Oct 03 08:40:12 crc systemd[1]: Starting Kubernetes Kubelet... Oct 03 08:40:12 crc restorecon[4664]: Relabeled /var/lib/kubelet/config.json from system_u:object_r:unlabeled_t:s0 to system_u:object_r:container_var_lib_t:s0 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/device-plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/device-plugins/kubelet.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/volumes/kubernetes.io~configmap/nginx-conf/..2025_02_23_05_40_35.4114275528/nginx.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/22e96971 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/21c98286 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/0f1869e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/46889d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/5b6a5969 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/6c7921f5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4804f443 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/2a46b283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/a6b5573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4f88ee5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/5a4eee4b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/cd87c521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/38602af4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/1483b002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/0346718b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/d3ed4ada not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/3bb473a5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/8cd075a9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/00ab4760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/54a21c09 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/70478888 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/43802770 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/955a0edc not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/bca2d009 not reset as customized by admin to system_u:object_r:container_file_t:s0:c140,c1009 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/b295f9bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/bc46ea27 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5731fc1b not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5e1b2a3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/943f0936 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/3f764ee4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/8695e3f9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/aed7aa86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/c64d7448 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/0ba16bd2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/207a939f not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/54aa8cdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/1f5fa595 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/bf9c8153 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/47fba4ea not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/7ae55ce9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7906a268 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/ce43fa69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7fc7ea3a not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/d8c38b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/9ef015fb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/b9db6a41 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/b1733d79 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/afccd338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/9df0a185 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/18938cf8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/7ab4eb23 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/56930be6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_35.630010865 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/0d8e3722 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/d22b2e76 not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/e036759f not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/2734c483 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/57878fe7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/3f3c2e58 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/375bec3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/7bc41e08 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/48c7a72d not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/4b66701f not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/a5a1c202 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_40.1388695756 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/26f3df5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/6d8fb21d not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/50e94777 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208473b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/ec9e08ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3b787c39 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208eaed5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/93aa3a2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3c697968 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/ba950ec9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/cb5cdb37 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/f2df9827 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/fedaa673 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/9ca2df95 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/b2d7460e not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2207853c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/241c1c29 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2d910eaf not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/c6c0f2e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/399edc97 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8049f7cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/0cec5484 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/312446d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c406,c828 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8e56a35d not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/2d30ddb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/eca8053d not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/c3a25c9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c168,c522 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/b9609c22 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/e8b0eca9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/b36a9c3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/38af7b07 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/ae821620 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/baa23338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/2c534809 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/59b29eae not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/c91a8e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/4d87494a not reset as customized by admin to system_u:object_r:container_file_t:s0:c442,c857 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/1e33ca63 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/8dea7be2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d0b04a99 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d84f01e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/4109059b not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/a7258a3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/05bdf2b6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/f3261b51 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/315d045e not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/5fdcf278 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/d053f757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/c2850dc7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fcfb0b2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c7ac9b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fa0c0d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c609b6ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/2be6c296 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/89a32653 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/4eb9afeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/13af6efa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/b03f9724 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/e3d105cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/3aed4d83 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/0765fa6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/2cefc627 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/3dcc6345 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/365af391 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b1130c0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/236a5913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b9432e26 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/5ddb0e3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/986dc4fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/8a23ff9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/9728ae68 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/665f31d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/136c9b42 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/98a1575b not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/cac69136 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/5deb77a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/2ae53400 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/e46f2326 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/dc688d3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/3497c3cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/177eb008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/af5a2afa not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/d780cb1f not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/49b0f374 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/26fbb125 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/cf14125a not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/b7f86972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/e51d739c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/88ba6a69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/669a9acf not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/5cd51231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/75349ec7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/15c26839 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/45023dcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/2bb66a50 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/64d03bdd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/ab8e7ca0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/bb9be25f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/9a0b61d3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/d471b9d2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/8cb76b8e not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/11a00840 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/ec355a92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/992f735e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d59cdbbc not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/72133ff0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/c56c834c not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d13724c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/0a498258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa471982 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fc900d92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa7d68da not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/4bacf9b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/424021b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/fc2e31a3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/f51eefac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/c8997f2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/7481f599 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/fdafea19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/d0e1c571 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/ee398915 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/682bb6b8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a3e67855 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a989f289 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/915431bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/7796fdab not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/dcdb5f19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/a3aaa88c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/5508e3e6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/160585de not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/e99f8da3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/8bc85570 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/a5861c91 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/84db1135 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/9e1a6043 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/c1aba1c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/d55ccd6d not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/971cc9f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/8f2e3dcf not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/ceb35e9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/1c192745 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/5209e501 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/f83de4df not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/e7b978ac not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/c64304a1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/5384386b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/cce3e3ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/8fb75465 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/740f573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/32fd1134 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/0a861bd3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/80363026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/bfa952a8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..2025_02_23_05_33_31.333075221 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/793bf43d not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/7db1bb6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/4f6a0368 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/c12c7d86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/36c4a773 not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/4c1e98ae not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/a4c8115c not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/setup/7db1802e not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver/a008a7ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-syncer/2c836bac not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-regeneration-controller/0ce62299 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-insecure-readyz/945d2457 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-check-endpoints/7d5c1dd8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/index.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/bundle-v1.15.0.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/channel.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/package.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/bc8d0691 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/6b76097a not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/34d1af30 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/312ba61c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/645d5dd1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/16e825f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/4cf51fc9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/2a23d348 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/075dbd49 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/dd585ddd not reset as customized by admin to system_u:object_r:container_file_t:s0:c377,c642 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/17ebd0ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c343 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/005579f4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_23_11.1287037894 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/bf5f3b9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/af276eb7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/ea28e322 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/692e6683 not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/871746a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/4eb2e958 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/ca9b62da not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/0edd6fce not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/containers/controller-manager/89b4555f not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/655fcd71 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/0d43c002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/e68efd17 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/9acf9b65 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/5ae3ff11 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/1e59206a not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/27af16d1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c304,c1017 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/7918e729 not reset as customized by admin to system_u:object_r:container_file_t:s0:c853,c893 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/5d976d0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c585,c981 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/d7f55cbb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/f0812073 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/1a56cbeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/7fdd437e not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/cdfb5652 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 03 08:40:12 crc restorecon[4664]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/fix-audit-permissions/fb93119e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver/f1e8fc0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver-check-endpoints/218511f3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server/serving-certs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/ca8af7b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/72cc8a75 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/6e8a3760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4c3455c0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/2278acb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4b453e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/3ec09bda not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2/cacerts.bin not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java/cacerts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl/ca-bundle.trust.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/email-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/objsign-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2ae6433e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fde84897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75680d2e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/openshift-service-serving-signer_1740288168.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/facfc4fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f5a969c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CFCA_EV_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9ef4a08a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ingress-operator_1740288202.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2f332aed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/248c8271.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d10a21f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ACCVRAIZ1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a94d09e5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c9a4d3b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40193066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd8c0d63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b936d1c6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CA_Disig_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4fd49c6c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM_SERVIDORES_SEGUROS.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b81b93f0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f9a69fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b30d5fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ANF_Secure_Server_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b433981b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93851c9e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9282e51c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7dd1bc4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Actalis_Authentication_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/930ac5d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f47b495.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e113c810.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5931b5bc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Commercial.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2b349938.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e48193cf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/302904dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a716d4ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Networking.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93bc0acc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/86212b19.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b727005e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbc54cab.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f51bb24c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c28a8a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9c8dfbd4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ccc52f49.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cb1c3204.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ce5e74ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd08c599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6d41d539.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb5fa911.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e35234b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8cb5ee0f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a7c655d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f8fc53da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/de6d66f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d41b5e2a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/41a3f684.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1df5a75f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_2011.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e36a6752.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b872f2b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9576d26b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/228f89db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_ECC_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb717492.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d21b73c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b1b94ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/595e996b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_RSA_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b46e03d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/128f4b91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_3_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81f2d2b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Autoridad_de_Certificacion_Firmaprofesional_CIF_A62634068.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3bde41ac.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d16a5865.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_EC-384_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0179095f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ffa7f1eb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9482e63a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4dae3dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e359ba6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7e067d03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/95aff9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7746a63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Baltimore_CyberTrust_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/653b494a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3ad48a91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_2_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/54657681.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/82223c44.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8de2f56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d9dafe4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d96b65e2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee64a828.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40547a79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5a3f0ff8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a780d93.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/34d996fb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/eed8c118.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/89c02a45.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b1159c4c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d6325660.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4c339cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8312c4c1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_E1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8508e720.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5fdd185d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48bec511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/69105f4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b9bc432.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/32888f65.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b03dec0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/219d9499.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5acf816d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbf06781.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc99f41e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AAA_Certificate_Services.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/985c1f52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8794b4e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_BR_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7c037b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ef954a4e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_EV_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2add47b6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/90c5a3c8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0f3e76e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/53a1b57a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_EV_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5ad8a5d6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/68dd7389.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d04f354.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d6437c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/062cdee6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bd43e1dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7f3d5d1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c491639e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3513523f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/399e7759.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/feffd413.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d18e9066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/607986c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c90bc37d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1b0f7e5c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e08bfd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dd8e9d41.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed39abd0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a3418fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bc3f2570.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_High_Assurance_EV_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/244b5494.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81b9768f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4be590e0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_ECC_P384_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9846683b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/252252d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e8e7201.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_RSA4096_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d52c538d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c44cc0c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Trusted_Root_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75d1b2ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a2c66da8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ecccd8db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust.net_Certification_Authority__2048_.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/aee5f10d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e7271e8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0e59380.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4c3982f2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b99d060.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf64f35b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0a775a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/002c0b4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cc450945.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_EC1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/106f3e4d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b3fb433b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4042bcee.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/02265526.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/455f1b52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0d69c7e1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9f727ac7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5e98733a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0cd152c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc4d6a89.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6187b673.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/FIRMAPROFESIONAL_CA_ROOT-A_WEB.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ba8887ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/068570d1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f081611a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48a195d8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GDCA_TrustAUTH_R5_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f6fa695.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab59055e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b92fd57f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GLOBALTRUST_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fa5da96b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ec40989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7719f463.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1001acf7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f013ecaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/626dceaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c559d742.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1d3472b9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9479c8c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a81e292b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4bfab552.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e071171e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/57bcb2da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_ECC_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab5346f4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5046c355.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_RSA_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/865fbdf9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da0cfd1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/85cde254.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_ECC_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbb3f32b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureSign_RootCA11.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5860aaa6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/31188b5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HiPKI_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c7f1359b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f15c80c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hongkong_Post_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/09789157.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/18856ac4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e09d511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Commercial_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cf701eeb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d06393bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Public_Sector_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/10531352.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Izenpe.com.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureTrust_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0ed035a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsec_e-Szigno_Root_CA_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8160b96c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8651083.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2c63f966.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_ECC_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d89cda1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/01419da9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_RSA_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7a5b843.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_RSA_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf53fb88.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9591a472.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3afde786.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Gold_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NAVER_Global_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3fb36b73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d39b0a2c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a89d74c2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd58d51e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7db1890.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NetLock_Arany__Class_Gold__F__tan__s__tv__ny.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/988a38cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/60afe812.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f39fc864.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5443e9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GB_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e73d606e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dfc0fe80.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b66938e9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e1eab7c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GC_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/773e07ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c899c73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d59297b8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ddcda989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_1_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/749e9e03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/52b525c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7e8dc79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a819ef2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/08063a00.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b483515.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/064e0aa9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1f58a078.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6f7454b3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7fa05551.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76faf6c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9339512a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f387163d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee37c333.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e18bfb83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e442e424.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fe8a2cd8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/23f4c490.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5cd81ad7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0c70a8d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7892ad52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SZAFIR_ROOT_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4f316efb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_RSA_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/06dc52d5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/583d0756.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0bf05006.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/88950faa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9046744a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c860d51.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_RSA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6fa5da56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/33ee480d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Secure_Global_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/63a2c897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_ECC_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bdacca6f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ff34af3f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbff3a01.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_ECC_RootCA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_C1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/406c9bb1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_C3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Services_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Silver_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/99e1b953.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/14bc7599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TUBITAK_Kamu_SM_SSL_Kok_Sertifikasi_-_Surum_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a3adc42.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f459871d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_ECC_Root_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_RSA_Root_2023.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TeliaSonera_Root_CA_v1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telia_Root_CA_v2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f103249.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f058632f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-certificates.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9bf03295.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/98aaf404.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1cef98f5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/073bfcc5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2923b3f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f249de83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/edcbddb5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P256_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b5697b0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ae85e5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b74d2bd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P384_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d887a5bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9aef356c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TunTrust_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd64f3fc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e13665f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Extended_Validation_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f5dc4f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da7377f6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Global_G2_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c01eb047.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/304d27c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed858448.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f30dd6ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/04f60c28.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_ECC_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fc5a8f99.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/35105088.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee532fd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/XRamp_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/706f604c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76579174.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d86cdd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/882de061.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f618aec.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a9d40e02.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e-Szigno_Root_CA_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e868b802.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/83e9984f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ePKI_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca6e4ad9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d6523ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4b718d9b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/869fbf79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/containers/registry/f8d22bdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/6e8bbfac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/54dd7996 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/a4f1bb05 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/207129da not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/c1df39e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/15b8f1cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/77bd6913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/2382c1b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/704ce128 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/70d16fe0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/bfb95535 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/57a8e8e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/1b9d3e5e not reset as customized by admin to system_u:object_r:container_file_t:s0:c107,c917 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/fddb173c not reset as customized by admin to system_u:object_r:container_file_t:s0:c202,c983 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/95d3c6c4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/bfb5fff5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/2aef40aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/c0391cad not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/1119e69d not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/660608b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/8220bd53 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/85f99d5c not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/4b0225f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/9c2a3394 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/e820b243 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/1ca52ea0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/e6988e45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/6655f00b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/98bc3986 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/08e3458a not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/2a191cb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/6c4eeefb not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/f61a549c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/24891863 not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/fbdfd89c not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/9b63b3bc not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/8acde6d6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/node-driver-registrar/59ecbba3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/csi-provisioner/685d4be3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/containers/route-controller-manager/feaea55e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/63709497 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/d966b7fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/f5773757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/81c9edb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/57bf57ee not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/86f5e6aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/0aabe31d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/d2af85c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/09d157d9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c0fe7256 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c30319e4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/e6b1dd45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/2bb643f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/920de426 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/70fa1e87 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/a1c12a2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/9442e6c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/5b45ec72 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/3c9f3a59 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/1091c11b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/9a6821c6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/ec0c35e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/517f37e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/6214fe78 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/ba189c8b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/351e4f31 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/c0f219ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/8069f607 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/559c3d82 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/605ad488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/148df488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/3bf6dcb4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/022a2feb not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/938c3924 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/729fe23e not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/1fd5cbd4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/a96697e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/e155ddca not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/10dd0e0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/6f2c8392 not reset as customized by admin to system_u:object_r:container_file_t:s0:c267,c588 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/bd241ad9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/plugins/csi-hostpath not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/plugins/csi-hostpath/csi.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/plugins/kubernetes.io not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/plugins/kubernetes.io/csi not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983 not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/vol_data.json not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 03 08:40:13 crc restorecon[4664]: /var/lib/kubelet/plugins_registry not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 03 08:40:13 crc restorecon[4664]: Relabeled /var/usrlocal/bin/kubenswrapper from system_u:object_r:bin_t:s0 to system_u:object_r:kubelet_exec_t:s0 Oct 03 08:40:14 crc kubenswrapper[4790]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Oct 03 08:40:14 crc kubenswrapper[4790]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Oct 03 08:40:14 crc kubenswrapper[4790]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Oct 03 08:40:14 crc kubenswrapper[4790]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Oct 03 08:40:14 crc kubenswrapper[4790]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Oct 03 08:40:14 crc kubenswrapper[4790]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.080047 4790 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Oct 03 08:40:14 crc kubenswrapper[4790]: W1003 08:40:14.088461 4790 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Oct 03 08:40:14 crc kubenswrapper[4790]: W1003 08:40:14.088489 4790 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Oct 03 08:40:14 crc kubenswrapper[4790]: W1003 08:40:14.088495 4790 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Oct 03 08:40:14 crc kubenswrapper[4790]: W1003 08:40:14.088500 4790 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Oct 03 08:40:14 crc kubenswrapper[4790]: W1003 08:40:14.088504 4790 feature_gate.go:330] unrecognized feature gate: OVNObservability Oct 03 08:40:14 crc kubenswrapper[4790]: W1003 08:40:14.088508 4790 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Oct 03 08:40:14 crc kubenswrapper[4790]: W1003 08:40:14.088513 4790 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Oct 03 08:40:14 crc kubenswrapper[4790]: W1003 08:40:14.088517 4790 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Oct 03 08:40:14 crc kubenswrapper[4790]: W1003 08:40:14.088520 4790 feature_gate.go:330] unrecognized feature gate: NewOLM Oct 03 08:40:14 crc kubenswrapper[4790]: W1003 08:40:14.088525 4790 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Oct 03 08:40:14 crc kubenswrapper[4790]: W1003 08:40:14.088529 4790 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Oct 03 08:40:14 crc kubenswrapper[4790]: W1003 08:40:14.088535 4790 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Oct 03 08:40:14 crc kubenswrapper[4790]: W1003 08:40:14.088541 4790 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Oct 03 08:40:14 crc kubenswrapper[4790]: W1003 08:40:14.088547 4790 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Oct 03 08:40:14 crc kubenswrapper[4790]: W1003 08:40:14.088553 4790 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Oct 03 08:40:14 crc kubenswrapper[4790]: W1003 08:40:14.088560 4790 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Oct 03 08:40:14 crc kubenswrapper[4790]: W1003 08:40:14.088565 4790 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Oct 03 08:40:14 crc kubenswrapper[4790]: W1003 08:40:14.088571 4790 feature_gate.go:330] unrecognized feature gate: Example Oct 03 08:40:14 crc kubenswrapper[4790]: W1003 08:40:14.088594 4790 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Oct 03 08:40:14 crc kubenswrapper[4790]: W1003 08:40:14.088599 4790 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Oct 03 08:40:14 crc kubenswrapper[4790]: W1003 08:40:14.088604 4790 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Oct 03 08:40:14 crc kubenswrapper[4790]: W1003 08:40:14.088610 4790 feature_gate.go:330] unrecognized feature gate: SignatureStores Oct 03 08:40:14 crc kubenswrapper[4790]: W1003 08:40:14.088616 4790 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Oct 03 08:40:14 crc kubenswrapper[4790]: W1003 08:40:14.088621 4790 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Oct 03 08:40:14 crc kubenswrapper[4790]: W1003 08:40:14.088626 4790 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Oct 03 08:40:14 crc kubenswrapper[4790]: W1003 08:40:14.088630 4790 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Oct 03 08:40:14 crc kubenswrapper[4790]: W1003 08:40:14.088634 4790 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Oct 03 08:40:14 crc kubenswrapper[4790]: W1003 08:40:14.088638 4790 feature_gate.go:330] unrecognized feature gate: PinnedImages Oct 03 08:40:14 crc kubenswrapper[4790]: W1003 08:40:14.088642 4790 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Oct 03 08:40:14 crc kubenswrapper[4790]: W1003 08:40:14.088653 4790 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Oct 03 08:40:14 crc kubenswrapper[4790]: W1003 08:40:14.088738 4790 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Oct 03 08:40:14 crc kubenswrapper[4790]: W1003 08:40:14.088743 4790 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Oct 03 08:40:14 crc kubenswrapper[4790]: W1003 08:40:14.088748 4790 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Oct 03 08:40:14 crc kubenswrapper[4790]: W1003 08:40:14.088752 4790 feature_gate.go:330] unrecognized feature gate: GatewayAPI Oct 03 08:40:14 crc kubenswrapper[4790]: W1003 08:40:14.088755 4790 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Oct 03 08:40:14 crc kubenswrapper[4790]: W1003 08:40:14.088761 4790 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Oct 03 08:40:14 crc kubenswrapper[4790]: W1003 08:40:14.088765 4790 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Oct 03 08:40:14 crc kubenswrapper[4790]: W1003 08:40:14.088770 4790 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Oct 03 08:40:14 crc kubenswrapper[4790]: W1003 08:40:14.088774 4790 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Oct 03 08:40:14 crc kubenswrapper[4790]: W1003 08:40:14.088778 4790 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Oct 03 08:40:14 crc kubenswrapper[4790]: W1003 08:40:14.088782 4790 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Oct 03 08:40:14 crc kubenswrapper[4790]: W1003 08:40:14.088786 4790 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Oct 03 08:40:14 crc kubenswrapper[4790]: W1003 08:40:14.088790 4790 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Oct 03 08:40:14 crc kubenswrapper[4790]: W1003 08:40:14.088794 4790 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Oct 03 08:40:14 crc kubenswrapper[4790]: W1003 08:40:14.088797 4790 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Oct 03 08:40:14 crc kubenswrapper[4790]: W1003 08:40:14.088801 4790 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Oct 03 08:40:14 crc kubenswrapper[4790]: W1003 08:40:14.088804 4790 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Oct 03 08:40:14 crc kubenswrapper[4790]: W1003 08:40:14.088830 4790 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Oct 03 08:40:14 crc kubenswrapper[4790]: W1003 08:40:14.088834 4790 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Oct 03 08:40:14 crc kubenswrapper[4790]: W1003 08:40:14.088839 4790 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Oct 03 08:40:14 crc kubenswrapper[4790]: W1003 08:40:14.088843 4790 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Oct 03 08:40:14 crc kubenswrapper[4790]: W1003 08:40:14.088848 4790 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Oct 03 08:40:14 crc kubenswrapper[4790]: W1003 08:40:14.088853 4790 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Oct 03 08:40:14 crc kubenswrapper[4790]: W1003 08:40:14.088858 4790 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Oct 03 08:40:14 crc kubenswrapper[4790]: W1003 08:40:14.088862 4790 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Oct 03 08:40:14 crc kubenswrapper[4790]: W1003 08:40:14.088865 4790 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Oct 03 08:40:14 crc kubenswrapper[4790]: W1003 08:40:14.088869 4790 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Oct 03 08:40:14 crc kubenswrapper[4790]: W1003 08:40:14.088872 4790 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Oct 03 08:40:14 crc kubenswrapper[4790]: W1003 08:40:14.088875 4790 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Oct 03 08:40:14 crc kubenswrapper[4790]: W1003 08:40:14.088879 4790 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Oct 03 08:40:14 crc kubenswrapper[4790]: W1003 08:40:14.088882 4790 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Oct 03 08:40:14 crc kubenswrapper[4790]: W1003 08:40:14.088886 4790 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Oct 03 08:40:14 crc kubenswrapper[4790]: W1003 08:40:14.088889 4790 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Oct 03 08:40:14 crc kubenswrapper[4790]: W1003 08:40:14.088893 4790 feature_gate.go:330] unrecognized feature gate: InsightsConfig Oct 03 08:40:14 crc kubenswrapper[4790]: W1003 08:40:14.088896 4790 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Oct 03 08:40:14 crc kubenswrapper[4790]: W1003 08:40:14.088899 4790 feature_gate.go:330] unrecognized feature gate: PlatformOperators Oct 03 08:40:14 crc kubenswrapper[4790]: W1003 08:40:14.088905 4790 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Oct 03 08:40:14 crc kubenswrapper[4790]: W1003 08:40:14.088909 4790 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Oct 03 08:40:14 crc kubenswrapper[4790]: W1003 08:40:14.088913 4790 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Oct 03 08:40:14 crc kubenswrapper[4790]: W1003 08:40:14.088916 4790 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Oct 03 08:40:14 crc kubenswrapper[4790]: W1003 08:40:14.088919 4790 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.089723 4790 flags.go:64] FLAG: --address="0.0.0.0" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.089745 4790 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.089758 4790 flags.go:64] FLAG: --anonymous-auth="true" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.089765 4790 flags.go:64] FLAG: --application-metrics-count-limit="100" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.089773 4790 flags.go:64] FLAG: --authentication-token-webhook="false" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.089778 4790 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.089786 4790 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.089800 4790 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.089899 4790 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.089906 4790 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.089912 4790 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.089917 4790 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.089923 4790 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.089928 4790 flags.go:64] FLAG: --cgroup-root="" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.089934 4790 flags.go:64] FLAG: --cgroups-per-qos="true" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.089939 4790 flags.go:64] FLAG: --client-ca-file="" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.089944 4790 flags.go:64] FLAG: --cloud-config="" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.089948 4790 flags.go:64] FLAG: --cloud-provider="" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.089953 4790 flags.go:64] FLAG: --cluster-dns="[]" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.089961 4790 flags.go:64] FLAG: --cluster-domain="" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.089966 4790 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.089971 4790 flags.go:64] FLAG: --config-dir="" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.089976 4790 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.089982 4790 flags.go:64] FLAG: --container-log-max-files="5" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.089990 4790 flags.go:64] FLAG: --container-log-max-size="10Mi" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.089996 4790 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.090000 4790 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.090006 4790 flags.go:64] FLAG: --containerd-namespace="k8s.io" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.090011 4790 flags.go:64] FLAG: --contention-profiling="false" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.090015 4790 flags.go:64] FLAG: --cpu-cfs-quota="true" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.090037 4790 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.090044 4790 flags.go:64] FLAG: --cpu-manager-policy="none" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.090049 4790 flags.go:64] FLAG: --cpu-manager-policy-options="" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.090056 4790 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.090062 4790 flags.go:64] FLAG: --enable-controller-attach-detach="true" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.090067 4790 flags.go:64] FLAG: --enable-debugging-handlers="true" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.090072 4790 flags.go:64] FLAG: --enable-load-reader="false" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.090079 4790 flags.go:64] FLAG: --enable-server="true" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.090084 4790 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.090091 4790 flags.go:64] FLAG: --event-burst="100" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.090096 4790 flags.go:64] FLAG: --event-qps="50" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.090101 4790 flags.go:64] FLAG: --event-storage-age-limit="default=0" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.090106 4790 flags.go:64] FLAG: --event-storage-event-limit="default=0" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.090112 4790 flags.go:64] FLAG: --eviction-hard="" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.090120 4790 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.090125 4790 flags.go:64] FLAG: --eviction-minimum-reclaim="" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.090131 4790 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.090141 4790 flags.go:64] FLAG: --eviction-soft="" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.090146 4790 flags.go:64] FLAG: --eviction-soft-grace-period="" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.090152 4790 flags.go:64] FLAG: --exit-on-lock-contention="false" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.090158 4790 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.090162 4790 flags.go:64] FLAG: --experimental-mounter-path="" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.090167 4790 flags.go:64] FLAG: --fail-cgroupv1="false" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.090171 4790 flags.go:64] FLAG: --fail-swap-on="true" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.090176 4790 flags.go:64] FLAG: --feature-gates="" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.090182 4790 flags.go:64] FLAG: --file-check-frequency="20s" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.090190 4790 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.090195 4790 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.090200 4790 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.090212 4790 flags.go:64] FLAG: --healthz-port="10248" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.090217 4790 flags.go:64] FLAG: --help="false" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.090221 4790 flags.go:64] FLAG: --hostname-override="" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.090226 4790 flags.go:64] FLAG: --housekeeping-interval="10s" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.090231 4790 flags.go:64] FLAG: --http-check-frequency="20s" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.090236 4790 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.090241 4790 flags.go:64] FLAG: --image-credential-provider-config="" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.090246 4790 flags.go:64] FLAG: --image-gc-high-threshold="85" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.090258 4790 flags.go:64] FLAG: --image-gc-low-threshold="80" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.090265 4790 flags.go:64] FLAG: --image-service-endpoint="" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.090270 4790 flags.go:64] FLAG: --kernel-memcg-notification="false" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.090274 4790 flags.go:64] FLAG: --kube-api-burst="100" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.090280 4790 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.090285 4790 flags.go:64] FLAG: --kube-api-qps="50" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.090290 4790 flags.go:64] FLAG: --kube-reserved="" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.090295 4790 flags.go:64] FLAG: --kube-reserved-cgroup="" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.090300 4790 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.090306 4790 flags.go:64] FLAG: --kubelet-cgroups="" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.090313 4790 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.090318 4790 flags.go:64] FLAG: --lock-file="" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.090324 4790 flags.go:64] FLAG: --log-cadvisor-usage="false" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.090328 4790 flags.go:64] FLAG: --log-flush-frequency="5s" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.090335 4790 flags.go:64] FLAG: --log-json-info-buffer-size="0" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.090343 4790 flags.go:64] FLAG: --log-json-split-stream="false" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.090348 4790 flags.go:64] FLAG: --log-text-info-buffer-size="0" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.090352 4790 flags.go:64] FLAG: --log-text-split-stream="false" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.090357 4790 flags.go:64] FLAG: --logging-format="text" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.090362 4790 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.090367 4790 flags.go:64] FLAG: --make-iptables-util-chains="true" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.090372 4790 flags.go:64] FLAG: --manifest-url="" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.090376 4790 flags.go:64] FLAG: --manifest-url-header="" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.090384 4790 flags.go:64] FLAG: --max-housekeeping-interval="15s" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.090391 4790 flags.go:64] FLAG: --max-open-files="1000000" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.090398 4790 flags.go:64] FLAG: --max-pods="110" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.090404 4790 flags.go:64] FLAG: --maximum-dead-containers="-1" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.090411 4790 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.090415 4790 flags.go:64] FLAG: --memory-manager-policy="None" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.090426 4790 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.090431 4790 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.090435 4790 flags.go:64] FLAG: --node-ip="192.168.126.11" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.090440 4790 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.090454 4790 flags.go:64] FLAG: --node-status-max-images="50" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.090459 4790 flags.go:64] FLAG: --node-status-update-frequency="10s" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.090464 4790 flags.go:64] FLAG: --oom-score-adj="-999" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.090469 4790 flags.go:64] FLAG: --pod-cidr="" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.090473 4790 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:33549946e22a9ffa738fd94b1345f90921bc8f92fa6137784cb33c77ad806f9d" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.090484 4790 flags.go:64] FLAG: --pod-manifest-path="" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.090488 4790 flags.go:64] FLAG: --pod-max-pids="-1" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.090493 4790 flags.go:64] FLAG: --pods-per-core="0" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.090498 4790 flags.go:64] FLAG: --port="10250" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.090503 4790 flags.go:64] FLAG: --protect-kernel-defaults="false" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.090508 4790 flags.go:64] FLAG: --provider-id="" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.090514 4790 flags.go:64] FLAG: --qos-reserved="" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.090519 4790 flags.go:64] FLAG: --read-only-port="10255" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.090525 4790 flags.go:64] FLAG: --register-node="true" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.090529 4790 flags.go:64] FLAG: --register-schedulable="true" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.090534 4790 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.090554 4790 flags.go:64] FLAG: --registry-burst="10" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.090559 4790 flags.go:64] FLAG: --registry-qps="5" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.090564 4790 flags.go:64] FLAG: --reserved-cpus="" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.090569 4790 flags.go:64] FLAG: --reserved-memory="" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.090595 4790 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.090601 4790 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.090606 4790 flags.go:64] FLAG: --rotate-certificates="false" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.090619 4790 flags.go:64] FLAG: --rotate-server-certificates="false" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.090624 4790 flags.go:64] FLAG: --runonce="false" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.090629 4790 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.090634 4790 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.090639 4790 flags.go:64] FLAG: --seccomp-default="false" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.090644 4790 flags.go:64] FLAG: --serialize-image-pulls="true" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.090649 4790 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.090654 4790 flags.go:64] FLAG: --storage-driver-db="cadvisor" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.090660 4790 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.090665 4790 flags.go:64] FLAG: --storage-driver-password="root" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.090670 4790 flags.go:64] FLAG: --storage-driver-secure="false" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.090675 4790 flags.go:64] FLAG: --storage-driver-table="stats" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.090679 4790 flags.go:64] FLAG: --storage-driver-user="root" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.090684 4790 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.090689 4790 flags.go:64] FLAG: --sync-frequency="1m0s" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.090694 4790 flags.go:64] FLAG: --system-cgroups="" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.090699 4790 flags.go:64] FLAG: --system-reserved="cpu=200m,ephemeral-storage=350Mi,memory=350Mi" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.090706 4790 flags.go:64] FLAG: --system-reserved-cgroup="" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.090711 4790 flags.go:64] FLAG: --tls-cert-file="" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.090716 4790 flags.go:64] FLAG: --tls-cipher-suites="[]" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.090724 4790 flags.go:64] FLAG: --tls-min-version="" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.090730 4790 flags.go:64] FLAG: --tls-private-key-file="" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.090736 4790 flags.go:64] FLAG: --topology-manager-policy="none" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.090741 4790 flags.go:64] FLAG: --topology-manager-policy-options="" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.090747 4790 flags.go:64] FLAG: --topology-manager-scope="container" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.090753 4790 flags.go:64] FLAG: --v="2" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.090760 4790 flags.go:64] FLAG: --version="false" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.090767 4790 flags.go:64] FLAG: --vmodule="" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.090774 4790 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.090779 4790 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Oct 03 08:40:14 crc kubenswrapper[4790]: W1003 08:40:14.090897 4790 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Oct 03 08:40:14 crc kubenswrapper[4790]: W1003 08:40:14.090903 4790 feature_gate.go:330] unrecognized feature gate: Example Oct 03 08:40:14 crc kubenswrapper[4790]: W1003 08:40:14.090910 4790 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Oct 03 08:40:14 crc kubenswrapper[4790]: W1003 08:40:14.090915 4790 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Oct 03 08:40:14 crc kubenswrapper[4790]: W1003 08:40:14.090920 4790 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Oct 03 08:40:14 crc kubenswrapper[4790]: W1003 08:40:14.090924 4790 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Oct 03 08:40:14 crc kubenswrapper[4790]: W1003 08:40:14.090929 4790 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Oct 03 08:40:14 crc kubenswrapper[4790]: W1003 08:40:14.090941 4790 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Oct 03 08:40:14 crc kubenswrapper[4790]: W1003 08:40:14.090945 4790 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Oct 03 08:40:14 crc kubenswrapper[4790]: W1003 08:40:14.090949 4790 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Oct 03 08:40:14 crc kubenswrapper[4790]: W1003 08:40:14.090953 4790 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Oct 03 08:40:14 crc kubenswrapper[4790]: W1003 08:40:14.090957 4790 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Oct 03 08:40:14 crc kubenswrapper[4790]: W1003 08:40:14.090960 4790 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Oct 03 08:40:14 crc kubenswrapper[4790]: W1003 08:40:14.090964 4790 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Oct 03 08:40:14 crc kubenswrapper[4790]: W1003 08:40:14.090970 4790 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Oct 03 08:40:14 crc kubenswrapper[4790]: W1003 08:40:14.090974 4790 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Oct 03 08:40:14 crc kubenswrapper[4790]: W1003 08:40:14.090979 4790 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Oct 03 08:40:14 crc kubenswrapper[4790]: W1003 08:40:14.090983 4790 feature_gate.go:330] unrecognized feature gate: GatewayAPI Oct 03 08:40:14 crc kubenswrapper[4790]: W1003 08:40:14.090987 4790 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Oct 03 08:40:14 crc kubenswrapper[4790]: W1003 08:40:14.090991 4790 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Oct 03 08:40:14 crc kubenswrapper[4790]: W1003 08:40:14.090995 4790 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Oct 03 08:40:14 crc kubenswrapper[4790]: W1003 08:40:14.090999 4790 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Oct 03 08:40:14 crc kubenswrapper[4790]: W1003 08:40:14.091003 4790 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Oct 03 08:40:14 crc kubenswrapper[4790]: W1003 08:40:14.091007 4790 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Oct 03 08:40:14 crc kubenswrapper[4790]: W1003 08:40:14.091012 4790 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Oct 03 08:40:14 crc kubenswrapper[4790]: W1003 08:40:14.091017 4790 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Oct 03 08:40:14 crc kubenswrapper[4790]: W1003 08:40:14.091022 4790 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Oct 03 08:40:14 crc kubenswrapper[4790]: W1003 08:40:14.091026 4790 feature_gate.go:330] unrecognized feature gate: InsightsConfig Oct 03 08:40:14 crc kubenswrapper[4790]: W1003 08:40:14.091030 4790 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Oct 03 08:40:14 crc kubenswrapper[4790]: W1003 08:40:14.091034 4790 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Oct 03 08:40:14 crc kubenswrapper[4790]: W1003 08:40:14.091040 4790 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Oct 03 08:40:14 crc kubenswrapper[4790]: W1003 08:40:14.091045 4790 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Oct 03 08:40:14 crc kubenswrapper[4790]: W1003 08:40:14.091049 4790 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Oct 03 08:40:14 crc kubenswrapper[4790]: W1003 08:40:14.091053 4790 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Oct 03 08:40:14 crc kubenswrapper[4790]: W1003 08:40:14.091066 4790 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Oct 03 08:40:14 crc kubenswrapper[4790]: W1003 08:40:14.091071 4790 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Oct 03 08:40:14 crc kubenswrapper[4790]: W1003 08:40:14.091075 4790 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Oct 03 08:40:14 crc kubenswrapper[4790]: W1003 08:40:14.091079 4790 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Oct 03 08:40:14 crc kubenswrapper[4790]: W1003 08:40:14.091083 4790 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Oct 03 08:40:14 crc kubenswrapper[4790]: W1003 08:40:14.091089 4790 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Oct 03 08:40:14 crc kubenswrapper[4790]: W1003 08:40:14.091092 4790 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Oct 03 08:40:14 crc kubenswrapper[4790]: W1003 08:40:14.091097 4790 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Oct 03 08:40:14 crc kubenswrapper[4790]: W1003 08:40:14.091102 4790 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Oct 03 08:40:14 crc kubenswrapper[4790]: W1003 08:40:14.091108 4790 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Oct 03 08:40:14 crc kubenswrapper[4790]: W1003 08:40:14.091114 4790 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Oct 03 08:40:14 crc kubenswrapper[4790]: W1003 08:40:14.091119 4790 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Oct 03 08:40:14 crc kubenswrapper[4790]: W1003 08:40:14.091125 4790 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Oct 03 08:40:14 crc kubenswrapper[4790]: W1003 08:40:14.091130 4790 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Oct 03 08:40:14 crc kubenswrapper[4790]: W1003 08:40:14.091135 4790 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Oct 03 08:40:14 crc kubenswrapper[4790]: W1003 08:40:14.091140 4790 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Oct 03 08:40:14 crc kubenswrapper[4790]: W1003 08:40:14.091145 4790 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Oct 03 08:40:14 crc kubenswrapper[4790]: W1003 08:40:14.091152 4790 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Oct 03 08:40:14 crc kubenswrapper[4790]: W1003 08:40:14.091157 4790 feature_gate.go:330] unrecognized feature gate: PinnedImages Oct 03 08:40:14 crc kubenswrapper[4790]: W1003 08:40:14.091162 4790 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Oct 03 08:40:14 crc kubenswrapper[4790]: W1003 08:40:14.091167 4790 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Oct 03 08:40:14 crc kubenswrapper[4790]: W1003 08:40:14.091171 4790 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Oct 03 08:40:14 crc kubenswrapper[4790]: W1003 08:40:14.091175 4790 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Oct 03 08:40:14 crc kubenswrapper[4790]: W1003 08:40:14.091179 4790 feature_gate.go:330] unrecognized feature gate: SignatureStores Oct 03 08:40:14 crc kubenswrapper[4790]: W1003 08:40:14.091183 4790 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Oct 03 08:40:14 crc kubenswrapper[4790]: W1003 08:40:14.091187 4790 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Oct 03 08:40:14 crc kubenswrapper[4790]: W1003 08:40:14.091191 4790 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Oct 03 08:40:14 crc kubenswrapper[4790]: W1003 08:40:14.091194 4790 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Oct 03 08:40:14 crc kubenswrapper[4790]: W1003 08:40:14.091198 4790 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Oct 03 08:40:14 crc kubenswrapper[4790]: W1003 08:40:14.091202 4790 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Oct 03 08:40:14 crc kubenswrapper[4790]: W1003 08:40:14.091206 4790 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Oct 03 08:40:14 crc kubenswrapper[4790]: W1003 08:40:14.091210 4790 feature_gate.go:330] unrecognized feature gate: NewOLM Oct 03 08:40:14 crc kubenswrapper[4790]: W1003 08:40:14.091216 4790 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Oct 03 08:40:14 crc kubenswrapper[4790]: W1003 08:40:14.091220 4790 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Oct 03 08:40:14 crc kubenswrapper[4790]: W1003 08:40:14.091223 4790 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Oct 03 08:40:14 crc kubenswrapper[4790]: W1003 08:40:14.091229 4790 feature_gate.go:330] unrecognized feature gate: OVNObservability Oct 03 08:40:14 crc kubenswrapper[4790]: W1003 08:40:14.091241 4790 feature_gate.go:330] unrecognized feature gate: PlatformOperators Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.091258 4790 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.107149 4790 server.go:491] "Kubelet version" kubeletVersion="v1.31.5" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.107214 4790 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Oct 03 08:40:14 crc kubenswrapper[4790]: W1003 08:40:14.107424 4790 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Oct 03 08:40:14 crc kubenswrapper[4790]: W1003 08:40:14.107456 4790 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Oct 03 08:40:14 crc kubenswrapper[4790]: W1003 08:40:14.107469 4790 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Oct 03 08:40:14 crc kubenswrapper[4790]: W1003 08:40:14.107481 4790 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Oct 03 08:40:14 crc kubenswrapper[4790]: W1003 08:40:14.107492 4790 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Oct 03 08:40:14 crc kubenswrapper[4790]: W1003 08:40:14.107502 4790 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Oct 03 08:40:14 crc kubenswrapper[4790]: W1003 08:40:14.107514 4790 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Oct 03 08:40:14 crc kubenswrapper[4790]: W1003 08:40:14.107525 4790 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Oct 03 08:40:14 crc kubenswrapper[4790]: W1003 08:40:14.107536 4790 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Oct 03 08:40:14 crc kubenswrapper[4790]: W1003 08:40:14.107546 4790 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Oct 03 08:40:14 crc kubenswrapper[4790]: W1003 08:40:14.107558 4790 feature_gate.go:330] unrecognized feature gate: Example Oct 03 08:40:14 crc kubenswrapper[4790]: W1003 08:40:14.107568 4790 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Oct 03 08:40:14 crc kubenswrapper[4790]: W1003 08:40:14.107614 4790 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Oct 03 08:40:14 crc kubenswrapper[4790]: W1003 08:40:14.107629 4790 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Oct 03 08:40:14 crc kubenswrapper[4790]: W1003 08:40:14.107644 4790 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Oct 03 08:40:14 crc kubenswrapper[4790]: W1003 08:40:14.107655 4790 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Oct 03 08:40:14 crc kubenswrapper[4790]: W1003 08:40:14.107666 4790 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Oct 03 08:40:14 crc kubenswrapper[4790]: W1003 08:40:14.107676 4790 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Oct 03 08:40:14 crc kubenswrapper[4790]: W1003 08:40:14.107687 4790 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Oct 03 08:40:14 crc kubenswrapper[4790]: W1003 08:40:14.107697 4790 feature_gate.go:330] unrecognized feature gate: PinnedImages Oct 03 08:40:14 crc kubenswrapper[4790]: W1003 08:40:14.107707 4790 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Oct 03 08:40:14 crc kubenswrapper[4790]: W1003 08:40:14.107717 4790 feature_gate.go:330] unrecognized feature gate: NewOLM Oct 03 08:40:14 crc kubenswrapper[4790]: W1003 08:40:14.107727 4790 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Oct 03 08:40:14 crc kubenswrapper[4790]: W1003 08:40:14.107739 4790 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Oct 03 08:40:14 crc kubenswrapper[4790]: W1003 08:40:14.107750 4790 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Oct 03 08:40:14 crc kubenswrapper[4790]: W1003 08:40:14.107760 4790 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Oct 03 08:40:14 crc kubenswrapper[4790]: W1003 08:40:14.107770 4790 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Oct 03 08:40:14 crc kubenswrapper[4790]: W1003 08:40:14.107780 4790 feature_gate.go:330] unrecognized feature gate: OVNObservability Oct 03 08:40:14 crc kubenswrapper[4790]: W1003 08:40:14.107794 4790 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Oct 03 08:40:14 crc kubenswrapper[4790]: W1003 08:40:14.107811 4790 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Oct 03 08:40:14 crc kubenswrapper[4790]: W1003 08:40:14.107824 4790 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Oct 03 08:40:14 crc kubenswrapper[4790]: W1003 08:40:14.107837 4790 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Oct 03 08:40:14 crc kubenswrapper[4790]: W1003 08:40:14.107849 4790 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Oct 03 08:40:14 crc kubenswrapper[4790]: W1003 08:40:14.107861 4790 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Oct 03 08:40:14 crc kubenswrapper[4790]: W1003 08:40:14.107920 4790 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Oct 03 08:40:14 crc kubenswrapper[4790]: W1003 08:40:14.107933 4790 feature_gate.go:330] unrecognized feature gate: PlatformOperators Oct 03 08:40:14 crc kubenswrapper[4790]: W1003 08:40:14.107946 4790 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Oct 03 08:40:14 crc kubenswrapper[4790]: W1003 08:40:14.107957 4790 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Oct 03 08:40:14 crc kubenswrapper[4790]: W1003 08:40:14.107968 4790 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Oct 03 08:40:14 crc kubenswrapper[4790]: W1003 08:40:14.107979 4790 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Oct 03 08:40:14 crc kubenswrapper[4790]: W1003 08:40:14.107989 4790 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Oct 03 08:40:14 crc kubenswrapper[4790]: W1003 08:40:14.108004 4790 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Oct 03 08:40:14 crc kubenswrapper[4790]: W1003 08:40:14.108017 4790 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Oct 03 08:40:14 crc kubenswrapper[4790]: W1003 08:40:14.108030 4790 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Oct 03 08:40:14 crc kubenswrapper[4790]: W1003 08:40:14.108041 4790 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Oct 03 08:40:14 crc kubenswrapper[4790]: W1003 08:40:14.108053 4790 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Oct 03 08:40:14 crc kubenswrapper[4790]: W1003 08:40:14.108065 4790 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Oct 03 08:40:14 crc kubenswrapper[4790]: W1003 08:40:14.108076 4790 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Oct 03 08:40:14 crc kubenswrapper[4790]: W1003 08:40:14.108086 4790 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Oct 03 08:40:14 crc kubenswrapper[4790]: W1003 08:40:14.108097 4790 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Oct 03 08:40:14 crc kubenswrapper[4790]: W1003 08:40:14.108107 4790 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Oct 03 08:40:14 crc kubenswrapper[4790]: W1003 08:40:14.108117 4790 feature_gate.go:330] unrecognized feature gate: SignatureStores Oct 03 08:40:14 crc kubenswrapper[4790]: W1003 08:40:14.108127 4790 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Oct 03 08:40:14 crc kubenswrapper[4790]: W1003 08:40:14.108137 4790 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Oct 03 08:40:14 crc kubenswrapper[4790]: W1003 08:40:14.108147 4790 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Oct 03 08:40:14 crc kubenswrapper[4790]: W1003 08:40:14.108157 4790 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Oct 03 08:40:14 crc kubenswrapper[4790]: W1003 08:40:14.108171 4790 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Oct 03 08:40:14 crc kubenswrapper[4790]: W1003 08:40:14.108181 4790 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Oct 03 08:40:14 crc kubenswrapper[4790]: W1003 08:40:14.108191 4790 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Oct 03 08:40:14 crc kubenswrapper[4790]: W1003 08:40:14.108201 4790 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Oct 03 08:40:14 crc kubenswrapper[4790]: W1003 08:40:14.108212 4790 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Oct 03 08:40:14 crc kubenswrapper[4790]: W1003 08:40:14.108222 4790 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Oct 03 08:40:14 crc kubenswrapper[4790]: W1003 08:40:14.108232 4790 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Oct 03 08:40:14 crc kubenswrapper[4790]: W1003 08:40:14.108242 4790 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Oct 03 08:40:14 crc kubenswrapper[4790]: W1003 08:40:14.108252 4790 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Oct 03 08:40:14 crc kubenswrapper[4790]: W1003 08:40:14.108262 4790 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Oct 03 08:40:14 crc kubenswrapper[4790]: W1003 08:40:14.108272 4790 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Oct 03 08:40:14 crc kubenswrapper[4790]: W1003 08:40:14.108282 4790 feature_gate.go:330] unrecognized feature gate: GatewayAPI Oct 03 08:40:14 crc kubenswrapper[4790]: W1003 08:40:14.108293 4790 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Oct 03 08:40:14 crc kubenswrapper[4790]: W1003 08:40:14.108303 4790 feature_gate.go:330] unrecognized feature gate: InsightsConfig Oct 03 08:40:14 crc kubenswrapper[4790]: W1003 08:40:14.108316 4790 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.108335 4790 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Oct 03 08:40:14 crc kubenswrapper[4790]: W1003 08:40:14.108687 4790 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Oct 03 08:40:14 crc kubenswrapper[4790]: W1003 08:40:14.108713 4790 feature_gate.go:330] unrecognized feature gate: GatewayAPI Oct 03 08:40:14 crc kubenswrapper[4790]: W1003 08:40:14.108728 4790 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Oct 03 08:40:14 crc kubenswrapper[4790]: W1003 08:40:14.108743 4790 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Oct 03 08:40:14 crc kubenswrapper[4790]: W1003 08:40:14.108755 4790 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Oct 03 08:40:14 crc kubenswrapper[4790]: W1003 08:40:14.108766 4790 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Oct 03 08:40:14 crc kubenswrapper[4790]: W1003 08:40:14.108779 4790 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Oct 03 08:40:14 crc kubenswrapper[4790]: W1003 08:40:14.108791 4790 feature_gate.go:330] unrecognized feature gate: NewOLM Oct 03 08:40:14 crc kubenswrapper[4790]: W1003 08:40:14.108802 4790 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Oct 03 08:40:14 crc kubenswrapper[4790]: W1003 08:40:14.108814 4790 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Oct 03 08:40:14 crc kubenswrapper[4790]: W1003 08:40:14.108825 4790 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Oct 03 08:40:14 crc kubenswrapper[4790]: W1003 08:40:14.108835 4790 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Oct 03 08:40:14 crc kubenswrapper[4790]: W1003 08:40:14.108845 4790 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Oct 03 08:40:14 crc kubenswrapper[4790]: W1003 08:40:14.108856 4790 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Oct 03 08:40:14 crc kubenswrapper[4790]: W1003 08:40:14.108865 4790 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Oct 03 08:40:14 crc kubenswrapper[4790]: W1003 08:40:14.108875 4790 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Oct 03 08:40:14 crc kubenswrapper[4790]: W1003 08:40:14.108886 4790 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Oct 03 08:40:14 crc kubenswrapper[4790]: W1003 08:40:14.108897 4790 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Oct 03 08:40:14 crc kubenswrapper[4790]: W1003 08:40:14.108913 4790 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Oct 03 08:40:14 crc kubenswrapper[4790]: W1003 08:40:14.108926 4790 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Oct 03 08:40:14 crc kubenswrapper[4790]: W1003 08:40:14.108938 4790 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Oct 03 08:40:14 crc kubenswrapper[4790]: W1003 08:40:14.108952 4790 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Oct 03 08:40:14 crc kubenswrapper[4790]: W1003 08:40:14.108965 4790 feature_gate.go:330] unrecognized feature gate: OVNObservability Oct 03 08:40:14 crc kubenswrapper[4790]: W1003 08:40:14.108976 4790 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Oct 03 08:40:14 crc kubenswrapper[4790]: W1003 08:40:14.108986 4790 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Oct 03 08:40:14 crc kubenswrapper[4790]: W1003 08:40:14.108998 4790 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Oct 03 08:40:14 crc kubenswrapper[4790]: W1003 08:40:14.109008 4790 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Oct 03 08:40:14 crc kubenswrapper[4790]: W1003 08:40:14.109019 4790 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Oct 03 08:40:14 crc kubenswrapper[4790]: W1003 08:40:14.109028 4790 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Oct 03 08:40:14 crc kubenswrapper[4790]: W1003 08:40:14.109039 4790 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Oct 03 08:40:14 crc kubenswrapper[4790]: W1003 08:40:14.109050 4790 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Oct 03 08:40:14 crc kubenswrapper[4790]: W1003 08:40:14.109060 4790 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Oct 03 08:40:14 crc kubenswrapper[4790]: W1003 08:40:14.109071 4790 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Oct 03 08:40:14 crc kubenswrapper[4790]: W1003 08:40:14.109080 4790 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Oct 03 08:40:14 crc kubenswrapper[4790]: W1003 08:40:14.109093 4790 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Oct 03 08:40:14 crc kubenswrapper[4790]: W1003 08:40:14.109103 4790 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Oct 03 08:40:14 crc kubenswrapper[4790]: W1003 08:40:14.109114 4790 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Oct 03 08:40:14 crc kubenswrapper[4790]: W1003 08:40:14.109124 4790 feature_gate.go:330] unrecognized feature gate: InsightsConfig Oct 03 08:40:14 crc kubenswrapper[4790]: W1003 08:40:14.109134 4790 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Oct 03 08:40:14 crc kubenswrapper[4790]: W1003 08:40:14.109144 4790 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Oct 03 08:40:14 crc kubenswrapper[4790]: W1003 08:40:14.109154 4790 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Oct 03 08:40:14 crc kubenswrapper[4790]: W1003 08:40:14.109164 4790 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Oct 03 08:40:14 crc kubenswrapper[4790]: W1003 08:40:14.109174 4790 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Oct 03 08:40:14 crc kubenswrapper[4790]: W1003 08:40:14.109184 4790 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Oct 03 08:40:14 crc kubenswrapper[4790]: W1003 08:40:14.109195 4790 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Oct 03 08:40:14 crc kubenswrapper[4790]: W1003 08:40:14.109206 4790 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Oct 03 08:40:14 crc kubenswrapper[4790]: W1003 08:40:14.109216 4790 feature_gate.go:330] unrecognized feature gate: SignatureStores Oct 03 08:40:14 crc kubenswrapper[4790]: W1003 08:40:14.109225 4790 feature_gate.go:330] unrecognized feature gate: Example Oct 03 08:40:14 crc kubenswrapper[4790]: W1003 08:40:14.109235 4790 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Oct 03 08:40:14 crc kubenswrapper[4790]: W1003 08:40:14.109246 4790 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Oct 03 08:40:14 crc kubenswrapper[4790]: W1003 08:40:14.109256 4790 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Oct 03 08:40:14 crc kubenswrapper[4790]: W1003 08:40:14.109266 4790 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Oct 03 08:40:14 crc kubenswrapper[4790]: W1003 08:40:14.109276 4790 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Oct 03 08:40:14 crc kubenswrapper[4790]: W1003 08:40:14.109286 4790 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Oct 03 08:40:14 crc kubenswrapper[4790]: W1003 08:40:14.109297 4790 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Oct 03 08:40:14 crc kubenswrapper[4790]: W1003 08:40:14.109307 4790 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Oct 03 08:40:14 crc kubenswrapper[4790]: W1003 08:40:14.109317 4790 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Oct 03 08:40:14 crc kubenswrapper[4790]: W1003 08:40:14.109327 4790 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Oct 03 08:40:14 crc kubenswrapper[4790]: W1003 08:40:14.109337 4790 feature_gate.go:330] unrecognized feature gate: PinnedImages Oct 03 08:40:14 crc kubenswrapper[4790]: W1003 08:40:14.109350 4790 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Oct 03 08:40:14 crc kubenswrapper[4790]: W1003 08:40:14.109362 4790 feature_gate.go:330] unrecognized feature gate: PlatformOperators Oct 03 08:40:14 crc kubenswrapper[4790]: W1003 08:40:14.109373 4790 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Oct 03 08:40:14 crc kubenswrapper[4790]: W1003 08:40:14.109384 4790 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Oct 03 08:40:14 crc kubenswrapper[4790]: W1003 08:40:14.109394 4790 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Oct 03 08:40:14 crc kubenswrapper[4790]: W1003 08:40:14.109404 4790 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Oct 03 08:40:14 crc kubenswrapper[4790]: W1003 08:40:14.109416 4790 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Oct 03 08:40:14 crc kubenswrapper[4790]: W1003 08:40:14.109426 4790 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Oct 03 08:40:14 crc kubenswrapper[4790]: W1003 08:40:14.109437 4790 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Oct 03 08:40:14 crc kubenswrapper[4790]: W1003 08:40:14.109447 4790 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Oct 03 08:40:14 crc kubenswrapper[4790]: W1003 08:40:14.109456 4790 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Oct 03 08:40:14 crc kubenswrapper[4790]: W1003 08:40:14.109473 4790 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.109491 4790 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.109897 4790 server.go:940] "Client rotation is on, will bootstrap in background" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.118213 4790 bootstrap.go:85] "Current kubeconfig file contents are still valid, no bootstrap necessary" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.118384 4790 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.120940 4790 server.go:997] "Starting client certificate rotation" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.120980 4790 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.121191 4790 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2026-02-24 05:52:08 +0000 UTC, rotation deadline is 2025-11-22 11:40:48.429401127 +0000 UTC Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.121313 4790 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 1203h0m34.308091436s for next certificate rotation Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.144430 4790 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.146435 4790 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.166287 4790 log.go:25] "Validated CRI v1 runtime API" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.198856 4790 log.go:25] "Validated CRI v1 image API" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.200881 4790 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.207873 4790 fs.go:133] Filesystem UUIDs: map[0b076daa-c26a-46d2-b3a6-72a8dbc6e257:/dev/vda4 2025-10-03-08-35-14-00:/dev/sr0 7B77-95E7:/dev/vda2 de0497b0-db1b-465a-b278-03db02455c71:/dev/vda3] Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.207926 4790 fs.go:134] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/user/1000:{mountpoint:/run/user/1000 major:0 minor:42 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/etcd:{mountpoint:/var/lib/etcd major:0 minor:43 fsType:tmpfs blockSize:0}] Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.227105 4790 manager.go:217] Machine: {Timestamp:2025-10-03 08:40:14.222900549 +0000 UTC m=+0.575834817 CPUVendorID:AuthenticAMD NumCores:12 NumPhysicalCores:1 NumSockets:12 CpuFrequency:2799998 MemoryCapacity:33654124544 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:21801e6708c44f15b81395eb736a7cec SystemUUID:312be368-bbdc-461d-b74b-f792414190de BootID:e374b313-7030-4739-b7f1-da1c9b39fe8d Filesystems:[{Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6730825728 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:16827064320 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/run/user/1000 DeviceMajor:0 DeviceMinor:42 Capacity:3365408768 Type:vfs Inodes:821633 HasInodes:true} {Device:/var/lib/etcd DeviceMajor:0 DeviceMinor:43 Capacity:1073741824 Type:vfs Inodes:4108169 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16827060224 Type:vfs Inodes:4108169 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:88:65:62 Speed:0 Mtu:1500} {Name:br-int MacAddress:d6:39:55:2e:22:71 Speed:0 Mtu:1400} {Name:ens3 MacAddress:fa:16:3e:88:65:62 Speed:-1 Mtu:1500} {Name:ens7 MacAddress:fa:16:3e:3c:58:2e Speed:-1 Mtu:1500} {Name:ens7.20 MacAddress:52:54:00:a3:4a:74 Speed:-1 Mtu:1496} {Name:ens7.21 MacAddress:52:54:00:df:c1:20 Speed:-1 Mtu:1496} {Name:ens7.22 MacAddress:52:54:00:ca:68:6f Speed:-1 Mtu:1496} {Name:eth10 MacAddress:4a:20:a1:cc:38:3e Speed:0 Mtu:1500} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:d9:00:02 Speed:0 Mtu:1400} {Name:ovs-system MacAddress:5e:81:11:0a:4b:da Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33654124544 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.227369 4790 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.227678 4790 manager.go:233] Version: {KernelVersion:5.14.0-427.50.2.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202502100215-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.230796 4790 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.231395 4790 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.231448 4790 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"crc","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"200m","ephemeral-storage":"350Mi","memory":"350Mi"},"HardEvictionThresholds":[{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.231729 4790 topology_manager.go:138] "Creating topology manager with none policy" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.231743 4790 container_manager_linux.go:303] "Creating device plugin manager" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.232555 4790 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.232602 4790 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.232896 4790 state_mem.go:36] "Initialized new in-memory state store" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.233440 4790 server.go:1245] "Using root directory" path="/var/lib/kubelet" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.236638 4790 kubelet.go:418] "Attempting to sync node with API server" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.236662 4790 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.236692 4790 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.236709 4790 kubelet.go:324] "Adding apiserver pod source" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.236723 4790 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.241438 4790 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.5-4.rhaos4.18.gitdad78d5.el9" apiVersion="v1" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.242340 4790 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Oct 03 08:40:14 crc kubenswrapper[4790]: W1003 08:40:14.243151 4790 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.113:6443: connect: connection refused Oct 03 08:40:14 crc kubenswrapper[4790]: W1003 08:40:14.243163 4790 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.113:6443: connect: connection refused Oct 03 08:40:14 crc kubenswrapper[4790]: E1003 08:40:14.243272 4790 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.113:6443: connect: connection refused" logger="UnhandledError" Oct 03 08:40:14 crc kubenswrapper[4790]: E1003 08:40:14.243285 4790 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.113:6443: connect: connection refused" logger="UnhandledError" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.244630 4790 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.246593 4790 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.246621 4790 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.246630 4790 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.246638 4790 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.246650 4790 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.246657 4790 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.246665 4790 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.246677 4790 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.246684 4790 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.246692 4790 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.246704 4790 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.246711 4790 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.250483 4790 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.250953 4790 server.go:1280] "Started kubelet" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.254401 4790 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.254398 4790 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.255719 4790 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.113:6443: connect: connection refused Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.255850 4790 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Oct 03 08:40:14 crc systemd[1]: Started Kubernetes Kubelet. Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.265074 4790 server.go:460] "Adding debug handlers to kubelet server" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.265179 4790 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.265258 4790 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.265285 4790 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-03 14:37:00.297738121 +0000 UTC Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.265341 4790 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 1469h56m46.032399909s for next certificate rotation Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.265654 4790 volume_manager.go:287] "The desired_state_of_world populator starts" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.265677 4790 volume_manager.go:289] "Starting Kubelet Volume Manager" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.265796 4790 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Oct 03 08:40:14 crc kubenswrapper[4790]: E1003 08:40:14.265778 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Oct 03 08:40:14 crc kubenswrapper[4790]: E1003 08:40:14.266217 4790 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.113:6443: connect: connection refused" interval="200ms" Oct 03 08:40:14 crc kubenswrapper[4790]: W1003 08:40:14.266325 4790 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.113:6443: connect: connection refused Oct 03 08:40:14 crc kubenswrapper[4790]: E1003 08:40:14.266410 4790 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.113:6443: connect: connection refused" logger="UnhandledError" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.266934 4790 factory.go:55] Registering systemd factory Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.266957 4790 factory.go:221] Registration of the systemd container factory successfully Oct 03 08:40:14 crc kubenswrapper[4790]: E1003 08:40:14.264861 4790 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.113:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.186aee7a7a8a9a61 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-10-03 08:40:14.250924641 +0000 UTC m=+0.603858879,LastTimestamp:2025-10-03 08:40:14.250924641 +0000 UTC m=+0.603858879,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.269364 4790 factory.go:153] Registering CRI-O factory Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.269397 4790 factory.go:221] Registration of the crio container factory successfully Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.269490 4790 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.269526 4790 factory.go:103] Registering Raw factory Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.269549 4790 manager.go:1196] Started watching for new ooms in manager Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.271889 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" seLinuxMountContext="" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.272030 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" seLinuxMountContext="" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.272046 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" seLinuxMountContext="" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.272065 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" seLinuxMountContext="" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.272076 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" seLinuxMountContext="" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.272085 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" seLinuxMountContext="" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.272094 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" seLinuxMountContext="" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.272103 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" seLinuxMountContext="" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.272113 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" seLinuxMountContext="" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.272122 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" seLinuxMountContext="" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.272131 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" seLinuxMountContext="" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.272142 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" seLinuxMountContext="" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.272153 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" seLinuxMountContext="" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.272193 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" seLinuxMountContext="" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.272202 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert" seLinuxMountContext="" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.272214 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d751cbb-f2e2-430d-9754-c882a5e924a5" volumeName="kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl" seLinuxMountContext="" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.272223 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" seLinuxMountContext="" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.272233 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" seLinuxMountContext="" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.272242 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" seLinuxMountContext="" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.272252 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" seLinuxMountContext="" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.272262 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" seLinuxMountContext="" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.272274 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" seLinuxMountContext="" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.272285 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" seLinuxMountContext="" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.272297 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" seLinuxMountContext="" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.272308 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" seLinuxMountContext="" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.272319 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" seLinuxMountContext="" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.272332 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b6479f0-333b-4a96-9adf-2099afdc2447" volumeName="kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr" seLinuxMountContext="" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.272346 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" seLinuxMountContext="" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.272353 4790 manager.go:319] Starting recovery of all containers Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.272358 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" seLinuxMountContext="" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.272954 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" seLinuxMountContext="" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.273603 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" seLinuxMountContext="" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.273635 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" seLinuxMountContext="" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.273653 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" seLinuxMountContext="" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.273671 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" seLinuxMountContext="" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.273689 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" seLinuxMountContext="" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.273705 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" seLinuxMountContext="" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.273719 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" seLinuxMountContext="" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.273736 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" volumeName="kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" seLinuxMountContext="" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.273751 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" seLinuxMountContext="" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.273766 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" seLinuxMountContext="" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.273781 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" seLinuxMountContext="" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.273799 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" seLinuxMountContext="" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.273833 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" seLinuxMountContext="" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.273857 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" seLinuxMountContext="" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.273908 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" seLinuxMountContext="" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.273932 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="44663579-783b-4372-86d6-acf235a62d72" volumeName="kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" seLinuxMountContext="" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.273966 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" seLinuxMountContext="" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.273987 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" seLinuxMountContext="" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.274010 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" seLinuxMountContext="" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.274033 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" seLinuxMountContext="" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.274056 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" seLinuxMountContext="" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.274076 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" seLinuxMountContext="" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.274109 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.274132 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" seLinuxMountContext="" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.274153 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" seLinuxMountContext="" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.274178 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" seLinuxMountContext="" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.274197 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" seLinuxMountContext="" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.274211 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" seLinuxMountContext="" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.274226 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" seLinuxMountContext="" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.274242 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" seLinuxMountContext="" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.274296 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" seLinuxMountContext="" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.274312 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" seLinuxMountContext="" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.274327 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" seLinuxMountContext="" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.274344 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" seLinuxMountContext="" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.274360 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" seLinuxMountContext="" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.274376 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" seLinuxMountContext="" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.274391 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" seLinuxMountContext="" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.274405 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" seLinuxMountContext="" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.274419 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" seLinuxMountContext="" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.274432 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" seLinuxMountContext="" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.274474 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" seLinuxMountContext="" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.274496 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" seLinuxMountContext="" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.274517 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" seLinuxMountContext="" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.274533 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" seLinuxMountContext="" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.274549 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" seLinuxMountContext="" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.274567 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" seLinuxMountContext="" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.274627 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" seLinuxMountContext="" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.274657 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" seLinuxMountContext="" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.275301 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" seLinuxMountContext="" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.275349 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" seLinuxMountContext="" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.275369 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" seLinuxMountContext="" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.275387 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" seLinuxMountContext="" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.275402 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" seLinuxMountContext="" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.275418 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" seLinuxMountContext="" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.275435 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" seLinuxMountContext="" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.275450 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" seLinuxMountContext="" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.275467 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" seLinuxMountContext="" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.275485 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" seLinuxMountContext="" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.275502 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" seLinuxMountContext="" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.275521 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" seLinuxMountContext="" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.275540 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" seLinuxMountContext="" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.275555 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" seLinuxMountContext="" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.275573 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" seLinuxMountContext="" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.275622 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" seLinuxMountContext="" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.275637 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" seLinuxMountContext="" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.275651 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" seLinuxMountContext="" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.275668 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" seLinuxMountContext="" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.275683 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm" seLinuxMountContext="" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.275698 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" seLinuxMountContext="" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.275715 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" seLinuxMountContext="" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.275738 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" seLinuxMountContext="" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.275753 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" seLinuxMountContext="" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.275769 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.275782 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" seLinuxMountContext="" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.275809 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" seLinuxMountContext="" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.275830 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" seLinuxMountContext="" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.275848 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" seLinuxMountContext="" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.275865 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" seLinuxMountContext="" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.275882 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" seLinuxMountContext="" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.275898 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert" seLinuxMountContext="" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.275916 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" seLinuxMountContext="" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.275934 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" seLinuxMountContext="" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.275949 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" seLinuxMountContext="" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.275963 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb" seLinuxMountContext="" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.275979 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" seLinuxMountContext="" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.275991 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" seLinuxMountContext="" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.276005 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf" seLinuxMountContext="" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.276017 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" seLinuxMountContext="" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.276031 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" seLinuxMountContext="" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.276045 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" seLinuxMountContext="" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.276058 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" seLinuxMountContext="" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.276072 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" seLinuxMountContext="" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.276085 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" seLinuxMountContext="" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.276100 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls" seLinuxMountContext="" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.276113 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" seLinuxMountContext="" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.276150 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" seLinuxMountContext="" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.276170 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" seLinuxMountContext="" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.276183 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" seLinuxMountContext="" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.276197 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" seLinuxMountContext="" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.276215 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" seLinuxMountContext="" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.276228 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" seLinuxMountContext="" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.276241 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" seLinuxMountContext="" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.276254 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" seLinuxMountContext="" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.276270 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" seLinuxMountContext="" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.278208 4790 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.278252 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" seLinuxMountContext="" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.278269 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" seLinuxMountContext="" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.278287 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" seLinuxMountContext="" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.278306 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" seLinuxMountContext="" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.278328 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides" seLinuxMountContext="" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.278345 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" seLinuxMountContext="" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.278357 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" seLinuxMountContext="" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.278372 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" seLinuxMountContext="" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.278384 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ef4625-1d3a-4a9f-b595-c2433d32326d" volumeName="kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" seLinuxMountContext="" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.278397 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" seLinuxMountContext="" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.278409 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" seLinuxMountContext="" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.278421 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" seLinuxMountContext="" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.278433 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" seLinuxMountContext="" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.278447 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" seLinuxMountContext="" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.278461 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" seLinuxMountContext="" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.278475 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" seLinuxMountContext="" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.278487 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" seLinuxMountContext="" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.278499 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" seLinuxMountContext="" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.278557 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" seLinuxMountContext="" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.278573 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" seLinuxMountContext="" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.278601 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" seLinuxMountContext="" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.278614 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" seLinuxMountContext="" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.278626 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" seLinuxMountContext="" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.278639 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.278651 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" seLinuxMountContext="" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.278676 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" seLinuxMountContext="" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.278688 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" seLinuxMountContext="" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.278701 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" seLinuxMountContext="" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.278712 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" seLinuxMountContext="" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.278724 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" seLinuxMountContext="" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.278739 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" seLinuxMountContext="" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.278752 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" seLinuxMountContext="" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.278765 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" seLinuxMountContext="" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.278777 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" seLinuxMountContext="" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.278793 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" seLinuxMountContext="" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.278803 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" seLinuxMountContext="" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.278816 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" seLinuxMountContext="" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.278826 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" seLinuxMountContext="" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.278836 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" volumeName="kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" seLinuxMountContext="" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.278846 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" seLinuxMountContext="" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.278857 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script" seLinuxMountContext="" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.278867 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" seLinuxMountContext="" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.278881 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" seLinuxMountContext="" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.278936 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" seLinuxMountContext="" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.278948 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" seLinuxMountContext="" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.278958 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" seLinuxMountContext="" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.278969 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" seLinuxMountContext="" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.278983 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" seLinuxMountContext="" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.278995 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" seLinuxMountContext="" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.279007 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" seLinuxMountContext="" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.279018 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5" seLinuxMountContext="" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.279028 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.279042 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" seLinuxMountContext="" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.279053 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" seLinuxMountContext="" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.279064 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" seLinuxMountContext="" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.279075 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" seLinuxMountContext="" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.279085 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" seLinuxMountContext="" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.279095 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" seLinuxMountContext="" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.279108 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" seLinuxMountContext="" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.279119 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" seLinuxMountContext="" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.279152 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" seLinuxMountContext="" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.279163 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" seLinuxMountContext="" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.279176 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" seLinuxMountContext="" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.279188 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf" seLinuxMountContext="" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.279228 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" seLinuxMountContext="" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.279240 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" seLinuxMountContext="" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.279251 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" seLinuxMountContext="" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.279262 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" seLinuxMountContext="" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.279274 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" seLinuxMountContext="" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.279288 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" seLinuxMountContext="" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.279301 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" seLinuxMountContext="" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.279312 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" seLinuxMountContext="" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.279325 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" seLinuxMountContext="" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.279344 4790 reconstruct.go:97] "Volume reconstruction finished" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.279352 4790 reconciler.go:26] "Reconciler: start to sync state" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.296816 4790 manager.go:324] Recovery completed Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.308543 4790 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.310743 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.310809 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.310820 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.311956 4790 cpu_manager.go:225] "Starting CPU manager" policy="none" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.311976 4790 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.312010 4790 state_mem.go:36] "Initialized new in-memory state store" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.325016 4790 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.326565 4790 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.326923 4790 status_manager.go:217] "Starting to sync pod status with apiserver" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.326970 4790 kubelet.go:2335] "Starting kubelet main sync loop" Oct 03 08:40:14 crc kubenswrapper[4790]: E1003 08:40:14.327023 4790 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Oct 03 08:40:14 crc kubenswrapper[4790]: W1003 08:40:14.332480 4790 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.113:6443: connect: connection refused Oct 03 08:40:14 crc kubenswrapper[4790]: E1003 08:40:14.332750 4790 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.113:6443: connect: connection refused" logger="UnhandledError" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.338145 4790 policy_none.go:49] "None policy: Start" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.339205 4790 memory_manager.go:170] "Starting memorymanager" policy="None" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.339312 4790 state_mem.go:35] "Initializing new in-memory state store" Oct 03 08:40:14 crc kubenswrapper[4790]: E1003 08:40:14.366618 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.408110 4790 manager.go:334] "Starting Device Plugin manager" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.408199 4790 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.408240 4790 server.go:79] "Starting device plugin registration server" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.409043 4790 eviction_manager.go:189] "Eviction manager: starting control loop" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.409066 4790 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.409809 4790 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.410002 4790 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.410023 4790 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Oct 03 08:40:14 crc kubenswrapper[4790]: E1003 08:40:14.416641 4790 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.428390 4790 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc","openshift-etcd/etcd-crc","openshift-kube-apiserver/kube-apiserver-crc","openshift-kube-controller-manager/kube-controller-manager-crc","openshift-kube-scheduler/openshift-kube-scheduler-crc"] Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.428492 4790 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.429827 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.429886 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.429900 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.430095 4790 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.430324 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.430376 4790 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.431225 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.431257 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.431269 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.431281 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.431313 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.431327 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.431376 4790 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.431518 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.431592 4790 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.431975 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.432011 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.432024 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.432148 4790 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.432373 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.432450 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.432490 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.432396 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.432544 4790 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.433191 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.433245 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.433256 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.433492 4790 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.433995 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.434034 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.434077 4790 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.434104 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.434125 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.434926 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.434974 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.434993 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.435029 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.435047 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.435056 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.435347 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.435408 4790 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.436826 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.436897 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.436930 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:14 crc kubenswrapper[4790]: E1003 08:40:14.467924 4790 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.113:6443: connect: connection refused" interval="400ms" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.480900 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.481171 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.481448 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.481620 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.481697 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.481756 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.481803 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.481846 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.481891 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.481995 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.482051 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.482118 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.482163 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.482204 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.482314 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.510254 4790 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.512071 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.512163 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.512193 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.512275 4790 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 03 08:40:14 crc kubenswrapper[4790]: E1003 08:40:14.513089 4790 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.113:6443: connect: connection refused" node="crc" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.583812 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.583877 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.583922 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.583944 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.583970 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.583991 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.584016 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.584017 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.584039 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.584061 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.584094 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.584116 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.584138 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.584157 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.584160 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.584197 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.584218 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.584247 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.584279 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.584315 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.584348 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.584381 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.584415 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.584446 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.584115 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.584487 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.584141 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.584519 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.584540 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.584589 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.713904 4790 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.715643 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.715700 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.715712 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.715746 4790 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 03 08:40:14 crc kubenswrapper[4790]: E1003 08:40:14.716381 4790 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.113:6443: connect: connection refused" node="crc" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.770435 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.781093 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.805497 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.817481 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 03 08:40:14 crc kubenswrapper[4790]: I1003 08:40:14.826462 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 03 08:40:14 crc kubenswrapper[4790]: W1003 08:40:14.833945 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2139d3e2895fc6797b9c76a1b4c9886d.slice/crio-378c6038fa93da593eea9707f3c7e68fca9663f418d98abbf33e9e2fce93a28e WatchSource:0}: Error finding container 378c6038fa93da593eea9707f3c7e68fca9663f418d98abbf33e9e2fce93a28e: Status 404 returned error can't find the container with id 378c6038fa93da593eea9707f3c7e68fca9663f418d98abbf33e9e2fce93a28e Oct 03 08:40:14 crc kubenswrapper[4790]: W1003 08:40:14.836091 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1b160f5dda77d281dd8e69ec8d817f9.slice/crio-500c1b14f360c375febef7dcfdba662486c03f38aa38e1596aade124cd8db8be WatchSource:0}: Error finding container 500c1b14f360c375febef7dcfdba662486c03f38aa38e1596aade124cd8db8be: Status 404 returned error can't find the container with id 500c1b14f360c375febef7dcfdba662486c03f38aa38e1596aade124cd8db8be Oct 03 08:40:14 crc kubenswrapper[4790]: W1003 08:40:14.846755 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4b27818a5e8e43d0dc095d08835c792.slice/crio-2686f3395b4e2b4e2b46a04050c7ca064fef64a563a66cb54dee464293829a35 WatchSource:0}: Error finding container 2686f3395b4e2b4e2b46a04050c7ca064fef64a563a66cb54dee464293829a35: Status 404 returned error can't find the container with id 2686f3395b4e2b4e2b46a04050c7ca064fef64a563a66cb54dee464293829a35 Oct 03 08:40:14 crc kubenswrapper[4790]: W1003 08:40:14.850552 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf614b9022728cf315e60c057852e563e.slice/crio-fbfa7fe8dfbbf297053bdd4f731af24cfbb904936f2f92e22a4f49f5b4706ad2 WatchSource:0}: Error finding container fbfa7fe8dfbbf297053bdd4f731af24cfbb904936f2f92e22a4f49f5b4706ad2: Status 404 returned error can't find the container with id fbfa7fe8dfbbf297053bdd4f731af24cfbb904936f2f92e22a4f49f5b4706ad2 Oct 03 08:40:14 crc kubenswrapper[4790]: W1003 08:40:14.856434 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dcd261975c3d6b9a6ad6367fd4facd3.slice/crio-e4a58a3e333a1033486ad6c949e90a91db1ce6583339723ab1e4cb69275b73ee WatchSource:0}: Error finding container e4a58a3e333a1033486ad6c949e90a91db1ce6583339723ab1e4cb69275b73ee: Status 404 returned error can't find the container with id e4a58a3e333a1033486ad6c949e90a91db1ce6583339723ab1e4cb69275b73ee Oct 03 08:40:14 crc kubenswrapper[4790]: E1003 08:40:14.868712 4790 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.113:6443: connect: connection refused" interval="800ms" Oct 03 08:40:15 crc kubenswrapper[4790]: I1003 08:40:15.116793 4790 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 08:40:15 crc kubenswrapper[4790]: I1003 08:40:15.118595 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:15 crc kubenswrapper[4790]: I1003 08:40:15.118633 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:15 crc kubenswrapper[4790]: I1003 08:40:15.118649 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:15 crc kubenswrapper[4790]: I1003 08:40:15.118678 4790 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 03 08:40:15 crc kubenswrapper[4790]: E1003 08:40:15.119177 4790 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.113:6443: connect: connection refused" node="crc" Oct 03 08:40:15 crc kubenswrapper[4790]: W1003 08:40:15.170545 4790 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.113:6443: connect: connection refused Oct 03 08:40:15 crc kubenswrapper[4790]: E1003 08:40:15.170689 4790 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.113:6443: connect: connection refused" logger="UnhandledError" Oct 03 08:40:15 crc kubenswrapper[4790]: W1003 08:40:15.182820 4790 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.113:6443: connect: connection refused Oct 03 08:40:15 crc kubenswrapper[4790]: E1003 08:40:15.182896 4790 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.113:6443: connect: connection refused" logger="UnhandledError" Oct 03 08:40:15 crc kubenswrapper[4790]: W1003 08:40:15.191870 4790 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.113:6443: connect: connection refused Oct 03 08:40:15 crc kubenswrapper[4790]: E1003 08:40:15.191941 4790 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.113:6443: connect: connection refused" logger="UnhandledError" Oct 03 08:40:15 crc kubenswrapper[4790]: I1003 08:40:15.256944 4790 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.113:6443: connect: connection refused Oct 03 08:40:15 crc kubenswrapper[4790]: I1003 08:40:15.334939 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"500c1b14f360c375febef7dcfdba662486c03f38aa38e1596aade124cd8db8be"} Oct 03 08:40:15 crc kubenswrapper[4790]: I1003 08:40:15.336564 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"e4a58a3e333a1033486ad6c949e90a91db1ce6583339723ab1e4cb69275b73ee"} Oct 03 08:40:15 crc kubenswrapper[4790]: I1003 08:40:15.337874 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"fbfa7fe8dfbbf297053bdd4f731af24cfbb904936f2f92e22a4f49f5b4706ad2"} Oct 03 08:40:15 crc kubenswrapper[4790]: I1003 08:40:15.339327 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"2686f3395b4e2b4e2b46a04050c7ca064fef64a563a66cb54dee464293829a35"} Oct 03 08:40:15 crc kubenswrapper[4790]: I1003 08:40:15.340925 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"378c6038fa93da593eea9707f3c7e68fca9663f418d98abbf33e9e2fce93a28e"} Oct 03 08:40:15 crc kubenswrapper[4790]: W1003 08:40:15.611643 4790 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.113:6443: connect: connection refused Oct 03 08:40:15 crc kubenswrapper[4790]: E1003 08:40:15.611798 4790 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.113:6443: connect: connection refused" logger="UnhandledError" Oct 03 08:40:15 crc kubenswrapper[4790]: E1003 08:40:15.669456 4790 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.113:6443: connect: connection refused" interval="1.6s" Oct 03 08:40:15 crc kubenswrapper[4790]: I1003 08:40:15.919533 4790 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 08:40:15 crc kubenswrapper[4790]: I1003 08:40:15.922352 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:15 crc kubenswrapper[4790]: I1003 08:40:15.922403 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:15 crc kubenswrapper[4790]: I1003 08:40:15.922416 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:15 crc kubenswrapper[4790]: I1003 08:40:15.922449 4790 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 03 08:40:15 crc kubenswrapper[4790]: E1003 08:40:15.923126 4790 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.113:6443: connect: connection refused" node="crc" Oct 03 08:40:16 crc kubenswrapper[4790]: I1003 08:40:16.256781 4790 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.113:6443: connect: connection refused Oct 03 08:40:16 crc kubenswrapper[4790]: I1003 08:40:16.346334 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"e837e81f3b907d8506f674d13a9bc8f20f63e6c40e1f952cd57b715855ec9bf9"} Oct 03 08:40:16 crc kubenswrapper[4790]: I1003 08:40:16.346398 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"027dd0df8d83044cff9f4e8c87bef3bb18e2169bb93e615486fd1e9b5a17b86e"} Oct 03 08:40:16 crc kubenswrapper[4790]: I1003 08:40:16.346432 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"b0138e08174582c0c1388716f1f6b9e42a16b009f1a467803c4dacd50c3d80ff"} Oct 03 08:40:16 crc kubenswrapper[4790]: I1003 08:40:16.346448 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"c575d680c9bc994b8c2e97e720d794138153bc3329c4b7f8ef81115a0245d0cc"} Oct 03 08:40:16 crc kubenswrapper[4790]: I1003 08:40:16.346398 4790 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 08:40:16 crc kubenswrapper[4790]: I1003 08:40:16.347549 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:16 crc kubenswrapper[4790]: I1003 08:40:16.347616 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:16 crc kubenswrapper[4790]: I1003 08:40:16.347634 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:16 crc kubenswrapper[4790]: I1003 08:40:16.348168 4790 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="444e973af493d5aca2c924bcaee20abca5f1281e6b9c9f364e4603bf622d0e48" exitCode=0 Oct 03 08:40:16 crc kubenswrapper[4790]: I1003 08:40:16.348221 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"444e973af493d5aca2c924bcaee20abca5f1281e6b9c9f364e4603bf622d0e48"} Oct 03 08:40:16 crc kubenswrapper[4790]: I1003 08:40:16.348234 4790 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 08:40:16 crc kubenswrapper[4790]: I1003 08:40:16.349498 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:16 crc kubenswrapper[4790]: I1003 08:40:16.349600 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:16 crc kubenswrapper[4790]: I1003 08:40:16.349615 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:16 crc kubenswrapper[4790]: I1003 08:40:16.351013 4790 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="400a812d6dcd1be3f7dd3023c3e52a4896f677984accef7ecfd27a7770afe056" exitCode=0 Oct 03 08:40:16 crc kubenswrapper[4790]: I1003 08:40:16.351097 4790 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 08:40:16 crc kubenswrapper[4790]: I1003 08:40:16.351080 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"400a812d6dcd1be3f7dd3023c3e52a4896f677984accef7ecfd27a7770afe056"} Oct 03 08:40:16 crc kubenswrapper[4790]: I1003 08:40:16.351827 4790 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 08:40:16 crc kubenswrapper[4790]: I1003 08:40:16.352257 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:16 crc kubenswrapper[4790]: I1003 08:40:16.352309 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:16 crc kubenswrapper[4790]: I1003 08:40:16.352324 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:16 crc kubenswrapper[4790]: I1003 08:40:16.352939 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:16 crc kubenswrapper[4790]: I1003 08:40:16.352965 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:16 crc kubenswrapper[4790]: I1003 08:40:16.352976 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:16 crc kubenswrapper[4790]: I1003 08:40:16.354026 4790 generic.go:334] "Generic (PLEG): container finished" podID="d1b160f5dda77d281dd8e69ec8d817f9" containerID="836c0a93c9a8ca16b512613d02dd11de6a852bc04e06dd03ae74c981f093f676" exitCode=0 Oct 03 08:40:16 crc kubenswrapper[4790]: I1003 08:40:16.354068 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerDied","Data":"836c0a93c9a8ca16b512613d02dd11de6a852bc04e06dd03ae74c981f093f676"} Oct 03 08:40:16 crc kubenswrapper[4790]: I1003 08:40:16.354127 4790 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 08:40:16 crc kubenswrapper[4790]: I1003 08:40:16.355105 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:16 crc kubenswrapper[4790]: I1003 08:40:16.355150 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:16 crc kubenswrapper[4790]: I1003 08:40:16.355166 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:16 crc kubenswrapper[4790]: I1003 08:40:16.357791 4790 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="3486e595b261af48bff933da2e9a7c49631e4523babd35cc054b39291e410512" exitCode=0 Oct 03 08:40:16 crc kubenswrapper[4790]: I1003 08:40:16.357829 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"3486e595b261af48bff933da2e9a7c49631e4523babd35cc054b39291e410512"} Oct 03 08:40:16 crc kubenswrapper[4790]: I1003 08:40:16.357929 4790 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 08:40:16 crc kubenswrapper[4790]: I1003 08:40:16.359065 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:16 crc kubenswrapper[4790]: I1003 08:40:16.359106 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:16 crc kubenswrapper[4790]: I1003 08:40:16.359118 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:16 crc kubenswrapper[4790]: W1003 08:40:16.778282 4790 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.113:6443: connect: connection refused Oct 03 08:40:16 crc kubenswrapper[4790]: E1003 08:40:16.779652 4790 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.113:6443: connect: connection refused" logger="UnhandledError" Oct 03 08:40:17 crc kubenswrapper[4790]: W1003 08:40:17.100541 4790 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.113:6443: connect: connection refused Oct 03 08:40:17 crc kubenswrapper[4790]: E1003 08:40:17.100703 4790 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.113:6443: connect: connection refused" logger="UnhandledError" Oct 03 08:40:17 crc kubenswrapper[4790]: I1003 08:40:17.257191 4790 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.113:6443: connect: connection refused Oct 03 08:40:17 crc kubenswrapper[4790]: E1003 08:40:17.270339 4790 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.113:6443: connect: connection refused" interval="3.2s" Oct 03 08:40:17 crc kubenswrapper[4790]: I1003 08:40:17.364554 4790 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="047ddbddda224cd9235029ce051d9e3594a329bf0afee33fe82f357986193386" exitCode=0 Oct 03 08:40:17 crc kubenswrapper[4790]: I1003 08:40:17.364645 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"047ddbddda224cd9235029ce051d9e3594a329bf0afee33fe82f357986193386"} Oct 03 08:40:17 crc kubenswrapper[4790]: I1003 08:40:17.364714 4790 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 08:40:17 crc kubenswrapper[4790]: I1003 08:40:17.366125 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:17 crc kubenswrapper[4790]: I1003 08:40:17.366163 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:17 crc kubenswrapper[4790]: I1003 08:40:17.366173 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:17 crc kubenswrapper[4790]: I1003 08:40:17.367384 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"ac215e99f46978c6b11677b4a1965fb4c1e16820ff221134062941eb162085ff"} Oct 03 08:40:17 crc kubenswrapper[4790]: I1003 08:40:17.367473 4790 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 08:40:17 crc kubenswrapper[4790]: I1003 08:40:17.369045 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:17 crc kubenswrapper[4790]: I1003 08:40:17.369069 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:17 crc kubenswrapper[4790]: I1003 08:40:17.369079 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:17 crc kubenswrapper[4790]: I1003 08:40:17.376596 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"55ba2aa3b5d8466c1193dab58f6add86f97ed9065aaa71ba877ad002d7b30ccc"} Oct 03 08:40:17 crc kubenswrapper[4790]: I1003 08:40:17.376636 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"d8b7cc917b16307413ed2fb2c91cf91b438cf0499138176e99562d0415bf872c"} Oct 03 08:40:17 crc kubenswrapper[4790]: I1003 08:40:17.376650 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"ca9eba35e4f0edf15874f10d6b11af0b09a3c3a43eaf8340ec5604122463dc8b"} Oct 03 08:40:17 crc kubenswrapper[4790]: I1003 08:40:17.376788 4790 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 08:40:17 crc kubenswrapper[4790]: I1003 08:40:17.378572 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:17 crc kubenswrapper[4790]: I1003 08:40:17.378628 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:17 crc kubenswrapper[4790]: I1003 08:40:17.378638 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:17 crc kubenswrapper[4790]: I1003 08:40:17.380346 4790 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 08:40:17 crc kubenswrapper[4790]: I1003 08:40:17.380346 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"db5e8a267e2821ea1c1628ff477a4efc3c5bcc78538460b0f30a160bd738dcee"} Oct 03 08:40:17 crc kubenswrapper[4790]: I1003 08:40:17.380419 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"7d85a637b07da3de31d73486c444bf86e7e2fd799180fc1cd9ab56fee61afaa9"} Oct 03 08:40:17 crc kubenswrapper[4790]: I1003 08:40:17.380441 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"2e0caa9033c82a625d9594342d4e58052cfbfea8251bc0fa2f1b102ba235b5ba"} Oct 03 08:40:17 crc kubenswrapper[4790]: I1003 08:40:17.381321 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:17 crc kubenswrapper[4790]: I1003 08:40:17.381352 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:17 crc kubenswrapper[4790]: I1003 08:40:17.381363 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:17 crc kubenswrapper[4790]: I1003 08:40:17.524272 4790 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 08:40:17 crc kubenswrapper[4790]: I1003 08:40:17.528104 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:17 crc kubenswrapper[4790]: I1003 08:40:17.529397 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:17 crc kubenswrapper[4790]: I1003 08:40:17.529428 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:17 crc kubenswrapper[4790]: I1003 08:40:17.529465 4790 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 03 08:40:17 crc kubenswrapper[4790]: E1003 08:40:17.532336 4790 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.113:6443: connect: connection refused" node="crc" Oct 03 08:40:17 crc kubenswrapper[4790]: W1003 08:40:17.549596 4790 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.113:6443: connect: connection refused Oct 03 08:40:17 crc kubenswrapper[4790]: E1003 08:40:17.549708 4790 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.113:6443: connect: connection refused" logger="UnhandledError" Oct 03 08:40:17 crc kubenswrapper[4790]: I1003 08:40:17.857796 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 03 08:40:17 crc kubenswrapper[4790]: E1003 08:40:17.890632 4790 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.113:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.186aee7a7a8a9a61 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-10-03 08:40:14.250924641 +0000 UTC m=+0.603858879,LastTimestamp:2025-10-03 08:40:14.250924641 +0000 UTC m=+0.603858879,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Oct 03 08:40:18 crc kubenswrapper[4790]: I1003 08:40:18.061001 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 03 08:40:18 crc kubenswrapper[4790]: I1003 08:40:18.073719 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 03 08:40:18 crc kubenswrapper[4790]: I1003 08:40:18.257418 4790 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.113:6443: connect: connection refused Oct 03 08:40:18 crc kubenswrapper[4790]: W1003 08:40:18.264275 4790 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.113:6443: connect: connection refused Oct 03 08:40:18 crc kubenswrapper[4790]: E1003 08:40:18.264395 4790 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.113:6443: connect: connection refused" logger="UnhandledError" Oct 03 08:40:18 crc kubenswrapper[4790]: I1003 08:40:18.385301 4790 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="9386a8f03b1066fc35cd6a10d0eb073b525b6c8802d3211ac60488f6a231d837" exitCode=0 Oct 03 08:40:18 crc kubenswrapper[4790]: I1003 08:40:18.385408 4790 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 08:40:18 crc kubenswrapper[4790]: I1003 08:40:18.385428 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"9386a8f03b1066fc35cd6a10d0eb073b525b6c8802d3211ac60488f6a231d837"} Oct 03 08:40:18 crc kubenswrapper[4790]: I1003 08:40:18.386154 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:18 crc kubenswrapper[4790]: I1003 08:40:18.386178 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:18 crc kubenswrapper[4790]: I1003 08:40:18.386187 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:18 crc kubenswrapper[4790]: I1003 08:40:18.388967 4790 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 08:40:18 crc kubenswrapper[4790]: I1003 08:40:18.389053 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"34084b4e3a826fa6d90d797f57f66a30e3601840e4bd58d42cc81c2ad5517104"} Oct 03 08:40:18 crc kubenswrapper[4790]: I1003 08:40:18.389082 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"e4591d1f2d844cb16eef19a65aaad416898d23ca5ce266e4863a9810d49ff4ba"} Oct 03 08:40:18 crc kubenswrapper[4790]: I1003 08:40:18.389183 4790 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 08:40:18 crc kubenswrapper[4790]: I1003 08:40:18.389352 4790 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 08:40:18 crc kubenswrapper[4790]: I1003 08:40:18.389841 4790 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 08:40:18 crc kubenswrapper[4790]: I1003 08:40:18.390040 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:18 crc kubenswrapper[4790]: I1003 08:40:18.390064 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:18 crc kubenswrapper[4790]: I1003 08:40:18.390075 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:18 crc kubenswrapper[4790]: I1003 08:40:18.390296 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:18 crc kubenswrapper[4790]: I1003 08:40:18.390324 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:18 crc kubenswrapper[4790]: I1003 08:40:18.390336 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:18 crc kubenswrapper[4790]: I1003 08:40:18.390504 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:18 crc kubenswrapper[4790]: I1003 08:40:18.390526 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:18 crc kubenswrapper[4790]: I1003 08:40:18.390537 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:18 crc kubenswrapper[4790]: I1003 08:40:18.391144 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:18 crc kubenswrapper[4790]: I1003 08:40:18.391171 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:18 crc kubenswrapper[4790]: I1003 08:40:18.391184 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:19 crc kubenswrapper[4790]: I1003 08:40:19.019068 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 03 08:40:19 crc kubenswrapper[4790]: I1003 08:40:19.398388 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"cab8364048d83836f12d33731b82fcadd0527f407da4b5343733f5e245d29a64"} Oct 03 08:40:19 crc kubenswrapper[4790]: I1003 08:40:19.398455 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"637948fd9e320da9f1d3edbb91399d9bec76dd14c1e12f23975484c96f258a4f"} Oct 03 08:40:19 crc kubenswrapper[4790]: I1003 08:40:19.398461 4790 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 08:40:19 crc kubenswrapper[4790]: I1003 08:40:19.398469 4790 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 03 08:40:19 crc kubenswrapper[4790]: I1003 08:40:19.398521 4790 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 08:40:19 crc kubenswrapper[4790]: I1003 08:40:19.398466 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"bd9a12a9a1b0afd247d92ff2723994cdeba69d3cbcfe00eb589ffb1be4b7b374"} Oct 03 08:40:19 crc kubenswrapper[4790]: I1003 08:40:19.398707 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"f031864d14c6032eab35a3563f92c99253446c550dd27903656bde7e5892fc8d"} Oct 03 08:40:19 crc kubenswrapper[4790]: I1003 08:40:19.398725 4790 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 08:40:19 crc kubenswrapper[4790]: I1003 08:40:19.398808 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 03 08:40:19 crc kubenswrapper[4790]: I1003 08:40:19.399621 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:19 crc kubenswrapper[4790]: I1003 08:40:19.399649 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:19 crc kubenswrapper[4790]: I1003 08:40:19.399658 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:19 crc kubenswrapper[4790]: I1003 08:40:19.400026 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:19 crc kubenswrapper[4790]: I1003 08:40:19.400091 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:19 crc kubenswrapper[4790]: I1003 08:40:19.400112 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:19 crc kubenswrapper[4790]: I1003 08:40:19.400273 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:19 crc kubenswrapper[4790]: I1003 08:40:19.400302 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:19 crc kubenswrapper[4790]: I1003 08:40:19.400337 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:20 crc kubenswrapper[4790]: I1003 08:40:20.406248 4790 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 08:40:20 crc kubenswrapper[4790]: I1003 08:40:20.406204 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"70b83da4dd337f5d3e58363e69374a5e11b89f25c0b50578a753542a14b6b0d8"} Oct 03 08:40:20 crc kubenswrapper[4790]: I1003 08:40:20.406497 4790 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 08:40:20 crc kubenswrapper[4790]: I1003 08:40:20.407524 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:20 crc kubenswrapper[4790]: I1003 08:40:20.407569 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:20 crc kubenswrapper[4790]: I1003 08:40:20.407601 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:20 crc kubenswrapper[4790]: I1003 08:40:20.408046 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:20 crc kubenswrapper[4790]: I1003 08:40:20.408095 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:20 crc kubenswrapper[4790]: I1003 08:40:20.408117 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:20 crc kubenswrapper[4790]: I1003 08:40:20.455335 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 03 08:40:20 crc kubenswrapper[4790]: I1003 08:40:20.455660 4790 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 03 08:40:20 crc kubenswrapper[4790]: I1003 08:40:20.455743 4790 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 08:40:20 crc kubenswrapper[4790]: I1003 08:40:20.457433 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:20 crc kubenswrapper[4790]: I1003 08:40:20.457503 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:20 crc kubenswrapper[4790]: I1003 08:40:20.457525 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:20 crc kubenswrapper[4790]: I1003 08:40:20.733042 4790 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 08:40:20 crc kubenswrapper[4790]: I1003 08:40:20.734703 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:20 crc kubenswrapper[4790]: I1003 08:40:20.734785 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:20 crc kubenswrapper[4790]: I1003 08:40:20.734809 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:20 crc kubenswrapper[4790]: I1003 08:40:20.734858 4790 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 03 08:40:21 crc kubenswrapper[4790]: I1003 08:40:21.266701 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Oct 03 08:40:21 crc kubenswrapper[4790]: I1003 08:40:21.409463 4790 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 08:40:21 crc kubenswrapper[4790]: I1003 08:40:21.411179 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:21 crc kubenswrapper[4790]: I1003 08:40:21.411224 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:21 crc kubenswrapper[4790]: I1003 08:40:21.411241 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:21 crc kubenswrapper[4790]: I1003 08:40:21.540505 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 03 08:40:21 crc kubenswrapper[4790]: I1003 08:40:21.540799 4790 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 08:40:21 crc kubenswrapper[4790]: I1003 08:40:21.542279 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:21 crc kubenswrapper[4790]: I1003 08:40:21.542358 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:21 crc kubenswrapper[4790]: I1003 08:40:21.542387 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:21 crc kubenswrapper[4790]: I1003 08:40:21.982643 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 03 08:40:21 crc kubenswrapper[4790]: I1003 08:40:21.982875 4790 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 08:40:21 crc kubenswrapper[4790]: I1003 08:40:21.984238 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:21 crc kubenswrapper[4790]: I1003 08:40:21.984279 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:21 crc kubenswrapper[4790]: I1003 08:40:21.984294 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:22 crc kubenswrapper[4790]: I1003 08:40:22.411889 4790 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 08:40:22 crc kubenswrapper[4790]: I1003 08:40:22.412893 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:22 crc kubenswrapper[4790]: I1003 08:40:22.412926 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:22 crc kubenswrapper[4790]: I1003 08:40:22.412937 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:22 crc kubenswrapper[4790]: I1003 08:40:22.825662 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-crc" Oct 03 08:40:23 crc kubenswrapper[4790]: I1003 08:40:23.414866 4790 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 08:40:23 crc kubenswrapper[4790]: I1003 08:40:23.416256 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:23 crc kubenswrapper[4790]: I1003 08:40:23.416343 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:23 crc kubenswrapper[4790]: I1003 08:40:23.416371 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:23 crc kubenswrapper[4790]: I1003 08:40:23.455454 4790 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Oct 03 08:40:23 crc kubenswrapper[4790]: I1003 08:40:23.455709 4790 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 03 08:40:23 crc kubenswrapper[4790]: I1003 08:40:23.537989 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 03 08:40:23 crc kubenswrapper[4790]: I1003 08:40:23.538228 4790 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 08:40:23 crc kubenswrapper[4790]: I1003 08:40:23.540002 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:23 crc kubenswrapper[4790]: I1003 08:40:23.540056 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:23 crc kubenswrapper[4790]: I1003 08:40:23.540070 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:24 crc kubenswrapper[4790]: E1003 08:40:24.416993 4790 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Oct 03 08:40:28 crc kubenswrapper[4790]: I1003 08:40:28.666395 4790 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Liveness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:42300->192.168.126.11:17697: read: connection reset by peer" start-of-body= Oct 03 08:40:28 crc kubenswrapper[4790]: I1003 08:40:28.666707 4790 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:42300->192.168.126.11:17697: read: connection reset by peer" Oct 03 08:40:28 crc kubenswrapper[4790]: I1003 08:40:28.893156 4790 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Oct 03 08:40:28 crc kubenswrapper[4790]: I1003 08:40:28.893218 4790 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Oct 03 08:40:28 crc kubenswrapper[4790]: I1003 08:40:28.906696 4790 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Oct 03 08:40:28 crc kubenswrapper[4790]: I1003 08:40:28.906764 4790 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Oct 03 08:40:29 crc kubenswrapper[4790]: I1003 08:40:29.029239 4790 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Oct 03 08:40:29 crc kubenswrapper[4790]: [+]log ok Oct 03 08:40:29 crc kubenswrapper[4790]: [+]etcd ok Oct 03 08:40:29 crc kubenswrapper[4790]: [+]poststarthook/openshift.io-oauth-apiserver-reachable ok Oct 03 08:40:29 crc kubenswrapper[4790]: [+]poststarthook/start-apiserver-admission-initializer ok Oct 03 08:40:29 crc kubenswrapper[4790]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Oct 03 08:40:29 crc kubenswrapper[4790]: [+]poststarthook/openshift.io-api-request-count-filter ok Oct 03 08:40:29 crc kubenswrapper[4790]: [+]poststarthook/openshift.io-startkubeinformers ok Oct 03 08:40:29 crc kubenswrapper[4790]: [+]poststarthook/openshift.io-openshift-apiserver-reachable ok Oct 03 08:40:29 crc kubenswrapper[4790]: [+]poststarthook/generic-apiserver-start-informers ok Oct 03 08:40:29 crc kubenswrapper[4790]: [+]poststarthook/priority-and-fairness-config-consumer ok Oct 03 08:40:29 crc kubenswrapper[4790]: [+]poststarthook/priority-and-fairness-filter ok Oct 03 08:40:29 crc kubenswrapper[4790]: [+]poststarthook/storage-object-count-tracker-hook ok Oct 03 08:40:29 crc kubenswrapper[4790]: [+]poststarthook/start-apiextensions-informers ok Oct 03 08:40:29 crc kubenswrapper[4790]: [-]poststarthook/start-apiextensions-controllers failed: reason withheld Oct 03 08:40:29 crc kubenswrapper[4790]: [-]poststarthook/crd-informer-synced failed: reason withheld Oct 03 08:40:29 crc kubenswrapper[4790]: [+]poststarthook/start-system-namespaces-controller ok Oct 03 08:40:29 crc kubenswrapper[4790]: [+]poststarthook/start-cluster-authentication-info-controller ok Oct 03 08:40:29 crc kubenswrapper[4790]: [+]poststarthook/start-kube-apiserver-identity-lease-controller ok Oct 03 08:40:29 crc kubenswrapper[4790]: [+]poststarthook/start-kube-apiserver-identity-lease-garbage-collector ok Oct 03 08:40:29 crc kubenswrapper[4790]: [+]poststarthook/start-legacy-token-tracking-controller ok Oct 03 08:40:29 crc kubenswrapper[4790]: [+]poststarthook/start-service-ip-repair-controllers ok Oct 03 08:40:29 crc kubenswrapper[4790]: [-]poststarthook/rbac/bootstrap-roles failed: reason withheld Oct 03 08:40:29 crc kubenswrapper[4790]: [-]poststarthook/scheduling/bootstrap-system-priority-classes failed: reason withheld Oct 03 08:40:29 crc kubenswrapper[4790]: [+]poststarthook/priority-and-fairness-config-producer ok Oct 03 08:40:29 crc kubenswrapper[4790]: [+]poststarthook/bootstrap-controller ok Oct 03 08:40:29 crc kubenswrapper[4790]: [+]poststarthook/aggregator-reload-proxy-client-cert ok Oct 03 08:40:29 crc kubenswrapper[4790]: [+]poststarthook/start-kube-aggregator-informers ok Oct 03 08:40:29 crc kubenswrapper[4790]: [+]poststarthook/apiservice-status-local-available-controller ok Oct 03 08:40:29 crc kubenswrapper[4790]: [+]poststarthook/apiservice-status-remote-available-controller ok Oct 03 08:40:29 crc kubenswrapper[4790]: [+]poststarthook/apiservice-registration-controller ok Oct 03 08:40:29 crc kubenswrapper[4790]: [+]poststarthook/apiservice-wait-for-first-sync ok Oct 03 08:40:29 crc kubenswrapper[4790]: [+]poststarthook/apiservice-discovery-controller ok Oct 03 08:40:29 crc kubenswrapper[4790]: [+]poststarthook/kube-apiserver-autoregistration ok Oct 03 08:40:29 crc kubenswrapper[4790]: [+]autoregister-completion ok Oct 03 08:40:29 crc kubenswrapper[4790]: [+]poststarthook/apiservice-openapi-controller ok Oct 03 08:40:29 crc kubenswrapper[4790]: [+]poststarthook/apiservice-openapiv3-controller ok Oct 03 08:40:29 crc kubenswrapper[4790]: livez check failed Oct 03 08:40:29 crc kubenswrapper[4790]: I1003 08:40:29.029346 4790 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 03 08:40:29 crc kubenswrapper[4790]: I1003 08:40:29.431564 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Oct 03 08:40:29 crc kubenswrapper[4790]: I1003 08:40:29.434523 4790 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="34084b4e3a826fa6d90d797f57f66a30e3601840e4bd58d42cc81c2ad5517104" exitCode=255 Oct 03 08:40:29 crc kubenswrapper[4790]: I1003 08:40:29.434591 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"34084b4e3a826fa6d90d797f57f66a30e3601840e4bd58d42cc81c2ad5517104"} Oct 03 08:40:29 crc kubenswrapper[4790]: I1003 08:40:29.434843 4790 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 08:40:29 crc kubenswrapper[4790]: I1003 08:40:29.435825 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:29 crc kubenswrapper[4790]: I1003 08:40:29.435860 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:29 crc kubenswrapper[4790]: I1003 08:40:29.435878 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:29 crc kubenswrapper[4790]: I1003 08:40:29.436538 4790 scope.go:117] "RemoveContainer" containerID="34084b4e3a826fa6d90d797f57f66a30e3601840e4bd58d42cc81c2ad5517104" Oct 03 08:40:30 crc kubenswrapper[4790]: I1003 08:40:30.440270 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Oct 03 08:40:30 crc kubenswrapper[4790]: I1003 08:40:30.442957 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"905a99c09cfaadc3eddc17fa6e509098364525d7a03add46a97c98dc08def8a2"} Oct 03 08:40:30 crc kubenswrapper[4790]: I1003 08:40:30.443288 4790 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 08:40:30 crc kubenswrapper[4790]: I1003 08:40:30.445040 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:30 crc kubenswrapper[4790]: I1003 08:40:30.445103 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:30 crc kubenswrapper[4790]: I1003 08:40:30.445123 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:31 crc kubenswrapper[4790]: I1003 08:40:31.304941 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Oct 03 08:40:31 crc kubenswrapper[4790]: I1003 08:40:31.305227 4790 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 08:40:31 crc kubenswrapper[4790]: I1003 08:40:31.307850 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:31 crc kubenswrapper[4790]: I1003 08:40:31.307909 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:31 crc kubenswrapper[4790]: I1003 08:40:31.307928 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:31 crc kubenswrapper[4790]: I1003 08:40:31.325105 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Oct 03 08:40:31 crc kubenswrapper[4790]: I1003 08:40:31.448480 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Oct 03 08:40:31 crc kubenswrapper[4790]: I1003 08:40:31.449627 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Oct 03 08:40:31 crc kubenswrapper[4790]: I1003 08:40:31.452830 4790 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="905a99c09cfaadc3eddc17fa6e509098364525d7a03add46a97c98dc08def8a2" exitCode=255 Oct 03 08:40:31 crc kubenswrapper[4790]: I1003 08:40:31.452932 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"905a99c09cfaadc3eddc17fa6e509098364525d7a03add46a97c98dc08def8a2"} Oct 03 08:40:31 crc kubenswrapper[4790]: I1003 08:40:31.453129 4790 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 08:40:31 crc kubenswrapper[4790]: I1003 08:40:31.453236 4790 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 08:40:31 crc kubenswrapper[4790]: I1003 08:40:31.453716 4790 scope.go:117] "RemoveContainer" containerID="34084b4e3a826fa6d90d797f57f66a30e3601840e4bd58d42cc81c2ad5517104" Oct 03 08:40:31 crc kubenswrapper[4790]: I1003 08:40:31.454893 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:31 crc kubenswrapper[4790]: I1003 08:40:31.454962 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:31 crc kubenswrapper[4790]: I1003 08:40:31.454987 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:31 crc kubenswrapper[4790]: I1003 08:40:31.454893 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:31 crc kubenswrapper[4790]: I1003 08:40:31.455060 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:31 crc kubenswrapper[4790]: I1003 08:40:31.455081 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:31 crc kubenswrapper[4790]: I1003 08:40:31.457753 4790 scope.go:117] "RemoveContainer" containerID="905a99c09cfaadc3eddc17fa6e509098364525d7a03add46a97c98dc08def8a2" Oct 03 08:40:31 crc kubenswrapper[4790]: E1003 08:40:31.458133 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Oct 03 08:40:31 crc kubenswrapper[4790]: I1003 08:40:31.993458 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 03 08:40:31 crc kubenswrapper[4790]: I1003 08:40:31.993825 4790 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 08:40:31 crc kubenswrapper[4790]: I1003 08:40:31.995889 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:31 crc kubenswrapper[4790]: I1003 08:40:31.995931 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:31 crc kubenswrapper[4790]: I1003 08:40:31.995940 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:32 crc kubenswrapper[4790]: I1003 08:40:32.457186 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Oct 03 08:40:33 crc kubenswrapper[4790]: I1003 08:40:33.456618 4790 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Oct 03 08:40:33 crc kubenswrapper[4790]: I1003 08:40:33.456690 4790 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Oct 03 08:40:33 crc kubenswrapper[4790]: E1003 08:40:33.895224 4790 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": context deadline exceeded" interval="6.4s" Oct 03 08:40:33 crc kubenswrapper[4790]: I1003 08:40:33.899300 4790 trace.go:236] Trace[1938206358]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (03-Oct-2025 08:40:22.782) (total time: 11116ms): Oct 03 08:40:33 crc kubenswrapper[4790]: Trace[1938206358]: ---"Objects listed" error: 11116ms (08:40:33.899) Oct 03 08:40:33 crc kubenswrapper[4790]: Trace[1938206358]: [11.116743759s] [11.116743759s] END Oct 03 08:40:33 crc kubenswrapper[4790]: I1003 08:40:33.899359 4790 trace.go:236] Trace[1772293212]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (03-Oct-2025 08:40:21.775) (total time: 12124ms): Oct 03 08:40:33 crc kubenswrapper[4790]: Trace[1772293212]: ---"Objects listed" error: 12124ms (08:40:33.899) Oct 03 08:40:33 crc kubenswrapper[4790]: Trace[1772293212]: [12.124211925s] [12.124211925s] END Oct 03 08:40:33 crc kubenswrapper[4790]: I1003 08:40:33.899377 4790 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Oct 03 08:40:33 crc kubenswrapper[4790]: I1003 08:40:33.899376 4790 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Oct 03 08:40:33 crc kubenswrapper[4790]: I1003 08:40:33.900810 4790 trace.go:236] Trace[1632790073]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (03-Oct-2025 08:40:22.000) (total time: 11900ms): Oct 03 08:40:33 crc kubenswrapper[4790]: Trace[1632790073]: ---"Objects listed" error: 11899ms (08:40:33.900) Oct 03 08:40:33 crc kubenswrapper[4790]: Trace[1632790073]: [11.900001909s] [11.900001909s] END Oct 03 08:40:33 crc kubenswrapper[4790]: I1003 08:40:33.900840 4790 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Oct 03 08:40:33 crc kubenswrapper[4790]: I1003 08:40:33.900869 4790 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Oct 03 08:40:33 crc kubenswrapper[4790]: I1003 08:40:33.901698 4790 trace.go:236] Trace[2140337383]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (03-Oct-2025 08:40:21.058) (total time: 12843ms): Oct 03 08:40:33 crc kubenswrapper[4790]: Trace[2140337383]: ---"Objects listed" error: 12843ms (08:40:33.901) Oct 03 08:40:33 crc kubenswrapper[4790]: Trace[2140337383]: [12.843511715s] [12.843511715s] END Oct 03 08:40:33 crc kubenswrapper[4790]: I1003 08:40:33.901745 4790 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Oct 03 08:40:33 crc kubenswrapper[4790]: E1003 08:40:33.903161 4790 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes \"crc\" is forbidden: autoscaling.openshift.io/ManagedNode infra config cache not synchronized" node="crc" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.027133 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.032685 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.154647 4790 scope.go:117] "RemoveContainer" containerID="905a99c09cfaadc3eddc17fa6e509098364525d7a03add46a97c98dc08def8a2" Oct 03 08:40:34 crc kubenswrapper[4790]: E1003 08:40:34.154871 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.249808 4790 apiserver.go:52] "Watching apiserver" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.252269 4790 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.252709 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-operator/iptables-alerter-4ln5h","openshift-network-operator/network-operator-58b4c7f79c-55gtf","openshift-dns/node-resolver-gprtl","openshift-kube-apiserver/kube-apiserver-crc","openshift-network-console/networking-console-plugin-85b44fc459-gdk6g","openshift-network-diagnostics/network-check-source-55646444c4-trplf","openshift-network-diagnostics/network-check-target-xd92c","openshift-network-node-identity/network-node-identity-vrzqb"] Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.253419 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-gprtl" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.253846 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.254189 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 08:40:34 crc kubenswrapper[4790]: E1003 08:40:34.254244 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.254322 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 08:40:34 crc kubenswrapper[4790]: E1003 08:40:34.254362 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.254419 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.254843 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.255229 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.255242 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 08:40:34 crc kubenswrapper[4790]: E1003 08:40:34.255344 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.257086 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.257631 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.257969 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.258102 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.258131 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.258265 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.258383 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.258387 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.258494 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.258537 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.260905 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.267389 4790 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.284449 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.303102 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.303162 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.303193 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.303214 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.303236 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.303520 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.303546 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.303598 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.303631 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.303655 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.303681 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.303710 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.303741 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.303768 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.303790 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.303815 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.303839 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.303862 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.303890 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.303916 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.303938 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.303965 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.303949 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.303989 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.304011 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.304025 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.304032 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.304135 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.304166 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.304186 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.304213 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.304257 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.304273 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.304279 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.304309 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.304338 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.304365 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.304365 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.304389 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.304421 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.304444 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.304455 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.304472 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.304505 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.304535 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.304563 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.304612 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.304665 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.304694 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.304722 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.304751 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.304782 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.305441 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.304501 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.304688 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.304725 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:40:34 crc kubenswrapper[4790]: E1003 08:40:34.304821 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 08:40:34.804798899 +0000 UTC m=+21.157733137 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.306452 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.306482 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.306504 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.306525 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.306543 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.306561 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.306592 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.306608 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.306625 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.306640 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.306656 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.306673 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.306690 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.306705 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.306721 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.306736 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.306751 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.306767 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.306782 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.306799 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.306816 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.306832 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.306851 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.306867 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.306885 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.306902 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.306919 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.306935 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.306953 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.306970 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.306985 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.307000 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.307018 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.307039 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.307057 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.307076 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.307097 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.307115 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.307132 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.307149 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.307166 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.307183 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.307200 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.307218 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.307236 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.307271 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.307288 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.307304 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.307322 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.307339 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.307355 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.307374 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.307393 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.307412 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.307431 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.307455 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.307472 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.307489 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.307509 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.307528 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.307563 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.307564 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.307596 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.307619 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.307638 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.307654 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.307672 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.307689 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.307709 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.307725 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.307929 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.307946 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.307962 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.307594 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.305047 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.305184 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.305331 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.305335 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.305449 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.305617 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.305592 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.305690 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.305924 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.305940 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.306015 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.306011 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.306301 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.306316 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.306321 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.306343 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.306380 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.308167 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.308315 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.308405 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.308460 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.308665 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.308714 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.308842 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.309018 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.309038 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.309289 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.309327 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.309344 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.310163 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.310166 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.310444 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.310450 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.310645 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.310965 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.311148 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.311502 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.311597 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.311783 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.311909 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.311990 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.312199 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.312222 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.312615 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.312941 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.313208 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.314344 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.314436 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.314122 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.314732 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.314903 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.315304 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.315450 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.315684 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.315841 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.315881 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.315887 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.315936 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.315969 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.315996 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.316023 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.316065 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.316092 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.316119 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.316146 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.316172 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.316198 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.316227 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.316252 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.316280 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.316308 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.316338 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.316363 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.316392 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.316419 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.316444 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.316471 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.316512 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.316540 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.316569 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.317086 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.317169 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.317254 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.317293 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.317324 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.317326 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.317352 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.317383 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.317424 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.317452 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.317480 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.317508 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.317544 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.317568 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.317612 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.317614 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.317636 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.317671 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.317698 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.317722 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.317747 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.318007 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.318050 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.318140 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.318320 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.318347 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.317776 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.318493 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.318521 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.318546 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.318547 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.318602 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.318626 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.318651 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.318687 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.318713 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.318740 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.318778 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.318779 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.318806 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.318835 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.318840 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.318969 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.319008 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.319034 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.319067 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.319092 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.319113 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.319138 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.319161 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.319186 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.319208 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.319229 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.319250 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.319272 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.319308 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.319328 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.319353 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.319376 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.319400 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.319425 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.319454 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.319479 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.319503 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.319528 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.319560 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.319612 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.320497 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.320678 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.320724 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.320772 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.320824 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.320858 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-66xc5\" (UniqueName: \"kubernetes.io/projected/0c5bfb61-e69a-4576-b200-18dd7f38c7f1-kube-api-access-66xc5\") pod \"node-resolver-gprtl\" (UID: \"0c5bfb61-e69a-4576-b200-18dd7f38c7f1\") " pod="openshift-dns/node-resolver-gprtl" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.320888 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/0c5bfb61-e69a-4576-b200-18dd7f38c7f1-hosts-file\") pod \"node-resolver-gprtl\" (UID: \"0c5bfb61-e69a-4576-b200-18dd7f38c7f1\") " pod="openshift-dns/node-resolver-gprtl" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.320925 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.320961 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.320995 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.321027 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.321064 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.321096 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.321130 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.321158 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.321184 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.321218 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.321362 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.321384 4790 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.321398 4790 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.321413 4790 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.321428 4790 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.321447 4790 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.321460 4790 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.321474 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.321487 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.321507 4790 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.321522 4790 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.321536 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.321554 4790 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.321566 4790 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.322522 4790 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.322544 4790 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.322563 4790 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.322613 4790 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.322630 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.322647 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.322668 4790 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.322681 4790 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.322695 4790 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.322709 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.322739 4790 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.322753 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.322767 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.322784 4790 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.322796 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.322809 4790 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.322824 4790 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.322841 4790 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.322859 4790 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.322872 4790 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.322885 4790 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.322902 4790 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.322916 4790 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.322929 4790 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.322943 4790 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.322960 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.322972 4790 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.322986 4790 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.323002 4790 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.323016 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.323029 4790 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.323041 4790 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.323057 4790 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.323070 4790 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.323084 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.323098 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.323114 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.323128 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.323141 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.323157 4790 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.323170 4790 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.323183 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.323195 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.323214 4790 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.323227 4790 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.323240 4790 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.323253 4790 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.323269 4790 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.323282 4790 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.323295 4790 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.323310 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.323327 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.323340 4790 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.323353 4790 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.323368 4790 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.323380 4790 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.323392 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.323408 4790 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.323426 4790 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.323440 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.323454 4790 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.323468 4790 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.323485 4790 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.323498 4790 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.323513 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.323527 4790 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.323543 4790 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.324865 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.325245 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.325240 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.325466 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.325688 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.335525 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.336416 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.336863 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.336860 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.336909 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.337305 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.337441 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.337732 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.337748 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.338818 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.341656 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.342124 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.342607 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.342718 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.342965 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.343135 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.343199 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.343312 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.343434 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.343630 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.343809 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.344057 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.344193 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.344280 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.344295 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.344469 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.344954 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.344963 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.345094 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.345256 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.345743 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.346135 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.346664 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.347001 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.347630 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.347682 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.350731 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.348099 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"535765e1-fd5d-4501-b13e-684c5f08b296\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e0caa9033c82a625d9594342d4e58052cfbfea8251bc0fa2f1b102ba235b5ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db5e8a267e2821ea1c1628ff477a4efc3c5bcc78538460b0f30a160bd738dcee\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d85a637b07da3de31d73486c444bf86e7e2fd799180fc1cd9ab56fee61afaa9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://905a99c09cfaadc3eddc17fa6e509098364525d7a03add46a97c98dc08def8a2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://905a99c09cfaadc3eddc17fa6e509098364525d7a03add46a97c98dc08def8a2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T08:40:30Z\\\",\\\"message\\\":\\\"file observer\\\\nW1003 08:40:30.396497 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1003 08:40:30.399279 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 08:40:30.400119 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-247026750/tls.crt::/tmp/serving-cert-247026750/tls.key\\\\\\\"\\\\nI1003 08:40:30.809123 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1003 08:40:30.812812 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1003 08:40:30.812831 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1003 08:40:30.812852 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1003 08:40:30.812858 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1003 08:40:30.817440 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1003 08:40:30.817463 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1003 08:40:30.817482 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 08:40:30.817491 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 08:40:30.817495 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1003 08:40:30.817498 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1003 08:40:30.817501 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1003 08:40:30.817504 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1003 08:40:30.821080 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4591d1f2d844cb16eef19a65aaad416898d23ca5ce266e4863a9810d49ff4ba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://444e973af493d5aca2c924bcaee20abca5f1281e6b9c9f364e4603bf622d0e48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://444e973af493d5aca2c924bcaee20abca5f1281e6b9c9f364e4603bf622d0e48\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:40:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:14Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.347728 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.347795 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.347794 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.347898 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.348173 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.348235 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.348752 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.348565 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.349104 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.349204 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.349238 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.349453 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.349593 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.348782 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.349927 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.349925 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.349984 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.350016 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.350192 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.350489 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.350505 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.350811 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.350947 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.351329 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.351446 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.351640 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.352051 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.352244 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.352273 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.352316 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.352910 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.352761 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.352890 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.352938 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.353562 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:40:34 crc kubenswrapper[4790]: E1003 08:40:34.353721 4790 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 03 08:40:34 crc kubenswrapper[4790]: E1003 08:40:34.353809 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-03 08:40:34.85378634 +0000 UTC m=+21.206720578 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.353906 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.353723 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:40:34 crc kubenswrapper[4790]: E1003 08:40:34.354080 4790 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 03 08:40:34 crc kubenswrapper[4790]: E1003 08:40:34.354204 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-03 08:40:34.85412829 +0000 UTC m=+21.207062528 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.354313 4790 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.355130 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.355139 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.355457 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.355529 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.355743 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.358865 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.359309 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.359664 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.359671 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.359779 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.360332 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.360366 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.360996 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.361722 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.362064 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.362112 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.362314 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.362598 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.362718 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.363882 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.365113 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.366674 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.366869 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.366897 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.367493 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.367928 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gprtl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c5bfb61-e69a-4576-b200-18dd7f38c7f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66xc5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:34Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gprtl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.368523 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Oct 03 08:40:34 crc kubenswrapper[4790]: E1003 08:40:34.370354 4790 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 03 08:40:34 crc kubenswrapper[4790]: E1003 08:40:34.370384 4790 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 03 08:40:34 crc kubenswrapper[4790]: E1003 08:40:34.370403 4790 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 03 08:40:34 crc kubenswrapper[4790]: E1003 08:40:34.370476 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-03 08:40:34.870451394 +0000 UTC m=+21.223385832 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.371117 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.372912 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.377408 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.377623 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.377875 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.378696 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.378735 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.379002 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.379493 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.379813 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.379853 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.380353 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.380438 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.380546 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.380781 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.381088 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.382455 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.382748 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.382834 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.384122 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.384596 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Oct 03 08:40:34 crc kubenswrapper[4790]: E1003 08:40:34.385650 4790 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 03 08:40:34 crc kubenswrapper[4790]: E1003 08:40:34.385677 4790 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 03 08:40:34 crc kubenswrapper[4790]: E1003 08:40:34.385689 4790 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 03 08:40:34 crc kubenswrapper[4790]: E1003 08:40:34.385743 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-03 08:40:34.885725949 +0000 UTC m=+21.238660187 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.387492 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.387933 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.389856 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.392041 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.393485 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.394033 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.398524 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.402998 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.403197 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.405032 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.405489 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.406797 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.409644 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.411984 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.413841 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.418729 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.420500 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.422034 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.422031 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.422604 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.424018 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.424673 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.424792 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.425045 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.425159 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.425306 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-66xc5\" (UniqueName: \"kubernetes.io/projected/0c5bfb61-e69a-4576-b200-18dd7f38c7f1-kube-api-access-66xc5\") pod \"node-resolver-gprtl\" (UID: \"0c5bfb61-e69a-4576-b200-18dd7f38c7f1\") " pod="openshift-dns/node-resolver-gprtl" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.425430 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.425621 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/0c5bfb61-e69a-4576-b200-18dd7f38c7f1-hosts-file\") pod \"node-resolver-gprtl\" (UID: \"0c5bfb61-e69a-4576-b200-18dd7f38c7f1\") " pod="openshift-dns/node-resolver-gprtl" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.425882 4790 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.425979 4790 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.426069 4790 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.426220 4790 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.426334 4790 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.426446 4790 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.426133 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.425494 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.426185 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/0c5bfb61-e69a-4576-b200-18dd7f38c7f1-hosts-file\") pod \"node-resolver-gprtl\" (UID: \"0c5bfb61-e69a-4576-b200-18dd7f38c7f1\") " pod="openshift-dns/node-resolver-gprtl" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.426557 4790 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.426754 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.426768 4790 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.426779 4790 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.426789 4790 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.426800 4790 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.426810 4790 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.426820 4790 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.426830 4790 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.426841 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.426851 4790 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.426863 4790 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.426874 4790 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.426884 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.426894 4790 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.426903 4790 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.426913 4790 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.426936 4790 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.426958 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.426975 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.426989 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.427003 4790 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.427018 4790 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.427031 4790 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.427043 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.427056 4790 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.427070 4790 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.427083 4790 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.427098 4790 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.427113 4790 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.427271 4790 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.427283 4790 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.427293 4790 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.427303 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.427313 4790 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.427323 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.427332 4790 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.427342 4790 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.427352 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.427363 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.427375 4790 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.427386 4790 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.427397 4790 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.427407 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.427417 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.427427 4790 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.427436 4790 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.427445 4790 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.427457 4790 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.427451 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.427469 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.427638 4790 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.427657 4790 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.427672 4790 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.427686 4790 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.427700 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.427713 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.427725 4790 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.427736 4790 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.427781 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.427795 4790 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.427806 4790 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.427817 4790 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.427836 4790 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.427850 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.427863 4790 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.427874 4790 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.427887 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.427899 4790 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.427911 4790 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.427923 4790 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.427934 4790 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.427963 4790 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.427975 4790 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.427988 4790 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.428003 4790 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.428014 4790 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.428026 4790 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.428030 4790 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.428040 4790 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.428052 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.428063 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.428076 4790 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.428088 4790 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.428099 4790 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.428124 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.428139 4790 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.428151 4790 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.428174 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.428186 4790 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.428149 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.428194 4790 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.428208 4790 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.428221 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.428234 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.428246 4790 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.428255 4790 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.428265 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.428277 4790 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.428289 4790 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.428303 4790 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.428317 4790 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.428328 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.428340 4790 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.428352 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.428365 4790 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.428378 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.428393 4790 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.428404 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.428413 4790 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.430532 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.431048 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.431936 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.433908 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.434775 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.435809 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.436541 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.438140 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.438399 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.438674 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.439336 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.440560 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.441985 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.442513 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.443556 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.444144 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.445500 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.446009 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.446588 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.447509 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.448273 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.449336 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.449846 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.451140 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.452261 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-66xc5\" (UniqueName: \"kubernetes.io/projected/0c5bfb61-e69a-4576-b200-18dd7f38c7f1-kube-api-access-66xc5\") pod \"node-resolver-gprtl\" (UID: \"0c5bfb61-e69a-4576-b200-18dd7f38c7f1\") " pod="openshift-dns/node-resolver-gprtl" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.462227 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"535765e1-fd5d-4501-b13e-684c5f08b296\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e0caa9033c82a625d9594342d4e58052cfbfea8251bc0fa2f1b102ba235b5ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db5e8a267e2821ea1c1628ff477a4efc3c5bcc78538460b0f30a160bd738dcee\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d85a637b07da3de31d73486c444bf86e7e2fd799180fc1cd9ab56fee61afaa9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://905a99c09cfaadc3eddc17fa6e509098364525d7a03add46a97c98dc08def8a2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://905a99c09cfaadc3eddc17fa6e509098364525d7a03add46a97c98dc08def8a2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T08:40:30Z\\\",\\\"message\\\":\\\"file observer\\\\nW1003 08:40:30.396497 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1003 08:40:30.399279 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 08:40:30.400119 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-247026750/tls.crt::/tmp/serving-cert-247026750/tls.key\\\\\\\"\\\\nI1003 08:40:30.809123 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1003 08:40:30.812812 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1003 08:40:30.812831 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1003 08:40:30.812852 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1003 08:40:30.812858 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1003 08:40:30.817440 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1003 08:40:30.817463 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1003 08:40:30.817482 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 08:40:30.817491 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 08:40:30.817495 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1003 08:40:30.817498 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1003 08:40:30.817501 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1003 08:40:30.817504 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1003 08:40:30.821080 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4591d1f2d844cb16eef19a65aaad416898d23ca5ce266e4863a9810d49ff4ba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://444e973af493d5aca2c924bcaee20abca5f1281e6b9c9f364e4603bf622d0e48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://444e973af493d5aca2c924bcaee20abca5f1281e6b9c9f364e4603bf622d0e48\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:40:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:14Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 03 08:40:34 crc kubenswrapper[4790]: E1003 08:40:34.474693 4790 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"kube-apiserver-crc\" already exists" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.474752 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gprtl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c5bfb61-e69a-4576-b200-18dd7f38c7f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66xc5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:34Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gprtl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.474923 4790 scope.go:117] "RemoveContainer" containerID="905a99c09cfaadc3eddc17fa6e509098364525d7a03add46a97c98dc08def8a2" Oct 03 08:40:34 crc kubenswrapper[4790]: E1003 08:40:34.475101 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.493546 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.506167 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.525982 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.539592 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.548534 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.559737 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.569978 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.574358 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-gprtl" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.582283 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"535765e1-fd5d-4501-b13e-684c5f08b296\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e0caa9033c82a625d9594342d4e58052cfbfea8251bc0fa2f1b102ba235b5ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db5e8a267e2821ea1c1628ff477a4efc3c5bcc78538460b0f30a160bd738dcee\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d85a637b07da3de31d73486c444bf86e7e2fd799180fc1cd9ab56fee61afaa9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://905a99c09cfaadc3eddc17fa6e509098364525d7a03add46a97c98dc08def8a2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://905a99c09cfaadc3eddc17fa6e509098364525d7a03add46a97c98dc08def8a2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T08:40:30Z\\\",\\\"message\\\":\\\"file observer\\\\nW1003 08:40:30.396497 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1003 08:40:30.399279 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 08:40:30.400119 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-247026750/tls.crt::/tmp/serving-cert-247026750/tls.key\\\\\\\"\\\\nI1003 08:40:30.809123 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1003 08:40:30.812812 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1003 08:40:30.812831 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1003 08:40:30.812852 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1003 08:40:30.812858 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1003 08:40:30.817440 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1003 08:40:30.817463 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1003 08:40:30.817482 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 08:40:30.817491 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 08:40:30.817495 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1003 08:40:30.817498 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1003 08:40:30.817501 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1003 08:40:30.817504 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1003 08:40:30.821080 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4591d1f2d844cb16eef19a65aaad416898d23ca5ce266e4863a9810d49ff4ba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://444e973af493d5aca2c924bcaee20abca5f1281e6b9c9f364e4603bf622d0e48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://444e973af493d5aca2c924bcaee20abca5f1281e6b9c9f364e4603bf622d0e48\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:40:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:14Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.590177 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.590169 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gprtl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c5bfb61-e69a-4576-b200-18dd7f38c7f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66xc5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:34Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gprtl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 03 08:40:34 crc kubenswrapper[4790]: W1003 08:40:34.591659 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0c5bfb61_e69a_4576_b200_18dd7f38c7f1.slice/crio-016066a1fc5d31723d5a97167f15ec255a30364c6fb687b86978aca736a61c08 WatchSource:0}: Error finding container 016066a1fc5d31723d5a97167f15ec255a30364c6fb687b86978aca736a61c08: Status 404 returned error can't find the container with id 016066a1fc5d31723d5a97167f15ec255a30364c6fb687b86978aca736a61c08 Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.603897 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.610682 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.653842 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 03 08:40:34 crc kubenswrapper[4790]: W1003 08:40:34.673723 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd75a4c96_2883_4a0b_bab2_0fab2b6c0b49.slice/crio-dc17bdbea184ec8828fd53b38f2eb28ea05d14587c226b9c2304c85cebe1c508 WatchSource:0}: Error finding container dc17bdbea184ec8828fd53b38f2eb28ea05d14587c226b9c2304c85cebe1c508: Status 404 returned error can't find the container with id dc17bdbea184ec8828fd53b38f2eb28ea05d14587c226b9c2304c85cebe1c508 Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.831867 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 08:40:34 crc kubenswrapper[4790]: E1003 08:40:34.832120 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 08:40:35.83208407 +0000 UTC m=+22.185018488 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.932976 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.933019 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.933045 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 08:40:34 crc kubenswrapper[4790]: I1003 08:40:34.933064 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 08:40:34 crc kubenswrapper[4790]: E1003 08:40:34.933171 4790 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 03 08:40:34 crc kubenswrapper[4790]: E1003 08:40:34.933223 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-03 08:40:35.933206672 +0000 UTC m=+22.286140910 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 03 08:40:34 crc kubenswrapper[4790]: E1003 08:40:34.933544 4790 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 03 08:40:34 crc kubenswrapper[4790]: E1003 08:40:34.933558 4790 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 03 08:40:34 crc kubenswrapper[4790]: E1003 08:40:34.933589 4790 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 03 08:40:34 crc kubenswrapper[4790]: E1003 08:40:34.933615 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-03 08:40:35.933607083 +0000 UTC m=+22.286541321 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 03 08:40:34 crc kubenswrapper[4790]: E1003 08:40:34.933643 4790 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 03 08:40:34 crc kubenswrapper[4790]: E1003 08:40:34.933661 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-03 08:40:35.933655654 +0000 UTC m=+22.286589892 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 03 08:40:34 crc kubenswrapper[4790]: E1003 08:40:34.933703 4790 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 03 08:40:34 crc kubenswrapper[4790]: E1003 08:40:34.933714 4790 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 03 08:40:34 crc kubenswrapper[4790]: E1003 08:40:34.933722 4790 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 03 08:40:34 crc kubenswrapper[4790]: E1003 08:40:34.933742 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-03 08:40:35.933736537 +0000 UTC m=+22.286670775 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 03 08:40:35 crc kubenswrapper[4790]: I1003 08:40:35.306853 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-hncl2"] Oct 03 08:40:35 crc kubenswrapper[4790]: I1003 08:40:35.307666 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-hncl2" Oct 03 08:40:35 crc kubenswrapper[4790]: W1003 08:40:35.309560 4790 reflector.go:561] object-"openshift-multus"/"openshift-service-ca.crt": failed to list *v1.ConfigMap: configmaps "openshift-service-ca.crt" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-multus": no relationship found between node 'crc' and this object Oct 03 08:40:35 crc kubenswrapper[4790]: E1003 08:40:35.309622 4790 reflector.go:158] "Unhandled Error" err="object-\"openshift-multus\"/\"openshift-service-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"openshift-service-ca.crt\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-multus\": no relationship found between node 'crc' and this object" logger="UnhandledError" Oct 03 08:40:35 crc kubenswrapper[4790]: W1003 08:40:35.310043 4790 reflector.go:561] object-"openshift-multus"/"multus-daemon-config": failed to list *v1.ConfigMap: configmaps "multus-daemon-config" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-multus": no relationship found between node 'crc' and this object Oct 03 08:40:35 crc kubenswrapper[4790]: W1003 08:40:35.310073 4790 reflector.go:561] object-"openshift-multus"/"default-dockercfg-2q5b6": failed to list *v1.Secret: secrets "default-dockercfg-2q5b6" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-multus": no relationship found between node 'crc' and this object Oct 03 08:40:35 crc kubenswrapper[4790]: E1003 08:40:35.310096 4790 reflector.go:158] "Unhandled Error" err="object-\"openshift-multus\"/\"default-dockercfg-2q5b6\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"default-dockercfg-2q5b6\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-multus\": no relationship found between node 'crc' and this object" logger="UnhandledError" Oct 03 08:40:35 crc kubenswrapper[4790]: E1003 08:40:35.310109 4790 reflector.go:158] "Unhandled Error" err="object-\"openshift-multus\"/\"multus-daemon-config\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"multus-daemon-config\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-multus\": no relationship found between node 'crc' and this object" logger="UnhandledError" Oct 03 08:40:35 crc kubenswrapper[4790]: W1003 08:40:35.310129 4790 reflector.go:561] object-"openshift-multus"/"cni-copy-resources": failed to list *v1.ConfigMap: configmaps "cni-copy-resources" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-multus": no relationship found between node 'crc' and this object Oct 03 08:40:35 crc kubenswrapper[4790]: E1003 08:40:35.310196 4790 reflector.go:158] "Unhandled Error" err="object-\"openshift-multus\"/\"cni-copy-resources\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"cni-copy-resources\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-multus\": no relationship found between node 'crc' and this object" logger="UnhandledError" Oct 03 08:40:35 crc kubenswrapper[4790]: W1003 08:40:35.310383 4790 reflector.go:561] object-"openshift-multus"/"kube-root-ca.crt": failed to list *v1.ConfigMap: configmaps "kube-root-ca.crt" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-multus": no relationship found between node 'crc' and this object Oct 03 08:40:35 crc kubenswrapper[4790]: E1003 08:40:35.310406 4790 reflector.go:158] "Unhandled Error" err="object-\"openshift-multus\"/\"kube-root-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"kube-root-ca.crt\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-multus\": no relationship found between node 'crc' and this object" logger="UnhandledError" Oct 03 08:40:35 crc kubenswrapper[4790]: I1003 08:40:35.319595 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 03 08:40:35 crc kubenswrapper[4790]: I1003 08:40:35.333180 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-hncl2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f2f5631d-8486-44f9-81b0-ee78515e7866\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-strp2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-hncl2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 03 08:40:35 crc kubenswrapper[4790]: I1003 08:40:35.346003 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 03 08:40:35 crc kubenswrapper[4790]: I1003 08:40:35.356685 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 03 08:40:35 crc kubenswrapper[4790]: I1003 08:40:35.377062 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:35Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:35 crc kubenswrapper[4790]: I1003 08:40:35.393382 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:35Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:35 crc kubenswrapper[4790]: I1003 08:40:35.406546 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"535765e1-fd5d-4501-b13e-684c5f08b296\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e0caa9033c82a625d9594342d4e58052cfbfea8251bc0fa2f1b102ba235b5ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db5e8a267e2821ea1c1628ff477a4efc3c5bcc78538460b0f30a160bd738dcee\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d85a637b07da3de31d73486c444bf86e7e2fd799180fc1cd9ab56fee61afaa9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://905a99c09cfaadc3eddc17fa6e509098364525d7a03add46a97c98dc08def8a2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://905a99c09cfaadc3eddc17fa6e509098364525d7a03add46a97c98dc08def8a2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T08:40:30Z\\\",\\\"message\\\":\\\"file observer\\\\nW1003 08:40:30.396497 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1003 08:40:30.399279 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 08:40:30.400119 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-247026750/tls.crt::/tmp/serving-cert-247026750/tls.key\\\\\\\"\\\\nI1003 08:40:30.809123 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1003 08:40:30.812812 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1003 08:40:30.812831 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1003 08:40:30.812852 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1003 08:40:30.812858 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1003 08:40:30.817440 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1003 08:40:30.817463 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1003 08:40:30.817482 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 08:40:30.817491 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 08:40:30.817495 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1003 08:40:30.817498 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1003 08:40:30.817501 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1003 08:40:30.817504 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1003 08:40:30.821080 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4591d1f2d844cb16eef19a65aaad416898d23ca5ce266e4863a9810d49ff4ba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://444e973af493d5aca2c924bcaee20abca5f1281e6b9c9f364e4603bf622d0e48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://444e973af493d5aca2c924bcaee20abca5f1281e6b9c9f364e4603bf622d0e48\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:40:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:14Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:35Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:35 crc kubenswrapper[4790]: I1003 08:40:35.418110 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gprtl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c5bfb61-e69a-4576-b200-18dd7f38c7f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66xc5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:34Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gprtl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:35Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:35 crc kubenswrapper[4790]: I1003 08:40:35.434898 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:35Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:35 crc kubenswrapper[4790]: I1003 08:40:35.437287 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/f2f5631d-8486-44f9-81b0-ee78515e7866-multus-daemon-config\") pod \"multus-hncl2\" (UID: \"f2f5631d-8486-44f9-81b0-ee78515e7866\") " pod="openshift-multus/multus-hncl2" Oct 03 08:40:35 crc kubenswrapper[4790]: I1003 08:40:35.437339 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/f2f5631d-8486-44f9-81b0-ee78515e7866-cnibin\") pod \"multus-hncl2\" (UID: \"f2f5631d-8486-44f9-81b0-ee78515e7866\") " pod="openshift-multus/multus-hncl2" Oct 03 08:40:35 crc kubenswrapper[4790]: I1003 08:40:35.437362 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/f2f5631d-8486-44f9-81b0-ee78515e7866-multus-socket-dir-parent\") pod \"multus-hncl2\" (UID: \"f2f5631d-8486-44f9-81b0-ee78515e7866\") " pod="openshift-multus/multus-hncl2" Oct 03 08:40:35 crc kubenswrapper[4790]: I1003 08:40:35.437384 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f2f5631d-8486-44f9-81b0-ee78515e7866-host-run-netns\") pod \"multus-hncl2\" (UID: \"f2f5631d-8486-44f9-81b0-ee78515e7866\") " pod="openshift-multus/multus-hncl2" Oct 03 08:40:35 crc kubenswrapper[4790]: I1003 08:40:35.437405 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/f2f5631d-8486-44f9-81b0-ee78515e7866-multus-conf-dir\") pod \"multus-hncl2\" (UID: \"f2f5631d-8486-44f9-81b0-ee78515e7866\") " pod="openshift-multus/multus-hncl2" Oct 03 08:40:35 crc kubenswrapper[4790]: I1003 08:40:35.437426 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f2f5631d-8486-44f9-81b0-ee78515e7866-system-cni-dir\") pod \"multus-hncl2\" (UID: \"f2f5631d-8486-44f9-81b0-ee78515e7866\") " pod="openshift-multus/multus-hncl2" Oct 03 08:40:35 crc kubenswrapper[4790]: I1003 08:40:35.437449 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/f2f5631d-8486-44f9-81b0-ee78515e7866-host-run-k8s-cni-cncf-io\") pod \"multus-hncl2\" (UID: \"f2f5631d-8486-44f9-81b0-ee78515e7866\") " pod="openshift-multus/multus-hncl2" Oct 03 08:40:35 crc kubenswrapper[4790]: I1003 08:40:35.437488 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f2f5631d-8486-44f9-81b0-ee78515e7866-host-var-lib-cni-bin\") pod \"multus-hncl2\" (UID: \"f2f5631d-8486-44f9-81b0-ee78515e7866\") " pod="openshift-multus/multus-hncl2" Oct 03 08:40:35 crc kubenswrapper[4790]: I1003 08:40:35.437639 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f2f5631d-8486-44f9-81b0-ee78515e7866-etc-kubernetes\") pod \"multus-hncl2\" (UID: \"f2f5631d-8486-44f9-81b0-ee78515e7866\") " pod="openshift-multus/multus-hncl2" Oct 03 08:40:35 crc kubenswrapper[4790]: I1003 08:40:35.437734 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/f2f5631d-8486-44f9-81b0-ee78515e7866-cni-binary-copy\") pod \"multus-hncl2\" (UID: \"f2f5631d-8486-44f9-81b0-ee78515e7866\") " pod="openshift-multus/multus-hncl2" Oct 03 08:40:35 crc kubenswrapper[4790]: I1003 08:40:35.437776 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/f2f5631d-8486-44f9-81b0-ee78515e7866-host-var-lib-cni-multus\") pod \"multus-hncl2\" (UID: \"f2f5631d-8486-44f9-81b0-ee78515e7866\") " pod="openshift-multus/multus-hncl2" Oct 03 08:40:35 crc kubenswrapper[4790]: I1003 08:40:35.437812 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/f2f5631d-8486-44f9-81b0-ee78515e7866-host-var-lib-kubelet\") pod \"multus-hncl2\" (UID: \"f2f5631d-8486-44f9-81b0-ee78515e7866\") " pod="openshift-multus/multus-hncl2" Oct 03 08:40:35 crc kubenswrapper[4790]: I1003 08:40:35.437870 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f2f5631d-8486-44f9-81b0-ee78515e7866-multus-cni-dir\") pod \"multus-hncl2\" (UID: \"f2f5631d-8486-44f9-81b0-ee78515e7866\") " pod="openshift-multus/multus-hncl2" Oct 03 08:40:35 crc kubenswrapper[4790]: I1003 08:40:35.437895 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/f2f5631d-8486-44f9-81b0-ee78515e7866-os-release\") pod \"multus-hncl2\" (UID: \"f2f5631d-8486-44f9-81b0-ee78515e7866\") " pod="openshift-multus/multus-hncl2" Oct 03 08:40:35 crc kubenswrapper[4790]: I1003 08:40:35.437953 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/f2f5631d-8486-44f9-81b0-ee78515e7866-host-run-multus-certs\") pod \"multus-hncl2\" (UID: \"f2f5631d-8486-44f9-81b0-ee78515e7866\") " pod="openshift-multus/multus-hncl2" Oct 03 08:40:35 crc kubenswrapper[4790]: I1003 08:40:35.437982 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/f2f5631d-8486-44f9-81b0-ee78515e7866-hostroot\") pod \"multus-hncl2\" (UID: \"f2f5631d-8486-44f9-81b0-ee78515e7866\") " pod="openshift-multus/multus-hncl2" Oct 03 08:40:35 crc kubenswrapper[4790]: I1003 08:40:35.438026 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-strp2\" (UniqueName: \"kubernetes.io/projected/f2f5631d-8486-44f9-81b0-ee78515e7866-kube-api-access-strp2\") pod \"multus-hncl2\" (UID: \"f2f5631d-8486-44f9-81b0-ee78515e7866\") " pod="openshift-multus/multus-hncl2" Oct 03 08:40:35 crc kubenswrapper[4790]: I1003 08:40:35.474770 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"28b99cc027b4390bdc7e3ed519bb798eb4a7b825ff29e4574f3420b9922265b0"} Oct 03 08:40:35 crc kubenswrapper[4790]: I1003 08:40:35.474855 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"8e2c1727664a06a8b23d42a2119f8c1eebc7474bb3e3f70b0b197aa764fed2f8"} Oct 03 08:40:35 crc kubenswrapper[4790]: I1003 08:40:35.480602 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"dc17bdbea184ec8828fd53b38f2eb28ea05d14587c226b9c2304c85cebe1c508"} Oct 03 08:40:35 crc kubenswrapper[4790]: I1003 08:40:35.484088 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"3362615c1396a6051d3770c79e4b67eea9d0c9df2a1a45381f729c4120970818"} Oct 03 08:40:35 crc kubenswrapper[4790]: I1003 08:40:35.484175 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"41113d498e57788a4c3162e8e981f5b9d9a5a7d49ee50b15085688fbc6b67f4e"} Oct 03 08:40:35 crc kubenswrapper[4790]: I1003 08:40:35.484198 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"36df0e7549e05d597ff390b1744725491244410d6379daaf295cd4429eab531e"} Oct 03 08:40:35 crc kubenswrapper[4790]: I1003 08:40:35.487681 4790 scope.go:117] "RemoveContainer" containerID="905a99c09cfaadc3eddc17fa6e509098364525d7a03add46a97c98dc08def8a2" Oct 03 08:40:35 crc kubenswrapper[4790]: E1003 08:40:35.487857 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Oct 03 08:40:35 crc kubenswrapper[4790]: I1003 08:40:35.488071 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-gprtl" event={"ID":"0c5bfb61-e69a-4576-b200-18dd7f38c7f1","Type":"ContainerStarted","Data":"dcb3e7b6550f0ad490e187c2154ec53cb427cbab7acaf0618eb4de188eb724c4"} Oct 03 08:40:35 crc kubenswrapper[4790]: I1003 08:40:35.488105 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-gprtl" event={"ID":"0c5bfb61-e69a-4576-b200-18dd7f38c7f1","Type":"ContainerStarted","Data":"016066a1fc5d31723d5a97167f15ec255a30364c6fb687b86978aca736a61c08"} Oct 03 08:40:35 crc kubenswrapper[4790]: I1003 08:40:35.495059 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gprtl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c5bfb61-e69a-4576-b200-18dd7f38c7f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66xc5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:34Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gprtl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:35Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:35 crc kubenswrapper[4790]: I1003 08:40:35.510781 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:35Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:35 crc kubenswrapper[4790]: I1003 08:40:35.528733 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:35Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:35 crc kubenswrapper[4790]: I1003 08:40:35.538966 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-strp2\" (UniqueName: \"kubernetes.io/projected/f2f5631d-8486-44f9-81b0-ee78515e7866-kube-api-access-strp2\") pod \"multus-hncl2\" (UID: \"f2f5631d-8486-44f9-81b0-ee78515e7866\") " pod="openshift-multus/multus-hncl2" Oct 03 08:40:35 crc kubenswrapper[4790]: I1003 08:40:35.539038 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/f2f5631d-8486-44f9-81b0-ee78515e7866-multus-daemon-config\") pod \"multus-hncl2\" (UID: \"f2f5631d-8486-44f9-81b0-ee78515e7866\") " pod="openshift-multus/multus-hncl2" Oct 03 08:40:35 crc kubenswrapper[4790]: I1003 08:40:35.539076 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/f2f5631d-8486-44f9-81b0-ee78515e7866-cnibin\") pod \"multus-hncl2\" (UID: \"f2f5631d-8486-44f9-81b0-ee78515e7866\") " pod="openshift-multus/multus-hncl2" Oct 03 08:40:35 crc kubenswrapper[4790]: I1003 08:40:35.539094 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/f2f5631d-8486-44f9-81b0-ee78515e7866-multus-socket-dir-parent\") pod \"multus-hncl2\" (UID: \"f2f5631d-8486-44f9-81b0-ee78515e7866\") " pod="openshift-multus/multus-hncl2" Oct 03 08:40:35 crc kubenswrapper[4790]: I1003 08:40:35.539113 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f2f5631d-8486-44f9-81b0-ee78515e7866-host-run-netns\") pod \"multus-hncl2\" (UID: \"f2f5631d-8486-44f9-81b0-ee78515e7866\") " pod="openshift-multus/multus-hncl2" Oct 03 08:40:35 crc kubenswrapper[4790]: I1003 08:40:35.539129 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/f2f5631d-8486-44f9-81b0-ee78515e7866-multus-conf-dir\") pod \"multus-hncl2\" (UID: \"f2f5631d-8486-44f9-81b0-ee78515e7866\") " pod="openshift-multus/multus-hncl2" Oct 03 08:40:35 crc kubenswrapper[4790]: I1003 08:40:35.539148 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f2f5631d-8486-44f9-81b0-ee78515e7866-system-cni-dir\") pod \"multus-hncl2\" (UID: \"f2f5631d-8486-44f9-81b0-ee78515e7866\") " pod="openshift-multus/multus-hncl2" Oct 03 08:40:35 crc kubenswrapper[4790]: I1003 08:40:35.539167 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/f2f5631d-8486-44f9-81b0-ee78515e7866-host-run-k8s-cni-cncf-io\") pod \"multus-hncl2\" (UID: \"f2f5631d-8486-44f9-81b0-ee78515e7866\") " pod="openshift-multus/multus-hncl2" Oct 03 08:40:35 crc kubenswrapper[4790]: I1003 08:40:35.539184 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f2f5631d-8486-44f9-81b0-ee78515e7866-host-var-lib-cni-bin\") pod \"multus-hncl2\" (UID: \"f2f5631d-8486-44f9-81b0-ee78515e7866\") " pod="openshift-multus/multus-hncl2" Oct 03 08:40:35 crc kubenswrapper[4790]: I1003 08:40:35.539216 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f2f5631d-8486-44f9-81b0-ee78515e7866-etc-kubernetes\") pod \"multus-hncl2\" (UID: \"f2f5631d-8486-44f9-81b0-ee78515e7866\") " pod="openshift-multus/multus-hncl2" Oct 03 08:40:35 crc kubenswrapper[4790]: I1003 08:40:35.539236 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/f2f5631d-8486-44f9-81b0-ee78515e7866-cni-binary-copy\") pod \"multus-hncl2\" (UID: \"f2f5631d-8486-44f9-81b0-ee78515e7866\") " pod="openshift-multus/multus-hncl2" Oct 03 08:40:35 crc kubenswrapper[4790]: I1003 08:40:35.539279 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/f2f5631d-8486-44f9-81b0-ee78515e7866-host-var-lib-cni-multus\") pod \"multus-hncl2\" (UID: \"f2f5631d-8486-44f9-81b0-ee78515e7866\") " pod="openshift-multus/multus-hncl2" Oct 03 08:40:35 crc kubenswrapper[4790]: I1003 08:40:35.539295 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/f2f5631d-8486-44f9-81b0-ee78515e7866-host-var-lib-kubelet\") pod \"multus-hncl2\" (UID: \"f2f5631d-8486-44f9-81b0-ee78515e7866\") " pod="openshift-multus/multus-hncl2" Oct 03 08:40:35 crc kubenswrapper[4790]: I1003 08:40:35.539315 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f2f5631d-8486-44f9-81b0-ee78515e7866-multus-cni-dir\") pod \"multus-hncl2\" (UID: \"f2f5631d-8486-44f9-81b0-ee78515e7866\") " pod="openshift-multus/multus-hncl2" Oct 03 08:40:35 crc kubenswrapper[4790]: I1003 08:40:35.539330 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/f2f5631d-8486-44f9-81b0-ee78515e7866-os-release\") pod \"multus-hncl2\" (UID: \"f2f5631d-8486-44f9-81b0-ee78515e7866\") " pod="openshift-multus/multus-hncl2" Oct 03 08:40:35 crc kubenswrapper[4790]: I1003 08:40:35.539346 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/f2f5631d-8486-44f9-81b0-ee78515e7866-hostroot\") pod \"multus-hncl2\" (UID: \"f2f5631d-8486-44f9-81b0-ee78515e7866\") " pod="openshift-multus/multus-hncl2" Oct 03 08:40:35 crc kubenswrapper[4790]: I1003 08:40:35.539363 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/f2f5631d-8486-44f9-81b0-ee78515e7866-host-run-multus-certs\") pod \"multus-hncl2\" (UID: \"f2f5631d-8486-44f9-81b0-ee78515e7866\") " pod="openshift-multus/multus-hncl2" Oct 03 08:40:35 crc kubenswrapper[4790]: I1003 08:40:35.539422 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/f2f5631d-8486-44f9-81b0-ee78515e7866-host-run-multus-certs\") pod \"multus-hncl2\" (UID: \"f2f5631d-8486-44f9-81b0-ee78515e7866\") " pod="openshift-multus/multus-hncl2" Oct 03 08:40:35 crc kubenswrapper[4790]: I1003 08:40:35.539838 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/f2f5631d-8486-44f9-81b0-ee78515e7866-cnibin\") pod \"multus-hncl2\" (UID: \"f2f5631d-8486-44f9-81b0-ee78515e7866\") " pod="openshift-multus/multus-hncl2" Oct 03 08:40:35 crc kubenswrapper[4790]: I1003 08:40:35.539861 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f2f5631d-8486-44f9-81b0-ee78515e7866-etc-kubernetes\") pod \"multus-hncl2\" (UID: \"f2f5631d-8486-44f9-81b0-ee78515e7866\") " pod="openshift-multus/multus-hncl2" Oct 03 08:40:35 crc kubenswrapper[4790]: I1003 08:40:35.539915 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/f2f5631d-8486-44f9-81b0-ee78515e7866-host-var-lib-cni-multus\") pod \"multus-hncl2\" (UID: \"f2f5631d-8486-44f9-81b0-ee78515e7866\") " pod="openshift-multus/multus-hncl2" Oct 03 08:40:35 crc kubenswrapper[4790]: I1003 08:40:35.539935 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/f2f5631d-8486-44f9-81b0-ee78515e7866-multus-socket-dir-parent\") pod \"multus-hncl2\" (UID: \"f2f5631d-8486-44f9-81b0-ee78515e7866\") " pod="openshift-multus/multus-hncl2" Oct 03 08:40:35 crc kubenswrapper[4790]: I1003 08:40:35.539953 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/f2f5631d-8486-44f9-81b0-ee78515e7866-host-var-lib-kubelet\") pod \"multus-hncl2\" (UID: \"f2f5631d-8486-44f9-81b0-ee78515e7866\") " pod="openshift-multus/multus-hncl2" Oct 03 08:40:35 crc kubenswrapper[4790]: I1003 08:40:35.539968 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f2f5631d-8486-44f9-81b0-ee78515e7866-host-run-netns\") pod \"multus-hncl2\" (UID: \"f2f5631d-8486-44f9-81b0-ee78515e7866\") " pod="openshift-multus/multus-hncl2" Oct 03 08:40:35 crc kubenswrapper[4790]: I1003 08:40:35.540000 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/f2f5631d-8486-44f9-81b0-ee78515e7866-multus-conf-dir\") pod \"multus-hncl2\" (UID: \"f2f5631d-8486-44f9-81b0-ee78515e7866\") " pod="openshift-multus/multus-hncl2" Oct 03 08:40:35 crc kubenswrapper[4790]: I1003 08:40:35.540058 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f2f5631d-8486-44f9-81b0-ee78515e7866-system-cni-dir\") pod \"multus-hncl2\" (UID: \"f2f5631d-8486-44f9-81b0-ee78515e7866\") " pod="openshift-multus/multus-hncl2" Oct 03 08:40:35 crc kubenswrapper[4790]: I1003 08:40:35.540089 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/f2f5631d-8486-44f9-81b0-ee78515e7866-host-run-k8s-cni-cncf-io\") pod \"multus-hncl2\" (UID: \"f2f5631d-8486-44f9-81b0-ee78515e7866\") " pod="openshift-multus/multus-hncl2" Oct 03 08:40:35 crc kubenswrapper[4790]: I1003 08:40:35.540119 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f2f5631d-8486-44f9-81b0-ee78515e7866-host-var-lib-cni-bin\") pod \"multus-hncl2\" (UID: \"f2f5631d-8486-44f9-81b0-ee78515e7866\") " pod="openshift-multus/multus-hncl2" Oct 03 08:40:35 crc kubenswrapper[4790]: I1003 08:40:35.540185 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f2f5631d-8486-44f9-81b0-ee78515e7866-multus-cni-dir\") pod \"multus-hncl2\" (UID: \"f2f5631d-8486-44f9-81b0-ee78515e7866\") " pod="openshift-multus/multus-hncl2" Oct 03 08:40:35 crc kubenswrapper[4790]: I1003 08:40:35.540220 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/f2f5631d-8486-44f9-81b0-ee78515e7866-hostroot\") pod \"multus-hncl2\" (UID: \"f2f5631d-8486-44f9-81b0-ee78515e7866\") " pod="openshift-multus/multus-hncl2" Oct 03 08:40:35 crc kubenswrapper[4790]: I1003 08:40:35.540609 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/f2f5631d-8486-44f9-81b0-ee78515e7866-os-release\") pod \"multus-hncl2\" (UID: \"f2f5631d-8486-44f9-81b0-ee78515e7866\") " pod="openshift-multus/multus-hncl2" Oct 03 08:40:35 crc kubenswrapper[4790]: I1003 08:40:35.542877 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"535765e1-fd5d-4501-b13e-684c5f08b296\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e0caa9033c82a625d9594342d4e58052cfbfea8251bc0fa2f1b102ba235b5ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db5e8a267e2821ea1c1628ff477a4efc3c5bcc78538460b0f30a160bd738dcee\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d85a637b07da3de31d73486c444bf86e7e2fd799180fc1cd9ab56fee61afaa9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://905a99c09cfaadc3eddc17fa6e509098364525d7a03add46a97c98dc08def8a2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://905a99c09cfaadc3eddc17fa6e509098364525d7a03add46a97c98dc08def8a2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T08:40:30Z\\\",\\\"message\\\":\\\"file observer\\\\nW1003 08:40:30.396497 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1003 08:40:30.399279 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 08:40:30.400119 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-247026750/tls.crt::/tmp/serving-cert-247026750/tls.key\\\\\\\"\\\\nI1003 08:40:30.809123 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1003 08:40:30.812812 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1003 08:40:30.812831 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1003 08:40:30.812852 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1003 08:40:30.812858 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1003 08:40:30.817440 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1003 08:40:30.817463 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1003 08:40:30.817482 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 08:40:30.817491 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 08:40:30.817495 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1003 08:40:30.817498 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1003 08:40:30.817501 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1003 08:40:30.817504 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1003 08:40:30.821080 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4591d1f2d844cb16eef19a65aaad416898d23ca5ce266e4863a9810d49ff4ba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://444e973af493d5aca2c924bcaee20abca5f1281e6b9c9f364e4603bf622d0e48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://444e973af493d5aca2c924bcaee20abca5f1281e6b9c9f364e4603bf622d0e48\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:40:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:14Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:35Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:35 crc kubenswrapper[4790]: I1003 08:40:35.555717 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:35Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:35 crc kubenswrapper[4790]: I1003 08:40:35.569913 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:35Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:35 crc kubenswrapper[4790]: I1003 08:40:35.583062 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:35Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:35 crc kubenswrapper[4790]: I1003 08:40:35.599014 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-hncl2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f2f5631d-8486-44f9-81b0-ee78515e7866\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-strp2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-hncl2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:35Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:35 crc kubenswrapper[4790]: I1003 08:40:35.615098 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28b99cc027b4390bdc7e3ed519bb798eb4a7b825ff29e4574f3420b9922265b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:35Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:35 crc kubenswrapper[4790]: I1003 08:40:35.627961 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:35Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:35 crc kubenswrapper[4790]: I1003 08:40:35.645694 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3362615c1396a6051d3770c79e4b67eea9d0c9df2a1a45381f729c4120970818\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://41113d498e57788a4c3162e8e981f5b9d9a5a7d49ee50b15085688fbc6b67f4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:35Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:35 crc kubenswrapper[4790]: I1003 08:40:35.673719 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"535765e1-fd5d-4501-b13e-684c5f08b296\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e0caa9033c82a625d9594342d4e58052cfbfea8251bc0fa2f1b102ba235b5ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db5e8a267e2821ea1c1628ff477a4efc3c5bcc78538460b0f30a160bd738dcee\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d85a637b07da3de31d73486c444bf86e7e2fd799180fc1cd9ab56fee61afaa9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://905a99c09cfaadc3eddc17fa6e509098364525d7a03add46a97c98dc08def8a2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://905a99c09cfaadc3eddc17fa6e509098364525d7a03add46a97c98dc08def8a2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T08:40:30Z\\\",\\\"message\\\":\\\"file observer\\\\nW1003 08:40:30.396497 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1003 08:40:30.399279 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 08:40:30.400119 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-247026750/tls.crt::/tmp/serving-cert-247026750/tls.key\\\\\\\"\\\\nI1003 08:40:30.809123 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1003 08:40:30.812812 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1003 08:40:30.812831 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1003 08:40:30.812852 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1003 08:40:30.812858 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1003 08:40:30.817440 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1003 08:40:30.817463 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1003 08:40:30.817482 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 08:40:30.817491 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 08:40:30.817495 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1003 08:40:30.817498 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1003 08:40:30.817501 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1003 08:40:30.817504 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1003 08:40:30.821080 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4591d1f2d844cb16eef19a65aaad416898d23ca5ce266e4863a9810d49ff4ba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://444e973af493d5aca2c924bcaee20abca5f1281e6b9c9f364e4603bf622d0e48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://444e973af493d5aca2c924bcaee20abca5f1281e6b9c9f364e4603bf622d0e48\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:40:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:14Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:35Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:35 crc kubenswrapper[4790]: I1003 08:40:35.694021 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-daemon-zqj4j"] Oct 03 08:40:35 crc kubenswrapper[4790]: I1003 08:40:35.694599 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-9fnsr"] Oct 03 08:40:35 crc kubenswrapper[4790]: I1003 08:40:35.694782 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" Oct 03 08:40:35 crc kubenswrapper[4790]: I1003 08:40:35.695492 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-9fnsr" Oct 03 08:40:35 crc kubenswrapper[4790]: I1003 08:40:35.697697 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Oct 03 08:40:35 crc kubenswrapper[4790]: I1003 08:40:35.699487 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Oct 03 08:40:35 crc kubenswrapper[4790]: I1003 08:40:35.706123 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Oct 03 08:40:35 crc kubenswrapper[4790]: I1003 08:40:35.706274 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Oct 03 08:40:35 crc kubenswrapper[4790]: I1003 08:40:35.706123 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Oct 03 08:40:35 crc kubenswrapper[4790]: I1003 08:40:35.706297 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gprtl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c5bfb61-e69a-4576-b200-18dd7f38c7f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dcb3e7b6550f0ad490e187c2154ec53cb427cbab7acaf0618eb4de188eb724c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66xc5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:34Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gprtl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:35Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:35 crc kubenswrapper[4790]: I1003 08:40:35.706617 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Oct 03 08:40:35 crc kubenswrapper[4790]: I1003 08:40:35.707316 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Oct 03 08:40:35 crc kubenswrapper[4790]: I1003 08:40:35.750009 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:35Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:35 crc kubenswrapper[4790]: I1003 08:40:35.794231 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:35Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:35 crc kubenswrapper[4790]: I1003 08:40:35.826266 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-hncl2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f2f5631d-8486-44f9-81b0-ee78515e7866\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-strp2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-hncl2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:35Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:35 crc kubenswrapper[4790]: I1003 08:40:35.842646 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 08:40:35 crc kubenswrapper[4790]: E1003 08:40:35.842799 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 08:40:37.842777285 +0000 UTC m=+24.195711523 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 08:40:35 crc kubenswrapper[4790]: I1003 08:40:35.842830 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/36eae06a-d8be-4128-8d68-acb32e1f011b-proxy-tls\") pod \"machine-config-daemon-zqj4j\" (UID: \"36eae06a-d8be-4128-8d68-acb32e1f011b\") " pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" Oct 03 08:40:35 crc kubenswrapper[4790]: I1003 08:40:35.842862 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/e01f9373-74e4-439b-bd7f-ad00e60a3284-cni-binary-copy\") pod \"multus-additional-cni-plugins-9fnsr\" (UID: \"e01f9373-74e4-439b-bd7f-ad00e60a3284\") " pod="openshift-multus/multus-additional-cni-plugins-9fnsr" Oct 03 08:40:35 crc kubenswrapper[4790]: I1003 08:40:35.842883 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-krfjv\" (UniqueName: \"kubernetes.io/projected/e01f9373-74e4-439b-bd7f-ad00e60a3284-kube-api-access-krfjv\") pod \"multus-additional-cni-plugins-9fnsr\" (UID: \"e01f9373-74e4-439b-bd7f-ad00e60a3284\") " pod="openshift-multus/multus-additional-cni-plugins-9fnsr" Oct 03 08:40:35 crc kubenswrapper[4790]: I1003 08:40:35.842897 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/36eae06a-d8be-4128-8d68-acb32e1f011b-mcd-auth-proxy-config\") pod \"machine-config-daemon-zqj4j\" (UID: \"36eae06a-d8be-4128-8d68-acb32e1f011b\") " pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" Oct 03 08:40:35 crc kubenswrapper[4790]: I1003 08:40:35.842913 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ww2bb\" (UniqueName: \"kubernetes.io/projected/36eae06a-d8be-4128-8d68-acb32e1f011b-kube-api-access-ww2bb\") pod \"machine-config-daemon-zqj4j\" (UID: \"36eae06a-d8be-4128-8d68-acb32e1f011b\") " pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" Oct 03 08:40:35 crc kubenswrapper[4790]: I1003 08:40:35.843061 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/e01f9373-74e4-439b-bd7f-ad00e60a3284-system-cni-dir\") pod \"multus-additional-cni-plugins-9fnsr\" (UID: \"e01f9373-74e4-439b-bd7f-ad00e60a3284\") " pod="openshift-multus/multus-additional-cni-plugins-9fnsr" Oct 03 08:40:35 crc kubenswrapper[4790]: I1003 08:40:35.843193 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/e01f9373-74e4-439b-bd7f-ad00e60a3284-tuning-conf-dir\") pod \"multus-additional-cni-plugins-9fnsr\" (UID: \"e01f9373-74e4-439b-bd7f-ad00e60a3284\") " pod="openshift-multus/multus-additional-cni-plugins-9fnsr" Oct 03 08:40:35 crc kubenswrapper[4790]: I1003 08:40:35.843223 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/e01f9373-74e4-439b-bd7f-ad00e60a3284-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-9fnsr\" (UID: \"e01f9373-74e4-439b-bd7f-ad00e60a3284\") " pod="openshift-multus/multus-additional-cni-plugins-9fnsr" Oct 03 08:40:35 crc kubenswrapper[4790]: I1003 08:40:35.843250 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/e01f9373-74e4-439b-bd7f-ad00e60a3284-os-release\") pod \"multus-additional-cni-plugins-9fnsr\" (UID: \"e01f9373-74e4-439b-bd7f-ad00e60a3284\") " pod="openshift-multus/multus-additional-cni-plugins-9fnsr" Oct 03 08:40:35 crc kubenswrapper[4790]: I1003 08:40:35.843313 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/36eae06a-d8be-4128-8d68-acb32e1f011b-rootfs\") pod \"machine-config-daemon-zqj4j\" (UID: \"36eae06a-d8be-4128-8d68-acb32e1f011b\") " pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" Oct 03 08:40:35 crc kubenswrapper[4790]: I1003 08:40:35.843346 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/e01f9373-74e4-439b-bd7f-ad00e60a3284-cnibin\") pod \"multus-additional-cni-plugins-9fnsr\" (UID: \"e01f9373-74e4-439b-bd7f-ad00e60a3284\") " pod="openshift-multus/multus-additional-cni-plugins-9fnsr" Oct 03 08:40:35 crc kubenswrapper[4790]: I1003 08:40:35.843865 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28b99cc027b4390bdc7e3ed519bb798eb4a7b825ff29e4574f3420b9922265b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:35Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:35 crc kubenswrapper[4790]: I1003 08:40:35.857758 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:35Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:35 crc kubenswrapper[4790]: I1003 08:40:35.873989 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"535765e1-fd5d-4501-b13e-684c5f08b296\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e0caa9033c82a625d9594342d4e58052cfbfea8251bc0fa2f1b102ba235b5ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db5e8a267e2821ea1c1628ff477a4efc3c5bcc78538460b0f30a160bd738dcee\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d85a637b07da3de31d73486c444bf86e7e2fd799180fc1cd9ab56fee61afaa9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://905a99c09cfaadc3eddc17fa6e509098364525d7a03add46a97c98dc08def8a2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://905a99c09cfaadc3eddc17fa6e509098364525d7a03add46a97c98dc08def8a2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T08:40:30Z\\\",\\\"message\\\":\\\"file observer\\\\nW1003 08:40:30.396497 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1003 08:40:30.399279 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 08:40:30.400119 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-247026750/tls.crt::/tmp/serving-cert-247026750/tls.key\\\\\\\"\\\\nI1003 08:40:30.809123 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1003 08:40:30.812812 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1003 08:40:30.812831 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1003 08:40:30.812852 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1003 08:40:30.812858 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1003 08:40:30.817440 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1003 08:40:30.817463 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1003 08:40:30.817482 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 08:40:30.817491 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 08:40:30.817495 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1003 08:40:30.817498 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1003 08:40:30.817501 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1003 08:40:30.817504 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1003 08:40:30.821080 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4591d1f2d844cb16eef19a65aaad416898d23ca5ce266e4863a9810d49ff4ba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://444e973af493d5aca2c924bcaee20abca5f1281e6b9c9f364e4603bf622d0e48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://444e973af493d5aca2c924bcaee20abca5f1281e6b9c9f364e4603bf622d0e48\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:40:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:14Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:35Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:35 crc kubenswrapper[4790]: I1003 08:40:35.883290 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gprtl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c5bfb61-e69a-4576-b200-18dd7f38c7f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dcb3e7b6550f0ad490e187c2154ec53cb427cbab7acaf0618eb4de188eb724c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66xc5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:34Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gprtl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:35Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:35 crc kubenswrapper[4790]: I1003 08:40:35.899599 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3362615c1396a6051d3770c79e4b67eea9d0c9df2a1a45381f729c4120970818\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://41113d498e57788a4c3162e8e981f5b9d9a5a7d49ee50b15085688fbc6b67f4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:35Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:35 crc kubenswrapper[4790]: I1003 08:40:35.910855 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:35Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:35 crc kubenswrapper[4790]: I1003 08:40:35.924378 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-hncl2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f2f5631d-8486-44f9-81b0-ee78515e7866\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-strp2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-hncl2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:35Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:35 crc kubenswrapper[4790]: I1003 08:40:35.937654 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:35Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:35 crc kubenswrapper[4790]: I1003 08:40:35.944778 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 08:40:35 crc kubenswrapper[4790]: I1003 08:40:35.944864 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/36eae06a-d8be-4128-8d68-acb32e1f011b-proxy-tls\") pod \"machine-config-daemon-zqj4j\" (UID: \"36eae06a-d8be-4128-8d68-acb32e1f011b\") " pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" Oct 03 08:40:35 crc kubenswrapper[4790]: I1003 08:40:35.944904 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/e01f9373-74e4-439b-bd7f-ad00e60a3284-cni-binary-copy\") pod \"multus-additional-cni-plugins-9fnsr\" (UID: \"e01f9373-74e4-439b-bd7f-ad00e60a3284\") " pod="openshift-multus/multus-additional-cni-plugins-9fnsr" Oct 03 08:40:35 crc kubenswrapper[4790]: I1003 08:40:35.944928 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-krfjv\" (UniqueName: \"kubernetes.io/projected/e01f9373-74e4-439b-bd7f-ad00e60a3284-kube-api-access-krfjv\") pod \"multus-additional-cni-plugins-9fnsr\" (UID: \"e01f9373-74e4-439b-bd7f-ad00e60a3284\") " pod="openshift-multus/multus-additional-cni-plugins-9fnsr" Oct 03 08:40:35 crc kubenswrapper[4790]: I1003 08:40:35.944955 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/36eae06a-d8be-4128-8d68-acb32e1f011b-mcd-auth-proxy-config\") pod \"machine-config-daemon-zqj4j\" (UID: \"36eae06a-d8be-4128-8d68-acb32e1f011b\") " pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" Oct 03 08:40:35 crc kubenswrapper[4790]: I1003 08:40:35.944973 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ww2bb\" (UniqueName: \"kubernetes.io/projected/36eae06a-d8be-4128-8d68-acb32e1f011b-kube-api-access-ww2bb\") pod \"machine-config-daemon-zqj4j\" (UID: \"36eae06a-d8be-4128-8d68-acb32e1f011b\") " pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" Oct 03 08:40:35 crc kubenswrapper[4790]: I1003 08:40:35.944998 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 08:40:35 crc kubenswrapper[4790]: I1003 08:40:35.945018 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/e01f9373-74e4-439b-bd7f-ad00e60a3284-system-cni-dir\") pod \"multus-additional-cni-plugins-9fnsr\" (UID: \"e01f9373-74e4-439b-bd7f-ad00e60a3284\") " pod="openshift-multus/multus-additional-cni-plugins-9fnsr" Oct 03 08:40:35 crc kubenswrapper[4790]: I1003 08:40:35.945041 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/e01f9373-74e4-439b-bd7f-ad00e60a3284-tuning-conf-dir\") pod \"multus-additional-cni-plugins-9fnsr\" (UID: \"e01f9373-74e4-439b-bd7f-ad00e60a3284\") " pod="openshift-multus/multus-additional-cni-plugins-9fnsr" Oct 03 08:40:35 crc kubenswrapper[4790]: I1003 08:40:35.945057 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/e01f9373-74e4-439b-bd7f-ad00e60a3284-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-9fnsr\" (UID: \"e01f9373-74e4-439b-bd7f-ad00e60a3284\") " pod="openshift-multus/multus-additional-cni-plugins-9fnsr" Oct 03 08:40:35 crc kubenswrapper[4790]: I1003 08:40:35.945073 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/e01f9373-74e4-439b-bd7f-ad00e60a3284-os-release\") pod \"multus-additional-cni-plugins-9fnsr\" (UID: \"e01f9373-74e4-439b-bd7f-ad00e60a3284\") " pod="openshift-multus/multus-additional-cni-plugins-9fnsr" Oct 03 08:40:35 crc kubenswrapper[4790]: I1003 08:40:35.945091 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 08:40:35 crc kubenswrapper[4790]: I1003 08:40:35.945111 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 08:40:35 crc kubenswrapper[4790]: I1003 08:40:35.945134 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/36eae06a-d8be-4128-8d68-acb32e1f011b-rootfs\") pod \"machine-config-daemon-zqj4j\" (UID: \"36eae06a-d8be-4128-8d68-acb32e1f011b\") " pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" Oct 03 08:40:35 crc kubenswrapper[4790]: I1003 08:40:35.945152 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/e01f9373-74e4-439b-bd7f-ad00e60a3284-cnibin\") pod \"multus-additional-cni-plugins-9fnsr\" (UID: \"e01f9373-74e4-439b-bd7f-ad00e60a3284\") " pod="openshift-multus/multus-additional-cni-plugins-9fnsr" Oct 03 08:40:35 crc kubenswrapper[4790]: E1003 08:40:35.945156 4790 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 03 08:40:35 crc kubenswrapper[4790]: I1003 08:40:35.945201 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/e01f9373-74e4-439b-bd7f-ad00e60a3284-cnibin\") pod \"multus-additional-cni-plugins-9fnsr\" (UID: \"e01f9373-74e4-439b-bd7f-ad00e60a3284\") " pod="openshift-multus/multus-additional-cni-plugins-9fnsr" Oct 03 08:40:35 crc kubenswrapper[4790]: E1003 08:40:35.945240 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-03 08:40:37.945217093 +0000 UTC m=+24.298151331 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 03 08:40:35 crc kubenswrapper[4790]: E1003 08:40:35.945312 4790 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 03 08:40:35 crc kubenswrapper[4790]: E1003 08:40:35.945325 4790 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 03 08:40:35 crc kubenswrapper[4790]: E1003 08:40:35.945338 4790 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 03 08:40:35 crc kubenswrapper[4790]: I1003 08:40:35.945346 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/e01f9373-74e4-439b-bd7f-ad00e60a3284-os-release\") pod \"multus-additional-cni-plugins-9fnsr\" (UID: \"e01f9373-74e4-439b-bd7f-ad00e60a3284\") " pod="openshift-multus/multus-additional-cni-plugins-9fnsr" Oct 03 08:40:35 crc kubenswrapper[4790]: I1003 08:40:35.945437 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/36eae06a-d8be-4128-8d68-acb32e1f011b-rootfs\") pod \"machine-config-daemon-zqj4j\" (UID: \"36eae06a-d8be-4128-8d68-acb32e1f011b\") " pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" Oct 03 08:40:35 crc kubenswrapper[4790]: I1003 08:40:35.945645 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/e01f9373-74e4-439b-bd7f-ad00e60a3284-system-cni-dir\") pod \"multus-additional-cni-plugins-9fnsr\" (UID: \"e01f9373-74e4-439b-bd7f-ad00e60a3284\") " pod="openshift-multus/multus-additional-cni-plugins-9fnsr" Oct 03 08:40:35 crc kubenswrapper[4790]: E1003 08:40:35.945365 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-03 08:40:37.945356456 +0000 UTC m=+24.298290694 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 03 08:40:35 crc kubenswrapper[4790]: E1003 08:40:35.945414 4790 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 03 08:40:35 crc kubenswrapper[4790]: E1003 08:40:35.945860 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-03 08:40:37.945851891 +0000 UTC m=+24.298786129 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 03 08:40:35 crc kubenswrapper[4790]: I1003 08:40:35.946035 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/36eae06a-d8be-4128-8d68-acb32e1f011b-mcd-auth-proxy-config\") pod \"machine-config-daemon-zqj4j\" (UID: \"36eae06a-d8be-4128-8d68-acb32e1f011b\") " pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" Oct 03 08:40:35 crc kubenswrapper[4790]: E1003 08:40:35.946191 4790 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 03 08:40:35 crc kubenswrapper[4790]: E1003 08:40:35.946223 4790 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 03 08:40:35 crc kubenswrapper[4790]: E1003 08:40:35.946240 4790 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 03 08:40:35 crc kubenswrapper[4790]: I1003 08:40:35.946257 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/e01f9373-74e4-439b-bd7f-ad00e60a3284-tuning-conf-dir\") pod \"multus-additional-cni-plugins-9fnsr\" (UID: \"e01f9373-74e4-439b-bd7f-ad00e60a3284\") " pod="openshift-multus/multus-additional-cni-plugins-9fnsr" Oct 03 08:40:35 crc kubenswrapper[4790]: I1003 08:40:35.946238 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/e01f9373-74e4-439b-bd7f-ad00e60a3284-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-9fnsr\" (UID: \"e01f9373-74e4-439b-bd7f-ad00e60a3284\") " pod="openshift-multus/multus-additional-cni-plugins-9fnsr" Oct 03 08:40:35 crc kubenswrapper[4790]: E1003 08:40:35.946303 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-03 08:40:37.946282343 +0000 UTC m=+24.299216811 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 03 08:40:35 crc kubenswrapper[4790]: I1003 08:40:35.955331 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9fnsr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e01f9373-74e4-439b-bd7f-ad00e60a3284\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krfjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krfjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krfjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krfjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krfjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krfjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krfjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9fnsr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:35Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:35 crc kubenswrapper[4790]: I1003 08:40:35.971102 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28b99cc027b4390bdc7e3ed519bb798eb4a7b825ff29e4574f3420b9922265b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:35Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:35 crc kubenswrapper[4790]: I1003 08:40:35.977076 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 03 08:40:35 crc kubenswrapper[4790]: I1003 08:40:35.986916 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:35Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:36 crc kubenswrapper[4790]: I1003 08:40:36.000623 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:35Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:36 crc kubenswrapper[4790]: I1003 08:40:36.015301 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"36eae06a-d8be-4128-8d68-acb32e1f011b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww2bb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww2bb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:35Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-zqj4j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:36Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:36 crc kubenswrapper[4790]: I1003 08:40:36.045563 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/36eae06a-d8be-4128-8d68-acb32e1f011b-proxy-tls\") pod \"machine-config-daemon-zqj4j\" (UID: \"36eae06a-d8be-4128-8d68-acb32e1f011b\") " pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" Oct 03 08:40:36 crc kubenswrapper[4790]: I1003 08:40:36.045657 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ww2bb\" (UniqueName: \"kubernetes.io/projected/36eae06a-d8be-4128-8d68-acb32e1f011b-kube-api-access-ww2bb\") pod \"machine-config-daemon-zqj4j\" (UID: \"36eae06a-d8be-4128-8d68-acb32e1f011b\") " pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" Oct 03 08:40:36 crc kubenswrapper[4790]: I1003 08:40:36.103381 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-vsjvh"] Oct 03 08:40:36 crc kubenswrapper[4790]: I1003 08:40:36.105103 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-vsjvh" Oct 03 08:40:36 crc kubenswrapper[4790]: I1003 08:40:36.107475 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Oct 03 08:40:36 crc kubenswrapper[4790]: I1003 08:40:36.108119 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Oct 03 08:40:36 crc kubenswrapper[4790]: I1003 08:40:36.108404 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Oct 03 08:40:36 crc kubenswrapper[4790]: I1003 08:40:36.108464 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Oct 03 08:40:36 crc kubenswrapper[4790]: I1003 08:40:36.108534 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Oct 03 08:40:36 crc kubenswrapper[4790]: I1003 08:40:36.108707 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Oct 03 08:40:36 crc kubenswrapper[4790]: I1003 08:40:36.108824 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Oct 03 08:40:36 crc kubenswrapper[4790]: I1003 08:40:36.124838 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:36Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:36 crc kubenswrapper[4790]: I1003 08:40:36.136464 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-hncl2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f2f5631d-8486-44f9-81b0-ee78515e7866\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-strp2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-hncl2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:36Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:36 crc kubenswrapper[4790]: I1003 08:40:36.151430 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:36Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:36 crc kubenswrapper[4790]: I1003 08:40:36.156084 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Oct 03 08:40:36 crc kubenswrapper[4790]: I1003 08:40:36.160504 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/f2f5631d-8486-44f9-81b0-ee78515e7866-cni-binary-copy\") pod \"multus-hncl2\" (UID: \"f2f5631d-8486-44f9-81b0-ee78515e7866\") " pod="openshift-multus/multus-hncl2" Oct 03 08:40:36 crc kubenswrapper[4790]: I1003 08:40:36.166052 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/e01f9373-74e4-439b-bd7f-ad00e60a3284-cni-binary-copy\") pod \"multus-additional-cni-plugins-9fnsr\" (UID: \"e01f9373-74e4-439b-bd7f-ad00e60a3284\") " pod="openshift-multus/multus-additional-cni-plugins-9fnsr" Oct 03 08:40:36 crc kubenswrapper[4790]: I1003 08:40:36.170173 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9fnsr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e01f9373-74e4-439b-bd7f-ad00e60a3284\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krfjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krfjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krfjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krfjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krfjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krfjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krfjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9fnsr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:36Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:36 crc kubenswrapper[4790]: I1003 08:40:36.189544 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28b99cc027b4390bdc7e3ed519bb798eb4a7b825ff29e4574f3420b9922265b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:36Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:36 crc kubenswrapper[4790]: I1003 08:40:36.207295 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:36Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:36 crc kubenswrapper[4790]: I1003 08:40:36.225400 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:36Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:36 crc kubenswrapper[4790]: I1003 08:40:36.240786 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"36eae06a-d8be-4128-8d68-acb32e1f011b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww2bb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww2bb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:35Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-zqj4j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:36Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:36 crc kubenswrapper[4790]: I1003 08:40:36.247823 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/fa372dcc-63aa-48e5-b153-7739078026ef-log-socket\") pod \"ovnkube-node-vsjvh\" (UID: \"fa372dcc-63aa-48e5-b153-7739078026ef\") " pod="openshift-ovn-kubernetes/ovnkube-node-vsjvh" Oct 03 08:40:36 crc kubenswrapper[4790]: I1003 08:40:36.247925 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-52pgl\" (UniqueName: \"kubernetes.io/projected/fa372dcc-63aa-48e5-b153-7739078026ef-kube-api-access-52pgl\") pod \"ovnkube-node-vsjvh\" (UID: \"fa372dcc-63aa-48e5-b153-7739078026ef\") " pod="openshift-ovn-kubernetes/ovnkube-node-vsjvh" Oct 03 08:40:36 crc kubenswrapper[4790]: I1003 08:40:36.247962 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/fa372dcc-63aa-48e5-b153-7739078026ef-host-run-netns\") pod \"ovnkube-node-vsjvh\" (UID: \"fa372dcc-63aa-48e5-b153-7739078026ef\") " pod="openshift-ovn-kubernetes/ovnkube-node-vsjvh" Oct 03 08:40:36 crc kubenswrapper[4790]: I1003 08:40:36.248023 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fa372dcc-63aa-48e5-b153-7739078026ef-etc-openvswitch\") pod \"ovnkube-node-vsjvh\" (UID: \"fa372dcc-63aa-48e5-b153-7739078026ef\") " pod="openshift-ovn-kubernetes/ovnkube-node-vsjvh" Oct 03 08:40:36 crc kubenswrapper[4790]: I1003 08:40:36.248077 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fa372dcc-63aa-48e5-b153-7739078026ef-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-vsjvh\" (UID: \"fa372dcc-63aa-48e5-b153-7739078026ef\") " pod="openshift-ovn-kubernetes/ovnkube-node-vsjvh" Oct 03 08:40:36 crc kubenswrapper[4790]: I1003 08:40:36.248156 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/fa372dcc-63aa-48e5-b153-7739078026ef-run-ovn\") pod \"ovnkube-node-vsjvh\" (UID: \"fa372dcc-63aa-48e5-b153-7739078026ef\") " pod="openshift-ovn-kubernetes/ovnkube-node-vsjvh" Oct 03 08:40:36 crc kubenswrapper[4790]: I1003 08:40:36.248217 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/fa372dcc-63aa-48e5-b153-7739078026ef-host-kubelet\") pod \"ovnkube-node-vsjvh\" (UID: \"fa372dcc-63aa-48e5-b153-7739078026ef\") " pod="openshift-ovn-kubernetes/ovnkube-node-vsjvh" Oct 03 08:40:36 crc kubenswrapper[4790]: I1003 08:40:36.248260 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fa372dcc-63aa-48e5-b153-7739078026ef-run-openvswitch\") pod \"ovnkube-node-vsjvh\" (UID: \"fa372dcc-63aa-48e5-b153-7739078026ef\") " pod="openshift-ovn-kubernetes/ovnkube-node-vsjvh" Oct 03 08:40:36 crc kubenswrapper[4790]: I1003 08:40:36.248381 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/fa372dcc-63aa-48e5-b153-7739078026ef-host-cni-bin\") pod \"ovnkube-node-vsjvh\" (UID: \"fa372dcc-63aa-48e5-b153-7739078026ef\") " pod="openshift-ovn-kubernetes/ovnkube-node-vsjvh" Oct 03 08:40:36 crc kubenswrapper[4790]: I1003 08:40:36.248432 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/fa372dcc-63aa-48e5-b153-7739078026ef-run-systemd\") pod \"ovnkube-node-vsjvh\" (UID: \"fa372dcc-63aa-48e5-b153-7739078026ef\") " pod="openshift-ovn-kubernetes/ovnkube-node-vsjvh" Oct 03 08:40:36 crc kubenswrapper[4790]: I1003 08:40:36.248484 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/fa372dcc-63aa-48e5-b153-7739078026ef-systemd-units\") pod \"ovnkube-node-vsjvh\" (UID: \"fa372dcc-63aa-48e5-b153-7739078026ef\") " pod="openshift-ovn-kubernetes/ovnkube-node-vsjvh" Oct 03 08:40:36 crc kubenswrapper[4790]: I1003 08:40:36.248522 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/fa372dcc-63aa-48e5-b153-7739078026ef-node-log\") pod \"ovnkube-node-vsjvh\" (UID: \"fa372dcc-63aa-48e5-b153-7739078026ef\") " pod="openshift-ovn-kubernetes/ovnkube-node-vsjvh" Oct 03 08:40:36 crc kubenswrapper[4790]: I1003 08:40:36.248599 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/fa372dcc-63aa-48e5-b153-7739078026ef-env-overrides\") pod \"ovnkube-node-vsjvh\" (UID: \"fa372dcc-63aa-48e5-b153-7739078026ef\") " pod="openshift-ovn-kubernetes/ovnkube-node-vsjvh" Oct 03 08:40:36 crc kubenswrapper[4790]: I1003 08:40:36.248627 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/fa372dcc-63aa-48e5-b153-7739078026ef-ovnkube-script-lib\") pod \"ovnkube-node-vsjvh\" (UID: \"fa372dcc-63aa-48e5-b153-7739078026ef\") " pod="openshift-ovn-kubernetes/ovnkube-node-vsjvh" Oct 03 08:40:36 crc kubenswrapper[4790]: I1003 08:40:36.248711 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/fa372dcc-63aa-48e5-b153-7739078026ef-host-cni-netd\") pod \"ovnkube-node-vsjvh\" (UID: \"fa372dcc-63aa-48e5-b153-7739078026ef\") " pod="openshift-ovn-kubernetes/ovnkube-node-vsjvh" Oct 03 08:40:36 crc kubenswrapper[4790]: I1003 08:40:36.248741 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/fa372dcc-63aa-48e5-b153-7739078026ef-ovn-node-metrics-cert\") pod \"ovnkube-node-vsjvh\" (UID: \"fa372dcc-63aa-48e5-b153-7739078026ef\") " pod="openshift-ovn-kubernetes/ovnkube-node-vsjvh" Oct 03 08:40:36 crc kubenswrapper[4790]: I1003 08:40:36.248845 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/fa372dcc-63aa-48e5-b153-7739078026ef-ovnkube-config\") pod \"ovnkube-node-vsjvh\" (UID: \"fa372dcc-63aa-48e5-b153-7739078026ef\") " pod="openshift-ovn-kubernetes/ovnkube-node-vsjvh" Oct 03 08:40:36 crc kubenswrapper[4790]: I1003 08:40:36.248878 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fa372dcc-63aa-48e5-b153-7739078026ef-host-run-ovn-kubernetes\") pod \"ovnkube-node-vsjvh\" (UID: \"fa372dcc-63aa-48e5-b153-7739078026ef\") " pod="openshift-ovn-kubernetes/ovnkube-node-vsjvh" Oct 03 08:40:36 crc kubenswrapper[4790]: I1003 08:40:36.248911 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/fa372dcc-63aa-48e5-b153-7739078026ef-host-slash\") pod \"ovnkube-node-vsjvh\" (UID: \"fa372dcc-63aa-48e5-b153-7739078026ef\") " pod="openshift-ovn-kubernetes/ovnkube-node-vsjvh" Oct 03 08:40:36 crc kubenswrapper[4790]: I1003 08:40:36.248966 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fa372dcc-63aa-48e5-b153-7739078026ef-var-lib-openvswitch\") pod \"ovnkube-node-vsjvh\" (UID: \"fa372dcc-63aa-48e5-b153-7739078026ef\") " pod="openshift-ovn-kubernetes/ovnkube-node-vsjvh" Oct 03 08:40:36 crc kubenswrapper[4790]: I1003 08:40:36.269365 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vsjvh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa372dcc-63aa-48e5-b153-7739078026ef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:36Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52pgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52pgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52pgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52pgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52pgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52pgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52pgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52pgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52pgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vsjvh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:36Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:36 crc kubenswrapper[4790]: I1003 08:40:36.286955 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"535765e1-fd5d-4501-b13e-684c5f08b296\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e0caa9033c82a625d9594342d4e58052cfbfea8251bc0fa2f1b102ba235b5ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db5e8a267e2821ea1c1628ff477a4efc3c5bcc78538460b0f30a160bd738dcee\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d85a637b07da3de31d73486c444bf86e7e2fd799180fc1cd9ab56fee61afaa9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://905a99c09cfaadc3eddc17fa6e509098364525d7a03add46a97c98dc08def8a2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://905a99c09cfaadc3eddc17fa6e509098364525d7a03add46a97c98dc08def8a2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T08:40:30Z\\\",\\\"message\\\":\\\"file observer\\\\nW1003 08:40:30.396497 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1003 08:40:30.399279 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 08:40:30.400119 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-247026750/tls.crt::/tmp/serving-cert-247026750/tls.key\\\\\\\"\\\\nI1003 08:40:30.809123 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1003 08:40:30.812812 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1003 08:40:30.812831 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1003 08:40:30.812852 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1003 08:40:30.812858 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1003 08:40:30.817440 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1003 08:40:30.817463 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1003 08:40:30.817482 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 08:40:30.817491 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 08:40:30.817495 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1003 08:40:30.817498 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1003 08:40:30.817501 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1003 08:40:30.817504 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1003 08:40:30.821080 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4591d1f2d844cb16eef19a65aaad416898d23ca5ce266e4863a9810d49ff4ba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://444e973af493d5aca2c924bcaee20abca5f1281e6b9c9f364e4603bf622d0e48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://444e973af493d5aca2c924bcaee20abca5f1281e6b9c9f364e4603bf622d0e48\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:40:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:14Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:36Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:36 crc kubenswrapper[4790]: I1003 08:40:36.299689 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gprtl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c5bfb61-e69a-4576-b200-18dd7f38c7f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dcb3e7b6550f0ad490e187c2154ec53cb427cbab7acaf0618eb4de188eb724c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66xc5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:34Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gprtl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:36Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:36 crc kubenswrapper[4790]: I1003 08:40:36.309560 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" Oct 03 08:40:36 crc kubenswrapper[4790]: I1003 08:40:36.316393 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3362615c1396a6051d3770c79e4b67eea9d0c9df2a1a45381f729c4120970818\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://41113d498e57788a4c3162e8e981f5b9d9a5a7d49ee50b15085688fbc6b67f4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:36Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:36 crc kubenswrapper[4790]: I1003 08:40:36.327207 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 08:40:36 crc kubenswrapper[4790]: E1003 08:40:36.327368 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 08:40:36 crc kubenswrapper[4790]: I1003 08:40:36.328771 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 08:40:36 crc kubenswrapper[4790]: E1003 08:40:36.328892 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 08:40:36 crc kubenswrapper[4790]: I1003 08:40:36.328960 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 08:40:36 crc kubenswrapper[4790]: E1003 08:40:36.329045 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 08:40:36 crc kubenswrapper[4790]: I1003 08:40:36.335451 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Oct 03 08:40:36 crc kubenswrapper[4790]: I1003 08:40:36.336393 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Oct 03 08:40:36 crc kubenswrapper[4790]: I1003 08:40:36.339282 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Oct 03 08:40:36 crc kubenswrapper[4790]: I1003 08:40:36.349987 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/fa372dcc-63aa-48e5-b153-7739078026ef-env-overrides\") pod \"ovnkube-node-vsjvh\" (UID: \"fa372dcc-63aa-48e5-b153-7739078026ef\") " pod="openshift-ovn-kubernetes/ovnkube-node-vsjvh" Oct 03 08:40:36 crc kubenswrapper[4790]: I1003 08:40:36.350184 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/fa372dcc-63aa-48e5-b153-7739078026ef-ovnkube-script-lib\") pod \"ovnkube-node-vsjvh\" (UID: \"fa372dcc-63aa-48e5-b153-7739078026ef\") " pod="openshift-ovn-kubernetes/ovnkube-node-vsjvh" Oct 03 08:40:36 crc kubenswrapper[4790]: I1003 08:40:36.351050 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/fa372dcc-63aa-48e5-b153-7739078026ef-env-overrides\") pod \"ovnkube-node-vsjvh\" (UID: \"fa372dcc-63aa-48e5-b153-7739078026ef\") " pod="openshift-ovn-kubernetes/ovnkube-node-vsjvh" Oct 03 08:40:36 crc kubenswrapper[4790]: I1003 08:40:36.351235 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/fa372dcc-63aa-48e5-b153-7739078026ef-ovnkube-script-lib\") pod \"ovnkube-node-vsjvh\" (UID: \"fa372dcc-63aa-48e5-b153-7739078026ef\") " pod="openshift-ovn-kubernetes/ovnkube-node-vsjvh" Oct 03 08:40:36 crc kubenswrapper[4790]: I1003 08:40:36.351394 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/fa372dcc-63aa-48e5-b153-7739078026ef-host-cni-netd\") pod \"ovnkube-node-vsjvh\" (UID: \"fa372dcc-63aa-48e5-b153-7739078026ef\") " pod="openshift-ovn-kubernetes/ovnkube-node-vsjvh" Oct 03 08:40:36 crc kubenswrapper[4790]: I1003 08:40:36.351531 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/fa372dcc-63aa-48e5-b153-7739078026ef-host-cni-netd\") pod \"ovnkube-node-vsjvh\" (UID: \"fa372dcc-63aa-48e5-b153-7739078026ef\") " pod="openshift-ovn-kubernetes/ovnkube-node-vsjvh" Oct 03 08:40:36 crc kubenswrapper[4790]: I1003 08:40:36.351719 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/fa372dcc-63aa-48e5-b153-7739078026ef-ovn-node-metrics-cert\") pod \"ovnkube-node-vsjvh\" (UID: \"fa372dcc-63aa-48e5-b153-7739078026ef\") " pod="openshift-ovn-kubernetes/ovnkube-node-vsjvh" Oct 03 08:40:36 crc kubenswrapper[4790]: I1003 08:40:36.352421 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/fa372dcc-63aa-48e5-b153-7739078026ef-ovnkube-config\") pod \"ovnkube-node-vsjvh\" (UID: \"fa372dcc-63aa-48e5-b153-7739078026ef\") " pod="openshift-ovn-kubernetes/ovnkube-node-vsjvh" Oct 03 08:40:36 crc kubenswrapper[4790]: I1003 08:40:36.352480 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/fa372dcc-63aa-48e5-b153-7739078026ef-host-slash\") pod \"ovnkube-node-vsjvh\" (UID: \"fa372dcc-63aa-48e5-b153-7739078026ef\") " pod="openshift-ovn-kubernetes/ovnkube-node-vsjvh" Oct 03 08:40:36 crc kubenswrapper[4790]: I1003 08:40:36.352513 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fa372dcc-63aa-48e5-b153-7739078026ef-var-lib-openvswitch\") pod \"ovnkube-node-vsjvh\" (UID: \"fa372dcc-63aa-48e5-b153-7739078026ef\") " pod="openshift-ovn-kubernetes/ovnkube-node-vsjvh" Oct 03 08:40:36 crc kubenswrapper[4790]: I1003 08:40:36.352615 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fa372dcc-63aa-48e5-b153-7739078026ef-host-run-ovn-kubernetes\") pod \"ovnkube-node-vsjvh\" (UID: \"fa372dcc-63aa-48e5-b153-7739078026ef\") " pod="openshift-ovn-kubernetes/ovnkube-node-vsjvh" Oct 03 08:40:36 crc kubenswrapper[4790]: I1003 08:40:36.352649 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/fa372dcc-63aa-48e5-b153-7739078026ef-log-socket\") pod \"ovnkube-node-vsjvh\" (UID: \"fa372dcc-63aa-48e5-b153-7739078026ef\") " pod="openshift-ovn-kubernetes/ovnkube-node-vsjvh" Oct 03 08:40:36 crc kubenswrapper[4790]: I1003 08:40:36.352671 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-52pgl\" (UniqueName: \"kubernetes.io/projected/fa372dcc-63aa-48e5-b153-7739078026ef-kube-api-access-52pgl\") pod \"ovnkube-node-vsjvh\" (UID: \"fa372dcc-63aa-48e5-b153-7739078026ef\") " pod="openshift-ovn-kubernetes/ovnkube-node-vsjvh" Oct 03 08:40:36 crc kubenswrapper[4790]: I1003 08:40:36.352695 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/fa372dcc-63aa-48e5-b153-7739078026ef-host-run-netns\") pod \"ovnkube-node-vsjvh\" (UID: \"fa372dcc-63aa-48e5-b153-7739078026ef\") " pod="openshift-ovn-kubernetes/ovnkube-node-vsjvh" Oct 03 08:40:36 crc kubenswrapper[4790]: I1003 08:40:36.352723 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fa372dcc-63aa-48e5-b153-7739078026ef-etc-openvswitch\") pod \"ovnkube-node-vsjvh\" (UID: \"fa372dcc-63aa-48e5-b153-7739078026ef\") " pod="openshift-ovn-kubernetes/ovnkube-node-vsjvh" Oct 03 08:40:36 crc kubenswrapper[4790]: I1003 08:40:36.352762 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/fa372dcc-63aa-48e5-b153-7739078026ef-run-ovn\") pod \"ovnkube-node-vsjvh\" (UID: \"fa372dcc-63aa-48e5-b153-7739078026ef\") " pod="openshift-ovn-kubernetes/ovnkube-node-vsjvh" Oct 03 08:40:36 crc kubenswrapper[4790]: I1003 08:40:36.352785 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fa372dcc-63aa-48e5-b153-7739078026ef-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-vsjvh\" (UID: \"fa372dcc-63aa-48e5-b153-7739078026ef\") " pod="openshift-ovn-kubernetes/ovnkube-node-vsjvh" Oct 03 08:40:36 crc kubenswrapper[4790]: I1003 08:40:36.352840 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/fa372dcc-63aa-48e5-b153-7739078026ef-host-kubelet\") pod \"ovnkube-node-vsjvh\" (UID: \"fa372dcc-63aa-48e5-b153-7739078026ef\") " pod="openshift-ovn-kubernetes/ovnkube-node-vsjvh" Oct 03 08:40:36 crc kubenswrapper[4790]: I1003 08:40:36.352862 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fa372dcc-63aa-48e5-b153-7739078026ef-run-openvswitch\") pod \"ovnkube-node-vsjvh\" (UID: \"fa372dcc-63aa-48e5-b153-7739078026ef\") " pod="openshift-ovn-kubernetes/ovnkube-node-vsjvh" Oct 03 08:40:36 crc kubenswrapper[4790]: I1003 08:40:36.352880 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/fa372dcc-63aa-48e5-b153-7739078026ef-host-cni-bin\") pod \"ovnkube-node-vsjvh\" (UID: \"fa372dcc-63aa-48e5-b153-7739078026ef\") " pod="openshift-ovn-kubernetes/ovnkube-node-vsjvh" Oct 03 08:40:36 crc kubenswrapper[4790]: I1003 08:40:36.352922 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/fa372dcc-63aa-48e5-b153-7739078026ef-run-systemd\") pod \"ovnkube-node-vsjvh\" (UID: \"fa372dcc-63aa-48e5-b153-7739078026ef\") " pod="openshift-ovn-kubernetes/ovnkube-node-vsjvh" Oct 03 08:40:36 crc kubenswrapper[4790]: I1003 08:40:36.352952 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/fa372dcc-63aa-48e5-b153-7739078026ef-systemd-units\") pod \"ovnkube-node-vsjvh\" (UID: \"fa372dcc-63aa-48e5-b153-7739078026ef\") " pod="openshift-ovn-kubernetes/ovnkube-node-vsjvh" Oct 03 08:40:36 crc kubenswrapper[4790]: I1003 08:40:36.352972 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/fa372dcc-63aa-48e5-b153-7739078026ef-node-log\") pod \"ovnkube-node-vsjvh\" (UID: \"fa372dcc-63aa-48e5-b153-7739078026ef\") " pod="openshift-ovn-kubernetes/ovnkube-node-vsjvh" Oct 03 08:40:36 crc kubenswrapper[4790]: I1003 08:40:36.353090 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/fa372dcc-63aa-48e5-b153-7739078026ef-node-log\") pod \"ovnkube-node-vsjvh\" (UID: \"fa372dcc-63aa-48e5-b153-7739078026ef\") " pod="openshift-ovn-kubernetes/ovnkube-node-vsjvh" Oct 03 08:40:36 crc kubenswrapper[4790]: I1003 08:40:36.353134 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/fa372dcc-63aa-48e5-b153-7739078026ef-host-slash\") pod \"ovnkube-node-vsjvh\" (UID: \"fa372dcc-63aa-48e5-b153-7739078026ef\") " pod="openshift-ovn-kubernetes/ovnkube-node-vsjvh" Oct 03 08:40:36 crc kubenswrapper[4790]: I1003 08:40:36.353159 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fa372dcc-63aa-48e5-b153-7739078026ef-var-lib-openvswitch\") pod \"ovnkube-node-vsjvh\" (UID: \"fa372dcc-63aa-48e5-b153-7739078026ef\") " pod="openshift-ovn-kubernetes/ovnkube-node-vsjvh" Oct 03 08:40:36 crc kubenswrapper[4790]: I1003 08:40:36.353153 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/fa372dcc-63aa-48e5-b153-7739078026ef-ovnkube-config\") pod \"ovnkube-node-vsjvh\" (UID: \"fa372dcc-63aa-48e5-b153-7739078026ef\") " pod="openshift-ovn-kubernetes/ovnkube-node-vsjvh" Oct 03 08:40:36 crc kubenswrapper[4790]: I1003 08:40:36.353184 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fa372dcc-63aa-48e5-b153-7739078026ef-host-run-ovn-kubernetes\") pod \"ovnkube-node-vsjvh\" (UID: \"fa372dcc-63aa-48e5-b153-7739078026ef\") " pod="openshift-ovn-kubernetes/ovnkube-node-vsjvh" Oct 03 08:40:36 crc kubenswrapper[4790]: I1003 08:40:36.353212 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/fa372dcc-63aa-48e5-b153-7739078026ef-log-socket\") pod \"ovnkube-node-vsjvh\" (UID: \"fa372dcc-63aa-48e5-b153-7739078026ef\") " pod="openshift-ovn-kubernetes/ovnkube-node-vsjvh" Oct 03 08:40:36 crc kubenswrapper[4790]: I1003 08:40:36.353239 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/fa372dcc-63aa-48e5-b153-7739078026ef-host-kubelet\") pod \"ovnkube-node-vsjvh\" (UID: \"fa372dcc-63aa-48e5-b153-7739078026ef\") " pod="openshift-ovn-kubernetes/ovnkube-node-vsjvh" Oct 03 08:40:36 crc kubenswrapper[4790]: I1003 08:40:36.353261 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fa372dcc-63aa-48e5-b153-7739078026ef-run-openvswitch\") pod \"ovnkube-node-vsjvh\" (UID: \"fa372dcc-63aa-48e5-b153-7739078026ef\") " pod="openshift-ovn-kubernetes/ovnkube-node-vsjvh" Oct 03 08:40:36 crc kubenswrapper[4790]: I1003 08:40:36.353254 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fa372dcc-63aa-48e5-b153-7739078026ef-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-vsjvh\" (UID: \"fa372dcc-63aa-48e5-b153-7739078026ef\") " pod="openshift-ovn-kubernetes/ovnkube-node-vsjvh" Oct 03 08:40:36 crc kubenswrapper[4790]: I1003 08:40:36.353311 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/fa372dcc-63aa-48e5-b153-7739078026ef-host-run-netns\") pod \"ovnkube-node-vsjvh\" (UID: \"fa372dcc-63aa-48e5-b153-7739078026ef\") " pod="openshift-ovn-kubernetes/ovnkube-node-vsjvh" Oct 03 08:40:36 crc kubenswrapper[4790]: I1003 08:40:36.353359 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fa372dcc-63aa-48e5-b153-7739078026ef-etc-openvswitch\") pod \"ovnkube-node-vsjvh\" (UID: \"fa372dcc-63aa-48e5-b153-7739078026ef\") " pod="openshift-ovn-kubernetes/ovnkube-node-vsjvh" Oct 03 08:40:36 crc kubenswrapper[4790]: I1003 08:40:36.353379 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/fa372dcc-63aa-48e5-b153-7739078026ef-run-systemd\") pod \"ovnkube-node-vsjvh\" (UID: \"fa372dcc-63aa-48e5-b153-7739078026ef\") " pod="openshift-ovn-kubernetes/ovnkube-node-vsjvh" Oct 03 08:40:36 crc kubenswrapper[4790]: I1003 08:40:36.353408 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/fa372dcc-63aa-48e5-b153-7739078026ef-run-ovn\") pod \"ovnkube-node-vsjvh\" (UID: \"fa372dcc-63aa-48e5-b153-7739078026ef\") " pod="openshift-ovn-kubernetes/ovnkube-node-vsjvh" Oct 03 08:40:36 crc kubenswrapper[4790]: I1003 08:40:36.353418 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/fa372dcc-63aa-48e5-b153-7739078026ef-host-cni-bin\") pod \"ovnkube-node-vsjvh\" (UID: \"fa372dcc-63aa-48e5-b153-7739078026ef\") " pod="openshift-ovn-kubernetes/ovnkube-node-vsjvh" Oct 03 08:40:36 crc kubenswrapper[4790]: I1003 08:40:36.353468 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/fa372dcc-63aa-48e5-b153-7739078026ef-systemd-units\") pod \"ovnkube-node-vsjvh\" (UID: \"fa372dcc-63aa-48e5-b153-7739078026ef\") " pod="openshift-ovn-kubernetes/ovnkube-node-vsjvh" Oct 03 08:40:36 crc kubenswrapper[4790]: I1003 08:40:36.358196 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/fa372dcc-63aa-48e5-b153-7739078026ef-ovn-node-metrics-cert\") pod \"ovnkube-node-vsjvh\" (UID: \"fa372dcc-63aa-48e5-b153-7739078026ef\") " pod="openshift-ovn-kubernetes/ovnkube-node-vsjvh" Oct 03 08:40:36 crc kubenswrapper[4790]: I1003 08:40:36.373492 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-52pgl\" (UniqueName: \"kubernetes.io/projected/fa372dcc-63aa-48e5-b153-7739078026ef-kube-api-access-52pgl\") pod \"ovnkube-node-vsjvh\" (UID: \"fa372dcc-63aa-48e5-b153-7739078026ef\") " pod="openshift-ovn-kubernetes/ovnkube-node-vsjvh" Oct 03 08:40:36 crc kubenswrapper[4790]: I1003 08:40:36.435838 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Oct 03 08:40:36 crc kubenswrapper[4790]: I1003 08:40:36.465642 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-vsjvh" Oct 03 08:40:36 crc kubenswrapper[4790]: I1003 08:40:36.488241 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Oct 03 08:40:36 crc kubenswrapper[4790]: I1003 08:40:36.490465 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/f2f5631d-8486-44f9-81b0-ee78515e7866-multus-daemon-config\") pod \"multus-hncl2\" (UID: \"f2f5631d-8486-44f9-81b0-ee78515e7866\") " pod="openshift-multus/multus-hncl2" Oct 03 08:40:36 crc kubenswrapper[4790]: I1003 08:40:36.491758 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" event={"ID":"36eae06a-d8be-4128-8d68-acb32e1f011b","Type":"ContainerStarted","Data":"fb04c9b53e5662b1b1ce16f20e3a3786d7c654771784f8c7c13bc2b299305b69"} Oct 03 08:40:36 crc kubenswrapper[4790]: I1003 08:40:36.493075 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vsjvh" event={"ID":"fa372dcc-63aa-48e5-b153-7739078026ef","Type":"ContainerStarted","Data":"179fd7ef9236cbccc231d9c8ae77862c35390fb123bb38b62441df8c69e2b09e"} Oct 03 08:40:36 crc kubenswrapper[4790]: I1003 08:40:36.493755 4790 scope.go:117] "RemoveContainer" containerID="905a99c09cfaadc3eddc17fa6e509098364525d7a03add46a97c98dc08def8a2" Oct 03 08:40:36 crc kubenswrapper[4790]: E1003 08:40:36.493938 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Oct 03 08:40:36 crc kubenswrapper[4790]: E1003 08:40:36.554162 4790 projected.go:288] Couldn't get configMap openshift-multus/kube-root-ca.crt: failed to sync configmap cache: timed out waiting for the condition Oct 03 08:40:36 crc kubenswrapper[4790]: E1003 08:40:36.554236 4790 projected.go:194] Error preparing data for projected volume kube-api-access-strp2 for pod openshift-multus/multus-hncl2: failed to sync configmap cache: timed out waiting for the condition Oct 03 08:40:36 crc kubenswrapper[4790]: E1003 08:40:36.554323 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f2f5631d-8486-44f9-81b0-ee78515e7866-kube-api-access-strp2 podName:f2f5631d-8486-44f9-81b0-ee78515e7866 nodeName:}" failed. No retries permitted until 2025-10-03 08:40:37.05429833 +0000 UTC m=+23.407232568 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-strp2" (UniqueName: "kubernetes.io/projected/f2f5631d-8486-44f9-81b0-ee78515e7866-kube-api-access-strp2") pod "multus-hncl2" (UID: "f2f5631d-8486-44f9-81b0-ee78515e7866") : failed to sync configmap cache: timed out waiting for the condition Oct 03 08:40:36 crc kubenswrapper[4790]: I1003 08:40:36.765393 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Oct 03 08:40:36 crc kubenswrapper[4790]: I1003 08:40:36.801738 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Oct 03 08:40:36 crc kubenswrapper[4790]: I1003 08:40:36.812802 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-krfjv\" (UniqueName: \"kubernetes.io/projected/e01f9373-74e4-439b-bd7f-ad00e60a3284-kube-api-access-krfjv\") pod \"multus-additional-cni-plugins-9fnsr\" (UID: \"e01f9373-74e4-439b-bd7f-ad00e60a3284\") " pod="openshift-multus/multus-additional-cni-plugins-9fnsr" Oct 03 08:40:36 crc kubenswrapper[4790]: I1003 08:40:36.916782 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-9fnsr" Oct 03 08:40:36 crc kubenswrapper[4790]: W1003 08:40:36.929800 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode01f9373_74e4_439b_bd7f_ad00e60a3284.slice/crio-432f4a1a13d315af0c9d7efebf7ca333936b9279d9f4a61d399b5245918f0d34 WatchSource:0}: Error finding container 432f4a1a13d315af0c9d7efebf7ca333936b9279d9f4a61d399b5245918f0d34: Status 404 returned error can't find the container with id 432f4a1a13d315af0c9d7efebf7ca333936b9279d9f4a61d399b5245918f0d34 Oct 03 08:40:37 crc kubenswrapper[4790]: I1003 08:40:37.061629 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-strp2\" (UniqueName: \"kubernetes.io/projected/f2f5631d-8486-44f9-81b0-ee78515e7866-kube-api-access-strp2\") pod \"multus-hncl2\" (UID: \"f2f5631d-8486-44f9-81b0-ee78515e7866\") " pod="openshift-multus/multus-hncl2" Oct 03 08:40:37 crc kubenswrapper[4790]: I1003 08:40:37.066547 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-strp2\" (UniqueName: \"kubernetes.io/projected/f2f5631d-8486-44f9-81b0-ee78515e7866-kube-api-access-strp2\") pod \"multus-hncl2\" (UID: \"f2f5631d-8486-44f9-81b0-ee78515e7866\") " pod="openshift-multus/multus-hncl2" Oct 03 08:40:37 crc kubenswrapper[4790]: I1003 08:40:37.113594 4790 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 03 08:40:37 crc kubenswrapper[4790]: I1003 08:40:37.120851 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-hncl2" Oct 03 08:40:37 crc kubenswrapper[4790]: W1003 08:40:37.133911 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf2f5631d_8486_44f9_81b0_ee78515e7866.slice/crio-60114a09b1a6b13c80ab16a34f9a4bbb7c804e5c7dea40cd7f91e14c360d53f2 WatchSource:0}: Error finding container 60114a09b1a6b13c80ab16a34f9a4bbb7c804e5c7dea40cd7f91e14c360d53f2: Status 404 returned error can't find the container with id 60114a09b1a6b13c80ab16a34f9a4bbb7c804e5c7dea40cd7f91e14c360d53f2 Oct 03 08:40:37 crc kubenswrapper[4790]: I1003 08:40:37.234795 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-7bpwn"] Oct 03 08:40:37 crc kubenswrapper[4790]: I1003 08:40:37.235321 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-7bpwn" Oct 03 08:40:37 crc kubenswrapper[4790]: I1003 08:40:37.237561 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Oct 03 08:40:37 crc kubenswrapper[4790]: I1003 08:40:37.237879 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Oct 03 08:40:37 crc kubenswrapper[4790]: I1003 08:40:37.237879 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Oct 03 08:40:37 crc kubenswrapper[4790]: I1003 08:40:37.239413 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Oct 03 08:40:37 crc kubenswrapper[4790]: I1003 08:40:37.253601 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28b99cc027b4390bdc7e3ed519bb798eb4a7b825ff29e4574f3420b9922265b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:37Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:37 crc kubenswrapper[4790]: I1003 08:40:37.267234 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:37Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:37 crc kubenswrapper[4790]: I1003 08:40:37.282104 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:37Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:37 crc kubenswrapper[4790]: I1003 08:40:37.293989 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"36eae06a-d8be-4128-8d68-acb32e1f011b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww2bb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww2bb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:35Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-zqj4j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:37Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:37 crc kubenswrapper[4790]: I1003 08:40:37.314042 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vsjvh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa372dcc-63aa-48e5-b153-7739078026ef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:36Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52pgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52pgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52pgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52pgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52pgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52pgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52pgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52pgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52pgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vsjvh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:37Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:37 crc kubenswrapper[4790]: I1003 08:40:37.329097 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"535765e1-fd5d-4501-b13e-684c5f08b296\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e0caa9033c82a625d9594342d4e58052cfbfea8251bc0fa2f1b102ba235b5ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db5e8a267e2821ea1c1628ff477a4efc3c5bcc78538460b0f30a160bd738dcee\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d85a637b07da3de31d73486c444bf86e7e2fd799180fc1cd9ab56fee61afaa9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://905a99c09cfaadc3eddc17fa6e509098364525d7a03add46a97c98dc08def8a2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://905a99c09cfaadc3eddc17fa6e509098364525d7a03add46a97c98dc08def8a2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T08:40:30Z\\\",\\\"message\\\":\\\"file observer\\\\nW1003 08:40:30.396497 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1003 08:40:30.399279 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 08:40:30.400119 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-247026750/tls.crt::/tmp/serving-cert-247026750/tls.key\\\\\\\"\\\\nI1003 08:40:30.809123 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1003 08:40:30.812812 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1003 08:40:30.812831 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1003 08:40:30.812852 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1003 08:40:30.812858 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1003 08:40:30.817440 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1003 08:40:30.817463 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1003 08:40:30.817482 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 08:40:30.817491 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 08:40:30.817495 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1003 08:40:30.817498 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1003 08:40:30.817501 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1003 08:40:30.817504 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1003 08:40:30.821080 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4591d1f2d844cb16eef19a65aaad416898d23ca5ce266e4863a9810d49ff4ba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://444e973af493d5aca2c924bcaee20abca5f1281e6b9c9f364e4603bf622d0e48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://444e973af493d5aca2c924bcaee20abca5f1281e6b9c9f364e4603bf622d0e48\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:40:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:14Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:37Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:37 crc kubenswrapper[4790]: I1003 08:40:37.339629 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gprtl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c5bfb61-e69a-4576-b200-18dd7f38c7f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dcb3e7b6550f0ad490e187c2154ec53cb427cbab7acaf0618eb4de188eb724c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66xc5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:34Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gprtl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:37Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:37 crc kubenswrapper[4790]: I1003 08:40:37.353157 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3362615c1396a6051d3770c79e4b67eea9d0c9df2a1a45381f729c4120970818\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://41113d498e57788a4c3162e8e981f5b9d9a5a7d49ee50b15085688fbc6b67f4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:37Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:37 crc kubenswrapper[4790]: I1003 08:40:37.364063 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/a3923f73-5efb-4ce3-b68f-0c4e1e8eca4b-serviceca\") pod \"node-ca-7bpwn\" (UID: \"a3923f73-5efb-4ce3-b68f-0c4e1e8eca4b\") " pod="openshift-image-registry/node-ca-7bpwn" Oct 03 08:40:37 crc kubenswrapper[4790]: I1003 08:40:37.364122 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a3923f73-5efb-4ce3-b68f-0c4e1e8eca4b-host\") pod \"node-ca-7bpwn\" (UID: \"a3923f73-5efb-4ce3-b68f-0c4e1e8eca4b\") " pod="openshift-image-registry/node-ca-7bpwn" Oct 03 08:40:37 crc kubenswrapper[4790]: I1003 08:40:37.364181 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gt4gf\" (UniqueName: \"kubernetes.io/projected/a3923f73-5efb-4ce3-b68f-0c4e1e8eca4b-kube-api-access-gt4gf\") pod \"node-ca-7bpwn\" (UID: \"a3923f73-5efb-4ce3-b68f-0c4e1e8eca4b\") " pod="openshift-image-registry/node-ca-7bpwn" Oct 03 08:40:37 crc kubenswrapper[4790]: I1003 08:40:37.365902 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:37Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:37 crc kubenswrapper[4790]: I1003 08:40:37.387914 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-hncl2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f2f5631d-8486-44f9-81b0-ee78515e7866\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-strp2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-hncl2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:37Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:37 crc kubenswrapper[4790]: I1003 08:40:37.403162 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:37Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:37 crc kubenswrapper[4790]: I1003 08:40:37.418744 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9fnsr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e01f9373-74e4-439b-bd7f-ad00e60a3284\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krfjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krfjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krfjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krfjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krfjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krfjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krfjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9fnsr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:37Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:37 crc kubenswrapper[4790]: I1003 08:40:37.430263 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7bpwn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a3923f73-5efb-4ce3-b68f-0c4e1e8eca4b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gt4gf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:37Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7bpwn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:37Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:37 crc kubenswrapper[4790]: I1003 08:40:37.464997 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a3923f73-5efb-4ce3-b68f-0c4e1e8eca4b-host\") pod \"node-ca-7bpwn\" (UID: \"a3923f73-5efb-4ce3-b68f-0c4e1e8eca4b\") " pod="openshift-image-registry/node-ca-7bpwn" Oct 03 08:40:37 crc kubenswrapper[4790]: I1003 08:40:37.465070 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gt4gf\" (UniqueName: \"kubernetes.io/projected/a3923f73-5efb-4ce3-b68f-0c4e1e8eca4b-kube-api-access-gt4gf\") pod \"node-ca-7bpwn\" (UID: \"a3923f73-5efb-4ce3-b68f-0c4e1e8eca4b\") " pod="openshift-image-registry/node-ca-7bpwn" Oct 03 08:40:37 crc kubenswrapper[4790]: I1003 08:40:37.465102 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/a3923f73-5efb-4ce3-b68f-0c4e1e8eca4b-serviceca\") pod \"node-ca-7bpwn\" (UID: \"a3923f73-5efb-4ce3-b68f-0c4e1e8eca4b\") " pod="openshift-image-registry/node-ca-7bpwn" Oct 03 08:40:37 crc kubenswrapper[4790]: I1003 08:40:37.465183 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a3923f73-5efb-4ce3-b68f-0c4e1e8eca4b-host\") pod \"node-ca-7bpwn\" (UID: \"a3923f73-5efb-4ce3-b68f-0c4e1e8eca4b\") " pod="openshift-image-registry/node-ca-7bpwn" Oct 03 08:40:37 crc kubenswrapper[4790]: I1003 08:40:37.465980 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/a3923f73-5efb-4ce3-b68f-0c4e1e8eca4b-serviceca\") pod \"node-ca-7bpwn\" (UID: \"a3923f73-5efb-4ce3-b68f-0c4e1e8eca4b\") " pod="openshift-image-registry/node-ca-7bpwn" Oct 03 08:40:37 crc kubenswrapper[4790]: I1003 08:40:37.482010 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gt4gf\" (UniqueName: \"kubernetes.io/projected/a3923f73-5efb-4ce3-b68f-0c4e1e8eca4b-kube-api-access-gt4gf\") pod \"node-ca-7bpwn\" (UID: \"a3923f73-5efb-4ce3-b68f-0c4e1e8eca4b\") " pod="openshift-image-registry/node-ca-7bpwn" Oct 03 08:40:37 crc kubenswrapper[4790]: I1003 08:40:37.497708 4790 generic.go:334] "Generic (PLEG): container finished" podID="e01f9373-74e4-439b-bd7f-ad00e60a3284" containerID="876c35e32e5877c2a69cdbb54b84ec0bc0daebd05c7847875a1c122aafac2057" exitCode=0 Oct 03 08:40:37 crc kubenswrapper[4790]: I1003 08:40:37.497793 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-9fnsr" event={"ID":"e01f9373-74e4-439b-bd7f-ad00e60a3284","Type":"ContainerDied","Data":"876c35e32e5877c2a69cdbb54b84ec0bc0daebd05c7847875a1c122aafac2057"} Oct 03 08:40:37 crc kubenswrapper[4790]: I1003 08:40:37.497839 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-9fnsr" event={"ID":"e01f9373-74e4-439b-bd7f-ad00e60a3284","Type":"ContainerStarted","Data":"432f4a1a13d315af0c9d7efebf7ca333936b9279d9f4a61d399b5245918f0d34"} Oct 03 08:40:37 crc kubenswrapper[4790]: I1003 08:40:37.500402 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" event={"ID":"36eae06a-d8be-4128-8d68-acb32e1f011b","Type":"ContainerStarted","Data":"a97a1a7f3aba494cd0da6bcd15052a85d70642642a62d108ac0be51ce8e3d302"} Oct 03 08:40:37 crc kubenswrapper[4790]: I1003 08:40:37.500438 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" event={"ID":"36eae06a-d8be-4128-8d68-acb32e1f011b","Type":"ContainerStarted","Data":"b8c3aea222eb433729d8eac107ebcdb306f649bef25942cd0390bfc1bbda5c74"} Oct 03 08:40:37 crc kubenswrapper[4790]: I1003 08:40:37.502048 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"08d57a60251c60c6c6a2cab2b862c322c01b531ca1aa12ff4b0933bee9eb95f8"} Oct 03 08:40:37 crc kubenswrapper[4790]: I1003 08:40:37.504622 4790 generic.go:334] "Generic (PLEG): container finished" podID="fa372dcc-63aa-48e5-b153-7739078026ef" containerID="e45c39722f37fc673cee930ca10396bba43b56e19e61aa99bd2b89f131341cb4" exitCode=0 Oct 03 08:40:37 crc kubenswrapper[4790]: I1003 08:40:37.504660 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vsjvh" event={"ID":"fa372dcc-63aa-48e5-b153-7739078026ef","Type":"ContainerDied","Data":"e45c39722f37fc673cee930ca10396bba43b56e19e61aa99bd2b89f131341cb4"} Oct 03 08:40:37 crc kubenswrapper[4790]: I1003 08:40:37.507118 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-hncl2" event={"ID":"f2f5631d-8486-44f9-81b0-ee78515e7866","Type":"ContainerStarted","Data":"3fde680972c36057c9e06b0f745acbfb2e6ab399b14d6c25a3c379b65e598afd"} Oct 03 08:40:37 crc kubenswrapper[4790]: I1003 08:40:37.507194 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-hncl2" event={"ID":"f2f5631d-8486-44f9-81b0-ee78515e7866","Type":"ContainerStarted","Data":"60114a09b1a6b13c80ab16a34f9a4bbb7c804e5c7dea40cd7f91e14c360d53f2"} Oct 03 08:40:37 crc kubenswrapper[4790]: I1003 08:40:37.507760 4790 scope.go:117] "RemoveContainer" containerID="905a99c09cfaadc3eddc17fa6e509098364525d7a03add46a97c98dc08def8a2" Oct 03 08:40:37 crc kubenswrapper[4790]: E1003 08:40:37.508124 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Oct 03 08:40:37 crc kubenswrapper[4790]: I1003 08:40:37.531219 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9fnsr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e01f9373-74e4-439b-bd7f-ad00e60a3284\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krfjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://876c35e32e5877c2a69cdbb54b84ec0bc0daebd05c7847875a1c122aafac2057\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://876c35e32e5877c2a69cdbb54b84ec0bc0daebd05c7847875a1c122aafac2057\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krfjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krfjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krfjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krfjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krfjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krfjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9fnsr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:37Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:37 crc kubenswrapper[4790]: I1003 08:40:37.544126 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7bpwn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a3923f73-5efb-4ce3-b68f-0c4e1e8eca4b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gt4gf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:37Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7bpwn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:37Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:37 crc kubenswrapper[4790]: I1003 08:40:37.551168 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-7bpwn" Oct 03 08:40:37 crc kubenswrapper[4790]: I1003 08:40:37.558513 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:37Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:37 crc kubenswrapper[4790]: I1003 08:40:37.574965 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:37Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:37 crc kubenswrapper[4790]: I1003 08:40:37.589335 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"36eae06a-d8be-4128-8d68-acb32e1f011b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww2bb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww2bb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:35Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-zqj4j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:37Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:37 crc kubenswrapper[4790]: I1003 08:40:37.611436 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vsjvh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa372dcc-63aa-48e5-b153-7739078026ef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:36Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52pgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52pgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52pgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52pgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52pgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52pgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52pgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52pgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52pgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vsjvh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:37Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:37 crc kubenswrapper[4790]: I1003 08:40:37.629886 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28b99cc027b4390bdc7e3ed519bb798eb4a7b825ff29e4574f3420b9922265b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:37Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:37 crc kubenswrapper[4790]: I1003 08:40:37.648453 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:37Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:37 crc kubenswrapper[4790]: I1003 08:40:37.664738 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gprtl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c5bfb61-e69a-4576-b200-18dd7f38c7f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dcb3e7b6550f0ad490e187c2154ec53cb427cbab7acaf0618eb4de188eb724c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66xc5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:34Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gprtl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:37Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:37 crc kubenswrapper[4790]: I1003 08:40:37.677691 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3362615c1396a6051d3770c79e4b67eea9d0c9df2a1a45381f729c4120970818\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://41113d498e57788a4c3162e8e981f5b9d9a5a7d49ee50b15085688fbc6b67f4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:37Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:37 crc kubenswrapper[4790]: I1003 08:40:37.695265 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"535765e1-fd5d-4501-b13e-684c5f08b296\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e0caa9033c82a625d9594342d4e58052cfbfea8251bc0fa2f1b102ba235b5ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db5e8a267e2821ea1c1628ff477a4efc3c5bcc78538460b0f30a160bd738dcee\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d85a637b07da3de31d73486c444bf86e7e2fd799180fc1cd9ab56fee61afaa9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://905a99c09cfaadc3eddc17fa6e509098364525d7a03add46a97c98dc08def8a2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://905a99c09cfaadc3eddc17fa6e509098364525d7a03add46a97c98dc08def8a2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T08:40:30Z\\\",\\\"message\\\":\\\"file observer\\\\nW1003 08:40:30.396497 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1003 08:40:30.399279 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 08:40:30.400119 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-247026750/tls.crt::/tmp/serving-cert-247026750/tls.key\\\\\\\"\\\\nI1003 08:40:30.809123 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1003 08:40:30.812812 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1003 08:40:30.812831 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1003 08:40:30.812852 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1003 08:40:30.812858 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1003 08:40:30.817440 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1003 08:40:30.817463 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1003 08:40:30.817482 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 08:40:30.817491 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 08:40:30.817495 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1003 08:40:30.817498 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1003 08:40:30.817501 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1003 08:40:30.817504 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1003 08:40:30.821080 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4591d1f2d844cb16eef19a65aaad416898d23ca5ce266e4863a9810d49ff4ba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://444e973af493d5aca2c924bcaee20abca5f1281e6b9c9f364e4603bf622d0e48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://444e973af493d5aca2c924bcaee20abca5f1281e6b9c9f364e4603bf622d0e48\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:40:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:14Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:37Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:37 crc kubenswrapper[4790]: I1003 08:40:37.709980 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:37Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:37 crc kubenswrapper[4790]: I1003 08:40:37.724955 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-hncl2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f2f5631d-8486-44f9-81b0-ee78515e7866\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-strp2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-hncl2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:37Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:37 crc kubenswrapper[4790]: I1003 08:40:37.737941 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08d57a60251c60c6c6a2cab2b862c322c01b531ca1aa12ff4b0933bee9eb95f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:37Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:37 crc kubenswrapper[4790]: I1003 08:40:37.752665 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-hncl2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f2f5631d-8486-44f9-81b0-ee78515e7866\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3fde680972c36057c9e06b0f745acbfb2e6ab399b14d6c25a3c379b65e598afd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-strp2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-hncl2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:37Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:37 crc kubenswrapper[4790]: I1003 08:40:37.766055 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:37Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:37 crc kubenswrapper[4790]: I1003 08:40:37.779439 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9fnsr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e01f9373-74e4-439b-bd7f-ad00e60a3284\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krfjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://876c35e32e5877c2a69cdbb54b84ec0bc0daebd05c7847875a1c122aafac2057\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://876c35e32e5877c2a69cdbb54b84ec0bc0daebd05c7847875a1c122aafac2057\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krfjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krfjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krfjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krfjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krfjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krfjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9fnsr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:37Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:37 crc kubenswrapper[4790]: I1003 08:40:37.790384 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7bpwn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a3923f73-5efb-4ce3-b68f-0c4e1e8eca4b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gt4gf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:37Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7bpwn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:37Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:37 crc kubenswrapper[4790]: I1003 08:40:37.804284 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28b99cc027b4390bdc7e3ed519bb798eb4a7b825ff29e4574f3420b9922265b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:37Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:37 crc kubenswrapper[4790]: I1003 08:40:37.818905 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:37Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:37 crc kubenswrapper[4790]: I1003 08:40:37.833496 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:37Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:37 crc kubenswrapper[4790]: I1003 08:40:37.849308 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"36eae06a-d8be-4128-8d68-acb32e1f011b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a97a1a7f3aba494cd0da6bcd15052a85d70642642a62d108ac0be51ce8e3d302\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww2bb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8c3aea222eb433729d8eac107ebcdb306f649bef25942cd0390bfc1bbda5c74\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww2bb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:35Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-zqj4j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:37Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:37 crc kubenswrapper[4790]: I1003 08:40:37.869063 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vsjvh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa372dcc-63aa-48e5-b153-7739078026ef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52pgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52pgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52pgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52pgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52pgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52pgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52pgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52pgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e45c39722f37fc673cee930ca10396bba43b56e19e61aa99bd2b89f131341cb4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e45c39722f37fc673cee930ca10396bba43b56e19e61aa99bd2b89f131341cb4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:40:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52pgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vsjvh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:37Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:37 crc kubenswrapper[4790]: I1003 08:40:37.870117 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 08:40:37 crc kubenswrapper[4790]: E1003 08:40:37.870368 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 08:40:41.870326034 +0000 UTC m=+28.223260272 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 08:40:37 crc kubenswrapper[4790]: I1003 08:40:37.886177 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"535765e1-fd5d-4501-b13e-684c5f08b296\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e0caa9033c82a625d9594342d4e58052cfbfea8251bc0fa2f1b102ba235b5ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db5e8a267e2821ea1c1628ff477a4efc3c5bcc78538460b0f30a160bd738dcee\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d85a637b07da3de31d73486c444bf86e7e2fd799180fc1cd9ab56fee61afaa9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://905a99c09cfaadc3eddc17fa6e509098364525d7a03add46a97c98dc08def8a2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://905a99c09cfaadc3eddc17fa6e509098364525d7a03add46a97c98dc08def8a2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T08:40:30Z\\\",\\\"message\\\":\\\"file observer\\\\nW1003 08:40:30.396497 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1003 08:40:30.399279 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 08:40:30.400119 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-247026750/tls.crt::/tmp/serving-cert-247026750/tls.key\\\\\\\"\\\\nI1003 08:40:30.809123 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1003 08:40:30.812812 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1003 08:40:30.812831 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1003 08:40:30.812852 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1003 08:40:30.812858 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1003 08:40:30.817440 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1003 08:40:30.817463 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1003 08:40:30.817482 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 08:40:30.817491 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 08:40:30.817495 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1003 08:40:30.817498 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1003 08:40:30.817501 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1003 08:40:30.817504 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1003 08:40:30.821080 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4591d1f2d844cb16eef19a65aaad416898d23ca5ce266e4863a9810d49ff4ba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://444e973af493d5aca2c924bcaee20abca5f1281e6b9c9f364e4603bf622d0e48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://444e973af493d5aca2c924bcaee20abca5f1281e6b9c9f364e4603bf622d0e48\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:40:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:14Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:37Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:37 crc kubenswrapper[4790]: I1003 08:40:37.903963 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gprtl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c5bfb61-e69a-4576-b200-18dd7f38c7f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dcb3e7b6550f0ad490e187c2154ec53cb427cbab7acaf0618eb4de188eb724c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66xc5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:34Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gprtl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:37Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:37 crc kubenswrapper[4790]: I1003 08:40:37.918318 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3362615c1396a6051d3770c79e4b67eea9d0c9df2a1a45381f729c4120970818\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://41113d498e57788a4c3162e8e981f5b9d9a5a7d49ee50b15085688fbc6b67f4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:37Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:37 crc kubenswrapper[4790]: I1003 08:40:37.971790 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 08:40:37 crc kubenswrapper[4790]: I1003 08:40:37.971873 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 08:40:37 crc kubenswrapper[4790]: I1003 08:40:37.971915 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 08:40:37 crc kubenswrapper[4790]: I1003 08:40:37.971980 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 08:40:37 crc kubenswrapper[4790]: E1003 08:40:37.972184 4790 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 03 08:40:37 crc kubenswrapper[4790]: E1003 08:40:37.972213 4790 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 03 08:40:37 crc kubenswrapper[4790]: E1003 08:40:37.972232 4790 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 03 08:40:37 crc kubenswrapper[4790]: E1003 08:40:37.972311 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-03 08:40:41.972285609 +0000 UTC m=+28.325219877 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 03 08:40:37 crc kubenswrapper[4790]: E1003 08:40:37.972968 4790 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 03 08:40:37 crc kubenswrapper[4790]: E1003 08:40:37.973034 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-03 08:40:41.973017689 +0000 UTC m=+28.325951967 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 03 08:40:37 crc kubenswrapper[4790]: E1003 08:40:37.973127 4790 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 03 08:40:37 crc kubenswrapper[4790]: E1003 08:40:37.973154 4790 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 03 08:40:37 crc kubenswrapper[4790]: E1003 08:40:37.973169 4790 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 03 08:40:37 crc kubenswrapper[4790]: E1003 08:40:37.973208 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-03 08:40:41.973195864 +0000 UTC m=+28.326130132 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 03 08:40:37 crc kubenswrapper[4790]: E1003 08:40:37.973286 4790 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 03 08:40:37 crc kubenswrapper[4790]: E1003 08:40:37.973342 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-03 08:40:41.973312977 +0000 UTC m=+28.326247255 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 03 08:40:38 crc kubenswrapper[4790]: I1003 08:40:38.327473 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 08:40:38 crc kubenswrapper[4790]: I1003 08:40:38.327515 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 08:40:38 crc kubenswrapper[4790]: E1003 08:40:38.327925 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 08:40:38 crc kubenswrapper[4790]: I1003 08:40:38.327515 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 08:40:38 crc kubenswrapper[4790]: E1003 08:40:38.328076 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 08:40:38 crc kubenswrapper[4790]: E1003 08:40:38.328154 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 08:40:38 crc kubenswrapper[4790]: I1003 08:40:38.514077 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vsjvh" event={"ID":"fa372dcc-63aa-48e5-b153-7739078026ef","Type":"ContainerStarted","Data":"e52e55afef7b68f05e0c7d975d4497ee57dd9fa5aef7f72e9ca7441ea1633c05"} Oct 03 08:40:38 crc kubenswrapper[4790]: I1003 08:40:38.514134 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vsjvh" event={"ID":"fa372dcc-63aa-48e5-b153-7739078026ef","Type":"ContainerStarted","Data":"e69102d376e90485ca565d34e18dbc12067e166f0bdf3c4325a0873f9f2b9b0b"} Oct 03 08:40:38 crc kubenswrapper[4790]: I1003 08:40:38.514147 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vsjvh" event={"ID":"fa372dcc-63aa-48e5-b153-7739078026ef","Type":"ContainerStarted","Data":"ff0f1dd623e164ebe1862661c5905a38bec3aa1dd5c61982eebde25d0feb4c6c"} Oct 03 08:40:38 crc kubenswrapper[4790]: I1003 08:40:38.514157 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vsjvh" event={"ID":"fa372dcc-63aa-48e5-b153-7739078026ef","Type":"ContainerStarted","Data":"27b1573f76936a5148d24accaa7fdc1a1da552fb56b348812445e7f1c1e4cbef"} Oct 03 08:40:38 crc kubenswrapper[4790]: I1003 08:40:38.516170 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-9fnsr" event={"ID":"e01f9373-74e4-439b-bd7f-ad00e60a3284","Type":"ContainerStarted","Data":"938ff3fffad5347212d00d02f96c105cd5e6252a4d6a7ada258f0367da3f8e77"} Oct 03 08:40:38 crc kubenswrapper[4790]: I1003 08:40:38.519401 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-7bpwn" event={"ID":"a3923f73-5efb-4ce3-b68f-0c4e1e8eca4b","Type":"ContainerStarted","Data":"305c950f0352fe1bec1c151dc89eed1389144ad5ee321458d48c6609e688bb73"} Oct 03 08:40:38 crc kubenswrapper[4790]: I1003 08:40:38.519440 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-7bpwn" event={"ID":"a3923f73-5efb-4ce3-b68f-0c4e1e8eca4b","Type":"ContainerStarted","Data":"d421109c6fb9fba54d1002b1fc703d41bf7701a75d62b2b4ad777f0426a6140e"} Oct 03 08:40:38 crc kubenswrapper[4790]: I1003 08:40:38.534226 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08d57a60251c60c6c6a2cab2b862c322c01b531ca1aa12ff4b0933bee9eb95f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:38Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:38 crc kubenswrapper[4790]: I1003 08:40:38.549702 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-hncl2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f2f5631d-8486-44f9-81b0-ee78515e7866\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3fde680972c36057c9e06b0f745acbfb2e6ab399b14d6c25a3c379b65e598afd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-strp2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-hncl2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:38Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:38 crc kubenswrapper[4790]: I1003 08:40:38.565959 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:38Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:38 crc kubenswrapper[4790]: I1003 08:40:38.582759 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9fnsr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e01f9373-74e4-439b-bd7f-ad00e60a3284\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krfjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://876c35e32e5877c2a69cdbb54b84ec0bc0daebd05c7847875a1c122aafac2057\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://876c35e32e5877c2a69cdbb54b84ec0bc0daebd05c7847875a1c122aafac2057\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krfjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://938ff3fffad5347212d00d02f96c105cd5e6252a4d6a7ada258f0367da3f8e77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krfjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krfjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krfjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krfjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krfjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9fnsr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:38Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:38 crc kubenswrapper[4790]: I1003 08:40:38.596242 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7bpwn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a3923f73-5efb-4ce3-b68f-0c4e1e8eca4b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gt4gf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:37Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7bpwn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:38Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:38 crc kubenswrapper[4790]: I1003 08:40:38.623017 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28b99cc027b4390bdc7e3ed519bb798eb4a7b825ff29e4574f3420b9922265b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:38Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:38 crc kubenswrapper[4790]: I1003 08:40:38.653308 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:38Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:38 crc kubenswrapper[4790]: I1003 08:40:38.675357 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:38Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:38 crc kubenswrapper[4790]: I1003 08:40:38.703999 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"36eae06a-d8be-4128-8d68-acb32e1f011b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a97a1a7f3aba494cd0da6bcd15052a85d70642642a62d108ac0be51ce8e3d302\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww2bb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8c3aea222eb433729d8eac107ebcdb306f649bef25942cd0390bfc1bbda5c74\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww2bb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:35Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-zqj4j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:38Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:38 crc kubenswrapper[4790]: I1003 08:40:38.727888 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vsjvh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa372dcc-63aa-48e5-b153-7739078026ef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52pgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52pgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52pgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52pgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52pgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52pgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52pgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52pgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e45c39722f37fc673cee930ca10396bba43b56e19e61aa99bd2b89f131341cb4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e45c39722f37fc673cee930ca10396bba43b56e19e61aa99bd2b89f131341cb4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:40:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52pgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vsjvh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:38Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:38 crc kubenswrapper[4790]: I1003 08:40:38.745972 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"535765e1-fd5d-4501-b13e-684c5f08b296\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e0caa9033c82a625d9594342d4e58052cfbfea8251bc0fa2f1b102ba235b5ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db5e8a267e2821ea1c1628ff477a4efc3c5bcc78538460b0f30a160bd738dcee\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d85a637b07da3de31d73486c444bf86e7e2fd799180fc1cd9ab56fee61afaa9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://905a99c09cfaadc3eddc17fa6e509098364525d7a03add46a97c98dc08def8a2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://905a99c09cfaadc3eddc17fa6e509098364525d7a03add46a97c98dc08def8a2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T08:40:30Z\\\",\\\"message\\\":\\\"file observer\\\\nW1003 08:40:30.396497 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1003 08:40:30.399279 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 08:40:30.400119 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-247026750/tls.crt::/tmp/serving-cert-247026750/tls.key\\\\\\\"\\\\nI1003 08:40:30.809123 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1003 08:40:30.812812 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1003 08:40:30.812831 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1003 08:40:30.812852 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1003 08:40:30.812858 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1003 08:40:30.817440 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1003 08:40:30.817463 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1003 08:40:30.817482 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 08:40:30.817491 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 08:40:30.817495 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1003 08:40:30.817498 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1003 08:40:30.817501 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1003 08:40:30.817504 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1003 08:40:30.821080 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4591d1f2d844cb16eef19a65aaad416898d23ca5ce266e4863a9810d49ff4ba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://444e973af493d5aca2c924bcaee20abca5f1281e6b9c9f364e4603bf622d0e48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://444e973af493d5aca2c924bcaee20abca5f1281e6b9c9f364e4603bf622d0e48\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:40:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:14Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:38Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:38 crc kubenswrapper[4790]: I1003 08:40:38.758301 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gprtl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c5bfb61-e69a-4576-b200-18dd7f38c7f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dcb3e7b6550f0ad490e187c2154ec53cb427cbab7acaf0618eb4de188eb724c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66xc5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:34Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gprtl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:38Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:38 crc kubenswrapper[4790]: I1003 08:40:38.772879 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3362615c1396a6051d3770c79e4b67eea9d0c9df2a1a45381f729c4120970818\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://41113d498e57788a4c3162e8e981f5b9d9a5a7d49ee50b15085688fbc6b67f4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:38Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:38 crc kubenswrapper[4790]: I1003 08:40:38.788139 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9fnsr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e01f9373-74e4-439b-bd7f-ad00e60a3284\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krfjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://876c35e32e5877c2a69cdbb54b84ec0bc0daebd05c7847875a1c122aafac2057\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://876c35e32e5877c2a69cdbb54b84ec0bc0daebd05c7847875a1c122aafac2057\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krfjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://938ff3fffad5347212d00d02f96c105cd5e6252a4d6a7ada258f0367da3f8e77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krfjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krfjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krfjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krfjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krfjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9fnsr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:38Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:38 crc kubenswrapper[4790]: I1003 08:40:38.798948 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7bpwn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a3923f73-5efb-4ce3-b68f-0c4e1e8eca4b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://305c950f0352fe1bec1c151dc89eed1389144ad5ee321458d48c6609e688bb73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gt4gf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:37Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7bpwn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:38Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:38 crc kubenswrapper[4790]: I1003 08:40:38.812835 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:38Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:38 crc kubenswrapper[4790]: I1003 08:40:38.825099 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:38Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:38 crc kubenswrapper[4790]: I1003 08:40:38.837895 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"36eae06a-d8be-4128-8d68-acb32e1f011b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a97a1a7f3aba494cd0da6bcd15052a85d70642642a62d108ac0be51ce8e3d302\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww2bb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8c3aea222eb433729d8eac107ebcdb306f649bef25942cd0390bfc1bbda5c74\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww2bb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:35Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-zqj4j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:38Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:38 crc kubenswrapper[4790]: I1003 08:40:38.860179 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vsjvh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa372dcc-63aa-48e5-b153-7739078026ef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52pgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52pgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52pgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52pgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52pgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52pgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52pgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52pgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e45c39722f37fc673cee930ca10396bba43b56e19e61aa99bd2b89f131341cb4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e45c39722f37fc673cee930ca10396bba43b56e19e61aa99bd2b89f131341cb4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:40:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52pgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vsjvh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:38Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:38 crc kubenswrapper[4790]: I1003 08:40:38.873645 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28b99cc027b4390bdc7e3ed519bb798eb4a7b825ff29e4574f3420b9922265b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:38Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:38 crc kubenswrapper[4790]: I1003 08:40:38.887093 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:38Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:38 crc kubenswrapper[4790]: I1003 08:40:38.897674 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gprtl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c5bfb61-e69a-4576-b200-18dd7f38c7f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dcb3e7b6550f0ad490e187c2154ec53cb427cbab7acaf0618eb4de188eb724c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66xc5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:34Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gprtl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:38Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:38 crc kubenswrapper[4790]: I1003 08:40:38.910447 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3362615c1396a6051d3770c79e4b67eea9d0c9df2a1a45381f729c4120970818\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://41113d498e57788a4c3162e8e981f5b9d9a5a7d49ee50b15085688fbc6b67f4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:38Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:38 crc kubenswrapper[4790]: I1003 08:40:38.925529 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"535765e1-fd5d-4501-b13e-684c5f08b296\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e0caa9033c82a625d9594342d4e58052cfbfea8251bc0fa2f1b102ba235b5ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db5e8a267e2821ea1c1628ff477a4efc3c5bcc78538460b0f30a160bd738dcee\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d85a637b07da3de31d73486c444bf86e7e2fd799180fc1cd9ab56fee61afaa9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://905a99c09cfaadc3eddc17fa6e509098364525d7a03add46a97c98dc08def8a2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://905a99c09cfaadc3eddc17fa6e509098364525d7a03add46a97c98dc08def8a2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T08:40:30Z\\\",\\\"message\\\":\\\"file observer\\\\nW1003 08:40:30.396497 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1003 08:40:30.399279 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 08:40:30.400119 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-247026750/tls.crt::/tmp/serving-cert-247026750/tls.key\\\\\\\"\\\\nI1003 08:40:30.809123 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1003 08:40:30.812812 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1003 08:40:30.812831 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1003 08:40:30.812852 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1003 08:40:30.812858 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1003 08:40:30.817440 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1003 08:40:30.817463 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1003 08:40:30.817482 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 08:40:30.817491 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 08:40:30.817495 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1003 08:40:30.817498 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1003 08:40:30.817501 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1003 08:40:30.817504 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1003 08:40:30.821080 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4591d1f2d844cb16eef19a65aaad416898d23ca5ce266e4863a9810d49ff4ba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://444e973af493d5aca2c924bcaee20abca5f1281e6b9c9f364e4603bf622d0e48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://444e973af493d5aca2c924bcaee20abca5f1281e6b9c9f364e4603bf622d0e48\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:40:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:14Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:38Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:38 crc kubenswrapper[4790]: I1003 08:40:38.936906 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08d57a60251c60c6c6a2cab2b862c322c01b531ca1aa12ff4b0933bee9eb95f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:38Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:38 crc kubenswrapper[4790]: I1003 08:40:38.949051 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-hncl2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f2f5631d-8486-44f9-81b0-ee78515e7866\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3fde680972c36057c9e06b0f745acbfb2e6ab399b14d6c25a3c379b65e598afd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-strp2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-hncl2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:38Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:39 crc kubenswrapper[4790]: I1003 08:40:39.530404 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vsjvh" event={"ID":"fa372dcc-63aa-48e5-b153-7739078026ef","Type":"ContainerStarted","Data":"55b4e8ab33316a7f7057ee723ebd49eb0adcd7270124f084b2828ce538ed3297"} Oct 03 08:40:39 crc kubenswrapper[4790]: I1003 08:40:39.530505 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vsjvh" event={"ID":"fa372dcc-63aa-48e5-b153-7739078026ef","Type":"ContainerStarted","Data":"6f2aaa340e3927261ef0119f786b348e2091ddcd4b791619cc521122cc0774e7"} Oct 03 08:40:39 crc kubenswrapper[4790]: I1003 08:40:39.534957 4790 generic.go:334] "Generic (PLEG): container finished" podID="e01f9373-74e4-439b-bd7f-ad00e60a3284" containerID="938ff3fffad5347212d00d02f96c105cd5e6252a4d6a7ada258f0367da3f8e77" exitCode=0 Oct 03 08:40:39 crc kubenswrapper[4790]: I1003 08:40:39.535014 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-9fnsr" event={"ID":"e01f9373-74e4-439b-bd7f-ad00e60a3284","Type":"ContainerDied","Data":"938ff3fffad5347212d00d02f96c105cd5e6252a4d6a7ada258f0367da3f8e77"} Oct 03 08:40:39 crc kubenswrapper[4790]: I1003 08:40:39.555796 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vsjvh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa372dcc-63aa-48e5-b153-7739078026ef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52pgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52pgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52pgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52pgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52pgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52pgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52pgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52pgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e45c39722f37fc673cee930ca10396bba43b56e19e61aa99bd2b89f131341cb4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e45c39722f37fc673cee930ca10396bba43b56e19e61aa99bd2b89f131341cb4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:40:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52pgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vsjvh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:39Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:39 crc kubenswrapper[4790]: I1003 08:40:39.569122 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28b99cc027b4390bdc7e3ed519bb798eb4a7b825ff29e4574f3420b9922265b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:39Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:39 crc kubenswrapper[4790]: I1003 08:40:39.584779 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:39Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:39 crc kubenswrapper[4790]: I1003 08:40:39.605841 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:39Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:39 crc kubenswrapper[4790]: I1003 08:40:39.622955 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"36eae06a-d8be-4128-8d68-acb32e1f011b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a97a1a7f3aba494cd0da6bcd15052a85d70642642a62d108ac0be51ce8e3d302\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww2bb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8c3aea222eb433729d8eac107ebcdb306f649bef25942cd0390bfc1bbda5c74\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww2bb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:35Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-zqj4j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:39Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:39 crc kubenswrapper[4790]: I1003 08:40:39.639263 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"535765e1-fd5d-4501-b13e-684c5f08b296\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e0caa9033c82a625d9594342d4e58052cfbfea8251bc0fa2f1b102ba235b5ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db5e8a267e2821ea1c1628ff477a4efc3c5bcc78538460b0f30a160bd738dcee\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d85a637b07da3de31d73486c444bf86e7e2fd799180fc1cd9ab56fee61afaa9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://905a99c09cfaadc3eddc17fa6e509098364525d7a03add46a97c98dc08def8a2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://905a99c09cfaadc3eddc17fa6e509098364525d7a03add46a97c98dc08def8a2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T08:40:30Z\\\",\\\"message\\\":\\\"file observer\\\\nW1003 08:40:30.396497 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1003 08:40:30.399279 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 08:40:30.400119 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-247026750/tls.crt::/tmp/serving-cert-247026750/tls.key\\\\\\\"\\\\nI1003 08:40:30.809123 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1003 08:40:30.812812 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1003 08:40:30.812831 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1003 08:40:30.812852 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1003 08:40:30.812858 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1003 08:40:30.817440 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1003 08:40:30.817463 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1003 08:40:30.817482 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 08:40:30.817491 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 08:40:30.817495 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1003 08:40:30.817498 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1003 08:40:30.817501 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1003 08:40:30.817504 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1003 08:40:30.821080 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4591d1f2d844cb16eef19a65aaad416898d23ca5ce266e4863a9810d49ff4ba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://444e973af493d5aca2c924bcaee20abca5f1281e6b9c9f364e4603bf622d0e48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://444e973af493d5aca2c924bcaee20abca5f1281e6b9c9f364e4603bf622d0e48\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:40:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:14Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:39Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:39 crc kubenswrapper[4790]: I1003 08:40:39.652047 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gprtl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c5bfb61-e69a-4576-b200-18dd7f38c7f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dcb3e7b6550f0ad490e187c2154ec53cb427cbab7acaf0618eb4de188eb724c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66xc5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:34Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gprtl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:39Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:39 crc kubenswrapper[4790]: I1003 08:40:39.666081 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3362615c1396a6051d3770c79e4b67eea9d0c9df2a1a45381f729c4120970818\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://41113d498e57788a4c3162e8e981f5b9d9a5a7d49ee50b15085688fbc6b67f4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:39Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:39 crc kubenswrapper[4790]: I1003 08:40:39.679939 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08d57a60251c60c6c6a2cab2b862c322c01b531ca1aa12ff4b0933bee9eb95f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:39Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:39 crc kubenswrapper[4790]: I1003 08:40:39.696913 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-hncl2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f2f5631d-8486-44f9-81b0-ee78515e7866\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3fde680972c36057c9e06b0f745acbfb2e6ab399b14d6c25a3c379b65e598afd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-strp2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-hncl2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:39Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:39 crc kubenswrapper[4790]: I1003 08:40:39.711640 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:39Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:39 crc kubenswrapper[4790]: I1003 08:40:39.728873 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9fnsr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e01f9373-74e4-439b-bd7f-ad00e60a3284\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krfjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://876c35e32e5877c2a69cdbb54b84ec0bc0daebd05c7847875a1c122aafac2057\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://876c35e32e5877c2a69cdbb54b84ec0bc0daebd05c7847875a1c122aafac2057\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krfjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://938ff3fffad5347212d00d02f96c105cd5e6252a4d6a7ada258f0367da3f8e77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://938ff3fffad5347212d00d02f96c105cd5e6252a4d6a7ada258f0367da3f8e77\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:40:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krfjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krfjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krfjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krfjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krfjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9fnsr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:39Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:39 crc kubenswrapper[4790]: I1003 08:40:39.743032 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7bpwn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a3923f73-5efb-4ce3-b68f-0c4e1e8eca4b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://305c950f0352fe1bec1c151dc89eed1389144ad5ee321458d48c6609e688bb73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gt4gf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:37Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7bpwn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:39Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:40 crc kubenswrapper[4790]: I1003 08:40:40.303850 4790 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 03 08:40:40 crc kubenswrapper[4790]: I1003 08:40:40.306837 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:40 crc kubenswrapper[4790]: I1003 08:40:40.306909 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:40 crc kubenswrapper[4790]: I1003 08:40:40.306934 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:40 crc kubenswrapper[4790]: I1003 08:40:40.307090 4790 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 03 08:40:40 crc kubenswrapper[4790]: I1003 08:40:40.316160 4790 kubelet_node_status.go:115] "Node was previously registered" node="crc" Oct 03 08:40:40 crc kubenswrapper[4790]: I1003 08:40:40.316520 4790 kubelet_node_status.go:79] "Successfully registered node" node="crc" Oct 03 08:40:40 crc kubenswrapper[4790]: I1003 08:40:40.318296 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:40 crc kubenswrapper[4790]: I1003 08:40:40.318341 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:40 crc kubenswrapper[4790]: I1003 08:40:40.318351 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:40 crc kubenswrapper[4790]: I1003 08:40:40.318370 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:40 crc kubenswrapper[4790]: I1003 08:40:40.318382 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:40Z","lastTransitionTime":"2025-10-03T08:40:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:40 crc kubenswrapper[4790]: I1003 08:40:40.327452 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 08:40:40 crc kubenswrapper[4790]: I1003 08:40:40.327588 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 08:40:40 crc kubenswrapper[4790]: E1003 08:40:40.327756 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 08:40:40 crc kubenswrapper[4790]: I1003 08:40:40.327819 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 08:40:40 crc kubenswrapper[4790]: E1003 08:40:40.327943 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 08:40:40 crc kubenswrapper[4790]: E1003 08:40:40.328195 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 08:40:40 crc kubenswrapper[4790]: E1003 08:40:40.338633 4790 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T08:40:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T08:40:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:40Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T08:40:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T08:40:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:40Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e374b313-7030-4739-b7f1-da1c9b39fe8d\\\",\\\"systemUUID\\\":\\\"312be368-bbdc-461d-b74b-f792414190de\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:40Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:40 crc kubenswrapper[4790]: I1003 08:40:40.343895 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:40 crc kubenswrapper[4790]: I1003 08:40:40.343946 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:40 crc kubenswrapper[4790]: I1003 08:40:40.343957 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:40 crc kubenswrapper[4790]: I1003 08:40:40.343978 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:40 crc kubenswrapper[4790]: I1003 08:40:40.343990 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:40Z","lastTransitionTime":"2025-10-03T08:40:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:40 crc kubenswrapper[4790]: E1003 08:40:40.358012 4790 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T08:40:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T08:40:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:40Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T08:40:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T08:40:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:40Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e374b313-7030-4739-b7f1-da1c9b39fe8d\\\",\\\"systemUUID\\\":\\\"312be368-bbdc-461d-b74b-f792414190de\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:40Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:40 crc kubenswrapper[4790]: I1003 08:40:40.362321 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:40 crc kubenswrapper[4790]: I1003 08:40:40.362369 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:40 crc kubenswrapper[4790]: I1003 08:40:40.362384 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:40 crc kubenswrapper[4790]: I1003 08:40:40.362404 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:40 crc kubenswrapper[4790]: I1003 08:40:40.362416 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:40Z","lastTransitionTime":"2025-10-03T08:40:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:40 crc kubenswrapper[4790]: E1003 08:40:40.378594 4790 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T08:40:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T08:40:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:40Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T08:40:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T08:40:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:40Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e374b313-7030-4739-b7f1-da1c9b39fe8d\\\",\\\"systemUUID\\\":\\\"312be368-bbdc-461d-b74b-f792414190de\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:40Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:40 crc kubenswrapper[4790]: I1003 08:40:40.382716 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:40 crc kubenswrapper[4790]: I1003 08:40:40.382743 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:40 crc kubenswrapper[4790]: I1003 08:40:40.382752 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:40 crc kubenswrapper[4790]: I1003 08:40:40.382768 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:40 crc kubenswrapper[4790]: I1003 08:40:40.382778 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:40Z","lastTransitionTime":"2025-10-03T08:40:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:40 crc kubenswrapper[4790]: E1003 08:40:40.401539 4790 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T08:40:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T08:40:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:40Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T08:40:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T08:40:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:40Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e374b313-7030-4739-b7f1-da1c9b39fe8d\\\",\\\"systemUUID\\\":\\\"312be368-bbdc-461d-b74b-f792414190de\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:40Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:40 crc kubenswrapper[4790]: I1003 08:40:40.406513 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:40 crc kubenswrapper[4790]: I1003 08:40:40.406644 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:40 crc kubenswrapper[4790]: I1003 08:40:40.406676 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:40 crc kubenswrapper[4790]: I1003 08:40:40.406713 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:40 crc kubenswrapper[4790]: I1003 08:40:40.406743 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:40Z","lastTransitionTime":"2025-10-03T08:40:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:40 crc kubenswrapper[4790]: E1003 08:40:40.427687 4790 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T08:40:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T08:40:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:40Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T08:40:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T08:40:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:40Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e374b313-7030-4739-b7f1-da1c9b39fe8d\\\",\\\"systemUUID\\\":\\\"312be368-bbdc-461d-b74b-f792414190de\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:40Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:40 crc kubenswrapper[4790]: E1003 08:40:40.427854 4790 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 03 08:40:40 crc kubenswrapper[4790]: I1003 08:40:40.429851 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:40 crc kubenswrapper[4790]: I1003 08:40:40.429895 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:40 crc kubenswrapper[4790]: I1003 08:40:40.429921 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:40 crc kubenswrapper[4790]: I1003 08:40:40.429952 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:40 crc kubenswrapper[4790]: I1003 08:40:40.429969 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:40Z","lastTransitionTime":"2025-10-03T08:40:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:40 crc kubenswrapper[4790]: I1003 08:40:40.461100 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 03 08:40:40 crc kubenswrapper[4790]: I1003 08:40:40.464552 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 03 08:40:40 crc kubenswrapper[4790]: I1003 08:40:40.470266 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Oct 03 08:40:40 crc kubenswrapper[4790]: I1003 08:40:40.474074 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:40Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:40 crc kubenswrapper[4790]: I1003 08:40:40.485799 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:40Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:40 crc kubenswrapper[4790]: I1003 08:40:40.495953 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"36eae06a-d8be-4128-8d68-acb32e1f011b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a97a1a7f3aba494cd0da6bcd15052a85d70642642a62d108ac0be51ce8e3d302\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww2bb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8c3aea222eb433729d8eac107ebcdb306f649bef25942cd0390bfc1bbda5c74\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww2bb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:35Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-zqj4j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:40Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:40 crc kubenswrapper[4790]: I1003 08:40:40.512497 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vsjvh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa372dcc-63aa-48e5-b153-7739078026ef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52pgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52pgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52pgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52pgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52pgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52pgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52pgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52pgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e45c39722f37fc673cee930ca10396bba43b56e19e61aa99bd2b89f131341cb4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e45c39722f37fc673cee930ca10396bba43b56e19e61aa99bd2b89f131341cb4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:40:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52pgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vsjvh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:40Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:40 crc kubenswrapper[4790]: I1003 08:40:40.526991 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28b99cc027b4390bdc7e3ed519bb798eb4a7b825ff29e4574f3420b9922265b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:40Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:40 crc kubenswrapper[4790]: I1003 08:40:40.532660 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:40 crc kubenswrapper[4790]: I1003 08:40:40.532709 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:40 crc kubenswrapper[4790]: I1003 08:40:40.532721 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:40 crc kubenswrapper[4790]: I1003 08:40:40.532745 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:40 crc kubenswrapper[4790]: I1003 08:40:40.532758 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:40Z","lastTransitionTime":"2025-10-03T08:40:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:40 crc kubenswrapper[4790]: I1003 08:40:40.541477 4790 generic.go:334] "Generic (PLEG): container finished" podID="e01f9373-74e4-439b-bd7f-ad00e60a3284" containerID="ca99fc95f4bc463b8e638cfa1e6dc6be77fc4111b6ec61b4b2913fdf91f46671" exitCode=0 Oct 03 08:40:40 crc kubenswrapper[4790]: I1003 08:40:40.541529 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-9fnsr" event={"ID":"e01f9373-74e4-439b-bd7f-ad00e60a3284","Type":"ContainerDied","Data":"ca99fc95f4bc463b8e638cfa1e6dc6be77fc4111b6ec61b4b2913fdf91f46671"} Oct 03 08:40:40 crc kubenswrapper[4790]: I1003 08:40:40.551825 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"535765e1-fd5d-4501-b13e-684c5f08b296\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e0caa9033c82a625d9594342d4e58052cfbfea8251bc0fa2f1b102ba235b5ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db5e8a267e2821ea1c1628ff477a4efc3c5bcc78538460b0f30a160bd738dcee\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d85a637b07da3de31d73486c444bf86e7e2fd799180fc1cd9ab56fee61afaa9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://905a99c09cfaadc3eddc17fa6e509098364525d7a03add46a97c98dc08def8a2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://905a99c09cfaadc3eddc17fa6e509098364525d7a03add46a97c98dc08def8a2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T08:40:30Z\\\",\\\"message\\\":\\\"file observer\\\\nW1003 08:40:30.396497 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1003 08:40:30.399279 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 08:40:30.400119 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-247026750/tls.crt::/tmp/serving-cert-247026750/tls.key\\\\\\\"\\\\nI1003 08:40:30.809123 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1003 08:40:30.812812 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1003 08:40:30.812831 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1003 08:40:30.812852 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1003 08:40:30.812858 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1003 08:40:30.817440 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1003 08:40:30.817463 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1003 08:40:30.817482 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 08:40:30.817491 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 08:40:30.817495 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1003 08:40:30.817498 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1003 08:40:30.817501 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1003 08:40:30.817504 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1003 08:40:30.821080 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4591d1f2d844cb16eef19a65aaad416898d23ca5ce266e4863a9810d49ff4ba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://444e973af493d5aca2c924bcaee20abca5f1281e6b9c9f364e4603bf622d0e48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://444e973af493d5aca2c924bcaee20abca5f1281e6b9c9f364e4603bf622d0e48\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:40:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:14Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:40Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:40 crc kubenswrapper[4790]: I1003 08:40:40.568895 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gprtl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c5bfb61-e69a-4576-b200-18dd7f38c7f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dcb3e7b6550f0ad490e187c2154ec53cb427cbab7acaf0618eb4de188eb724c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66xc5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:34Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gprtl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:40Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:40 crc kubenswrapper[4790]: I1003 08:40:40.583086 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3362615c1396a6051d3770c79e4b67eea9d0c9df2a1a45381f729c4120970818\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://41113d498e57788a4c3162e8e981f5b9d9a5a7d49ee50b15085688fbc6b67f4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:40Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:40 crc kubenswrapper[4790]: I1003 08:40:40.592880 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08d57a60251c60c6c6a2cab2b862c322c01b531ca1aa12ff4b0933bee9eb95f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:40Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:40 crc kubenswrapper[4790]: I1003 08:40:40.604970 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-hncl2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f2f5631d-8486-44f9-81b0-ee78515e7866\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3fde680972c36057c9e06b0f745acbfb2e6ab399b14d6c25a3c379b65e598afd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-strp2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-hncl2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:40Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:40 crc kubenswrapper[4790]: I1003 08:40:40.616788 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:40Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:40 crc kubenswrapper[4790]: I1003 08:40:40.635425 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:40 crc kubenswrapper[4790]: I1003 08:40:40.635672 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:40 crc kubenswrapper[4790]: I1003 08:40:40.635735 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:40 crc kubenswrapper[4790]: I1003 08:40:40.635758 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:40 crc kubenswrapper[4790]: I1003 08:40:40.635797 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:40Z","lastTransitionTime":"2025-10-03T08:40:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:40 crc kubenswrapper[4790]: I1003 08:40:40.637635 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9fnsr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e01f9373-74e4-439b-bd7f-ad00e60a3284\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krfjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://876c35e32e5877c2a69cdbb54b84ec0bc0daebd05c7847875a1c122aafac2057\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://876c35e32e5877c2a69cdbb54b84ec0bc0daebd05c7847875a1c122aafac2057\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krfjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://938ff3fffad5347212d00d02f96c105cd5e6252a4d6a7ada258f0367da3f8e77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://938ff3fffad5347212d00d02f96c105cd5e6252a4d6a7ada258f0367da3f8e77\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:40:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krfjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krfjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krfjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krfjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krfjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9fnsr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:40Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:40 crc kubenswrapper[4790]: I1003 08:40:40.651843 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7bpwn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a3923f73-5efb-4ce3-b68f-0c4e1e8eca4b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://305c950f0352fe1bec1c151dc89eed1389144ad5ee321458d48c6609e688bb73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gt4gf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:37Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7bpwn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:40Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:40 crc kubenswrapper[4790]: I1003 08:40:40.666140 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28b99cc027b4390bdc7e3ed519bb798eb4a7b825ff29e4574f3420b9922265b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:40Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:40 crc kubenswrapper[4790]: I1003 08:40:40.681219 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:40Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:40 crc kubenswrapper[4790]: I1003 08:40:40.696039 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:40Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:40 crc kubenswrapper[4790]: I1003 08:40:40.711918 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"36eae06a-d8be-4128-8d68-acb32e1f011b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a97a1a7f3aba494cd0da6bcd15052a85d70642642a62d108ac0be51ce8e3d302\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww2bb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8c3aea222eb433729d8eac107ebcdb306f649bef25942cd0390bfc1bbda5c74\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww2bb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:35Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-zqj4j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:40Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:40 crc kubenswrapper[4790]: I1003 08:40:40.737794 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vsjvh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa372dcc-63aa-48e5-b153-7739078026ef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52pgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52pgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52pgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52pgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52pgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52pgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52pgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52pgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e45c39722f37fc673cee930ca10396bba43b56e19e61aa99bd2b89f131341cb4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e45c39722f37fc673cee930ca10396bba43b56e19e61aa99bd2b89f131341cb4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:40:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52pgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vsjvh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:40Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:40 crc kubenswrapper[4790]: I1003 08:40:40.738956 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:40 crc kubenswrapper[4790]: I1003 08:40:40.739037 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:40 crc kubenswrapper[4790]: I1003 08:40:40.739054 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:40 crc kubenswrapper[4790]: I1003 08:40:40.739079 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:40 crc kubenswrapper[4790]: I1003 08:40:40.739134 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:40Z","lastTransitionTime":"2025-10-03T08:40:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:40 crc kubenswrapper[4790]: I1003 08:40:40.755928 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"535765e1-fd5d-4501-b13e-684c5f08b296\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e0caa9033c82a625d9594342d4e58052cfbfea8251bc0fa2f1b102ba235b5ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db5e8a267e2821ea1c1628ff477a4efc3c5bcc78538460b0f30a160bd738dcee\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d85a637b07da3de31d73486c444bf86e7e2fd799180fc1cd9ab56fee61afaa9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://905a99c09cfaadc3eddc17fa6e509098364525d7a03add46a97c98dc08def8a2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://905a99c09cfaadc3eddc17fa6e509098364525d7a03add46a97c98dc08def8a2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T08:40:30Z\\\",\\\"message\\\":\\\"file observer\\\\nW1003 08:40:30.396497 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1003 08:40:30.399279 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 08:40:30.400119 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-247026750/tls.crt::/tmp/serving-cert-247026750/tls.key\\\\\\\"\\\\nI1003 08:40:30.809123 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1003 08:40:30.812812 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1003 08:40:30.812831 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1003 08:40:30.812852 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1003 08:40:30.812858 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1003 08:40:30.817440 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1003 08:40:30.817463 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1003 08:40:30.817482 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 08:40:30.817491 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 08:40:30.817495 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1003 08:40:30.817498 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1003 08:40:30.817501 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1003 08:40:30.817504 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1003 08:40:30.821080 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4591d1f2d844cb16eef19a65aaad416898d23ca5ce266e4863a9810d49ff4ba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://444e973af493d5aca2c924bcaee20abca5f1281e6b9c9f364e4603bf622d0e48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://444e973af493d5aca2c924bcaee20abca5f1281e6b9c9f364e4603bf622d0e48\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:40:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:14Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:40Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:40 crc kubenswrapper[4790]: I1003 08:40:40.772845 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gprtl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c5bfb61-e69a-4576-b200-18dd7f38c7f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dcb3e7b6550f0ad490e187c2154ec53cb427cbab7acaf0618eb4de188eb724c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66xc5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:34Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gprtl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:40Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:40 crc kubenswrapper[4790]: I1003 08:40:40.791384 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3362615c1396a6051d3770c79e4b67eea9d0c9df2a1a45381f729c4120970818\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://41113d498e57788a4c3162e8e981f5b9d9a5a7d49ee50b15085688fbc6b67f4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:40Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:40 crc kubenswrapper[4790]: I1003 08:40:40.815315 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7ddb59e5-6feb-471d-9594-f86e6993a2e4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b0138e08174582c0c1388716f1f6b9e42a16b009f1a467803c4dacd50c3d80ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c575d680c9bc994b8c2e97e720d794138153bc3329c4b7f8ef81115a0245d0cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://027dd0df8d83044cff9f4e8c87bef3bb18e2169bb93e615486fd1e9b5a17b86e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e837e81f3b907d8506f674d13a9bc8f20f63e6c40e1f952cd57b715855ec9bf9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:14Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:40Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:40 crc kubenswrapper[4790]: I1003 08:40:40.832763 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08d57a60251c60c6c6a2cab2b862c322c01b531ca1aa12ff4b0933bee9eb95f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:40Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:40 crc kubenswrapper[4790]: I1003 08:40:40.842931 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:40 crc kubenswrapper[4790]: I1003 08:40:40.843006 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:40 crc kubenswrapper[4790]: I1003 08:40:40.843025 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:40 crc kubenswrapper[4790]: I1003 08:40:40.843053 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:40 crc kubenswrapper[4790]: I1003 08:40:40.843071 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:40Z","lastTransitionTime":"2025-10-03T08:40:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:40 crc kubenswrapper[4790]: I1003 08:40:40.855009 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-hncl2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f2f5631d-8486-44f9-81b0-ee78515e7866\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3fde680972c36057c9e06b0f745acbfb2e6ab399b14d6c25a3c379b65e598afd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-strp2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-hncl2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:40Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:40 crc kubenswrapper[4790]: I1003 08:40:40.875105 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:40Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:40 crc kubenswrapper[4790]: I1003 08:40:40.901030 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9fnsr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e01f9373-74e4-439b-bd7f-ad00e60a3284\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krfjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://876c35e32e5877c2a69cdbb54b84ec0bc0daebd05c7847875a1c122aafac2057\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://876c35e32e5877c2a69cdbb54b84ec0bc0daebd05c7847875a1c122aafac2057\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krfjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://938ff3fffad5347212d00d02f96c105cd5e6252a4d6a7ada258f0367da3f8e77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://938ff3fffad5347212d00d02f96c105cd5e6252a4d6a7ada258f0367da3f8e77\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:40:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krfjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca99fc95f4bc463b8e638cfa1e6dc6be77fc4111b6ec61b4b2913fdf91f46671\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ca99fc95f4bc463b8e638cfa1e6dc6be77fc4111b6ec61b4b2913fdf91f46671\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:40:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krfjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krfjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krfjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krfjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9fnsr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:40Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:40 crc kubenswrapper[4790]: I1003 08:40:40.921943 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7bpwn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a3923f73-5efb-4ce3-b68f-0c4e1e8eca4b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://305c950f0352fe1bec1c151dc89eed1389144ad5ee321458d48c6609e688bb73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gt4gf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:37Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7bpwn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:40Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:40 crc kubenswrapper[4790]: I1003 08:40:40.945809 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:40 crc kubenswrapper[4790]: I1003 08:40:40.945863 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:40 crc kubenswrapper[4790]: I1003 08:40:40.945878 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:40 crc kubenswrapper[4790]: I1003 08:40:40.945900 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:40 crc kubenswrapper[4790]: I1003 08:40:40.945915 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:40Z","lastTransitionTime":"2025-10-03T08:40:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:41 crc kubenswrapper[4790]: I1003 08:40:41.049053 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:41 crc kubenswrapper[4790]: I1003 08:40:41.049109 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:41 crc kubenswrapper[4790]: I1003 08:40:41.049121 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:41 crc kubenswrapper[4790]: I1003 08:40:41.049145 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:41 crc kubenswrapper[4790]: I1003 08:40:41.049158 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:41Z","lastTransitionTime":"2025-10-03T08:40:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:41 crc kubenswrapper[4790]: I1003 08:40:41.152687 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:41 crc kubenswrapper[4790]: I1003 08:40:41.152743 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:41 crc kubenswrapper[4790]: I1003 08:40:41.152757 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:41 crc kubenswrapper[4790]: I1003 08:40:41.152790 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:41 crc kubenswrapper[4790]: I1003 08:40:41.152809 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:41Z","lastTransitionTime":"2025-10-03T08:40:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:41 crc kubenswrapper[4790]: I1003 08:40:41.255092 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:41 crc kubenswrapper[4790]: I1003 08:40:41.255153 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:41 crc kubenswrapper[4790]: I1003 08:40:41.255168 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:41 crc kubenswrapper[4790]: I1003 08:40:41.255191 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:41 crc kubenswrapper[4790]: I1003 08:40:41.255205 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:41Z","lastTransitionTime":"2025-10-03T08:40:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:41 crc kubenswrapper[4790]: I1003 08:40:41.358112 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:41 crc kubenswrapper[4790]: I1003 08:40:41.358192 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:41 crc kubenswrapper[4790]: I1003 08:40:41.358217 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:41 crc kubenswrapper[4790]: I1003 08:40:41.358247 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:41 crc kubenswrapper[4790]: I1003 08:40:41.358271 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:41Z","lastTransitionTime":"2025-10-03T08:40:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:41 crc kubenswrapper[4790]: I1003 08:40:41.461677 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:41 crc kubenswrapper[4790]: I1003 08:40:41.461764 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:41 crc kubenswrapper[4790]: I1003 08:40:41.461793 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:41 crc kubenswrapper[4790]: I1003 08:40:41.461830 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:41 crc kubenswrapper[4790]: I1003 08:40:41.461856 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:41Z","lastTransitionTime":"2025-10-03T08:40:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:41 crc kubenswrapper[4790]: I1003 08:40:41.549670 4790 generic.go:334] "Generic (PLEG): container finished" podID="e01f9373-74e4-439b-bd7f-ad00e60a3284" containerID="9667f80988a61a4f26f6f2b8b047ef94bdc6fa052083d5d1285f5bd9235615ee" exitCode=0 Oct 03 08:40:41 crc kubenswrapper[4790]: I1003 08:40:41.549784 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-9fnsr" event={"ID":"e01f9373-74e4-439b-bd7f-ad00e60a3284","Type":"ContainerDied","Data":"9667f80988a61a4f26f6f2b8b047ef94bdc6fa052083d5d1285f5bd9235615ee"} Oct 03 08:40:41 crc kubenswrapper[4790]: I1003 08:40:41.565729 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vsjvh" event={"ID":"fa372dcc-63aa-48e5-b153-7739078026ef","Type":"ContainerStarted","Data":"74d44e422b0cbbd3793553dad98475d5a93ac38530a167db0dd1ef152edac010"} Oct 03 08:40:41 crc kubenswrapper[4790]: I1003 08:40:41.566200 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:41 crc kubenswrapper[4790]: I1003 08:40:41.566232 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:41 crc kubenswrapper[4790]: I1003 08:40:41.566244 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:41 crc kubenswrapper[4790]: I1003 08:40:41.566263 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:41 crc kubenswrapper[4790]: I1003 08:40:41.566277 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:41Z","lastTransitionTime":"2025-10-03T08:40:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:41 crc kubenswrapper[4790]: I1003 08:40:41.568656 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28b99cc027b4390bdc7e3ed519bb798eb4a7b825ff29e4574f3420b9922265b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:41Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:41 crc kubenswrapper[4790]: I1003 08:40:41.596857 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:41Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:41 crc kubenswrapper[4790]: I1003 08:40:41.621163 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:41Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:41 crc kubenswrapper[4790]: I1003 08:40:41.640788 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"36eae06a-d8be-4128-8d68-acb32e1f011b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a97a1a7f3aba494cd0da6bcd15052a85d70642642a62d108ac0be51ce8e3d302\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww2bb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8c3aea222eb433729d8eac107ebcdb306f649bef25942cd0390bfc1bbda5c74\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww2bb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:35Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-zqj4j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:41Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:41 crc kubenswrapper[4790]: I1003 08:40:41.668188 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vsjvh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa372dcc-63aa-48e5-b153-7739078026ef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52pgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52pgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52pgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52pgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52pgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52pgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52pgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52pgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e45c39722f37fc673cee930ca10396bba43b56e19e61aa99bd2b89f131341cb4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e45c39722f37fc673cee930ca10396bba43b56e19e61aa99bd2b89f131341cb4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:40:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52pgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vsjvh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:41Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:41 crc kubenswrapper[4790]: I1003 08:40:41.682082 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:41 crc kubenswrapper[4790]: I1003 08:40:41.682200 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:41 crc kubenswrapper[4790]: I1003 08:40:41.682217 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:41 crc kubenswrapper[4790]: I1003 08:40:41.682248 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:41 crc kubenswrapper[4790]: I1003 08:40:41.682264 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:41Z","lastTransitionTime":"2025-10-03T08:40:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:41 crc kubenswrapper[4790]: I1003 08:40:41.686933 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"535765e1-fd5d-4501-b13e-684c5f08b296\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e0caa9033c82a625d9594342d4e58052cfbfea8251bc0fa2f1b102ba235b5ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db5e8a267e2821ea1c1628ff477a4efc3c5bcc78538460b0f30a160bd738dcee\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d85a637b07da3de31d73486c444bf86e7e2fd799180fc1cd9ab56fee61afaa9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://905a99c09cfaadc3eddc17fa6e509098364525d7a03add46a97c98dc08def8a2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://905a99c09cfaadc3eddc17fa6e509098364525d7a03add46a97c98dc08def8a2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T08:40:30Z\\\",\\\"message\\\":\\\"file observer\\\\nW1003 08:40:30.396497 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1003 08:40:30.399279 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 08:40:30.400119 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-247026750/tls.crt::/tmp/serving-cert-247026750/tls.key\\\\\\\"\\\\nI1003 08:40:30.809123 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1003 08:40:30.812812 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1003 08:40:30.812831 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1003 08:40:30.812852 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1003 08:40:30.812858 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1003 08:40:30.817440 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1003 08:40:30.817463 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1003 08:40:30.817482 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 08:40:30.817491 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 08:40:30.817495 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1003 08:40:30.817498 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1003 08:40:30.817501 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1003 08:40:30.817504 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1003 08:40:30.821080 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4591d1f2d844cb16eef19a65aaad416898d23ca5ce266e4863a9810d49ff4ba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://444e973af493d5aca2c924bcaee20abca5f1281e6b9c9f364e4603bf622d0e48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://444e973af493d5aca2c924bcaee20abca5f1281e6b9c9f364e4603bf622d0e48\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:40:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:14Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:41Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:41 crc kubenswrapper[4790]: I1003 08:40:41.699416 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gprtl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c5bfb61-e69a-4576-b200-18dd7f38c7f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dcb3e7b6550f0ad490e187c2154ec53cb427cbab7acaf0618eb4de188eb724c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66xc5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:34Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gprtl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:41Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:41 crc kubenswrapper[4790]: I1003 08:40:41.712937 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3362615c1396a6051d3770c79e4b67eea9d0c9df2a1a45381f729c4120970818\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://41113d498e57788a4c3162e8e981f5b9d9a5a7d49ee50b15085688fbc6b67f4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:41Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:41 crc kubenswrapper[4790]: I1003 08:40:41.728738 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7ddb59e5-6feb-471d-9594-f86e6993a2e4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b0138e08174582c0c1388716f1f6b9e42a16b009f1a467803c4dacd50c3d80ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c575d680c9bc994b8c2e97e720d794138153bc3329c4b7f8ef81115a0245d0cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://027dd0df8d83044cff9f4e8c87bef3bb18e2169bb93e615486fd1e9b5a17b86e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e837e81f3b907d8506f674d13a9bc8f20f63e6c40e1f952cd57b715855ec9bf9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:14Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:41Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:41 crc kubenswrapper[4790]: I1003 08:40:41.743081 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08d57a60251c60c6c6a2cab2b862c322c01b531ca1aa12ff4b0933bee9eb95f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:41Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:41 crc kubenswrapper[4790]: I1003 08:40:41.756125 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-hncl2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f2f5631d-8486-44f9-81b0-ee78515e7866\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3fde680972c36057c9e06b0f745acbfb2e6ab399b14d6c25a3c379b65e598afd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-strp2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-hncl2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:41Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:41 crc kubenswrapper[4790]: I1003 08:40:41.769132 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:41Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:41 crc kubenswrapper[4790]: I1003 08:40:41.784558 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:41 crc kubenswrapper[4790]: I1003 08:40:41.784653 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:41 crc kubenswrapper[4790]: I1003 08:40:41.784668 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:41 crc kubenswrapper[4790]: I1003 08:40:41.784689 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:41 crc kubenswrapper[4790]: I1003 08:40:41.784702 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:41Z","lastTransitionTime":"2025-10-03T08:40:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:41 crc kubenswrapper[4790]: I1003 08:40:41.787997 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9fnsr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e01f9373-74e4-439b-bd7f-ad00e60a3284\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krfjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://876c35e32e5877c2a69cdbb54b84ec0bc0daebd05c7847875a1c122aafac2057\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://876c35e32e5877c2a69cdbb54b84ec0bc0daebd05c7847875a1c122aafac2057\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krfjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://938ff3fffad5347212d00d02f96c105cd5e6252a4d6a7ada258f0367da3f8e77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://938ff3fffad5347212d00d02f96c105cd5e6252a4d6a7ada258f0367da3f8e77\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:40:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krfjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca99fc95f4bc463b8e638cfa1e6dc6be77fc4111b6ec61b4b2913fdf91f46671\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ca99fc95f4bc463b8e638cfa1e6dc6be77fc4111b6ec61b4b2913fdf91f46671\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:40:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krfjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9667f80988a61a4f26f6f2b8b047ef94bdc6fa052083d5d1285f5bd9235615ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9667f80988a61a4f26f6f2b8b047ef94bdc6fa052083d5d1285f5bd9235615ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:40:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krfjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krfjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krfjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9fnsr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:41Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:41 crc kubenswrapper[4790]: I1003 08:40:41.802173 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7bpwn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a3923f73-5efb-4ce3-b68f-0c4e1e8eca4b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://305c950f0352fe1bec1c151dc89eed1389144ad5ee321458d48c6609e688bb73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gt4gf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:37Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7bpwn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:41Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:41 crc kubenswrapper[4790]: I1003 08:40:41.887485 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:41 crc kubenswrapper[4790]: I1003 08:40:41.887547 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:41 crc kubenswrapper[4790]: I1003 08:40:41.887563 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:41 crc kubenswrapper[4790]: I1003 08:40:41.887635 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:41 crc kubenswrapper[4790]: I1003 08:40:41.887671 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:41Z","lastTransitionTime":"2025-10-03T08:40:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:41 crc kubenswrapper[4790]: I1003 08:40:41.920107 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 08:40:41 crc kubenswrapper[4790]: E1003 08:40:41.920394 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 08:40:49.920368601 +0000 UTC m=+36.273302859 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 08:40:41 crc kubenswrapper[4790]: I1003 08:40:41.991000 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:41 crc kubenswrapper[4790]: I1003 08:40:41.991075 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:41 crc kubenswrapper[4790]: I1003 08:40:41.991089 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:41 crc kubenswrapper[4790]: I1003 08:40:41.991117 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:41 crc kubenswrapper[4790]: I1003 08:40:41.991132 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:41Z","lastTransitionTime":"2025-10-03T08:40:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:42 crc kubenswrapper[4790]: I1003 08:40:42.021378 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 08:40:42 crc kubenswrapper[4790]: I1003 08:40:42.021466 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 08:40:42 crc kubenswrapper[4790]: I1003 08:40:42.021503 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 08:40:42 crc kubenswrapper[4790]: I1003 08:40:42.021548 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 08:40:42 crc kubenswrapper[4790]: E1003 08:40:42.021705 4790 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 03 08:40:42 crc kubenswrapper[4790]: E1003 08:40:42.021803 4790 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 03 08:40:42 crc kubenswrapper[4790]: E1003 08:40:42.021834 4790 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 03 08:40:42 crc kubenswrapper[4790]: E1003 08:40:42.021855 4790 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 03 08:40:42 crc kubenswrapper[4790]: E1003 08:40:42.021868 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-03 08:40:50.021834312 +0000 UTC m=+36.374768580 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 03 08:40:42 crc kubenswrapper[4790]: E1003 08:40:42.021870 4790 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 03 08:40:42 crc kubenswrapper[4790]: E1003 08:40:42.021924 4790 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 03 08:40:42 crc kubenswrapper[4790]: E1003 08:40:42.021973 4790 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 03 08:40:42 crc kubenswrapper[4790]: E1003 08:40:42.021936 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-03 08:40:50.021910685 +0000 UTC m=+36.374844963 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 03 08:40:42 crc kubenswrapper[4790]: E1003 08:40:42.022006 4790 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 03 08:40:42 crc kubenswrapper[4790]: E1003 08:40:42.022150 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-03 08:40:50.022054239 +0000 UTC m=+36.374988477 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 03 08:40:42 crc kubenswrapper[4790]: E1003 08:40:42.022192 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-03 08:40:50.022181683 +0000 UTC m=+36.375115911 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 03 08:40:42 crc kubenswrapper[4790]: I1003 08:40:42.095265 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:42 crc kubenswrapper[4790]: I1003 08:40:42.095340 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:42 crc kubenswrapper[4790]: I1003 08:40:42.095352 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:42 crc kubenswrapper[4790]: I1003 08:40:42.095375 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:42 crc kubenswrapper[4790]: I1003 08:40:42.095389 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:42Z","lastTransitionTime":"2025-10-03T08:40:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:42 crc kubenswrapper[4790]: I1003 08:40:42.198766 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:42 crc kubenswrapper[4790]: I1003 08:40:42.198833 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:42 crc kubenswrapper[4790]: I1003 08:40:42.198845 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:42 crc kubenswrapper[4790]: I1003 08:40:42.198870 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:42 crc kubenswrapper[4790]: I1003 08:40:42.198883 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:42Z","lastTransitionTime":"2025-10-03T08:40:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:42 crc kubenswrapper[4790]: I1003 08:40:42.302630 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:42 crc kubenswrapper[4790]: I1003 08:40:42.303164 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:42 crc kubenswrapper[4790]: I1003 08:40:42.303176 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:42 crc kubenswrapper[4790]: I1003 08:40:42.303199 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:42 crc kubenswrapper[4790]: I1003 08:40:42.303212 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:42Z","lastTransitionTime":"2025-10-03T08:40:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:42 crc kubenswrapper[4790]: I1003 08:40:42.329782 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 08:40:42 crc kubenswrapper[4790]: I1003 08:40:42.329830 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 08:40:42 crc kubenswrapper[4790]: E1003 08:40:42.329983 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 08:40:42 crc kubenswrapper[4790]: I1003 08:40:42.330081 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 08:40:42 crc kubenswrapper[4790]: E1003 08:40:42.330147 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 08:40:42 crc kubenswrapper[4790]: E1003 08:40:42.330338 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 08:40:42 crc kubenswrapper[4790]: I1003 08:40:42.407125 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:42 crc kubenswrapper[4790]: I1003 08:40:42.407666 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:42 crc kubenswrapper[4790]: I1003 08:40:42.407928 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:42 crc kubenswrapper[4790]: I1003 08:40:42.408133 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:42 crc kubenswrapper[4790]: I1003 08:40:42.408355 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:42Z","lastTransitionTime":"2025-10-03T08:40:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:42 crc kubenswrapper[4790]: I1003 08:40:42.511982 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:42 crc kubenswrapper[4790]: I1003 08:40:42.512065 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:42 crc kubenswrapper[4790]: I1003 08:40:42.512083 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:42 crc kubenswrapper[4790]: I1003 08:40:42.512111 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:42 crc kubenswrapper[4790]: I1003 08:40:42.512129 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:42Z","lastTransitionTime":"2025-10-03T08:40:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:42 crc kubenswrapper[4790]: I1003 08:40:42.638792 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:42 crc kubenswrapper[4790]: I1003 08:40:42.638872 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:42 crc kubenswrapper[4790]: I1003 08:40:42.638890 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:42 crc kubenswrapper[4790]: I1003 08:40:42.638916 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:42 crc kubenswrapper[4790]: I1003 08:40:42.638938 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:42Z","lastTransitionTime":"2025-10-03T08:40:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:42 crc kubenswrapper[4790]: I1003 08:40:42.645896 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-9fnsr" event={"ID":"e01f9373-74e4-439b-bd7f-ad00e60a3284","Type":"ContainerStarted","Data":"a4e7fb7b6f922e6e0fa2a7fcb5476ba549a6aa21dc88ee145bc55f4dac0912bb"} Oct 03 08:40:42 crc kubenswrapper[4790]: I1003 08:40:42.666007 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7ddb59e5-6feb-471d-9594-f86e6993a2e4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b0138e08174582c0c1388716f1f6b9e42a16b009f1a467803c4dacd50c3d80ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c575d680c9bc994b8c2e97e720d794138153bc3329c4b7f8ef81115a0245d0cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://027dd0df8d83044cff9f4e8c87bef3bb18e2169bb93e615486fd1e9b5a17b86e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e837e81f3b907d8506f674d13a9bc8f20f63e6c40e1f952cd57b715855ec9bf9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:14Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:42Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:42 crc kubenswrapper[4790]: I1003 08:40:42.682456 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08d57a60251c60c6c6a2cab2b862c322c01b531ca1aa12ff4b0933bee9eb95f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:42Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:42 crc kubenswrapper[4790]: I1003 08:40:42.696896 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-hncl2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f2f5631d-8486-44f9-81b0-ee78515e7866\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3fde680972c36057c9e06b0f745acbfb2e6ab399b14d6c25a3c379b65e598afd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-strp2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-hncl2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:42Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:42 crc kubenswrapper[4790]: I1003 08:40:42.711471 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:42Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:42 crc kubenswrapper[4790]: I1003 08:40:42.729565 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9fnsr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e01f9373-74e4-439b-bd7f-ad00e60a3284\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krfjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://876c35e32e5877c2a69cdbb54b84ec0bc0daebd05c7847875a1c122aafac2057\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://876c35e32e5877c2a69cdbb54b84ec0bc0daebd05c7847875a1c122aafac2057\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krfjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://938ff3fffad5347212d00d02f96c105cd5e6252a4d6a7ada258f0367da3f8e77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://938ff3fffad5347212d00d02f96c105cd5e6252a4d6a7ada258f0367da3f8e77\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:40:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krfjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca99fc95f4bc463b8e638cfa1e6dc6be77fc4111b6ec61b4b2913fdf91f46671\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ca99fc95f4bc463b8e638cfa1e6dc6be77fc4111b6ec61b4b2913fdf91f46671\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:40:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krfjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9667f80988a61a4f26f6f2b8b047ef94bdc6fa052083d5d1285f5bd9235615ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9667f80988a61a4f26f6f2b8b047ef94bdc6fa052083d5d1285f5bd9235615ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:40:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krfjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a4e7fb7b6f922e6e0fa2a7fcb5476ba549a6aa21dc88ee145bc55f4dac0912bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krfjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krfjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9fnsr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:42Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:42 crc kubenswrapper[4790]: I1003 08:40:42.742411 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:42 crc kubenswrapper[4790]: I1003 08:40:42.742466 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:42 crc kubenswrapper[4790]: I1003 08:40:42.742477 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:42 crc kubenswrapper[4790]: I1003 08:40:42.742498 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:42 crc kubenswrapper[4790]: I1003 08:40:42.742515 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:42Z","lastTransitionTime":"2025-10-03T08:40:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:42 crc kubenswrapper[4790]: I1003 08:40:42.746166 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7bpwn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a3923f73-5efb-4ce3-b68f-0c4e1e8eca4b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://305c950f0352fe1bec1c151dc89eed1389144ad5ee321458d48c6609e688bb73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gt4gf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:37Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7bpwn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:42Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:42 crc kubenswrapper[4790]: I1003 08:40:42.765301 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28b99cc027b4390bdc7e3ed519bb798eb4a7b825ff29e4574f3420b9922265b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:42Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:42 crc kubenswrapper[4790]: I1003 08:40:42.781990 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:42Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:42 crc kubenswrapper[4790]: I1003 08:40:42.802824 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:42Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:42 crc kubenswrapper[4790]: I1003 08:40:42.819716 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"36eae06a-d8be-4128-8d68-acb32e1f011b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a97a1a7f3aba494cd0da6bcd15052a85d70642642a62d108ac0be51ce8e3d302\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww2bb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8c3aea222eb433729d8eac107ebcdb306f649bef25942cd0390bfc1bbda5c74\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww2bb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:35Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-zqj4j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:42Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:42 crc kubenswrapper[4790]: I1003 08:40:42.845673 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:42 crc kubenswrapper[4790]: I1003 08:40:42.845730 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:42 crc kubenswrapper[4790]: I1003 08:40:42.845742 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:42 crc kubenswrapper[4790]: I1003 08:40:42.845764 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:42 crc kubenswrapper[4790]: I1003 08:40:42.845777 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:42Z","lastTransitionTime":"2025-10-03T08:40:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:42 crc kubenswrapper[4790]: I1003 08:40:42.881291 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vsjvh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa372dcc-63aa-48e5-b153-7739078026ef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52pgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52pgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52pgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52pgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52pgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52pgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52pgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52pgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e45c39722f37fc673cee930ca10396bba43b56e19e61aa99bd2b89f131341cb4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e45c39722f37fc673cee930ca10396bba43b56e19e61aa99bd2b89f131341cb4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:40:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52pgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vsjvh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:42Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:42 crc kubenswrapper[4790]: I1003 08:40:42.906688 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"535765e1-fd5d-4501-b13e-684c5f08b296\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e0caa9033c82a625d9594342d4e58052cfbfea8251bc0fa2f1b102ba235b5ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db5e8a267e2821ea1c1628ff477a4efc3c5bcc78538460b0f30a160bd738dcee\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d85a637b07da3de31d73486c444bf86e7e2fd799180fc1cd9ab56fee61afaa9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://905a99c09cfaadc3eddc17fa6e509098364525d7a03add46a97c98dc08def8a2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://905a99c09cfaadc3eddc17fa6e509098364525d7a03add46a97c98dc08def8a2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T08:40:30Z\\\",\\\"message\\\":\\\"file observer\\\\nW1003 08:40:30.396497 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1003 08:40:30.399279 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 08:40:30.400119 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-247026750/tls.crt::/tmp/serving-cert-247026750/tls.key\\\\\\\"\\\\nI1003 08:40:30.809123 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1003 08:40:30.812812 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1003 08:40:30.812831 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1003 08:40:30.812852 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1003 08:40:30.812858 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1003 08:40:30.817440 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1003 08:40:30.817463 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1003 08:40:30.817482 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 08:40:30.817491 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 08:40:30.817495 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1003 08:40:30.817498 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1003 08:40:30.817501 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1003 08:40:30.817504 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1003 08:40:30.821080 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4591d1f2d844cb16eef19a65aaad416898d23ca5ce266e4863a9810d49ff4ba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://444e973af493d5aca2c924bcaee20abca5f1281e6b9c9f364e4603bf622d0e48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://444e973af493d5aca2c924bcaee20abca5f1281e6b9c9f364e4603bf622d0e48\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:40:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:14Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:42Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:42 crc kubenswrapper[4790]: I1003 08:40:42.920470 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gprtl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c5bfb61-e69a-4576-b200-18dd7f38c7f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dcb3e7b6550f0ad490e187c2154ec53cb427cbab7acaf0618eb4de188eb724c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66xc5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:34Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gprtl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:42Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:42 crc kubenswrapper[4790]: I1003 08:40:42.931972 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3362615c1396a6051d3770c79e4b67eea9d0c9df2a1a45381f729c4120970818\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://41113d498e57788a4c3162e8e981f5b9d9a5a7d49ee50b15085688fbc6b67f4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:42Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:42 crc kubenswrapper[4790]: I1003 08:40:42.948399 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:42 crc kubenswrapper[4790]: I1003 08:40:42.948468 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:42 crc kubenswrapper[4790]: I1003 08:40:42.948484 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:42 crc kubenswrapper[4790]: I1003 08:40:42.948512 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:42 crc kubenswrapper[4790]: I1003 08:40:42.948532 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:42Z","lastTransitionTime":"2025-10-03T08:40:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:43 crc kubenswrapper[4790]: I1003 08:40:43.051841 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:43 crc kubenswrapper[4790]: I1003 08:40:43.051934 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:43 crc kubenswrapper[4790]: I1003 08:40:43.051955 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:43 crc kubenswrapper[4790]: I1003 08:40:43.051985 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:43 crc kubenswrapper[4790]: I1003 08:40:43.052201 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:43Z","lastTransitionTime":"2025-10-03T08:40:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:43 crc kubenswrapper[4790]: I1003 08:40:43.158918 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:43 crc kubenswrapper[4790]: I1003 08:40:43.158975 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:43 crc kubenswrapper[4790]: I1003 08:40:43.158991 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:43 crc kubenswrapper[4790]: I1003 08:40:43.159013 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:43 crc kubenswrapper[4790]: I1003 08:40:43.159030 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:43Z","lastTransitionTime":"2025-10-03T08:40:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:43 crc kubenswrapper[4790]: I1003 08:40:43.261427 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:43 crc kubenswrapper[4790]: I1003 08:40:43.261467 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:43 crc kubenswrapper[4790]: I1003 08:40:43.261475 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:43 crc kubenswrapper[4790]: I1003 08:40:43.261491 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:43 crc kubenswrapper[4790]: I1003 08:40:43.261501 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:43Z","lastTransitionTime":"2025-10-03T08:40:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:43 crc kubenswrapper[4790]: I1003 08:40:43.364233 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:43 crc kubenswrapper[4790]: I1003 08:40:43.364301 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:43 crc kubenswrapper[4790]: I1003 08:40:43.364319 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:43 crc kubenswrapper[4790]: I1003 08:40:43.364337 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:43 crc kubenswrapper[4790]: I1003 08:40:43.364347 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:43Z","lastTransitionTime":"2025-10-03T08:40:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:43 crc kubenswrapper[4790]: I1003 08:40:43.467421 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:43 crc kubenswrapper[4790]: I1003 08:40:43.467483 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:43 crc kubenswrapper[4790]: I1003 08:40:43.467500 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:43 crc kubenswrapper[4790]: I1003 08:40:43.468097 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:43 crc kubenswrapper[4790]: I1003 08:40:43.468142 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:43Z","lastTransitionTime":"2025-10-03T08:40:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:43 crc kubenswrapper[4790]: I1003 08:40:43.571654 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:43 crc kubenswrapper[4790]: I1003 08:40:43.571709 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:43 crc kubenswrapper[4790]: I1003 08:40:43.571726 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:43 crc kubenswrapper[4790]: I1003 08:40:43.571750 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:43 crc kubenswrapper[4790]: I1003 08:40:43.571768 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:43Z","lastTransitionTime":"2025-10-03T08:40:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:43 crc kubenswrapper[4790]: I1003 08:40:43.655478 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vsjvh" event={"ID":"fa372dcc-63aa-48e5-b153-7739078026ef","Type":"ContainerStarted","Data":"f23e27271cc758d5c79c89c2c23cc95fdeec4f05bc5e026a05e0ce68da68541b"} Oct 03 08:40:43 crc kubenswrapper[4790]: I1003 08:40:43.655943 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-vsjvh" Oct 03 08:40:43 crc kubenswrapper[4790]: I1003 08:40:43.655997 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-vsjvh" Oct 03 08:40:43 crc kubenswrapper[4790]: I1003 08:40:43.661570 4790 generic.go:334] "Generic (PLEG): container finished" podID="e01f9373-74e4-439b-bd7f-ad00e60a3284" containerID="a4e7fb7b6f922e6e0fa2a7fcb5476ba549a6aa21dc88ee145bc55f4dac0912bb" exitCode=0 Oct 03 08:40:43 crc kubenswrapper[4790]: I1003 08:40:43.661668 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-9fnsr" event={"ID":"e01f9373-74e4-439b-bd7f-ad00e60a3284","Type":"ContainerDied","Data":"a4e7fb7b6f922e6e0fa2a7fcb5476ba549a6aa21dc88ee145bc55f4dac0912bb"} Oct 03 08:40:43 crc kubenswrapper[4790]: I1003 08:40:43.674045 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08d57a60251c60c6c6a2cab2b862c322c01b531ca1aa12ff4b0933bee9eb95f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:43Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:43 crc kubenswrapper[4790]: I1003 08:40:43.675209 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:43 crc kubenswrapper[4790]: I1003 08:40:43.675263 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:43 crc kubenswrapper[4790]: I1003 08:40:43.675280 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:43 crc kubenswrapper[4790]: I1003 08:40:43.675304 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:43 crc kubenswrapper[4790]: I1003 08:40:43.675320 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:43Z","lastTransitionTime":"2025-10-03T08:40:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:43 crc kubenswrapper[4790]: I1003 08:40:43.689004 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-hncl2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f2f5631d-8486-44f9-81b0-ee78515e7866\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3fde680972c36057c9e06b0f745acbfb2e6ab399b14d6c25a3c379b65e598afd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-strp2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-hncl2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:43Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:43 crc kubenswrapper[4790]: I1003 08:40:43.705004 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-vsjvh" Oct 03 08:40:43 crc kubenswrapper[4790]: I1003 08:40:43.707979 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-vsjvh" Oct 03 08:40:43 crc kubenswrapper[4790]: I1003 08:40:43.719826 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7ddb59e5-6feb-471d-9594-f86e6993a2e4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b0138e08174582c0c1388716f1f6b9e42a16b009f1a467803c4dacd50c3d80ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c575d680c9bc994b8c2e97e720d794138153bc3329c4b7f8ef81115a0245d0cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://027dd0df8d83044cff9f4e8c87bef3bb18e2169bb93e615486fd1e9b5a17b86e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e837e81f3b907d8506f674d13a9bc8f20f63e6c40e1f952cd57b715855ec9bf9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:14Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:43Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:43 crc kubenswrapper[4790]: I1003 08:40:43.752187 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:43Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:43 crc kubenswrapper[4790]: I1003 08:40:43.773339 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9fnsr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e01f9373-74e4-439b-bd7f-ad00e60a3284\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krfjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://876c35e32e5877c2a69cdbb54b84ec0bc0daebd05c7847875a1c122aafac2057\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://876c35e32e5877c2a69cdbb54b84ec0bc0daebd05c7847875a1c122aafac2057\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krfjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://938ff3fffad5347212d00d02f96c105cd5e6252a4d6a7ada258f0367da3f8e77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://938ff3fffad5347212d00d02f96c105cd5e6252a4d6a7ada258f0367da3f8e77\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:40:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krfjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca99fc95f4bc463b8e638cfa1e6dc6be77fc4111b6ec61b4b2913fdf91f46671\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ca99fc95f4bc463b8e638cfa1e6dc6be77fc4111b6ec61b4b2913fdf91f46671\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:40:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krfjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9667f80988a61a4f26f6f2b8b047ef94bdc6fa052083d5d1285f5bd9235615ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9667f80988a61a4f26f6f2b8b047ef94bdc6fa052083d5d1285f5bd9235615ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:40:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krfjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a4e7fb7b6f922e6e0fa2a7fcb5476ba549a6aa21dc88ee145bc55f4dac0912bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krfjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krfjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9fnsr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:43Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:43 crc kubenswrapper[4790]: I1003 08:40:43.777965 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:43 crc kubenswrapper[4790]: I1003 08:40:43.777995 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:43 crc kubenswrapper[4790]: I1003 08:40:43.778005 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:43 crc kubenswrapper[4790]: I1003 08:40:43.778024 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:43 crc kubenswrapper[4790]: I1003 08:40:43.778035 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:43Z","lastTransitionTime":"2025-10-03T08:40:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:43 crc kubenswrapper[4790]: I1003 08:40:43.786533 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7bpwn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a3923f73-5efb-4ce3-b68f-0c4e1e8eca4b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://305c950f0352fe1bec1c151dc89eed1389144ad5ee321458d48c6609e688bb73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gt4gf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:37Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7bpwn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:43Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:43 crc kubenswrapper[4790]: I1003 08:40:43.804158 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:43Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:43 crc kubenswrapper[4790]: I1003 08:40:43.820541 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:43Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:43 crc kubenswrapper[4790]: I1003 08:40:43.840695 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"36eae06a-d8be-4128-8d68-acb32e1f011b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a97a1a7f3aba494cd0da6bcd15052a85d70642642a62d108ac0be51ce8e3d302\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww2bb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8c3aea222eb433729d8eac107ebcdb306f649bef25942cd0390bfc1bbda5c74\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww2bb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:35Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-zqj4j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:43Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:43 crc kubenswrapper[4790]: I1003 08:40:43.870396 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vsjvh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa372dcc-63aa-48e5-b153-7739078026ef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:36Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:36Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e69102d376e90485ca565d34e18dbc12067e166f0bdf3c4325a0873f9f2b9b0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52pgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e52e55afef7b68f05e0c7d975d4497ee57dd9fa5aef7f72e9ca7441ea1633c05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52pgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55b4e8ab33316a7f7057ee723ebd49eb0adcd7270124f084b2828ce538ed3297\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52pgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f2aaa340e3927261ef0119f786b348e2091ddcd4b791619cc521122cc0774e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52pgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff0f1dd623e164ebe1862661c5905a38bec3aa1dd5c61982eebde25d0feb4c6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52pgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27b1573f76936a5148d24accaa7fdc1a1da552fb56b348812445e7f1c1e4cbef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52pgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f23e27271cc758d5c79c89c2c23cc95fdeec4f05bc5e026a05e0ce68da68541b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52pgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74d44e422b0cbbd3793553dad98475d5a93ac38530a167db0dd1ef152edac010\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52pgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e45c39722f37fc673cee930ca10396bba43b56e19e61aa99bd2b89f131341cb4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e45c39722f37fc673cee930ca10396bba43b56e19e61aa99bd2b89f131341cb4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:40:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52pgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vsjvh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:43Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:43 crc kubenswrapper[4790]: I1003 08:40:43.887582 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28b99cc027b4390bdc7e3ed519bb798eb4a7b825ff29e4574f3420b9922265b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:43Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:43 crc kubenswrapper[4790]: I1003 08:40:43.895555 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:43 crc kubenswrapper[4790]: I1003 08:40:43.895623 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:43 crc kubenswrapper[4790]: I1003 08:40:43.895638 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:43 crc kubenswrapper[4790]: I1003 08:40:43.895660 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:43 crc kubenswrapper[4790]: I1003 08:40:43.895677 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:43Z","lastTransitionTime":"2025-10-03T08:40:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:43 crc kubenswrapper[4790]: I1003 08:40:43.904677 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"535765e1-fd5d-4501-b13e-684c5f08b296\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e0caa9033c82a625d9594342d4e58052cfbfea8251bc0fa2f1b102ba235b5ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db5e8a267e2821ea1c1628ff477a4efc3c5bcc78538460b0f30a160bd738dcee\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d85a637b07da3de31d73486c444bf86e7e2fd799180fc1cd9ab56fee61afaa9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://905a99c09cfaadc3eddc17fa6e509098364525d7a03add46a97c98dc08def8a2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://905a99c09cfaadc3eddc17fa6e509098364525d7a03add46a97c98dc08def8a2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T08:40:30Z\\\",\\\"message\\\":\\\"file observer\\\\nW1003 08:40:30.396497 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1003 08:40:30.399279 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 08:40:30.400119 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-247026750/tls.crt::/tmp/serving-cert-247026750/tls.key\\\\\\\"\\\\nI1003 08:40:30.809123 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1003 08:40:30.812812 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1003 08:40:30.812831 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1003 08:40:30.812852 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1003 08:40:30.812858 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1003 08:40:30.817440 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1003 08:40:30.817463 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1003 08:40:30.817482 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 08:40:30.817491 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 08:40:30.817495 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1003 08:40:30.817498 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1003 08:40:30.817501 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1003 08:40:30.817504 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1003 08:40:30.821080 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4591d1f2d844cb16eef19a65aaad416898d23ca5ce266e4863a9810d49ff4ba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://444e973af493d5aca2c924bcaee20abca5f1281e6b9c9f364e4603bf622d0e48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://444e973af493d5aca2c924bcaee20abca5f1281e6b9c9f364e4603bf622d0e48\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:40:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:14Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:43Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:43 crc kubenswrapper[4790]: I1003 08:40:43.920191 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gprtl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c5bfb61-e69a-4576-b200-18dd7f38c7f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dcb3e7b6550f0ad490e187c2154ec53cb427cbab7acaf0618eb4de188eb724c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66xc5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:34Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gprtl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:43Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:43 crc kubenswrapper[4790]: I1003 08:40:43.941660 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3362615c1396a6051d3770c79e4b67eea9d0c9df2a1a45381f729c4120970818\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://41113d498e57788a4c3162e8e981f5b9d9a5a7d49ee50b15085688fbc6b67f4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:43Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:43 crc kubenswrapper[4790]: I1003 08:40:43.961653 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08d57a60251c60c6c6a2cab2b862c322c01b531ca1aa12ff4b0933bee9eb95f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:43Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:43 crc kubenswrapper[4790]: I1003 08:40:43.983265 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-hncl2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f2f5631d-8486-44f9-81b0-ee78515e7866\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3fde680972c36057c9e06b0f745acbfb2e6ab399b14d6c25a3c379b65e598afd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-strp2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-hncl2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:43Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:43 crc kubenswrapper[4790]: I1003 08:40:43.999168 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:43 crc kubenswrapper[4790]: I1003 08:40:43.999249 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:43 crc kubenswrapper[4790]: I1003 08:40:43.999275 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:43 crc kubenswrapper[4790]: I1003 08:40:43.999364 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:43 crc kubenswrapper[4790]: I1003 08:40:43.999436 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:43Z","lastTransitionTime":"2025-10-03T08:40:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:44 crc kubenswrapper[4790]: I1003 08:40:44.001624 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7ddb59e5-6feb-471d-9594-f86e6993a2e4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b0138e08174582c0c1388716f1f6b9e42a16b009f1a467803c4dacd50c3d80ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c575d680c9bc994b8c2e97e720d794138153bc3329c4b7f8ef81115a0245d0cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://027dd0df8d83044cff9f4e8c87bef3bb18e2169bb93e615486fd1e9b5a17b86e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e837e81f3b907d8506f674d13a9bc8f20f63e6c40e1f952cd57b715855ec9bf9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:14Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:43Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:44 crc kubenswrapper[4790]: I1003 08:40:44.020321 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9fnsr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e01f9373-74e4-439b-bd7f-ad00e60a3284\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krfjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://876c35e32e5877c2a69cdbb54b84ec0bc0daebd05c7847875a1c122aafac2057\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://876c35e32e5877c2a69cdbb54b84ec0bc0daebd05c7847875a1c122aafac2057\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krfjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://938ff3fffad5347212d00d02f96c105cd5e6252a4d6a7ada258f0367da3f8e77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://938ff3fffad5347212d00d02f96c105cd5e6252a4d6a7ada258f0367da3f8e77\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:40:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krfjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca99fc95f4bc463b8e638cfa1e6dc6be77fc4111b6ec61b4b2913fdf91f46671\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ca99fc95f4bc463b8e638cfa1e6dc6be77fc4111b6ec61b4b2913fdf91f46671\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:40:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krfjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9667f80988a61a4f26f6f2b8b047ef94bdc6fa052083d5d1285f5bd9235615ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9667f80988a61a4f26f6f2b8b047ef94bdc6fa052083d5d1285f5bd9235615ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:40:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krfjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a4e7fb7b6f922e6e0fa2a7fcb5476ba549a6aa21dc88ee145bc55f4dac0912bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a4e7fb7b6f922e6e0fa2a7fcb5476ba549a6aa21dc88ee145bc55f4dac0912bb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:40:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krfjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krfjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9fnsr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:44Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:44 crc kubenswrapper[4790]: I1003 08:40:44.032622 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7bpwn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a3923f73-5efb-4ce3-b68f-0c4e1e8eca4b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://305c950f0352fe1bec1c151dc89eed1389144ad5ee321458d48c6609e688bb73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gt4gf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:37Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7bpwn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:44Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:44 crc kubenswrapper[4790]: I1003 08:40:44.048058 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:44Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:44 crc kubenswrapper[4790]: I1003 08:40:44.064277 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:44Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:44 crc kubenswrapper[4790]: I1003 08:40:44.079497 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"36eae06a-d8be-4128-8d68-acb32e1f011b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a97a1a7f3aba494cd0da6bcd15052a85d70642642a62d108ac0be51ce8e3d302\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww2bb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8c3aea222eb433729d8eac107ebcdb306f649bef25942cd0390bfc1bbda5c74\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww2bb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:35Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-zqj4j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:44Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:44 crc kubenswrapper[4790]: I1003 08:40:44.104217 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:44 crc kubenswrapper[4790]: I1003 08:40:44.104715 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:44 crc kubenswrapper[4790]: I1003 08:40:44.104729 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:44 crc kubenswrapper[4790]: I1003 08:40:44.104753 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:44 crc kubenswrapper[4790]: I1003 08:40:44.104768 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:44Z","lastTransitionTime":"2025-10-03T08:40:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:44 crc kubenswrapper[4790]: I1003 08:40:44.153272 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vsjvh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa372dcc-63aa-48e5-b153-7739078026ef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e69102d376e90485ca565d34e18dbc12067e166f0bdf3c4325a0873f9f2b9b0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52pgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e52e55afef7b68f05e0c7d975d4497ee57dd9fa5aef7f72e9ca7441ea1633c05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52pgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55b4e8ab33316a7f7057ee723ebd49eb0adcd7270124f084b2828ce538ed3297\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52pgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f2aaa340e3927261ef0119f786b348e2091ddcd4b791619cc521122cc0774e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52pgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff0f1dd623e164ebe1862661c5905a38bec3aa1dd5c61982eebde25d0feb4c6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52pgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27b1573f76936a5148d24accaa7fdc1a1da552fb56b348812445e7f1c1e4cbef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52pgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f23e27271cc758d5c79c89c2c23cc95fdeec4f05bc5e026a05e0ce68da68541b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52pgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74d44e422b0cbbd3793553dad98475d5a93ac38530a167db0dd1ef152edac010\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52pgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e45c39722f37fc673cee930ca10396bba43b56e19e61aa99bd2b89f131341cb4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e45c39722f37fc673cee930ca10396bba43b56e19e61aa99bd2b89f131341cb4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:40:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52pgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vsjvh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:44Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:44 crc kubenswrapper[4790]: I1003 08:40:44.169676 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28b99cc027b4390bdc7e3ed519bb798eb4a7b825ff29e4574f3420b9922265b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:44Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:44 crc kubenswrapper[4790]: I1003 08:40:44.184473 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:44Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:44 crc kubenswrapper[4790]: I1003 08:40:44.195957 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gprtl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c5bfb61-e69a-4576-b200-18dd7f38c7f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dcb3e7b6550f0ad490e187c2154ec53cb427cbab7acaf0618eb4de188eb724c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66xc5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:34Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gprtl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:44Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:44 crc kubenswrapper[4790]: I1003 08:40:44.208455 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:44 crc kubenswrapper[4790]: I1003 08:40:44.208576 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:44 crc kubenswrapper[4790]: I1003 08:40:44.208613 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:44 crc kubenswrapper[4790]: I1003 08:40:44.208647 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:44 crc kubenswrapper[4790]: I1003 08:40:44.208666 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:44Z","lastTransitionTime":"2025-10-03T08:40:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:44 crc kubenswrapper[4790]: I1003 08:40:44.209982 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3362615c1396a6051d3770c79e4b67eea9d0c9df2a1a45381f729c4120970818\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://41113d498e57788a4c3162e8e981f5b9d9a5a7d49ee50b15085688fbc6b67f4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:44Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:44 crc kubenswrapper[4790]: I1003 08:40:44.223336 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"535765e1-fd5d-4501-b13e-684c5f08b296\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e0caa9033c82a625d9594342d4e58052cfbfea8251bc0fa2f1b102ba235b5ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db5e8a267e2821ea1c1628ff477a4efc3c5bcc78538460b0f30a160bd738dcee\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d85a637b07da3de31d73486c444bf86e7e2fd799180fc1cd9ab56fee61afaa9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://905a99c09cfaadc3eddc17fa6e509098364525d7a03add46a97c98dc08def8a2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://905a99c09cfaadc3eddc17fa6e509098364525d7a03add46a97c98dc08def8a2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T08:40:30Z\\\",\\\"message\\\":\\\"file observer\\\\nW1003 08:40:30.396497 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1003 08:40:30.399279 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 08:40:30.400119 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-247026750/tls.crt::/tmp/serving-cert-247026750/tls.key\\\\\\\"\\\\nI1003 08:40:30.809123 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1003 08:40:30.812812 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1003 08:40:30.812831 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1003 08:40:30.812852 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1003 08:40:30.812858 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1003 08:40:30.817440 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1003 08:40:30.817463 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1003 08:40:30.817482 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 08:40:30.817491 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 08:40:30.817495 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1003 08:40:30.817498 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1003 08:40:30.817501 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1003 08:40:30.817504 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1003 08:40:30.821080 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4591d1f2d844cb16eef19a65aaad416898d23ca5ce266e4863a9810d49ff4ba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://444e973af493d5aca2c924bcaee20abca5f1281e6b9c9f364e4603bf622d0e48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://444e973af493d5aca2c924bcaee20abca5f1281e6b9c9f364e4603bf622d0e48\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:40:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:14Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:44Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:44 crc kubenswrapper[4790]: I1003 08:40:44.311893 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:44 crc kubenswrapper[4790]: I1003 08:40:44.311950 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:44 crc kubenswrapper[4790]: I1003 08:40:44.311966 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:44 crc kubenswrapper[4790]: I1003 08:40:44.311987 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:44 crc kubenswrapper[4790]: I1003 08:40:44.312003 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:44Z","lastTransitionTime":"2025-10-03T08:40:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:44 crc kubenswrapper[4790]: I1003 08:40:44.327634 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 08:40:44 crc kubenswrapper[4790]: E1003 08:40:44.327783 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 08:40:44 crc kubenswrapper[4790]: I1003 08:40:44.327838 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 08:40:44 crc kubenswrapper[4790]: I1003 08:40:44.327927 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 08:40:44 crc kubenswrapper[4790]: E1003 08:40:44.328121 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 08:40:44 crc kubenswrapper[4790]: E1003 08:40:44.328245 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 08:40:44 crc kubenswrapper[4790]: I1003 08:40:44.356827 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7ddb59e5-6feb-471d-9594-f86e6993a2e4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b0138e08174582c0c1388716f1f6b9e42a16b009f1a467803c4dacd50c3d80ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c575d680c9bc994b8c2e97e720d794138153bc3329c4b7f8ef81115a0245d0cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://027dd0df8d83044cff9f4e8c87bef3bb18e2169bb93e615486fd1e9b5a17b86e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e837e81f3b907d8506f674d13a9bc8f20f63e6c40e1f952cd57b715855ec9bf9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:14Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:44Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:44 crc kubenswrapper[4790]: I1003 08:40:44.376143 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08d57a60251c60c6c6a2cab2b862c322c01b531ca1aa12ff4b0933bee9eb95f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:44Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:44 crc kubenswrapper[4790]: I1003 08:40:44.400442 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-hncl2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f2f5631d-8486-44f9-81b0-ee78515e7866\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3fde680972c36057c9e06b0f745acbfb2e6ab399b14d6c25a3c379b65e598afd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-strp2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-hncl2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:44Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:44 crc kubenswrapper[4790]: I1003 08:40:44.414157 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:44Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:44 crc kubenswrapper[4790]: I1003 08:40:44.429248 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9fnsr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e01f9373-74e4-439b-bd7f-ad00e60a3284\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krfjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://876c35e32e5877c2a69cdbb54b84ec0bc0daebd05c7847875a1c122aafac2057\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://876c35e32e5877c2a69cdbb54b84ec0bc0daebd05c7847875a1c122aafac2057\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krfjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://938ff3fffad5347212d00d02f96c105cd5e6252a4d6a7ada258f0367da3f8e77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://938ff3fffad5347212d00d02f96c105cd5e6252a4d6a7ada258f0367da3f8e77\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:40:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krfjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca99fc95f4bc463b8e638cfa1e6dc6be77fc4111b6ec61b4b2913fdf91f46671\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ca99fc95f4bc463b8e638cfa1e6dc6be77fc4111b6ec61b4b2913fdf91f46671\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:40:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krfjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9667f80988a61a4f26f6f2b8b047ef94bdc6fa052083d5d1285f5bd9235615ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9667f80988a61a4f26f6f2b8b047ef94bdc6fa052083d5d1285f5bd9235615ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:40:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krfjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a4e7fb7b6f922e6e0fa2a7fcb5476ba549a6aa21dc88ee145bc55f4dac0912bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a4e7fb7b6f922e6e0fa2a7fcb5476ba549a6aa21dc88ee145bc55f4dac0912bb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:40:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krfjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krfjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9fnsr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:44Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:44 crc kubenswrapper[4790]: I1003 08:40:44.439737 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:44 crc kubenswrapper[4790]: I1003 08:40:44.439801 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:44 crc kubenswrapper[4790]: I1003 08:40:44.439822 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:44 crc kubenswrapper[4790]: I1003 08:40:44.439850 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:44 crc kubenswrapper[4790]: I1003 08:40:44.439871 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:44Z","lastTransitionTime":"2025-10-03T08:40:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:44 crc kubenswrapper[4790]: I1003 08:40:44.443378 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7bpwn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a3923f73-5efb-4ce3-b68f-0c4e1e8eca4b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://305c950f0352fe1bec1c151dc89eed1389144ad5ee321458d48c6609e688bb73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gt4gf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:37Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7bpwn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:44Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:44 crc kubenswrapper[4790]: I1003 08:40:44.468492 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28b99cc027b4390bdc7e3ed519bb798eb4a7b825ff29e4574f3420b9922265b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:44Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:44 crc kubenswrapper[4790]: I1003 08:40:44.482909 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:44Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:44 crc kubenswrapper[4790]: I1003 08:40:44.499042 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:44Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:44 crc kubenswrapper[4790]: I1003 08:40:44.511699 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"36eae06a-d8be-4128-8d68-acb32e1f011b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a97a1a7f3aba494cd0da6bcd15052a85d70642642a62d108ac0be51ce8e3d302\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww2bb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8c3aea222eb433729d8eac107ebcdb306f649bef25942cd0390bfc1bbda5c74\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww2bb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:35Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-zqj4j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:44Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:44 crc kubenswrapper[4790]: I1003 08:40:44.532878 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vsjvh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa372dcc-63aa-48e5-b153-7739078026ef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e69102d376e90485ca565d34e18dbc12067e166f0bdf3c4325a0873f9f2b9b0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52pgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e52e55afef7b68f05e0c7d975d4497ee57dd9fa5aef7f72e9ca7441ea1633c05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52pgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55b4e8ab33316a7f7057ee723ebd49eb0adcd7270124f084b2828ce538ed3297\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52pgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f2aaa340e3927261ef0119f786b348e2091ddcd4b791619cc521122cc0774e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52pgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff0f1dd623e164ebe1862661c5905a38bec3aa1dd5c61982eebde25d0feb4c6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52pgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27b1573f76936a5148d24accaa7fdc1a1da552fb56b348812445e7f1c1e4cbef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52pgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f23e27271cc758d5c79c89c2c23cc95fdeec4f05bc5e026a05e0ce68da68541b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52pgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74d44e422b0cbbd3793553dad98475d5a93ac38530a167db0dd1ef152edac010\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52pgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e45c39722f37fc673cee930ca10396bba43b56e19e61aa99bd2b89f131341cb4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e45c39722f37fc673cee930ca10396bba43b56e19e61aa99bd2b89f131341cb4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:40:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52pgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vsjvh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:44Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:44 crc kubenswrapper[4790]: I1003 08:40:44.542408 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:44 crc kubenswrapper[4790]: I1003 08:40:44.542456 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:44 crc kubenswrapper[4790]: I1003 08:40:44.542469 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:44 crc kubenswrapper[4790]: I1003 08:40:44.542488 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:44 crc kubenswrapper[4790]: I1003 08:40:44.542501 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:44Z","lastTransitionTime":"2025-10-03T08:40:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:44 crc kubenswrapper[4790]: I1003 08:40:44.547553 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"535765e1-fd5d-4501-b13e-684c5f08b296\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e0caa9033c82a625d9594342d4e58052cfbfea8251bc0fa2f1b102ba235b5ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db5e8a267e2821ea1c1628ff477a4efc3c5bcc78538460b0f30a160bd738dcee\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d85a637b07da3de31d73486c444bf86e7e2fd799180fc1cd9ab56fee61afaa9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://905a99c09cfaadc3eddc17fa6e509098364525d7a03add46a97c98dc08def8a2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://905a99c09cfaadc3eddc17fa6e509098364525d7a03add46a97c98dc08def8a2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T08:40:30Z\\\",\\\"message\\\":\\\"file observer\\\\nW1003 08:40:30.396497 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1003 08:40:30.399279 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 08:40:30.400119 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-247026750/tls.crt::/tmp/serving-cert-247026750/tls.key\\\\\\\"\\\\nI1003 08:40:30.809123 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1003 08:40:30.812812 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1003 08:40:30.812831 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1003 08:40:30.812852 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1003 08:40:30.812858 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1003 08:40:30.817440 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1003 08:40:30.817463 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1003 08:40:30.817482 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 08:40:30.817491 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 08:40:30.817495 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1003 08:40:30.817498 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1003 08:40:30.817501 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1003 08:40:30.817504 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1003 08:40:30.821080 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4591d1f2d844cb16eef19a65aaad416898d23ca5ce266e4863a9810d49ff4ba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://444e973af493d5aca2c924bcaee20abca5f1281e6b9c9f364e4603bf622d0e48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://444e973af493d5aca2c924bcaee20abca5f1281e6b9c9f364e4603bf622d0e48\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:40:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:14Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:44Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:44 crc kubenswrapper[4790]: I1003 08:40:44.562705 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gprtl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c5bfb61-e69a-4576-b200-18dd7f38c7f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dcb3e7b6550f0ad490e187c2154ec53cb427cbab7acaf0618eb4de188eb724c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66xc5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:34Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gprtl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:44Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:44 crc kubenswrapper[4790]: I1003 08:40:44.578873 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3362615c1396a6051d3770c79e4b67eea9d0c9df2a1a45381f729c4120970818\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://41113d498e57788a4c3162e8e981f5b9d9a5a7d49ee50b15085688fbc6b67f4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:44Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:44 crc kubenswrapper[4790]: I1003 08:40:44.645533 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:44 crc kubenswrapper[4790]: I1003 08:40:44.645638 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:44 crc kubenswrapper[4790]: I1003 08:40:44.645656 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:44 crc kubenswrapper[4790]: I1003 08:40:44.645685 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:44 crc kubenswrapper[4790]: I1003 08:40:44.645702 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:44Z","lastTransitionTime":"2025-10-03T08:40:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:44 crc kubenswrapper[4790]: I1003 08:40:44.671049 4790 generic.go:334] "Generic (PLEG): container finished" podID="e01f9373-74e4-439b-bd7f-ad00e60a3284" containerID="76a8239bfbb38c5858f1f9b0b01b9552463d8b2774564d7cdacb6757bf053656" exitCode=0 Oct 03 08:40:44 crc kubenswrapper[4790]: I1003 08:40:44.671147 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-9fnsr" event={"ID":"e01f9373-74e4-439b-bd7f-ad00e60a3284","Type":"ContainerDied","Data":"76a8239bfbb38c5858f1f9b0b01b9552463d8b2774564d7cdacb6757bf053656"} Oct 03 08:40:44 crc kubenswrapper[4790]: I1003 08:40:44.671359 4790 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 03 08:40:44 crc kubenswrapper[4790]: I1003 08:40:44.690061 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"535765e1-fd5d-4501-b13e-684c5f08b296\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e0caa9033c82a625d9594342d4e58052cfbfea8251bc0fa2f1b102ba235b5ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db5e8a267e2821ea1c1628ff477a4efc3c5bcc78538460b0f30a160bd738dcee\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d85a637b07da3de31d73486c444bf86e7e2fd799180fc1cd9ab56fee61afaa9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://905a99c09cfaadc3eddc17fa6e509098364525d7a03add46a97c98dc08def8a2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://905a99c09cfaadc3eddc17fa6e509098364525d7a03add46a97c98dc08def8a2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T08:40:30Z\\\",\\\"message\\\":\\\"file observer\\\\nW1003 08:40:30.396497 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1003 08:40:30.399279 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 08:40:30.400119 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-247026750/tls.crt::/tmp/serving-cert-247026750/tls.key\\\\\\\"\\\\nI1003 08:40:30.809123 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1003 08:40:30.812812 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1003 08:40:30.812831 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1003 08:40:30.812852 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1003 08:40:30.812858 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1003 08:40:30.817440 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1003 08:40:30.817463 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1003 08:40:30.817482 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 08:40:30.817491 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 08:40:30.817495 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1003 08:40:30.817498 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1003 08:40:30.817501 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1003 08:40:30.817504 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1003 08:40:30.821080 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4591d1f2d844cb16eef19a65aaad416898d23ca5ce266e4863a9810d49ff4ba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://444e973af493d5aca2c924bcaee20abca5f1281e6b9c9f364e4603bf622d0e48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://444e973af493d5aca2c924bcaee20abca5f1281e6b9c9f364e4603bf622d0e48\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:40:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:14Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:44Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:44 crc kubenswrapper[4790]: I1003 08:40:44.705179 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gprtl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c5bfb61-e69a-4576-b200-18dd7f38c7f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dcb3e7b6550f0ad490e187c2154ec53cb427cbab7acaf0618eb4de188eb724c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66xc5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:34Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gprtl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:44Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:44 crc kubenswrapper[4790]: I1003 08:40:44.722052 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3362615c1396a6051d3770c79e4b67eea9d0c9df2a1a45381f729c4120970818\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://41113d498e57788a4c3162e8e981f5b9d9a5a7d49ee50b15085688fbc6b67f4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:44Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:44 crc kubenswrapper[4790]: I1003 08:40:44.738553 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7ddb59e5-6feb-471d-9594-f86e6993a2e4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b0138e08174582c0c1388716f1f6b9e42a16b009f1a467803c4dacd50c3d80ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c575d680c9bc994b8c2e97e720d794138153bc3329c4b7f8ef81115a0245d0cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://027dd0df8d83044cff9f4e8c87bef3bb18e2169bb93e615486fd1e9b5a17b86e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e837e81f3b907d8506f674d13a9bc8f20f63e6c40e1f952cd57b715855ec9bf9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:14Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:44Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:44 crc kubenswrapper[4790]: I1003 08:40:44.748982 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:44 crc kubenswrapper[4790]: I1003 08:40:44.749029 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:44 crc kubenswrapper[4790]: I1003 08:40:44.749040 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:44 crc kubenswrapper[4790]: I1003 08:40:44.749064 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:44 crc kubenswrapper[4790]: I1003 08:40:44.749081 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:44Z","lastTransitionTime":"2025-10-03T08:40:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:44 crc kubenswrapper[4790]: I1003 08:40:44.755773 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08d57a60251c60c6c6a2cab2b862c322c01b531ca1aa12ff4b0933bee9eb95f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:44Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:44 crc kubenswrapper[4790]: I1003 08:40:44.775801 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-hncl2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f2f5631d-8486-44f9-81b0-ee78515e7866\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3fde680972c36057c9e06b0f745acbfb2e6ab399b14d6c25a3c379b65e598afd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-strp2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-hncl2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:44Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:44 crc kubenswrapper[4790]: I1003 08:40:44.795262 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:44Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:44 crc kubenswrapper[4790]: I1003 08:40:44.813176 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9fnsr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e01f9373-74e4-439b-bd7f-ad00e60a3284\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krfjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://876c35e32e5877c2a69cdbb54b84ec0bc0daebd05c7847875a1c122aafac2057\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://876c35e32e5877c2a69cdbb54b84ec0bc0daebd05c7847875a1c122aafac2057\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krfjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://938ff3fffad5347212d00d02f96c105cd5e6252a4d6a7ada258f0367da3f8e77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://938ff3fffad5347212d00d02f96c105cd5e6252a4d6a7ada258f0367da3f8e77\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:40:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krfjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca99fc95f4bc463b8e638cfa1e6dc6be77fc4111b6ec61b4b2913fdf91f46671\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ca99fc95f4bc463b8e638cfa1e6dc6be77fc4111b6ec61b4b2913fdf91f46671\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:40:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krfjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9667f80988a61a4f26f6f2b8b047ef94bdc6fa052083d5d1285f5bd9235615ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9667f80988a61a4f26f6f2b8b047ef94bdc6fa052083d5d1285f5bd9235615ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:40:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krfjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a4e7fb7b6f922e6e0fa2a7fcb5476ba549a6aa21dc88ee145bc55f4dac0912bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a4e7fb7b6f922e6e0fa2a7fcb5476ba549a6aa21dc88ee145bc55f4dac0912bb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:40:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krfjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://76a8239bfbb38c5858f1f9b0b01b9552463d8b2774564d7cdacb6757bf053656\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76a8239bfbb38c5858f1f9b0b01b9552463d8b2774564d7cdacb6757bf053656\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:40:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krfjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9fnsr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:44Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:44 crc kubenswrapper[4790]: I1003 08:40:44.825107 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7bpwn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a3923f73-5efb-4ce3-b68f-0c4e1e8eca4b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://305c950f0352fe1bec1c151dc89eed1389144ad5ee321458d48c6609e688bb73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gt4gf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:37Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7bpwn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:44Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:44 crc kubenswrapper[4790]: I1003 08:40:44.840073 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28b99cc027b4390bdc7e3ed519bb798eb4a7b825ff29e4574f3420b9922265b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:44Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:44 crc kubenswrapper[4790]: I1003 08:40:44.853851 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:44Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:44 crc kubenswrapper[4790]: I1003 08:40:44.857220 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:44 crc kubenswrapper[4790]: I1003 08:40:44.857303 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:44 crc kubenswrapper[4790]: I1003 08:40:44.857317 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:44 crc kubenswrapper[4790]: I1003 08:40:44.857338 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:44 crc kubenswrapper[4790]: I1003 08:40:44.857351 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:44Z","lastTransitionTime":"2025-10-03T08:40:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:44 crc kubenswrapper[4790]: I1003 08:40:44.869263 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:44Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:44 crc kubenswrapper[4790]: I1003 08:40:44.883924 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"36eae06a-d8be-4128-8d68-acb32e1f011b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a97a1a7f3aba494cd0da6bcd15052a85d70642642a62d108ac0be51ce8e3d302\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww2bb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8c3aea222eb433729d8eac107ebcdb306f649bef25942cd0390bfc1bbda5c74\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww2bb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:35Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-zqj4j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:44Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:44 crc kubenswrapper[4790]: I1003 08:40:44.910473 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vsjvh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa372dcc-63aa-48e5-b153-7739078026ef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e69102d376e90485ca565d34e18dbc12067e166f0bdf3c4325a0873f9f2b9b0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52pgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e52e55afef7b68f05e0c7d975d4497ee57dd9fa5aef7f72e9ca7441ea1633c05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52pgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55b4e8ab33316a7f7057ee723ebd49eb0adcd7270124f084b2828ce538ed3297\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52pgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f2aaa340e3927261ef0119f786b348e2091ddcd4b791619cc521122cc0774e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52pgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff0f1dd623e164ebe1862661c5905a38bec3aa1dd5c61982eebde25d0feb4c6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52pgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27b1573f76936a5148d24accaa7fdc1a1da552fb56b348812445e7f1c1e4cbef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52pgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f23e27271cc758d5c79c89c2c23cc95fdeec4f05bc5e026a05e0ce68da68541b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52pgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74d44e422b0cbbd3793553dad98475d5a93ac38530a167db0dd1ef152edac010\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52pgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e45c39722f37fc673cee930ca10396bba43b56e19e61aa99bd2b89f131341cb4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e45c39722f37fc673cee930ca10396bba43b56e19e61aa99bd2b89f131341cb4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:40:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52pgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vsjvh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:44Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:44 crc kubenswrapper[4790]: I1003 08:40:44.960847 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:44 crc kubenswrapper[4790]: I1003 08:40:44.960884 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:44 crc kubenswrapper[4790]: I1003 08:40:44.960894 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:44 crc kubenswrapper[4790]: I1003 08:40:44.960911 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:44 crc kubenswrapper[4790]: I1003 08:40:44.960922 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:44Z","lastTransitionTime":"2025-10-03T08:40:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:45 crc kubenswrapper[4790]: I1003 08:40:45.063706 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:45 crc kubenswrapper[4790]: I1003 08:40:45.063763 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:45 crc kubenswrapper[4790]: I1003 08:40:45.063778 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:45 crc kubenswrapper[4790]: I1003 08:40:45.063801 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:45 crc kubenswrapper[4790]: I1003 08:40:45.063815 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:45Z","lastTransitionTime":"2025-10-03T08:40:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:45 crc kubenswrapper[4790]: I1003 08:40:45.166767 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:45 crc kubenswrapper[4790]: I1003 08:40:45.166809 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:45 crc kubenswrapper[4790]: I1003 08:40:45.166820 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:45 crc kubenswrapper[4790]: I1003 08:40:45.166840 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:45 crc kubenswrapper[4790]: I1003 08:40:45.166852 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:45Z","lastTransitionTime":"2025-10-03T08:40:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:45 crc kubenswrapper[4790]: I1003 08:40:45.269618 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:45 crc kubenswrapper[4790]: I1003 08:40:45.269679 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:45 crc kubenswrapper[4790]: I1003 08:40:45.269696 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:45 crc kubenswrapper[4790]: I1003 08:40:45.269724 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:45 crc kubenswrapper[4790]: I1003 08:40:45.269741 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:45Z","lastTransitionTime":"2025-10-03T08:40:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:45 crc kubenswrapper[4790]: I1003 08:40:45.373208 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:45 crc kubenswrapper[4790]: I1003 08:40:45.373263 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:45 crc kubenswrapper[4790]: I1003 08:40:45.373282 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:45 crc kubenswrapper[4790]: I1003 08:40:45.373308 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:45 crc kubenswrapper[4790]: I1003 08:40:45.373325 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:45Z","lastTransitionTime":"2025-10-03T08:40:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:45 crc kubenswrapper[4790]: I1003 08:40:45.476963 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:45 crc kubenswrapper[4790]: I1003 08:40:45.477039 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:45 crc kubenswrapper[4790]: I1003 08:40:45.477062 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:45 crc kubenswrapper[4790]: I1003 08:40:45.477096 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:45 crc kubenswrapper[4790]: I1003 08:40:45.477343 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:45Z","lastTransitionTime":"2025-10-03T08:40:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:45 crc kubenswrapper[4790]: I1003 08:40:45.580910 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:45 crc kubenswrapper[4790]: I1003 08:40:45.580989 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:45 crc kubenswrapper[4790]: I1003 08:40:45.581004 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:45 crc kubenswrapper[4790]: I1003 08:40:45.581032 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:45 crc kubenswrapper[4790]: I1003 08:40:45.581050 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:45Z","lastTransitionTime":"2025-10-03T08:40:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:45 crc kubenswrapper[4790]: I1003 08:40:45.678970 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-9fnsr" event={"ID":"e01f9373-74e4-439b-bd7f-ad00e60a3284","Type":"ContainerStarted","Data":"45be0671b4ad4d5e294f87dba5d52747298b00a3ebd9a13f2845881c0f1846a9"} Oct 03 08:40:45 crc kubenswrapper[4790]: I1003 08:40:45.679109 4790 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 03 08:40:45 crc kubenswrapper[4790]: I1003 08:40:45.683654 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:45 crc kubenswrapper[4790]: I1003 08:40:45.683732 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:45 crc kubenswrapper[4790]: I1003 08:40:45.683801 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:45 crc kubenswrapper[4790]: I1003 08:40:45.683841 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:45 crc kubenswrapper[4790]: I1003 08:40:45.683868 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:45Z","lastTransitionTime":"2025-10-03T08:40:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:45 crc kubenswrapper[4790]: I1003 08:40:45.698842 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7ddb59e5-6feb-471d-9594-f86e6993a2e4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b0138e08174582c0c1388716f1f6b9e42a16b009f1a467803c4dacd50c3d80ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c575d680c9bc994b8c2e97e720d794138153bc3329c4b7f8ef81115a0245d0cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://027dd0df8d83044cff9f4e8c87bef3bb18e2169bb93e615486fd1e9b5a17b86e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e837e81f3b907d8506f674d13a9bc8f20f63e6c40e1f952cd57b715855ec9bf9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:14Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:45Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:45 crc kubenswrapper[4790]: I1003 08:40:45.717477 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08d57a60251c60c6c6a2cab2b862c322c01b531ca1aa12ff4b0933bee9eb95f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:45Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:45 crc kubenswrapper[4790]: I1003 08:40:45.738365 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-hncl2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f2f5631d-8486-44f9-81b0-ee78515e7866\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3fde680972c36057c9e06b0f745acbfb2e6ab399b14d6c25a3c379b65e598afd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-strp2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-hncl2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:45Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:45 crc kubenswrapper[4790]: I1003 08:40:45.756709 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:45Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:45 crc kubenswrapper[4790]: I1003 08:40:45.783803 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9fnsr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e01f9373-74e4-439b-bd7f-ad00e60a3284\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45be0671b4ad4d5e294f87dba5d52747298b00a3ebd9a13f2845881c0f1846a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krfjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://876c35e32e5877c2a69cdbb54b84ec0bc0daebd05c7847875a1c122aafac2057\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://876c35e32e5877c2a69cdbb54b84ec0bc0daebd05c7847875a1c122aafac2057\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krfjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://938ff3fffad5347212d00d02f96c105cd5e6252a4d6a7ada258f0367da3f8e77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://938ff3fffad5347212d00d02f96c105cd5e6252a4d6a7ada258f0367da3f8e77\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:40:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krfjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca99fc95f4bc463b8e638cfa1e6dc6be77fc4111b6ec61b4b2913fdf91f46671\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ca99fc95f4bc463b8e638cfa1e6dc6be77fc4111b6ec61b4b2913fdf91f46671\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:40:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krfjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9667f80988a61a4f26f6f2b8b047ef94bdc6fa052083d5d1285f5bd9235615ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9667f80988a61a4f26f6f2b8b047ef94bdc6fa052083d5d1285f5bd9235615ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:40:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krfjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a4e7fb7b6f922e6e0fa2a7fcb5476ba549a6aa21dc88ee145bc55f4dac0912bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a4e7fb7b6f922e6e0fa2a7fcb5476ba549a6aa21dc88ee145bc55f4dac0912bb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:40:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krfjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://76a8239bfbb38c5858f1f9b0b01b9552463d8b2774564d7cdacb6757bf053656\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76a8239bfbb38c5858f1f9b0b01b9552463d8b2774564d7cdacb6757bf053656\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:40:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krfjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9fnsr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:45Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:45 crc kubenswrapper[4790]: I1003 08:40:45.786658 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:45 crc kubenswrapper[4790]: I1003 08:40:45.786720 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:45 crc kubenswrapper[4790]: I1003 08:40:45.786734 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:45 crc kubenswrapper[4790]: I1003 08:40:45.786758 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:45 crc kubenswrapper[4790]: I1003 08:40:45.786771 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:45Z","lastTransitionTime":"2025-10-03T08:40:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:45 crc kubenswrapper[4790]: I1003 08:40:45.801212 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7bpwn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a3923f73-5efb-4ce3-b68f-0c4e1e8eca4b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://305c950f0352fe1bec1c151dc89eed1389144ad5ee321458d48c6609e688bb73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gt4gf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:37Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7bpwn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:45Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:45 crc kubenswrapper[4790]: I1003 08:40:45.823664 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28b99cc027b4390bdc7e3ed519bb798eb4a7b825ff29e4574f3420b9922265b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:45Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:45 crc kubenswrapper[4790]: I1003 08:40:45.842693 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:45Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:45 crc kubenswrapper[4790]: I1003 08:40:45.866076 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:45Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:45 crc kubenswrapper[4790]: I1003 08:40:45.887500 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"36eae06a-d8be-4128-8d68-acb32e1f011b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a97a1a7f3aba494cd0da6bcd15052a85d70642642a62d108ac0be51ce8e3d302\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww2bb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8c3aea222eb433729d8eac107ebcdb306f649bef25942cd0390bfc1bbda5c74\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww2bb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:35Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-zqj4j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:45Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:45 crc kubenswrapper[4790]: I1003 08:40:45.889825 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:45 crc kubenswrapper[4790]: I1003 08:40:45.889888 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:45 crc kubenswrapper[4790]: I1003 08:40:45.889902 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:45 crc kubenswrapper[4790]: I1003 08:40:45.889925 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:45 crc kubenswrapper[4790]: I1003 08:40:45.889942 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:45Z","lastTransitionTime":"2025-10-03T08:40:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:45 crc kubenswrapper[4790]: I1003 08:40:45.920441 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vsjvh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa372dcc-63aa-48e5-b153-7739078026ef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e69102d376e90485ca565d34e18dbc12067e166f0bdf3c4325a0873f9f2b9b0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52pgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e52e55afef7b68f05e0c7d975d4497ee57dd9fa5aef7f72e9ca7441ea1633c05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52pgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55b4e8ab33316a7f7057ee723ebd49eb0adcd7270124f084b2828ce538ed3297\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52pgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f2aaa340e3927261ef0119f786b348e2091ddcd4b791619cc521122cc0774e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52pgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff0f1dd623e164ebe1862661c5905a38bec3aa1dd5c61982eebde25d0feb4c6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52pgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27b1573f76936a5148d24accaa7fdc1a1da552fb56b348812445e7f1c1e4cbef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52pgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f23e27271cc758d5c79c89c2c23cc95fdeec4f05bc5e026a05e0ce68da68541b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52pgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74d44e422b0cbbd3793553dad98475d5a93ac38530a167db0dd1ef152edac010\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52pgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e45c39722f37fc673cee930ca10396bba43b56e19e61aa99bd2b89f131341cb4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e45c39722f37fc673cee930ca10396bba43b56e19e61aa99bd2b89f131341cb4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:40:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52pgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vsjvh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:45Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:45 crc kubenswrapper[4790]: I1003 08:40:45.940892 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"535765e1-fd5d-4501-b13e-684c5f08b296\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e0caa9033c82a625d9594342d4e58052cfbfea8251bc0fa2f1b102ba235b5ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db5e8a267e2821ea1c1628ff477a4efc3c5bcc78538460b0f30a160bd738dcee\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d85a637b07da3de31d73486c444bf86e7e2fd799180fc1cd9ab56fee61afaa9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://905a99c09cfaadc3eddc17fa6e509098364525d7a03add46a97c98dc08def8a2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://905a99c09cfaadc3eddc17fa6e509098364525d7a03add46a97c98dc08def8a2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T08:40:30Z\\\",\\\"message\\\":\\\"file observer\\\\nW1003 08:40:30.396497 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1003 08:40:30.399279 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 08:40:30.400119 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-247026750/tls.crt::/tmp/serving-cert-247026750/tls.key\\\\\\\"\\\\nI1003 08:40:30.809123 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1003 08:40:30.812812 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1003 08:40:30.812831 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1003 08:40:30.812852 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1003 08:40:30.812858 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1003 08:40:30.817440 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1003 08:40:30.817463 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1003 08:40:30.817482 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 08:40:30.817491 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 08:40:30.817495 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1003 08:40:30.817498 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1003 08:40:30.817501 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1003 08:40:30.817504 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1003 08:40:30.821080 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4591d1f2d844cb16eef19a65aaad416898d23ca5ce266e4863a9810d49ff4ba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://444e973af493d5aca2c924bcaee20abca5f1281e6b9c9f364e4603bf622d0e48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://444e973af493d5aca2c924bcaee20abca5f1281e6b9c9f364e4603bf622d0e48\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:40:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:14Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:45Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:45 crc kubenswrapper[4790]: I1003 08:40:45.955607 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gprtl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c5bfb61-e69a-4576-b200-18dd7f38c7f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dcb3e7b6550f0ad490e187c2154ec53cb427cbab7acaf0618eb4de188eb724c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66xc5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:34Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gprtl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:45Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:45 crc kubenswrapper[4790]: I1003 08:40:45.970947 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3362615c1396a6051d3770c79e4b67eea9d0c9df2a1a45381f729c4120970818\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://41113d498e57788a4c3162e8e981f5b9d9a5a7d49ee50b15085688fbc6b67f4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:45Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:45 crc kubenswrapper[4790]: I1003 08:40:45.993962 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:45 crc kubenswrapper[4790]: I1003 08:40:45.994016 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:45 crc kubenswrapper[4790]: I1003 08:40:45.994030 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:45 crc kubenswrapper[4790]: I1003 08:40:45.994054 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:45 crc kubenswrapper[4790]: I1003 08:40:45.994074 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:45Z","lastTransitionTime":"2025-10-03T08:40:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:46 crc kubenswrapper[4790]: I1003 08:40:46.097898 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:46 crc kubenswrapper[4790]: I1003 08:40:46.097982 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:46 crc kubenswrapper[4790]: I1003 08:40:46.098007 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:46 crc kubenswrapper[4790]: I1003 08:40:46.098042 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:46 crc kubenswrapper[4790]: I1003 08:40:46.098067 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:46Z","lastTransitionTime":"2025-10-03T08:40:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:46 crc kubenswrapper[4790]: I1003 08:40:46.200866 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:46 crc kubenswrapper[4790]: I1003 08:40:46.200927 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:46 crc kubenswrapper[4790]: I1003 08:40:46.200938 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:46 crc kubenswrapper[4790]: I1003 08:40:46.200960 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:46 crc kubenswrapper[4790]: I1003 08:40:46.200974 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:46Z","lastTransitionTime":"2025-10-03T08:40:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:46 crc kubenswrapper[4790]: I1003 08:40:46.304354 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:46 crc kubenswrapper[4790]: I1003 08:40:46.304419 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:46 crc kubenswrapper[4790]: I1003 08:40:46.304430 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:46 crc kubenswrapper[4790]: I1003 08:40:46.304454 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:46 crc kubenswrapper[4790]: I1003 08:40:46.304468 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:46Z","lastTransitionTime":"2025-10-03T08:40:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:46 crc kubenswrapper[4790]: I1003 08:40:46.327991 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 08:40:46 crc kubenswrapper[4790]: I1003 08:40:46.328051 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 08:40:46 crc kubenswrapper[4790]: I1003 08:40:46.328051 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 08:40:46 crc kubenswrapper[4790]: E1003 08:40:46.328250 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 08:40:46 crc kubenswrapper[4790]: E1003 08:40:46.328443 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 08:40:46 crc kubenswrapper[4790]: E1003 08:40:46.328609 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 08:40:46 crc kubenswrapper[4790]: I1003 08:40:46.407643 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:46 crc kubenswrapper[4790]: I1003 08:40:46.407679 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:46 crc kubenswrapper[4790]: I1003 08:40:46.407689 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:46 crc kubenswrapper[4790]: I1003 08:40:46.407707 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:46 crc kubenswrapper[4790]: I1003 08:40:46.407717 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:46Z","lastTransitionTime":"2025-10-03T08:40:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:46 crc kubenswrapper[4790]: I1003 08:40:46.511390 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:46 crc kubenswrapper[4790]: I1003 08:40:46.511433 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:46 crc kubenswrapper[4790]: I1003 08:40:46.511443 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:46 crc kubenswrapper[4790]: I1003 08:40:46.511460 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:46 crc kubenswrapper[4790]: I1003 08:40:46.511470 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:46Z","lastTransitionTime":"2025-10-03T08:40:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:46 crc kubenswrapper[4790]: I1003 08:40:46.614869 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:46 crc kubenswrapper[4790]: I1003 08:40:46.614927 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:46 crc kubenswrapper[4790]: I1003 08:40:46.614940 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:46 crc kubenswrapper[4790]: I1003 08:40:46.614961 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:46 crc kubenswrapper[4790]: I1003 08:40:46.614975 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:46Z","lastTransitionTime":"2025-10-03T08:40:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:46 crc kubenswrapper[4790]: I1003 08:40:46.718061 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:46 crc kubenswrapper[4790]: I1003 08:40:46.718100 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:46 crc kubenswrapper[4790]: I1003 08:40:46.718110 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:46 crc kubenswrapper[4790]: I1003 08:40:46.718128 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:46 crc kubenswrapper[4790]: I1003 08:40:46.718143 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:46Z","lastTransitionTime":"2025-10-03T08:40:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:46 crc kubenswrapper[4790]: I1003 08:40:46.820915 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:46 crc kubenswrapper[4790]: I1003 08:40:46.821372 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:46 crc kubenswrapper[4790]: I1003 08:40:46.821391 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:46 crc kubenswrapper[4790]: I1003 08:40:46.821418 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:46 crc kubenswrapper[4790]: I1003 08:40:46.821436 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:46Z","lastTransitionTime":"2025-10-03T08:40:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:46 crc kubenswrapper[4790]: I1003 08:40:46.924855 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:46 crc kubenswrapper[4790]: I1003 08:40:46.924910 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:46 crc kubenswrapper[4790]: I1003 08:40:46.924924 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:46 crc kubenswrapper[4790]: I1003 08:40:46.924948 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:46 crc kubenswrapper[4790]: I1003 08:40:46.924964 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:46Z","lastTransitionTime":"2025-10-03T08:40:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:47 crc kubenswrapper[4790]: I1003 08:40:47.028006 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:47 crc kubenswrapper[4790]: I1003 08:40:47.028087 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:47 crc kubenswrapper[4790]: I1003 08:40:47.028111 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:47 crc kubenswrapper[4790]: I1003 08:40:47.028149 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:47 crc kubenswrapper[4790]: I1003 08:40:47.028175 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:47Z","lastTransitionTime":"2025-10-03T08:40:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:47 crc kubenswrapper[4790]: I1003 08:40:47.131076 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:47 crc kubenswrapper[4790]: I1003 08:40:47.131119 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:47 crc kubenswrapper[4790]: I1003 08:40:47.131129 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:47 crc kubenswrapper[4790]: I1003 08:40:47.131148 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:47 crc kubenswrapper[4790]: I1003 08:40:47.131162 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:47Z","lastTransitionTime":"2025-10-03T08:40:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:47 crc kubenswrapper[4790]: I1003 08:40:47.233359 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:47 crc kubenswrapper[4790]: I1003 08:40:47.233429 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:47 crc kubenswrapper[4790]: I1003 08:40:47.233446 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:47 crc kubenswrapper[4790]: I1003 08:40:47.233478 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:47 crc kubenswrapper[4790]: I1003 08:40:47.233497 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:47Z","lastTransitionTime":"2025-10-03T08:40:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:47 crc kubenswrapper[4790]: I1003 08:40:47.337978 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:47 crc kubenswrapper[4790]: I1003 08:40:47.338082 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:47 crc kubenswrapper[4790]: I1003 08:40:47.338112 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:47 crc kubenswrapper[4790]: I1003 08:40:47.338161 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:47 crc kubenswrapper[4790]: I1003 08:40:47.338205 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:47Z","lastTransitionTime":"2025-10-03T08:40:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:47 crc kubenswrapper[4790]: I1003 08:40:47.443124 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:47 crc kubenswrapper[4790]: I1003 08:40:47.443168 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:47 crc kubenswrapper[4790]: I1003 08:40:47.443182 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:47 crc kubenswrapper[4790]: I1003 08:40:47.443202 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:47 crc kubenswrapper[4790]: I1003 08:40:47.443213 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:47Z","lastTransitionTime":"2025-10-03T08:40:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:47 crc kubenswrapper[4790]: I1003 08:40:47.547627 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:47 crc kubenswrapper[4790]: I1003 08:40:47.548188 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:47 crc kubenswrapper[4790]: I1003 08:40:47.548395 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:47 crc kubenswrapper[4790]: I1003 08:40:47.548639 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:47 crc kubenswrapper[4790]: I1003 08:40:47.548875 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:47Z","lastTransitionTime":"2025-10-03T08:40:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:47 crc kubenswrapper[4790]: I1003 08:40:47.652857 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:47 crc kubenswrapper[4790]: I1003 08:40:47.652930 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:47 crc kubenswrapper[4790]: I1003 08:40:47.652948 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:47 crc kubenswrapper[4790]: I1003 08:40:47.652978 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:47 crc kubenswrapper[4790]: I1003 08:40:47.652997 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:47Z","lastTransitionTime":"2025-10-03T08:40:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:47 crc kubenswrapper[4790]: I1003 08:40:47.688764 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vsjvh_fa372dcc-63aa-48e5-b153-7739078026ef/ovnkube-controller/0.log" Oct 03 08:40:47 crc kubenswrapper[4790]: I1003 08:40:47.694164 4790 generic.go:334] "Generic (PLEG): container finished" podID="fa372dcc-63aa-48e5-b153-7739078026ef" containerID="f23e27271cc758d5c79c89c2c23cc95fdeec4f05bc5e026a05e0ce68da68541b" exitCode=1 Oct 03 08:40:47 crc kubenswrapper[4790]: I1003 08:40:47.694234 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vsjvh" event={"ID":"fa372dcc-63aa-48e5-b153-7739078026ef","Type":"ContainerDied","Data":"f23e27271cc758d5c79c89c2c23cc95fdeec4f05bc5e026a05e0ce68da68541b"} Oct 03 08:40:47 crc kubenswrapper[4790]: I1003 08:40:47.695426 4790 scope.go:117] "RemoveContainer" containerID="f23e27271cc758d5c79c89c2c23cc95fdeec4f05bc5e026a05e0ce68da68541b" Oct 03 08:40:47 crc kubenswrapper[4790]: I1003 08:40:47.722505 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08d57a60251c60c6c6a2cab2b862c322c01b531ca1aa12ff4b0933bee9eb95f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:47Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:47 crc kubenswrapper[4790]: I1003 08:40:47.743994 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-hncl2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f2f5631d-8486-44f9-81b0-ee78515e7866\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3fde680972c36057c9e06b0f745acbfb2e6ab399b14d6c25a3c379b65e598afd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-strp2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-hncl2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:47Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:47 crc kubenswrapper[4790]: I1003 08:40:47.757191 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:47 crc kubenswrapper[4790]: I1003 08:40:47.757281 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:47 crc kubenswrapper[4790]: I1003 08:40:47.757294 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:47 crc kubenswrapper[4790]: I1003 08:40:47.757320 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:47 crc kubenswrapper[4790]: I1003 08:40:47.757335 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:47Z","lastTransitionTime":"2025-10-03T08:40:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:47 crc kubenswrapper[4790]: I1003 08:40:47.757877 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7ddb59e5-6feb-471d-9594-f86e6993a2e4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b0138e08174582c0c1388716f1f6b9e42a16b009f1a467803c4dacd50c3d80ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c575d680c9bc994b8c2e97e720d794138153bc3329c4b7f8ef81115a0245d0cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://027dd0df8d83044cff9f4e8c87bef3bb18e2169bb93e615486fd1e9b5a17b86e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e837e81f3b907d8506f674d13a9bc8f20f63e6c40e1f952cd57b715855ec9bf9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:14Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:47Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:47 crc kubenswrapper[4790]: I1003 08:40:47.773691 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9fnsr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e01f9373-74e4-439b-bd7f-ad00e60a3284\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45be0671b4ad4d5e294f87dba5d52747298b00a3ebd9a13f2845881c0f1846a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krfjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://876c35e32e5877c2a69cdbb54b84ec0bc0daebd05c7847875a1c122aafac2057\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://876c35e32e5877c2a69cdbb54b84ec0bc0daebd05c7847875a1c122aafac2057\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krfjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://938ff3fffad5347212d00d02f96c105cd5e6252a4d6a7ada258f0367da3f8e77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://938ff3fffad5347212d00d02f96c105cd5e6252a4d6a7ada258f0367da3f8e77\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:40:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krfjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca99fc95f4bc463b8e638cfa1e6dc6be77fc4111b6ec61b4b2913fdf91f46671\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ca99fc95f4bc463b8e638cfa1e6dc6be77fc4111b6ec61b4b2913fdf91f46671\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:40:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krfjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9667f80988a61a4f26f6f2b8b047ef94bdc6fa052083d5d1285f5bd9235615ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9667f80988a61a4f26f6f2b8b047ef94bdc6fa052083d5d1285f5bd9235615ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:40:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krfjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a4e7fb7b6f922e6e0fa2a7fcb5476ba549a6aa21dc88ee145bc55f4dac0912bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a4e7fb7b6f922e6e0fa2a7fcb5476ba549a6aa21dc88ee145bc55f4dac0912bb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:40:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krfjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://76a8239bfbb38c5858f1f9b0b01b9552463d8b2774564d7cdacb6757bf053656\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76a8239bfbb38c5858f1f9b0b01b9552463d8b2774564d7cdacb6757bf053656\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:40:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krfjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9fnsr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:47Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:47 crc kubenswrapper[4790]: I1003 08:40:47.784906 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7bpwn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a3923f73-5efb-4ce3-b68f-0c4e1e8eca4b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://305c950f0352fe1bec1c151dc89eed1389144ad5ee321458d48c6609e688bb73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gt4gf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:37Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7bpwn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:47Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:47 crc kubenswrapper[4790]: I1003 08:40:47.797080 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:47Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:47 crc kubenswrapper[4790]: I1003 08:40:47.812500 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:47Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:47 crc kubenswrapper[4790]: I1003 08:40:47.825744 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"36eae06a-d8be-4128-8d68-acb32e1f011b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a97a1a7f3aba494cd0da6bcd15052a85d70642642a62d108ac0be51ce8e3d302\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww2bb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8c3aea222eb433729d8eac107ebcdb306f649bef25942cd0390bfc1bbda5c74\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww2bb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:35Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-zqj4j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:47Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:47 crc kubenswrapper[4790]: I1003 08:40:47.851436 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vsjvh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa372dcc-63aa-48e5-b153-7739078026ef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e69102d376e90485ca565d34e18dbc12067e166f0bdf3c4325a0873f9f2b9b0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52pgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e52e55afef7b68f05e0c7d975d4497ee57dd9fa5aef7f72e9ca7441ea1633c05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52pgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55b4e8ab33316a7f7057ee723ebd49eb0adcd7270124f084b2828ce538ed3297\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52pgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f2aaa340e3927261ef0119f786b348e2091ddcd4b791619cc521122cc0774e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52pgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff0f1dd623e164ebe1862661c5905a38bec3aa1dd5c61982eebde25d0feb4c6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52pgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27b1573f76936a5148d24accaa7fdc1a1da552fb56b348812445e7f1c1e4cbef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52pgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f23e27271cc758d5c79c89c2c23cc95fdeec4f05bc5e026a05e0ce68da68541b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f23e27271cc758d5c79c89c2c23cc95fdeec4f05bc5e026a05e0ce68da68541b\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-03T08:40:46Z\\\",\\\"message\\\":\\\"qos/v1/apis/informers/externalversions/factory.go:140\\\\nI1003 08:40:46.779070 6066 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1003 08:40:46.779119 6066 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1003 08:40:46.779154 6066 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1003 08:40:46.779161 6066 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1003 08:40:46.779205 6066 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1003 08:40:46.779253 6066 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1003 08:40:46.779260 6066 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1003 08:40:46.779333 6066 factory.go:656] Stopping watch factory\\\\nI1003 08:40:46.779335 6066 handler.go:208] Removed *v1.Node event handler 2\\\\nI1003 08:40:46.779354 6066 ovnkube.go:599] Stopped ovnkube\\\\nI1003 08:40:46.779365 6066 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1003 08:40:46.779375 6066 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1003 08:40:46.779383 6066 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1003 08:40:46.779397 6066 handler.go:208] Removed *v1.Node event handler 7\\\\nI1003 08:40:46.779363 6066 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1003 08\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52pgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74d44e422b0cbbd3793553dad98475d5a93ac38530a167db0dd1ef152edac010\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52pgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e45c39722f37fc673cee930ca10396bba43b56e19e61aa99bd2b89f131341cb4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e45c39722f37fc673cee930ca10396bba43b56e19e61aa99bd2b89f131341cb4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:40:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52pgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vsjvh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:47Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:47 crc kubenswrapper[4790]: I1003 08:40:47.861063 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:47 crc kubenswrapper[4790]: I1003 08:40:47.861138 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:47 crc kubenswrapper[4790]: I1003 08:40:47.861162 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:47 crc kubenswrapper[4790]: I1003 08:40:47.861193 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:47 crc kubenswrapper[4790]: I1003 08:40:47.861214 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:47Z","lastTransitionTime":"2025-10-03T08:40:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:47 crc kubenswrapper[4790]: I1003 08:40:47.869486 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28b99cc027b4390bdc7e3ed519bb798eb4a7b825ff29e4574f3420b9922265b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:47Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:47 crc kubenswrapper[4790]: I1003 08:40:47.891545 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:47Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:47 crc kubenswrapper[4790]: I1003 08:40:47.906836 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gprtl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c5bfb61-e69a-4576-b200-18dd7f38c7f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dcb3e7b6550f0ad490e187c2154ec53cb427cbab7acaf0618eb4de188eb724c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66xc5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:34Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gprtl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:47Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:47 crc kubenswrapper[4790]: I1003 08:40:47.932918 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3362615c1396a6051d3770c79e4b67eea9d0c9df2a1a45381f729c4120970818\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://41113d498e57788a4c3162e8e981f5b9d9a5a7d49ee50b15085688fbc6b67f4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:47Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:47 crc kubenswrapper[4790]: I1003 08:40:47.943194 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-fn764"] Oct 03 08:40:47 crc kubenswrapper[4790]: I1003 08:40:47.943865 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-fn764" Oct 03 08:40:47 crc kubenswrapper[4790]: W1003 08:40:47.946406 4790 reflector.go:561] object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd": failed to list *v1.Secret: secrets "ovn-kubernetes-control-plane-dockercfg-gs7dd" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-ovn-kubernetes": no relationship found between node 'crc' and this object Oct 03 08:40:47 crc kubenswrapper[4790]: W1003 08:40:47.946451 4790 reflector.go:561] object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert": failed to list *v1.Secret: secrets "ovn-control-plane-metrics-cert" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-ovn-kubernetes": no relationship found between node 'crc' and this object Oct 03 08:40:47 crc kubenswrapper[4790]: E1003 08:40:47.946461 4790 reflector.go:158] "Unhandled Error" err="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-control-plane-dockercfg-gs7dd\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"ovn-kubernetes-control-plane-dockercfg-gs7dd\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-ovn-kubernetes\": no relationship found between node 'crc' and this object" logger="UnhandledError" Oct 03 08:40:47 crc kubenswrapper[4790]: E1003 08:40:47.946500 4790 reflector.go:158] "Unhandled Error" err="object-\"openshift-ovn-kubernetes\"/\"ovn-control-plane-metrics-cert\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"ovn-control-plane-metrics-cert\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-ovn-kubernetes\": no relationship found between node 'crc' and this object" logger="UnhandledError" Oct 03 08:40:47 crc kubenswrapper[4790]: I1003 08:40:47.958957 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"535765e1-fd5d-4501-b13e-684c5f08b296\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e0caa9033c82a625d9594342d4e58052cfbfea8251bc0fa2f1b102ba235b5ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db5e8a267e2821ea1c1628ff477a4efc3c5bcc78538460b0f30a160bd738dcee\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d85a637b07da3de31d73486c444bf86e7e2fd799180fc1cd9ab56fee61afaa9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://905a99c09cfaadc3eddc17fa6e509098364525d7a03add46a97c98dc08def8a2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://905a99c09cfaadc3eddc17fa6e509098364525d7a03add46a97c98dc08def8a2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T08:40:30Z\\\",\\\"message\\\":\\\"file observer\\\\nW1003 08:40:30.396497 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1003 08:40:30.399279 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 08:40:30.400119 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-247026750/tls.crt::/tmp/serving-cert-247026750/tls.key\\\\\\\"\\\\nI1003 08:40:30.809123 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1003 08:40:30.812812 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1003 08:40:30.812831 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1003 08:40:30.812852 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1003 08:40:30.812858 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1003 08:40:30.817440 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1003 08:40:30.817463 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1003 08:40:30.817482 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 08:40:30.817491 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 08:40:30.817495 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1003 08:40:30.817498 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1003 08:40:30.817501 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1003 08:40:30.817504 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1003 08:40:30.821080 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4591d1f2d844cb16eef19a65aaad416898d23ca5ce266e4863a9810d49ff4ba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://444e973af493d5aca2c924bcaee20abca5f1281e6b9c9f364e4603bf622d0e48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://444e973af493d5aca2c924bcaee20abca5f1281e6b9c9f364e4603bf622d0e48\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:40:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:14Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:47Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:47 crc kubenswrapper[4790]: I1003 08:40:47.963900 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:47 crc kubenswrapper[4790]: I1003 08:40:47.963961 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:47 crc kubenswrapper[4790]: I1003 08:40:47.963978 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:47 crc kubenswrapper[4790]: I1003 08:40:47.964008 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:47 crc kubenswrapper[4790]: I1003 08:40:47.964027 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:47Z","lastTransitionTime":"2025-10-03T08:40:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:47 crc kubenswrapper[4790]: I1003 08:40:47.979305 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28b99cc027b4390bdc7e3ed519bb798eb4a7b825ff29e4574f3420b9922265b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:47Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:47 crc kubenswrapper[4790]: I1003 08:40:47.994725 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:47Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:47 crc kubenswrapper[4790]: I1003 08:40:47.998755 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/12525ca4-5242-41bc-b7ba-2c1ab5a45892-env-overrides\") pod \"ovnkube-control-plane-749d76644c-fn764\" (UID: \"12525ca4-5242-41bc-b7ba-2c1ab5a45892\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-fn764" Oct 03 08:40:47 crc kubenswrapper[4790]: I1003 08:40:47.998807 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/12525ca4-5242-41bc-b7ba-2c1ab5a45892-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-fn764\" (UID: \"12525ca4-5242-41bc-b7ba-2c1ab5a45892\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-fn764" Oct 03 08:40:47 crc kubenswrapper[4790]: I1003 08:40:47.998833 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2t6qw\" (UniqueName: \"kubernetes.io/projected/12525ca4-5242-41bc-b7ba-2c1ab5a45892-kube-api-access-2t6qw\") pod \"ovnkube-control-plane-749d76644c-fn764\" (UID: \"12525ca4-5242-41bc-b7ba-2c1ab5a45892\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-fn764" Oct 03 08:40:47 crc kubenswrapper[4790]: I1003 08:40:47.998865 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/12525ca4-5242-41bc-b7ba-2c1ab5a45892-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-fn764\" (UID: \"12525ca4-5242-41bc-b7ba-2c1ab5a45892\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-fn764" Oct 03 08:40:48 crc kubenswrapper[4790]: I1003 08:40:48.008815 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:48Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:48 crc kubenswrapper[4790]: I1003 08:40:48.021407 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"36eae06a-d8be-4128-8d68-acb32e1f011b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a97a1a7f3aba494cd0da6bcd15052a85d70642642a62d108ac0be51ce8e3d302\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww2bb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8c3aea222eb433729d8eac107ebcdb306f649bef25942cd0390bfc1bbda5c74\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww2bb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:35Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-zqj4j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:48Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:48 crc kubenswrapper[4790]: I1003 08:40:48.044427 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vsjvh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa372dcc-63aa-48e5-b153-7739078026ef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e69102d376e90485ca565d34e18dbc12067e166f0bdf3c4325a0873f9f2b9b0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52pgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e52e55afef7b68f05e0c7d975d4497ee57dd9fa5aef7f72e9ca7441ea1633c05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52pgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55b4e8ab33316a7f7057ee723ebd49eb0adcd7270124f084b2828ce538ed3297\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52pgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f2aaa340e3927261ef0119f786b348e2091ddcd4b791619cc521122cc0774e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52pgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff0f1dd623e164ebe1862661c5905a38bec3aa1dd5c61982eebde25d0feb4c6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52pgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27b1573f76936a5148d24accaa7fdc1a1da552fb56b348812445e7f1c1e4cbef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52pgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f23e27271cc758d5c79c89c2c23cc95fdeec4f05bc5e026a05e0ce68da68541b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f23e27271cc758d5c79c89c2c23cc95fdeec4f05bc5e026a05e0ce68da68541b\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-03T08:40:46Z\\\",\\\"message\\\":\\\"qos/v1/apis/informers/externalversions/factory.go:140\\\\nI1003 08:40:46.779070 6066 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1003 08:40:46.779119 6066 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1003 08:40:46.779154 6066 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1003 08:40:46.779161 6066 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1003 08:40:46.779205 6066 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1003 08:40:46.779253 6066 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1003 08:40:46.779260 6066 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1003 08:40:46.779333 6066 factory.go:656] Stopping watch factory\\\\nI1003 08:40:46.779335 6066 handler.go:208] Removed *v1.Node event handler 2\\\\nI1003 08:40:46.779354 6066 ovnkube.go:599] Stopped ovnkube\\\\nI1003 08:40:46.779365 6066 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1003 08:40:46.779375 6066 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1003 08:40:46.779383 6066 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1003 08:40:46.779397 6066 handler.go:208] Removed *v1.Node event handler 7\\\\nI1003 08:40:46.779363 6066 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1003 08\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52pgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74d44e422b0cbbd3793553dad98475d5a93ac38530a167db0dd1ef152edac010\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52pgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e45c39722f37fc673cee930ca10396bba43b56e19e61aa99bd2b89f131341cb4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e45c39722f37fc673cee930ca10396bba43b56e19e61aa99bd2b89f131341cb4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:40:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52pgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vsjvh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:48Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:48 crc kubenswrapper[4790]: I1003 08:40:48.061831 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"535765e1-fd5d-4501-b13e-684c5f08b296\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e0caa9033c82a625d9594342d4e58052cfbfea8251bc0fa2f1b102ba235b5ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db5e8a267e2821ea1c1628ff477a4efc3c5bcc78538460b0f30a160bd738dcee\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d85a637b07da3de31d73486c444bf86e7e2fd799180fc1cd9ab56fee61afaa9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://905a99c09cfaadc3eddc17fa6e509098364525d7a03add46a97c98dc08def8a2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://905a99c09cfaadc3eddc17fa6e509098364525d7a03add46a97c98dc08def8a2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T08:40:30Z\\\",\\\"message\\\":\\\"file observer\\\\nW1003 08:40:30.396497 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1003 08:40:30.399279 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 08:40:30.400119 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-247026750/tls.crt::/tmp/serving-cert-247026750/tls.key\\\\\\\"\\\\nI1003 08:40:30.809123 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1003 08:40:30.812812 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1003 08:40:30.812831 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1003 08:40:30.812852 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1003 08:40:30.812858 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1003 08:40:30.817440 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1003 08:40:30.817463 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1003 08:40:30.817482 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 08:40:30.817491 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 08:40:30.817495 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1003 08:40:30.817498 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1003 08:40:30.817501 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1003 08:40:30.817504 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1003 08:40:30.821080 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4591d1f2d844cb16eef19a65aaad416898d23ca5ce266e4863a9810d49ff4ba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://444e973af493d5aca2c924bcaee20abca5f1281e6b9c9f364e4603bf622d0e48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://444e973af493d5aca2c924bcaee20abca5f1281e6b9c9f364e4603bf622d0e48\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:40:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:14Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:48Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:48 crc kubenswrapper[4790]: I1003 08:40:48.067382 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:48 crc kubenswrapper[4790]: I1003 08:40:48.067433 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:48 crc kubenswrapper[4790]: I1003 08:40:48.067446 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:48 crc kubenswrapper[4790]: I1003 08:40:48.067469 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:48 crc kubenswrapper[4790]: I1003 08:40:48.067483 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:48Z","lastTransitionTime":"2025-10-03T08:40:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:48 crc kubenswrapper[4790]: I1003 08:40:48.077117 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gprtl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c5bfb61-e69a-4576-b200-18dd7f38c7f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dcb3e7b6550f0ad490e187c2154ec53cb427cbab7acaf0618eb4de188eb724c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66xc5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:34Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gprtl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:48Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:48 crc kubenswrapper[4790]: I1003 08:40:48.093717 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3362615c1396a6051d3770c79e4b67eea9d0c9df2a1a45381f729c4120970818\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://41113d498e57788a4c3162e8e981f5b9d9a5a7d49ee50b15085688fbc6b67f4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:48Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:48 crc kubenswrapper[4790]: I1003 08:40:48.099858 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/12525ca4-5242-41bc-b7ba-2c1ab5a45892-env-overrides\") pod \"ovnkube-control-plane-749d76644c-fn764\" (UID: \"12525ca4-5242-41bc-b7ba-2c1ab5a45892\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-fn764" Oct 03 08:40:48 crc kubenswrapper[4790]: I1003 08:40:48.099929 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/12525ca4-5242-41bc-b7ba-2c1ab5a45892-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-fn764\" (UID: \"12525ca4-5242-41bc-b7ba-2c1ab5a45892\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-fn764" Oct 03 08:40:48 crc kubenswrapper[4790]: I1003 08:40:48.099959 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2t6qw\" (UniqueName: \"kubernetes.io/projected/12525ca4-5242-41bc-b7ba-2c1ab5a45892-kube-api-access-2t6qw\") pod \"ovnkube-control-plane-749d76644c-fn764\" (UID: \"12525ca4-5242-41bc-b7ba-2c1ab5a45892\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-fn764" Oct 03 08:40:48 crc kubenswrapper[4790]: I1003 08:40:48.099995 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/12525ca4-5242-41bc-b7ba-2c1ab5a45892-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-fn764\" (UID: \"12525ca4-5242-41bc-b7ba-2c1ab5a45892\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-fn764" Oct 03 08:40:48 crc kubenswrapper[4790]: I1003 08:40:48.101031 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/12525ca4-5242-41bc-b7ba-2c1ab5a45892-env-overrides\") pod \"ovnkube-control-plane-749d76644c-fn764\" (UID: \"12525ca4-5242-41bc-b7ba-2c1ab5a45892\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-fn764" Oct 03 08:40:48 crc kubenswrapper[4790]: I1003 08:40:48.101148 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/12525ca4-5242-41bc-b7ba-2c1ab5a45892-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-fn764\" (UID: \"12525ca4-5242-41bc-b7ba-2c1ab5a45892\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-fn764" Oct 03 08:40:48 crc kubenswrapper[4790]: I1003 08:40:48.131367 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2t6qw\" (UniqueName: \"kubernetes.io/projected/12525ca4-5242-41bc-b7ba-2c1ab5a45892-kube-api-access-2t6qw\") pod \"ovnkube-control-plane-749d76644c-fn764\" (UID: \"12525ca4-5242-41bc-b7ba-2c1ab5a45892\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-fn764" Oct 03 08:40:48 crc kubenswrapper[4790]: I1003 08:40:48.133722 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7ddb59e5-6feb-471d-9594-f86e6993a2e4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b0138e08174582c0c1388716f1f6b9e42a16b009f1a467803c4dacd50c3d80ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c575d680c9bc994b8c2e97e720d794138153bc3329c4b7f8ef81115a0245d0cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://027dd0df8d83044cff9f4e8c87bef3bb18e2169bb93e615486fd1e9b5a17b86e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e837e81f3b907d8506f674d13a9bc8f20f63e6c40e1f952cd57b715855ec9bf9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:14Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:48Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:48 crc kubenswrapper[4790]: I1003 08:40:48.156991 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08d57a60251c60c6c6a2cab2b862c322c01b531ca1aa12ff4b0933bee9eb95f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:48Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:48 crc kubenswrapper[4790]: I1003 08:40:48.171495 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:48 crc kubenswrapper[4790]: I1003 08:40:48.171607 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:48 crc kubenswrapper[4790]: I1003 08:40:48.171635 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:48 crc kubenswrapper[4790]: I1003 08:40:48.171672 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:48 crc kubenswrapper[4790]: I1003 08:40:48.171698 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:48Z","lastTransitionTime":"2025-10-03T08:40:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:48 crc kubenswrapper[4790]: I1003 08:40:48.175476 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-hncl2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f2f5631d-8486-44f9-81b0-ee78515e7866\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3fde680972c36057c9e06b0f745acbfb2e6ab399b14d6c25a3c379b65e598afd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-strp2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-hncl2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:48Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:48 crc kubenswrapper[4790]: I1003 08:40:48.195321 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:48Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:48 crc kubenswrapper[4790]: I1003 08:40:48.210740 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9fnsr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e01f9373-74e4-439b-bd7f-ad00e60a3284\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45be0671b4ad4d5e294f87dba5d52747298b00a3ebd9a13f2845881c0f1846a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krfjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://876c35e32e5877c2a69cdbb54b84ec0bc0daebd05c7847875a1c122aafac2057\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://876c35e32e5877c2a69cdbb54b84ec0bc0daebd05c7847875a1c122aafac2057\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krfjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://938ff3fffad5347212d00d02f96c105cd5e6252a4d6a7ada258f0367da3f8e77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://938ff3fffad5347212d00d02f96c105cd5e6252a4d6a7ada258f0367da3f8e77\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:40:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krfjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca99fc95f4bc463b8e638cfa1e6dc6be77fc4111b6ec61b4b2913fdf91f46671\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ca99fc95f4bc463b8e638cfa1e6dc6be77fc4111b6ec61b4b2913fdf91f46671\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:40:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krfjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9667f80988a61a4f26f6f2b8b047ef94bdc6fa052083d5d1285f5bd9235615ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9667f80988a61a4f26f6f2b8b047ef94bdc6fa052083d5d1285f5bd9235615ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:40:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krfjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a4e7fb7b6f922e6e0fa2a7fcb5476ba549a6aa21dc88ee145bc55f4dac0912bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a4e7fb7b6f922e6e0fa2a7fcb5476ba549a6aa21dc88ee145bc55f4dac0912bb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:40:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krfjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://76a8239bfbb38c5858f1f9b0b01b9552463d8b2774564d7cdacb6757bf053656\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76a8239bfbb38c5858f1f9b0b01b9552463d8b2774564d7cdacb6757bf053656\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:40:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krfjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9fnsr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:48Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:48 crc kubenswrapper[4790]: I1003 08:40:48.221538 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7bpwn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a3923f73-5efb-4ce3-b68f-0c4e1e8eca4b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://305c950f0352fe1bec1c151dc89eed1389144ad5ee321458d48c6609e688bb73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gt4gf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:37Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7bpwn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:48Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:48 crc kubenswrapper[4790]: I1003 08:40:48.233667 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-fn764" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12525ca4-5242-41bc-b7ba-2c1ab5a45892\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2t6qw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2t6qw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-fn764\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:48Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:48 crc kubenswrapper[4790]: I1003 08:40:48.274212 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:48 crc kubenswrapper[4790]: I1003 08:40:48.274277 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:48 crc kubenswrapper[4790]: I1003 08:40:48.274286 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:48 crc kubenswrapper[4790]: I1003 08:40:48.274313 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:48 crc kubenswrapper[4790]: I1003 08:40:48.274334 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:48Z","lastTransitionTime":"2025-10-03T08:40:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:48 crc kubenswrapper[4790]: I1003 08:40:48.327678 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 08:40:48 crc kubenswrapper[4790]: I1003 08:40:48.327710 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 08:40:48 crc kubenswrapper[4790]: E1003 08:40:48.327892 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 08:40:48 crc kubenswrapper[4790]: I1003 08:40:48.327912 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 08:40:48 crc kubenswrapper[4790]: E1003 08:40:48.328042 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 08:40:48 crc kubenswrapper[4790]: E1003 08:40:48.328164 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 08:40:48 crc kubenswrapper[4790]: I1003 08:40:48.377105 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:48 crc kubenswrapper[4790]: I1003 08:40:48.377167 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:48 crc kubenswrapper[4790]: I1003 08:40:48.377186 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:48 crc kubenswrapper[4790]: I1003 08:40:48.377212 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:48 crc kubenswrapper[4790]: I1003 08:40:48.377231 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:48Z","lastTransitionTime":"2025-10-03T08:40:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:48 crc kubenswrapper[4790]: I1003 08:40:48.480692 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:48 crc kubenswrapper[4790]: I1003 08:40:48.480758 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:48 crc kubenswrapper[4790]: I1003 08:40:48.480771 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:48 crc kubenswrapper[4790]: I1003 08:40:48.480796 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:48 crc kubenswrapper[4790]: I1003 08:40:48.480811 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:48Z","lastTransitionTime":"2025-10-03T08:40:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:48 crc kubenswrapper[4790]: I1003 08:40:48.583416 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:48 crc kubenswrapper[4790]: I1003 08:40:48.583482 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:48 crc kubenswrapper[4790]: I1003 08:40:48.583504 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:48 crc kubenswrapper[4790]: I1003 08:40:48.583531 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:48 crc kubenswrapper[4790]: I1003 08:40:48.583548 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:48Z","lastTransitionTime":"2025-10-03T08:40:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:48 crc kubenswrapper[4790]: I1003 08:40:48.686860 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:48 crc kubenswrapper[4790]: I1003 08:40:48.686909 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:48 crc kubenswrapper[4790]: I1003 08:40:48.686919 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:48 crc kubenswrapper[4790]: I1003 08:40:48.686942 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:48 crc kubenswrapper[4790]: I1003 08:40:48.686953 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:48Z","lastTransitionTime":"2025-10-03T08:40:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:48 crc kubenswrapper[4790]: I1003 08:40:48.699728 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vsjvh_fa372dcc-63aa-48e5-b153-7739078026ef/ovnkube-controller/0.log" Oct 03 08:40:48 crc kubenswrapper[4790]: I1003 08:40:48.702987 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vsjvh" event={"ID":"fa372dcc-63aa-48e5-b153-7739078026ef","Type":"ContainerStarted","Data":"6fb145beaf63d143f4e9dbaa8da4340794905d8ee9b2b317b7f5389999c43b09"} Oct 03 08:40:48 crc kubenswrapper[4790]: I1003 08:40:48.703147 4790 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 03 08:40:48 crc kubenswrapper[4790]: I1003 08:40:48.721920 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7bpwn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a3923f73-5efb-4ce3-b68f-0c4e1e8eca4b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://305c950f0352fe1bec1c151dc89eed1389144ad5ee321458d48c6609e688bb73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gt4gf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:37Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7bpwn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:48Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:48 crc kubenswrapper[4790]: I1003 08:40:48.735041 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-fn764" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12525ca4-5242-41bc-b7ba-2c1ab5a45892\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2t6qw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2t6qw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-fn764\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:48Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:48 crc kubenswrapper[4790]: I1003 08:40:48.751218 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:48Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:48 crc kubenswrapper[4790]: I1003 08:40:48.767364 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9fnsr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e01f9373-74e4-439b-bd7f-ad00e60a3284\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45be0671b4ad4d5e294f87dba5d52747298b00a3ebd9a13f2845881c0f1846a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krfjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://876c35e32e5877c2a69cdbb54b84ec0bc0daebd05c7847875a1c122aafac2057\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://876c35e32e5877c2a69cdbb54b84ec0bc0daebd05c7847875a1c122aafac2057\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krfjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://938ff3fffad5347212d00d02f96c105cd5e6252a4d6a7ada258f0367da3f8e77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://938ff3fffad5347212d00d02f96c105cd5e6252a4d6a7ada258f0367da3f8e77\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:40:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krfjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca99fc95f4bc463b8e638cfa1e6dc6be77fc4111b6ec61b4b2913fdf91f46671\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ca99fc95f4bc463b8e638cfa1e6dc6be77fc4111b6ec61b4b2913fdf91f46671\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:40:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krfjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9667f80988a61a4f26f6f2b8b047ef94bdc6fa052083d5d1285f5bd9235615ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9667f80988a61a4f26f6f2b8b047ef94bdc6fa052083d5d1285f5bd9235615ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:40:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krfjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a4e7fb7b6f922e6e0fa2a7fcb5476ba549a6aa21dc88ee145bc55f4dac0912bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a4e7fb7b6f922e6e0fa2a7fcb5476ba549a6aa21dc88ee145bc55f4dac0912bb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:40:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krfjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://76a8239bfbb38c5858f1f9b0b01b9552463d8b2774564d7cdacb6757bf053656\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76a8239bfbb38c5858f1f9b0b01b9552463d8b2774564d7cdacb6757bf053656\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:40:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krfjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9fnsr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:48Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:48 crc kubenswrapper[4790]: I1003 08:40:48.782880 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"36eae06a-d8be-4128-8d68-acb32e1f011b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a97a1a7f3aba494cd0da6bcd15052a85d70642642a62d108ac0be51ce8e3d302\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww2bb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8c3aea222eb433729d8eac107ebcdb306f649bef25942cd0390bfc1bbda5c74\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww2bb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:35Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-zqj4j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:48Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:48 crc kubenswrapper[4790]: I1003 08:40:48.789618 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:48 crc kubenswrapper[4790]: I1003 08:40:48.789707 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:48 crc kubenswrapper[4790]: I1003 08:40:48.789733 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:48 crc kubenswrapper[4790]: I1003 08:40:48.789773 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:48 crc kubenswrapper[4790]: I1003 08:40:48.789799 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:48Z","lastTransitionTime":"2025-10-03T08:40:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:48 crc kubenswrapper[4790]: I1003 08:40:48.817990 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vsjvh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa372dcc-63aa-48e5-b153-7739078026ef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e69102d376e90485ca565d34e18dbc12067e166f0bdf3c4325a0873f9f2b9b0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52pgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e52e55afef7b68f05e0c7d975d4497ee57dd9fa5aef7f72e9ca7441ea1633c05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52pgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55b4e8ab33316a7f7057ee723ebd49eb0adcd7270124f084b2828ce538ed3297\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52pgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f2aaa340e3927261ef0119f786b348e2091ddcd4b791619cc521122cc0774e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52pgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff0f1dd623e164ebe1862661c5905a38bec3aa1dd5c61982eebde25d0feb4c6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52pgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27b1573f76936a5148d24accaa7fdc1a1da552fb56b348812445e7f1c1e4cbef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52pgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fb145beaf63d143f4e9dbaa8da4340794905d8ee9b2b317b7f5389999c43b09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f23e27271cc758d5c79c89c2c23cc95fdeec4f05bc5e026a05e0ce68da68541b\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-03T08:40:46Z\\\",\\\"message\\\":\\\"qos/v1/apis/informers/externalversions/factory.go:140\\\\nI1003 08:40:46.779070 6066 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1003 08:40:46.779119 6066 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1003 08:40:46.779154 6066 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1003 08:40:46.779161 6066 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1003 08:40:46.779205 6066 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1003 08:40:46.779253 6066 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1003 08:40:46.779260 6066 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1003 08:40:46.779333 6066 factory.go:656] Stopping watch factory\\\\nI1003 08:40:46.779335 6066 handler.go:208] Removed *v1.Node event handler 2\\\\nI1003 08:40:46.779354 6066 ovnkube.go:599] Stopped ovnkube\\\\nI1003 08:40:46.779365 6066 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1003 08:40:46.779375 6066 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1003 08:40:46.779383 6066 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1003 08:40:46.779397 6066 handler.go:208] Removed *v1.Node event handler 7\\\\nI1003 08:40:46.779363 6066 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1003 08\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:43Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52pgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74d44e422b0cbbd3793553dad98475d5a93ac38530a167db0dd1ef152edac010\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52pgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e45c39722f37fc673cee930ca10396bba43b56e19e61aa99bd2b89f131341cb4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e45c39722f37fc673cee930ca10396bba43b56e19e61aa99bd2b89f131341cb4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:40:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52pgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vsjvh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:48Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:48 crc kubenswrapper[4790]: I1003 08:40:48.831505 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28b99cc027b4390bdc7e3ed519bb798eb4a7b825ff29e4574f3420b9922265b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:48Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:48 crc kubenswrapper[4790]: I1003 08:40:48.833827 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Oct 03 08:40:48 crc kubenswrapper[4790]: I1003 08:40:48.847089 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:48Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:48 crc kubenswrapper[4790]: I1003 08:40:48.860501 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:48Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:48 crc kubenswrapper[4790]: I1003 08:40:48.876630 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3362615c1396a6051d3770c79e4b67eea9d0c9df2a1a45381f729c4120970818\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://41113d498e57788a4c3162e8e981f5b9d9a5a7d49ee50b15085688fbc6b67f4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:48Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:48 crc kubenswrapper[4790]: I1003 08:40:48.892365 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"535765e1-fd5d-4501-b13e-684c5f08b296\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e0caa9033c82a625d9594342d4e58052cfbfea8251bc0fa2f1b102ba235b5ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db5e8a267e2821ea1c1628ff477a4efc3c5bcc78538460b0f30a160bd738dcee\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d85a637b07da3de31d73486c444bf86e7e2fd799180fc1cd9ab56fee61afaa9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://905a99c09cfaadc3eddc17fa6e509098364525d7a03add46a97c98dc08def8a2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://905a99c09cfaadc3eddc17fa6e509098364525d7a03add46a97c98dc08def8a2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T08:40:30Z\\\",\\\"message\\\":\\\"file observer\\\\nW1003 08:40:30.396497 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1003 08:40:30.399279 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 08:40:30.400119 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-247026750/tls.crt::/tmp/serving-cert-247026750/tls.key\\\\\\\"\\\\nI1003 08:40:30.809123 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1003 08:40:30.812812 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1003 08:40:30.812831 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1003 08:40:30.812852 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1003 08:40:30.812858 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1003 08:40:30.817440 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1003 08:40:30.817463 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1003 08:40:30.817482 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 08:40:30.817491 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 08:40:30.817495 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1003 08:40:30.817498 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1003 08:40:30.817501 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1003 08:40:30.817504 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1003 08:40:30.821080 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4591d1f2d844cb16eef19a65aaad416898d23ca5ce266e4863a9810d49ff4ba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://444e973af493d5aca2c924bcaee20abca5f1281e6b9c9f364e4603bf622d0e48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://444e973af493d5aca2c924bcaee20abca5f1281e6b9c9f364e4603bf622d0e48\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:40:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:14Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:48Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:48 crc kubenswrapper[4790]: I1003 08:40:48.893495 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:48 crc kubenswrapper[4790]: I1003 08:40:48.893555 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:48 crc kubenswrapper[4790]: I1003 08:40:48.893601 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:48 crc kubenswrapper[4790]: I1003 08:40:48.893626 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:48 crc kubenswrapper[4790]: I1003 08:40:48.893643 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:48Z","lastTransitionTime":"2025-10-03T08:40:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:48 crc kubenswrapper[4790]: I1003 08:40:48.904801 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gprtl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c5bfb61-e69a-4576-b200-18dd7f38c7f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dcb3e7b6550f0ad490e187c2154ec53cb427cbab7acaf0618eb4de188eb724c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66xc5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:34Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gprtl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:48Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:48 crc kubenswrapper[4790]: I1003 08:40:48.917306 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-hncl2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f2f5631d-8486-44f9-81b0-ee78515e7866\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3fde680972c36057c9e06b0f745acbfb2e6ab399b14d6c25a3c379b65e598afd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-strp2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-hncl2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:48Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:48 crc kubenswrapper[4790]: I1003 08:40:48.926974 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7ddb59e5-6feb-471d-9594-f86e6993a2e4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b0138e08174582c0c1388716f1f6b9e42a16b009f1a467803c4dacd50c3d80ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c575d680c9bc994b8c2e97e720d794138153bc3329c4b7f8ef81115a0245d0cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://027dd0df8d83044cff9f4e8c87bef3bb18e2169bb93e615486fd1e9b5a17b86e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e837e81f3b907d8506f674d13a9bc8f20f63e6c40e1f952cd57b715855ec9bf9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:14Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:48Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:48 crc kubenswrapper[4790]: I1003 08:40:48.966017 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08d57a60251c60c6c6a2cab2b862c322c01b531ca1aa12ff4b0933bee9eb95f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:48Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:48 crc kubenswrapper[4790]: I1003 08:40:48.996805 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:48 crc kubenswrapper[4790]: I1003 08:40:48.996858 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:48 crc kubenswrapper[4790]: I1003 08:40:48.996872 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:48 crc kubenswrapper[4790]: I1003 08:40:48.996893 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:48 crc kubenswrapper[4790]: I1003 08:40:48.996916 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:48Z","lastTransitionTime":"2025-10-03T08:40:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:49 crc kubenswrapper[4790]: I1003 08:40:49.044307 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-4vl95"] Oct 03 08:40:49 crc kubenswrapper[4790]: I1003 08:40:49.044935 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4vl95" Oct 03 08:40:49 crc kubenswrapper[4790]: E1003 08:40:49.045029 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4vl95" podUID="9236b957-81cb-40bc-969e-a2057884ba50" Oct 03 08:40:49 crc kubenswrapper[4790]: I1003 08:40:49.060623 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9fnsr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e01f9373-74e4-439b-bd7f-ad00e60a3284\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45be0671b4ad4d5e294f87dba5d52747298b00a3ebd9a13f2845881c0f1846a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krfjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://876c35e32e5877c2a69cdbb54b84ec0bc0daebd05c7847875a1c122aafac2057\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://876c35e32e5877c2a69cdbb54b84ec0bc0daebd05c7847875a1c122aafac2057\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krfjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://938ff3fffad5347212d00d02f96c105cd5e6252a4d6a7ada258f0367da3f8e77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://938ff3fffad5347212d00d02f96c105cd5e6252a4d6a7ada258f0367da3f8e77\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:40:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krfjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca99fc95f4bc463b8e638cfa1e6dc6be77fc4111b6ec61b4b2913fdf91f46671\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ca99fc95f4bc463b8e638cfa1e6dc6be77fc4111b6ec61b4b2913fdf91f46671\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:40:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krfjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9667f80988a61a4f26f6f2b8b047ef94bdc6fa052083d5d1285f5bd9235615ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9667f80988a61a4f26f6f2b8b047ef94bdc6fa052083d5d1285f5bd9235615ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:40:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krfjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a4e7fb7b6f922e6e0fa2a7fcb5476ba549a6aa21dc88ee145bc55f4dac0912bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a4e7fb7b6f922e6e0fa2a7fcb5476ba549a6aa21dc88ee145bc55f4dac0912bb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:40:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krfjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://76a8239bfbb38c5858f1f9b0b01b9552463d8b2774564d7cdacb6757bf053656\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76a8239bfbb38c5858f1f9b0b01b9552463d8b2774564d7cdacb6757bf053656\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:40:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krfjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9fnsr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:49Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:49 crc kubenswrapper[4790]: I1003 08:40:49.072073 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7bpwn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a3923f73-5efb-4ce3-b68f-0c4e1e8eca4b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://305c950f0352fe1bec1c151dc89eed1389144ad5ee321458d48c6609e688bb73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gt4gf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:37Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7bpwn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:49Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:49 crc kubenswrapper[4790]: I1003 08:40:49.088608 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-fn764" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12525ca4-5242-41bc-b7ba-2c1ab5a45892\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2t6qw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2t6qw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-fn764\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:49Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:49 crc kubenswrapper[4790]: E1003 08:40:49.102235 4790 secret.go:188] Couldn't get secret openshift-ovn-kubernetes/ovn-control-plane-metrics-cert: failed to sync secret cache: timed out waiting for the condition Oct 03 08:40:49 crc kubenswrapper[4790]: E1003 08:40:49.102371 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/12525ca4-5242-41bc-b7ba-2c1ab5a45892-ovn-control-plane-metrics-cert podName:12525ca4-5242-41bc-b7ba-2c1ab5a45892 nodeName:}" failed. No retries permitted until 2025-10-03 08:40:49.60233279 +0000 UTC m=+35.955267028 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "ovn-control-plane-metrics-cert" (UniqueName: "kubernetes.io/secret/12525ca4-5242-41bc-b7ba-2c1ab5a45892-ovn-control-plane-metrics-cert") pod "ovnkube-control-plane-749d76644c-fn764" (UID: "12525ca4-5242-41bc-b7ba-2c1ab5a45892") : failed to sync secret cache: timed out waiting for the condition Oct 03 08:40:49 crc kubenswrapper[4790]: I1003 08:40:49.104410 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:49 crc kubenswrapper[4790]: I1003 08:40:49.104458 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:49 crc kubenswrapper[4790]: I1003 08:40:49.104536 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:49 crc kubenswrapper[4790]: I1003 08:40:49.104592 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:49 crc kubenswrapper[4790]: I1003 08:40:49.104602 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:49Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:49 crc kubenswrapper[4790]: I1003 08:40:49.104613 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:49Z","lastTransitionTime":"2025-10-03T08:40:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:49 crc kubenswrapper[4790]: I1003 08:40:49.111250 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6j2gd\" (UniqueName: \"kubernetes.io/projected/9236b957-81cb-40bc-969e-a2057884ba50-kube-api-access-6j2gd\") pod \"network-metrics-daemon-4vl95\" (UID: \"9236b957-81cb-40bc-969e-a2057884ba50\") " pod="openshift-multus/network-metrics-daemon-4vl95" Oct 03 08:40:49 crc kubenswrapper[4790]: I1003 08:40:49.111290 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9236b957-81cb-40bc-969e-a2057884ba50-metrics-certs\") pod \"network-metrics-daemon-4vl95\" (UID: \"9236b957-81cb-40bc-969e-a2057884ba50\") " pod="openshift-multus/network-metrics-daemon-4vl95" Oct 03 08:40:49 crc kubenswrapper[4790]: I1003 08:40:49.116614 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:49Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:49 crc kubenswrapper[4790]: I1003 08:40:49.129280 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"36eae06a-d8be-4128-8d68-acb32e1f011b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a97a1a7f3aba494cd0da6bcd15052a85d70642642a62d108ac0be51ce8e3d302\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww2bb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8c3aea222eb433729d8eac107ebcdb306f649bef25942cd0390bfc1bbda5c74\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww2bb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:35Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-zqj4j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:49Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:49 crc kubenswrapper[4790]: I1003 08:40:49.150756 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vsjvh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa372dcc-63aa-48e5-b153-7739078026ef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e69102d376e90485ca565d34e18dbc12067e166f0bdf3c4325a0873f9f2b9b0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52pgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e52e55afef7b68f05e0c7d975d4497ee57dd9fa5aef7f72e9ca7441ea1633c05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52pgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55b4e8ab33316a7f7057ee723ebd49eb0adcd7270124f084b2828ce538ed3297\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52pgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f2aaa340e3927261ef0119f786b348e2091ddcd4b791619cc521122cc0774e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52pgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff0f1dd623e164ebe1862661c5905a38bec3aa1dd5c61982eebde25d0feb4c6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52pgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27b1573f76936a5148d24accaa7fdc1a1da552fb56b348812445e7f1c1e4cbef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52pgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fb145beaf63d143f4e9dbaa8da4340794905d8ee9b2b317b7f5389999c43b09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f23e27271cc758d5c79c89c2c23cc95fdeec4f05bc5e026a05e0ce68da68541b\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-03T08:40:46Z\\\",\\\"message\\\":\\\"qos/v1/apis/informers/externalversions/factory.go:140\\\\nI1003 08:40:46.779070 6066 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1003 08:40:46.779119 6066 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1003 08:40:46.779154 6066 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1003 08:40:46.779161 6066 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1003 08:40:46.779205 6066 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1003 08:40:46.779253 6066 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1003 08:40:46.779260 6066 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1003 08:40:46.779333 6066 factory.go:656] Stopping watch factory\\\\nI1003 08:40:46.779335 6066 handler.go:208] Removed *v1.Node event handler 2\\\\nI1003 08:40:46.779354 6066 ovnkube.go:599] Stopped ovnkube\\\\nI1003 08:40:46.779365 6066 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1003 08:40:46.779375 6066 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1003 08:40:46.779383 6066 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1003 08:40:46.779397 6066 handler.go:208] Removed *v1.Node event handler 7\\\\nI1003 08:40:46.779363 6066 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1003 08\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:43Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52pgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74d44e422b0cbbd3793553dad98475d5a93ac38530a167db0dd1ef152edac010\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52pgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e45c39722f37fc673cee930ca10396bba43b56e19e61aa99bd2b89f131341cb4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e45c39722f37fc673cee930ca10396bba43b56e19e61aa99bd2b89f131341cb4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:40:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52pgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vsjvh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:49Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:49 crc kubenswrapper[4790]: I1003 08:40:49.165998 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-4vl95" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9236b957-81cb-40bc-969e-a2057884ba50\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j2gd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j2gd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:49Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-4vl95\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:49Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:49 crc kubenswrapper[4790]: I1003 08:40:49.185356 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28b99cc027b4390bdc7e3ed519bb798eb4a7b825ff29e4574f3420b9922265b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:49Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:49 crc kubenswrapper[4790]: I1003 08:40:49.190930 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Oct 03 08:40:49 crc kubenswrapper[4790]: I1003 08:40:49.206865 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:49Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:49 crc kubenswrapper[4790]: I1003 08:40:49.208906 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:49 crc kubenswrapper[4790]: I1003 08:40:49.209041 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:49 crc kubenswrapper[4790]: I1003 08:40:49.209133 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:49 crc kubenswrapper[4790]: I1003 08:40:49.209209 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:49 crc kubenswrapper[4790]: I1003 08:40:49.209275 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:49Z","lastTransitionTime":"2025-10-03T08:40:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:49 crc kubenswrapper[4790]: I1003 08:40:49.212132 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6j2gd\" (UniqueName: \"kubernetes.io/projected/9236b957-81cb-40bc-969e-a2057884ba50-kube-api-access-6j2gd\") pod \"network-metrics-daemon-4vl95\" (UID: \"9236b957-81cb-40bc-969e-a2057884ba50\") " pod="openshift-multus/network-metrics-daemon-4vl95" Oct 03 08:40:49 crc kubenswrapper[4790]: I1003 08:40:49.212207 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9236b957-81cb-40bc-969e-a2057884ba50-metrics-certs\") pod \"network-metrics-daemon-4vl95\" (UID: \"9236b957-81cb-40bc-969e-a2057884ba50\") " pod="openshift-multus/network-metrics-daemon-4vl95" Oct 03 08:40:49 crc kubenswrapper[4790]: E1003 08:40:49.212360 4790 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 03 08:40:49 crc kubenswrapper[4790]: E1003 08:40:49.212433 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9236b957-81cb-40bc-969e-a2057884ba50-metrics-certs podName:9236b957-81cb-40bc-969e-a2057884ba50 nodeName:}" failed. No retries permitted until 2025-10-03 08:40:49.712412621 +0000 UTC m=+36.065346869 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9236b957-81cb-40bc-969e-a2057884ba50-metrics-certs") pod "network-metrics-daemon-4vl95" (UID: "9236b957-81cb-40bc-969e-a2057884ba50") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 03 08:40:49 crc kubenswrapper[4790]: I1003 08:40:49.224046 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gprtl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c5bfb61-e69a-4576-b200-18dd7f38c7f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dcb3e7b6550f0ad490e187c2154ec53cb427cbab7acaf0618eb4de188eb724c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66xc5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:34Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gprtl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:49Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:49 crc kubenswrapper[4790]: I1003 08:40:49.234548 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6j2gd\" (UniqueName: \"kubernetes.io/projected/9236b957-81cb-40bc-969e-a2057884ba50-kube-api-access-6j2gd\") pod \"network-metrics-daemon-4vl95\" (UID: \"9236b957-81cb-40bc-969e-a2057884ba50\") " pod="openshift-multus/network-metrics-daemon-4vl95" Oct 03 08:40:49 crc kubenswrapper[4790]: I1003 08:40:49.242569 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3362615c1396a6051d3770c79e4b67eea9d0c9df2a1a45381f729c4120970818\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://41113d498e57788a4c3162e8e981f5b9d9a5a7d49ee50b15085688fbc6b67f4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:49Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:49 crc kubenswrapper[4790]: I1003 08:40:49.258208 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"535765e1-fd5d-4501-b13e-684c5f08b296\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e0caa9033c82a625d9594342d4e58052cfbfea8251bc0fa2f1b102ba235b5ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db5e8a267e2821ea1c1628ff477a4efc3c5bcc78538460b0f30a160bd738dcee\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d85a637b07da3de31d73486c444bf86e7e2fd799180fc1cd9ab56fee61afaa9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://905a99c09cfaadc3eddc17fa6e509098364525d7a03add46a97c98dc08def8a2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://905a99c09cfaadc3eddc17fa6e509098364525d7a03add46a97c98dc08def8a2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T08:40:30Z\\\",\\\"message\\\":\\\"file observer\\\\nW1003 08:40:30.396497 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1003 08:40:30.399279 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 08:40:30.400119 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-247026750/tls.crt::/tmp/serving-cert-247026750/tls.key\\\\\\\"\\\\nI1003 08:40:30.809123 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1003 08:40:30.812812 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1003 08:40:30.812831 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1003 08:40:30.812852 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1003 08:40:30.812858 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1003 08:40:30.817440 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1003 08:40:30.817463 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1003 08:40:30.817482 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 08:40:30.817491 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 08:40:30.817495 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1003 08:40:30.817498 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1003 08:40:30.817501 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1003 08:40:30.817504 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1003 08:40:30.821080 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4591d1f2d844cb16eef19a65aaad416898d23ca5ce266e4863a9810d49ff4ba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://444e973af493d5aca2c924bcaee20abca5f1281e6b9c9f364e4603bf622d0e48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://444e973af493d5aca2c924bcaee20abca5f1281e6b9c9f364e4603bf622d0e48\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:40:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:14Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:49Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:49 crc kubenswrapper[4790]: I1003 08:40:49.278955 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08d57a60251c60c6c6a2cab2b862c322c01b531ca1aa12ff4b0933bee9eb95f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:49Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:49 crc kubenswrapper[4790]: I1003 08:40:49.297123 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-hncl2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f2f5631d-8486-44f9-81b0-ee78515e7866\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3fde680972c36057c9e06b0f745acbfb2e6ab399b14d6c25a3c379b65e598afd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-strp2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-hncl2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:49Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:49 crc kubenswrapper[4790]: I1003 08:40:49.312085 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:49 crc kubenswrapper[4790]: I1003 08:40:49.312494 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:49 crc kubenswrapper[4790]: I1003 08:40:49.312667 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:49 crc kubenswrapper[4790]: I1003 08:40:49.312750 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:49 crc kubenswrapper[4790]: I1003 08:40:49.312811 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:49Z","lastTransitionTime":"2025-10-03T08:40:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:49 crc kubenswrapper[4790]: I1003 08:40:49.317721 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7ddb59e5-6feb-471d-9594-f86e6993a2e4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b0138e08174582c0c1388716f1f6b9e42a16b009f1a467803c4dacd50c3d80ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c575d680c9bc994b8c2e97e720d794138153bc3329c4b7f8ef81115a0245d0cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://027dd0df8d83044cff9f4e8c87bef3bb18e2169bb93e615486fd1e9b5a17b86e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e837e81f3b907d8506f674d13a9bc8f20f63e6c40e1f952cd57b715855ec9bf9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:14Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:49Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:49 crc kubenswrapper[4790]: I1003 08:40:49.417125 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:49 crc kubenswrapper[4790]: I1003 08:40:49.417187 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:49 crc kubenswrapper[4790]: I1003 08:40:49.417205 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:49 crc kubenswrapper[4790]: I1003 08:40:49.417234 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:49 crc kubenswrapper[4790]: I1003 08:40:49.417341 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:49Z","lastTransitionTime":"2025-10-03T08:40:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:49 crc kubenswrapper[4790]: I1003 08:40:49.520714 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:49 crc kubenswrapper[4790]: I1003 08:40:49.520783 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:49 crc kubenswrapper[4790]: I1003 08:40:49.520797 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:49 crc kubenswrapper[4790]: I1003 08:40:49.520831 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:49 crc kubenswrapper[4790]: I1003 08:40:49.520860 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:49Z","lastTransitionTime":"2025-10-03T08:40:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:49 crc kubenswrapper[4790]: I1003 08:40:49.617263 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/12525ca4-5242-41bc-b7ba-2c1ab5a45892-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-fn764\" (UID: \"12525ca4-5242-41bc-b7ba-2c1ab5a45892\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-fn764" Oct 03 08:40:49 crc kubenswrapper[4790]: I1003 08:40:49.623029 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/12525ca4-5242-41bc-b7ba-2c1ab5a45892-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-fn764\" (UID: \"12525ca4-5242-41bc-b7ba-2c1ab5a45892\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-fn764" Oct 03 08:40:49 crc kubenswrapper[4790]: I1003 08:40:49.624536 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:49 crc kubenswrapper[4790]: I1003 08:40:49.624710 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:49 crc kubenswrapper[4790]: I1003 08:40:49.624836 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:49 crc kubenswrapper[4790]: I1003 08:40:49.624996 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:49 crc kubenswrapper[4790]: I1003 08:40:49.625113 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:49Z","lastTransitionTime":"2025-10-03T08:40:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:49 crc kubenswrapper[4790]: I1003 08:40:49.709710 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vsjvh_fa372dcc-63aa-48e5-b153-7739078026ef/ovnkube-controller/1.log" Oct 03 08:40:49 crc kubenswrapper[4790]: I1003 08:40:49.710556 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vsjvh_fa372dcc-63aa-48e5-b153-7739078026ef/ovnkube-controller/0.log" Oct 03 08:40:49 crc kubenswrapper[4790]: I1003 08:40:49.714475 4790 generic.go:334] "Generic (PLEG): container finished" podID="fa372dcc-63aa-48e5-b153-7739078026ef" containerID="6fb145beaf63d143f4e9dbaa8da4340794905d8ee9b2b317b7f5389999c43b09" exitCode=1 Oct 03 08:40:49 crc kubenswrapper[4790]: I1003 08:40:49.714532 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vsjvh" event={"ID":"fa372dcc-63aa-48e5-b153-7739078026ef","Type":"ContainerDied","Data":"6fb145beaf63d143f4e9dbaa8da4340794905d8ee9b2b317b7f5389999c43b09"} Oct 03 08:40:49 crc kubenswrapper[4790]: I1003 08:40:49.714611 4790 scope.go:117] "RemoveContainer" containerID="f23e27271cc758d5c79c89c2c23cc95fdeec4f05bc5e026a05e0ce68da68541b" Oct 03 08:40:49 crc kubenswrapper[4790]: I1003 08:40:49.715626 4790 scope.go:117] "RemoveContainer" containerID="6fb145beaf63d143f4e9dbaa8da4340794905d8ee9b2b317b7f5389999c43b09" Oct 03 08:40:49 crc kubenswrapper[4790]: E1003 08:40:49.716071 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-vsjvh_openshift-ovn-kubernetes(fa372dcc-63aa-48e5-b153-7739078026ef)\"" pod="openshift-ovn-kubernetes/ovnkube-node-vsjvh" podUID="fa372dcc-63aa-48e5-b153-7739078026ef" Oct 03 08:40:49 crc kubenswrapper[4790]: I1003 08:40:49.718815 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9236b957-81cb-40bc-969e-a2057884ba50-metrics-certs\") pod \"network-metrics-daemon-4vl95\" (UID: \"9236b957-81cb-40bc-969e-a2057884ba50\") " pod="openshift-multus/network-metrics-daemon-4vl95" Oct 03 08:40:49 crc kubenswrapper[4790]: E1003 08:40:49.719003 4790 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 03 08:40:49 crc kubenswrapper[4790]: E1003 08:40:49.719064 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9236b957-81cb-40bc-969e-a2057884ba50-metrics-certs podName:9236b957-81cb-40bc-969e-a2057884ba50 nodeName:}" failed. No retries permitted until 2025-10-03 08:40:50.719041479 +0000 UTC m=+37.071975727 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9236b957-81cb-40bc-969e-a2057884ba50-metrics-certs") pod "network-metrics-daemon-4vl95" (UID: "9236b957-81cb-40bc-969e-a2057884ba50") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 03 08:40:49 crc kubenswrapper[4790]: I1003 08:40:49.730110 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:49 crc kubenswrapper[4790]: I1003 08:40:49.730331 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:49 crc kubenswrapper[4790]: I1003 08:40:49.730397 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:49 crc kubenswrapper[4790]: I1003 08:40:49.730500 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:49 crc kubenswrapper[4790]: I1003 08:40:49.730609 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:49Z","lastTransitionTime":"2025-10-03T08:40:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:49 crc kubenswrapper[4790]: I1003 08:40:49.741542 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"535765e1-fd5d-4501-b13e-684c5f08b296\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e0caa9033c82a625d9594342d4e58052cfbfea8251bc0fa2f1b102ba235b5ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db5e8a267e2821ea1c1628ff477a4efc3c5bcc78538460b0f30a160bd738dcee\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d85a637b07da3de31d73486c444bf86e7e2fd799180fc1cd9ab56fee61afaa9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://905a99c09cfaadc3eddc17fa6e509098364525d7a03add46a97c98dc08def8a2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://905a99c09cfaadc3eddc17fa6e509098364525d7a03add46a97c98dc08def8a2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T08:40:30Z\\\",\\\"message\\\":\\\"file observer\\\\nW1003 08:40:30.396497 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1003 08:40:30.399279 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 08:40:30.400119 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-247026750/tls.crt::/tmp/serving-cert-247026750/tls.key\\\\\\\"\\\\nI1003 08:40:30.809123 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1003 08:40:30.812812 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1003 08:40:30.812831 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1003 08:40:30.812852 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1003 08:40:30.812858 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1003 08:40:30.817440 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1003 08:40:30.817463 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1003 08:40:30.817482 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 08:40:30.817491 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 08:40:30.817495 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1003 08:40:30.817498 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1003 08:40:30.817501 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1003 08:40:30.817504 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1003 08:40:30.821080 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4591d1f2d844cb16eef19a65aaad416898d23ca5ce266e4863a9810d49ff4ba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://444e973af493d5aca2c924bcaee20abca5f1281e6b9c9f364e4603bf622d0e48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://444e973af493d5aca2c924bcaee20abca5f1281e6b9c9f364e4603bf622d0e48\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:40:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:14Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:49Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:49 crc kubenswrapper[4790]: I1003 08:40:49.756014 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gprtl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c5bfb61-e69a-4576-b200-18dd7f38c7f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dcb3e7b6550f0ad490e187c2154ec53cb427cbab7acaf0618eb4de188eb724c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66xc5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:34Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gprtl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:49Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:49 crc kubenswrapper[4790]: I1003 08:40:49.762108 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-fn764" Oct 03 08:40:49 crc kubenswrapper[4790]: I1003 08:40:49.777011 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3362615c1396a6051d3770c79e4b67eea9d0c9df2a1a45381f729c4120970818\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://41113d498e57788a4c3162e8e981f5b9d9a5a7d49ee50b15085688fbc6b67f4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:49Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:49 crc kubenswrapper[4790]: I1003 08:40:49.794868 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7ddb59e5-6feb-471d-9594-f86e6993a2e4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b0138e08174582c0c1388716f1f6b9e42a16b009f1a467803c4dacd50c3d80ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c575d680c9bc994b8c2e97e720d794138153bc3329c4b7f8ef81115a0245d0cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://027dd0df8d83044cff9f4e8c87bef3bb18e2169bb93e615486fd1e9b5a17b86e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e837e81f3b907d8506f674d13a9bc8f20f63e6c40e1f952cd57b715855ec9bf9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:14Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:49Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:49 crc kubenswrapper[4790]: I1003 08:40:49.810251 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08d57a60251c60c6c6a2cab2b862c322c01b531ca1aa12ff4b0933bee9eb95f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:49Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:49 crc kubenswrapper[4790]: I1003 08:40:49.829899 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-hncl2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f2f5631d-8486-44f9-81b0-ee78515e7866\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3fde680972c36057c9e06b0f745acbfb2e6ab399b14d6c25a3c379b65e598afd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-strp2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-hncl2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:49Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:49 crc kubenswrapper[4790]: I1003 08:40:49.834953 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:49 crc kubenswrapper[4790]: I1003 08:40:49.834992 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:49 crc kubenswrapper[4790]: I1003 08:40:49.835005 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:49 crc kubenswrapper[4790]: I1003 08:40:49.835026 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:49 crc kubenswrapper[4790]: I1003 08:40:49.835038 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:49Z","lastTransitionTime":"2025-10-03T08:40:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:49 crc kubenswrapper[4790]: I1003 08:40:49.842636 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-fn764" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12525ca4-5242-41bc-b7ba-2c1ab5a45892\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2t6qw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2t6qw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-fn764\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:49Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:49 crc kubenswrapper[4790]: I1003 08:40:49.870145 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:49Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:49 crc kubenswrapper[4790]: I1003 08:40:49.887949 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9fnsr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e01f9373-74e4-439b-bd7f-ad00e60a3284\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45be0671b4ad4d5e294f87dba5d52747298b00a3ebd9a13f2845881c0f1846a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krfjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://876c35e32e5877c2a69cdbb54b84ec0bc0daebd05c7847875a1c122aafac2057\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://876c35e32e5877c2a69cdbb54b84ec0bc0daebd05c7847875a1c122aafac2057\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krfjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://938ff3fffad5347212d00d02f96c105cd5e6252a4d6a7ada258f0367da3f8e77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://938ff3fffad5347212d00d02f96c105cd5e6252a4d6a7ada258f0367da3f8e77\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:40:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krfjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca99fc95f4bc463b8e638cfa1e6dc6be77fc4111b6ec61b4b2913fdf91f46671\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ca99fc95f4bc463b8e638cfa1e6dc6be77fc4111b6ec61b4b2913fdf91f46671\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:40:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krfjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9667f80988a61a4f26f6f2b8b047ef94bdc6fa052083d5d1285f5bd9235615ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9667f80988a61a4f26f6f2b8b047ef94bdc6fa052083d5d1285f5bd9235615ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:40:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krfjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a4e7fb7b6f922e6e0fa2a7fcb5476ba549a6aa21dc88ee145bc55f4dac0912bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a4e7fb7b6f922e6e0fa2a7fcb5476ba549a6aa21dc88ee145bc55f4dac0912bb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:40:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krfjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://76a8239bfbb38c5858f1f9b0b01b9552463d8b2774564d7cdacb6757bf053656\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76a8239bfbb38c5858f1f9b0b01b9552463d8b2774564d7cdacb6757bf053656\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:40:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krfjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9fnsr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:49Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:49 crc kubenswrapper[4790]: I1003 08:40:49.901937 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7bpwn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a3923f73-5efb-4ce3-b68f-0c4e1e8eca4b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://305c950f0352fe1bec1c151dc89eed1389144ad5ee321458d48c6609e688bb73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gt4gf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:37Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7bpwn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:49Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:49 crc kubenswrapper[4790]: I1003 08:40:49.920525 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 08:40:49 crc kubenswrapper[4790]: E1003 08:40:49.920821 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 08:41:05.920777718 +0000 UTC m=+52.273711956 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 08:40:49 crc kubenswrapper[4790]: I1003 08:40:49.927734 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vsjvh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa372dcc-63aa-48e5-b153-7739078026ef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e69102d376e90485ca565d34e18dbc12067e166f0bdf3c4325a0873f9f2b9b0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52pgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e52e55afef7b68f05e0c7d975d4497ee57dd9fa5aef7f72e9ca7441ea1633c05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52pgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55b4e8ab33316a7f7057ee723ebd49eb0adcd7270124f084b2828ce538ed3297\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52pgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f2aaa340e3927261ef0119f786b348e2091ddcd4b791619cc521122cc0774e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52pgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff0f1dd623e164ebe1862661c5905a38bec3aa1dd5c61982eebde25d0feb4c6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52pgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27b1573f76936a5148d24accaa7fdc1a1da552fb56b348812445e7f1c1e4cbef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52pgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fb145beaf63d143f4e9dbaa8da4340794905d8ee9b2b317b7f5389999c43b09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f23e27271cc758d5c79c89c2c23cc95fdeec4f05bc5e026a05e0ce68da68541b\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-03T08:40:46Z\\\",\\\"message\\\":\\\"qos/v1/apis/informers/externalversions/factory.go:140\\\\nI1003 08:40:46.779070 6066 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1003 08:40:46.779119 6066 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1003 08:40:46.779154 6066 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1003 08:40:46.779161 6066 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1003 08:40:46.779205 6066 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1003 08:40:46.779253 6066 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1003 08:40:46.779260 6066 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1003 08:40:46.779333 6066 factory.go:656] Stopping watch factory\\\\nI1003 08:40:46.779335 6066 handler.go:208] Removed *v1.Node event handler 2\\\\nI1003 08:40:46.779354 6066 ovnkube.go:599] Stopped ovnkube\\\\nI1003 08:40:46.779365 6066 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1003 08:40:46.779375 6066 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1003 08:40:46.779383 6066 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1003 08:40:46.779397 6066 handler.go:208] Removed *v1.Node event handler 7\\\\nI1003 08:40:46.779363 6066 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1003 08\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:43Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6fb145beaf63d143f4e9dbaa8da4340794905d8ee9b2b317b7f5389999c43b09\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-03T08:40:49Z\\\",\\\"message\\\":\\\"6 for removal\\\\nI1003 08:40:48.888763 6236 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1003 08:40:48.888772 6236 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1003 08:40:48.888784 6236 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1003 08:40:48.888812 6236 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1003 08:40:48.888830 6236 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1003 08:40:48.888831 6236 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1003 08:40:48.888804 6236 factory.go:656] Stopping watch factory\\\\nI1003 08:40:48.888954 6236 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI1003 08:40:48.889061 6236 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-machine-api/cluster-autoscaler-operator\\\\\\\"}\\\\nI1003 08:40:48.889112 6236 services_controller.go:360] Finished syncing service cluster-autoscaler-operator on namespace openshift-machine-api for network=default : 2.188891ms\\\\nI1003 08:40:48.889115 6236 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI1003 08:40:48.889164 6236 ovnkube.go:599] Stopped ovnkube\\\\nI1003 08:40:48.889201 6236 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1003 08:40:48.889310 6236 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52pgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74d44e422b0cbbd3793553dad98475d5a93ac38530a167db0dd1ef152edac010\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52pgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e45c39722f37fc673cee930ca10396bba43b56e19e61aa99bd2b89f131341cb4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e45c39722f37fc673cee930ca10396bba43b56e19e61aa99bd2b89f131341cb4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:40:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52pgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vsjvh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:49Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:49 crc kubenswrapper[4790]: I1003 08:40:49.938913 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:49 crc kubenswrapper[4790]: I1003 08:40:49.938987 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:49 crc kubenswrapper[4790]: I1003 08:40:49.939015 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:49 crc kubenswrapper[4790]: I1003 08:40:49.939043 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:49 crc kubenswrapper[4790]: I1003 08:40:49.939057 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:49Z","lastTransitionTime":"2025-10-03T08:40:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:49 crc kubenswrapper[4790]: I1003 08:40:49.944082 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-4vl95" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9236b957-81cb-40bc-969e-a2057884ba50\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j2gd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j2gd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:49Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-4vl95\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:49Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:49 crc kubenswrapper[4790]: I1003 08:40:49.963055 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28b99cc027b4390bdc7e3ed519bb798eb4a7b825ff29e4574f3420b9922265b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:49Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:49 crc kubenswrapper[4790]: I1003 08:40:49.979685 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:49Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:49 crc kubenswrapper[4790]: I1003 08:40:49.995697 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:49Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:50 crc kubenswrapper[4790]: I1003 08:40:50.012020 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"36eae06a-d8be-4128-8d68-acb32e1f011b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a97a1a7f3aba494cd0da6bcd15052a85d70642642a62d108ac0be51ce8e3d302\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww2bb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8c3aea222eb433729d8eac107ebcdb306f649bef25942cd0390bfc1bbda5c74\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww2bb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:35Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-zqj4j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:50Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:50 crc kubenswrapper[4790]: I1003 08:40:50.043721 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:50 crc kubenswrapper[4790]: I1003 08:40:50.044457 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:50 crc kubenswrapper[4790]: I1003 08:40:50.044473 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:50 crc kubenswrapper[4790]: I1003 08:40:50.044497 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:50 crc kubenswrapper[4790]: I1003 08:40:50.044509 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:50Z","lastTransitionTime":"2025-10-03T08:40:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:50 crc kubenswrapper[4790]: I1003 08:40:50.122376 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 08:40:50 crc kubenswrapper[4790]: I1003 08:40:50.122450 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 08:40:50 crc kubenswrapper[4790]: I1003 08:40:50.122482 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 08:40:50 crc kubenswrapper[4790]: I1003 08:40:50.122516 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 08:40:50 crc kubenswrapper[4790]: E1003 08:40:50.122727 4790 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 03 08:40:50 crc kubenswrapper[4790]: E1003 08:40:50.122762 4790 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 03 08:40:50 crc kubenswrapper[4790]: E1003 08:40:50.122777 4790 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 03 08:40:50 crc kubenswrapper[4790]: E1003 08:40:50.122809 4790 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 03 08:40:50 crc kubenswrapper[4790]: E1003 08:40:50.122844 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-03 08:41:06.122822436 +0000 UTC m=+52.475756674 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 03 08:40:50 crc kubenswrapper[4790]: E1003 08:40:50.122809 4790 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 03 08:40:50 crc kubenswrapper[4790]: E1003 08:40:50.122925 4790 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 03 08:40:50 crc kubenswrapper[4790]: E1003 08:40:50.122939 4790 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 03 08:40:50 crc kubenswrapper[4790]: E1003 08:40:50.122892 4790 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 03 08:40:50 crc kubenswrapper[4790]: E1003 08:40:50.122926 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-03 08:41:06.122897518 +0000 UTC m=+52.475831786 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 03 08:40:50 crc kubenswrapper[4790]: E1003 08:40:50.122986 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-03 08:41:06.12297467 +0000 UTC m=+52.475908908 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 03 08:40:50 crc kubenswrapper[4790]: E1003 08:40:50.123001 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-03 08:41:06.122993531 +0000 UTC m=+52.475927759 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 03 08:40:50 crc kubenswrapper[4790]: I1003 08:40:50.147494 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:50 crc kubenswrapper[4790]: I1003 08:40:50.147544 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:50 crc kubenswrapper[4790]: I1003 08:40:50.147555 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:50 crc kubenswrapper[4790]: I1003 08:40:50.147595 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:50 crc kubenswrapper[4790]: I1003 08:40:50.147608 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:50Z","lastTransitionTime":"2025-10-03T08:40:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:50 crc kubenswrapper[4790]: I1003 08:40:50.251084 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:50 crc kubenswrapper[4790]: I1003 08:40:50.251148 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:50 crc kubenswrapper[4790]: I1003 08:40:50.251162 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:50 crc kubenswrapper[4790]: I1003 08:40:50.251182 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:50 crc kubenswrapper[4790]: I1003 08:40:50.251194 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:50Z","lastTransitionTime":"2025-10-03T08:40:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:50 crc kubenswrapper[4790]: I1003 08:40:50.327867 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 08:40:50 crc kubenswrapper[4790]: E1003 08:40:50.328053 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 08:40:50 crc kubenswrapper[4790]: I1003 08:40:50.328531 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 08:40:50 crc kubenswrapper[4790]: E1003 08:40:50.328628 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 08:40:50 crc kubenswrapper[4790]: I1003 08:40:50.328697 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 08:40:50 crc kubenswrapper[4790]: E1003 08:40:50.328756 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 08:40:50 crc kubenswrapper[4790]: I1003 08:40:50.328912 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4vl95" Oct 03 08:40:50 crc kubenswrapper[4790]: E1003 08:40:50.328995 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4vl95" podUID="9236b957-81cb-40bc-969e-a2057884ba50" Oct 03 08:40:50 crc kubenswrapper[4790]: I1003 08:40:50.353750 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:50 crc kubenswrapper[4790]: I1003 08:40:50.353811 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:50 crc kubenswrapper[4790]: I1003 08:40:50.353825 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:50 crc kubenswrapper[4790]: I1003 08:40:50.353849 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:50 crc kubenswrapper[4790]: I1003 08:40:50.353865 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:50Z","lastTransitionTime":"2025-10-03T08:40:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:50 crc kubenswrapper[4790]: I1003 08:40:50.457005 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:50 crc kubenswrapper[4790]: I1003 08:40:50.457048 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:50 crc kubenswrapper[4790]: I1003 08:40:50.457062 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:50 crc kubenswrapper[4790]: I1003 08:40:50.457080 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:50 crc kubenswrapper[4790]: I1003 08:40:50.457093 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:50Z","lastTransitionTime":"2025-10-03T08:40:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:50 crc kubenswrapper[4790]: I1003 08:40:50.560561 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:50 crc kubenswrapper[4790]: I1003 08:40:50.560654 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:50 crc kubenswrapper[4790]: I1003 08:40:50.560673 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:50 crc kubenswrapper[4790]: I1003 08:40:50.560700 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:50 crc kubenswrapper[4790]: I1003 08:40:50.560721 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:50Z","lastTransitionTime":"2025-10-03T08:40:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:50 crc kubenswrapper[4790]: I1003 08:40:50.663728 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:50 crc kubenswrapper[4790]: I1003 08:40:50.663796 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:50 crc kubenswrapper[4790]: I1003 08:40:50.663812 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:50 crc kubenswrapper[4790]: I1003 08:40:50.663838 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:50 crc kubenswrapper[4790]: I1003 08:40:50.663856 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:50Z","lastTransitionTime":"2025-10-03T08:40:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:50 crc kubenswrapper[4790]: I1003 08:40:50.690691 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:50 crc kubenswrapper[4790]: I1003 08:40:50.690751 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:50 crc kubenswrapper[4790]: I1003 08:40:50.690768 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:50 crc kubenswrapper[4790]: I1003 08:40:50.690790 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:50 crc kubenswrapper[4790]: I1003 08:40:50.690806 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:50Z","lastTransitionTime":"2025-10-03T08:40:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:50 crc kubenswrapper[4790]: E1003 08:40:50.707680 4790 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T08:40:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T08:40:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:50Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T08:40:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T08:40:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:50Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e374b313-7030-4739-b7f1-da1c9b39fe8d\\\",\\\"systemUUID\\\":\\\"312be368-bbdc-461d-b74b-f792414190de\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:50Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:50 crc kubenswrapper[4790]: I1003 08:40:50.712737 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:50 crc kubenswrapper[4790]: I1003 08:40:50.712803 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:50 crc kubenswrapper[4790]: I1003 08:40:50.712821 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:50 crc kubenswrapper[4790]: I1003 08:40:50.712841 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:50 crc kubenswrapper[4790]: I1003 08:40:50.712854 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:50Z","lastTransitionTime":"2025-10-03T08:40:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:50 crc kubenswrapper[4790]: I1003 08:40:50.720312 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-fn764" event={"ID":"12525ca4-5242-41bc-b7ba-2c1ab5a45892","Type":"ContainerStarted","Data":"d79cdfa897fe5ca615930826e3a153a184ab7977626a9837dbcdbddbe4d27af6"} Oct 03 08:40:50 crc kubenswrapper[4790]: I1003 08:40:50.720398 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-fn764" event={"ID":"12525ca4-5242-41bc-b7ba-2c1ab5a45892","Type":"ContainerStarted","Data":"1ee4f33c772a396fdd8d2a6f958d402139cb81cfd79da0f32c83bbe9adbe0801"} Oct 03 08:40:50 crc kubenswrapper[4790]: I1003 08:40:50.720413 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-fn764" event={"ID":"12525ca4-5242-41bc-b7ba-2c1ab5a45892","Type":"ContainerStarted","Data":"c8c0bdaee250023dfe4fb4db4f0b722cb6fb0b0bb906cfb128483b8c71515859"} Oct 03 08:40:50 crc kubenswrapper[4790]: I1003 08:40:50.722223 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vsjvh_fa372dcc-63aa-48e5-b153-7739078026ef/ovnkube-controller/1.log" Oct 03 08:40:50 crc kubenswrapper[4790]: I1003 08:40:50.728267 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9236b957-81cb-40bc-969e-a2057884ba50-metrics-certs\") pod \"network-metrics-daemon-4vl95\" (UID: \"9236b957-81cb-40bc-969e-a2057884ba50\") " pod="openshift-multus/network-metrics-daemon-4vl95" Oct 03 08:40:50 crc kubenswrapper[4790]: E1003 08:40:50.728616 4790 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 03 08:40:50 crc kubenswrapper[4790]: E1003 08:40:50.728730 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9236b957-81cb-40bc-969e-a2057884ba50-metrics-certs podName:9236b957-81cb-40bc-969e-a2057884ba50 nodeName:}" failed. No retries permitted until 2025-10-03 08:40:52.728691335 +0000 UTC m=+39.081625623 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9236b957-81cb-40bc-969e-a2057884ba50-metrics-certs") pod "network-metrics-daemon-4vl95" (UID: "9236b957-81cb-40bc-969e-a2057884ba50") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 03 08:40:50 crc kubenswrapper[4790]: E1003 08:40:50.729102 4790 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T08:40:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T08:40:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:50Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T08:40:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T08:40:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:50Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e374b313-7030-4739-b7f1-da1c9b39fe8d\\\",\\\"systemUUID\\\":\\\"312be368-bbdc-461d-b74b-f792414190de\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:50Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:50 crc kubenswrapper[4790]: I1003 08:40:50.734122 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:50 crc kubenswrapper[4790]: I1003 08:40:50.734163 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:50 crc kubenswrapper[4790]: I1003 08:40:50.734175 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:50 crc kubenswrapper[4790]: I1003 08:40:50.734198 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:50 crc kubenswrapper[4790]: I1003 08:40:50.734214 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:50Z","lastTransitionTime":"2025-10-03T08:40:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:50 crc kubenswrapper[4790]: I1003 08:40:50.741094 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7ddb59e5-6feb-471d-9594-f86e6993a2e4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b0138e08174582c0c1388716f1f6b9e42a16b009f1a467803c4dacd50c3d80ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c575d680c9bc994b8c2e97e720d794138153bc3329c4b7f8ef81115a0245d0cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://027dd0df8d83044cff9f4e8c87bef3bb18e2169bb93e615486fd1e9b5a17b86e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e837e81f3b907d8506f674d13a9bc8f20f63e6c40e1f952cd57b715855ec9bf9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:14Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:50Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:50 crc kubenswrapper[4790]: E1003 08:40:50.748517 4790 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T08:40:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T08:40:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:50Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T08:40:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T08:40:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:50Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e374b313-7030-4739-b7f1-da1c9b39fe8d\\\",\\\"systemUUID\\\":\\\"312be368-bbdc-461d-b74b-f792414190de\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:50Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:50 crc kubenswrapper[4790]: I1003 08:40:50.754327 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:50 crc kubenswrapper[4790]: I1003 08:40:50.754384 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:50 crc kubenswrapper[4790]: I1003 08:40:50.754393 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:50 crc kubenswrapper[4790]: I1003 08:40:50.754413 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:50 crc kubenswrapper[4790]: I1003 08:40:50.754426 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:50Z","lastTransitionTime":"2025-10-03T08:40:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:50 crc kubenswrapper[4790]: I1003 08:40:50.755054 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08d57a60251c60c6c6a2cab2b862c322c01b531ca1aa12ff4b0933bee9eb95f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:50Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:50 crc kubenswrapper[4790]: I1003 08:40:50.769521 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-hncl2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f2f5631d-8486-44f9-81b0-ee78515e7866\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3fde680972c36057c9e06b0f745acbfb2e6ab399b14d6c25a3c379b65e598afd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-strp2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-hncl2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:50Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:50 crc kubenswrapper[4790]: E1003 08:40:50.773667 4790 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T08:40:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T08:40:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:50Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T08:40:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T08:40:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:50Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e374b313-7030-4739-b7f1-da1c9b39fe8d\\\",\\\"systemUUID\\\":\\\"312be368-bbdc-461d-b74b-f792414190de\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:50Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:50 crc kubenswrapper[4790]: I1003 08:40:50.778039 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:50 crc kubenswrapper[4790]: I1003 08:40:50.778100 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:50 crc kubenswrapper[4790]: I1003 08:40:50.778113 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:50 crc kubenswrapper[4790]: I1003 08:40:50.778132 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:50 crc kubenswrapper[4790]: I1003 08:40:50.778144 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:50Z","lastTransitionTime":"2025-10-03T08:40:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:50 crc kubenswrapper[4790]: I1003 08:40:50.786159 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:50Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:50 crc kubenswrapper[4790]: E1003 08:40:50.800553 4790 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T08:40:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T08:40:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:50Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T08:40:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T08:40:50Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:50Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e374b313-7030-4739-b7f1-da1c9b39fe8d\\\",\\\"systemUUID\\\":\\\"312be368-bbdc-461d-b74b-f792414190de\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:50Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:50 crc kubenswrapper[4790]: E1003 08:40:50.800693 4790 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 03 08:40:50 crc kubenswrapper[4790]: I1003 08:40:50.806674 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:50 crc kubenswrapper[4790]: I1003 08:40:50.806706 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:50 crc kubenswrapper[4790]: I1003 08:40:50.806717 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:50 crc kubenswrapper[4790]: I1003 08:40:50.806770 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:50 crc kubenswrapper[4790]: I1003 08:40:50.806787 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:50Z","lastTransitionTime":"2025-10-03T08:40:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:50 crc kubenswrapper[4790]: I1003 08:40:50.811409 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9fnsr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e01f9373-74e4-439b-bd7f-ad00e60a3284\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45be0671b4ad4d5e294f87dba5d52747298b00a3ebd9a13f2845881c0f1846a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krfjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://876c35e32e5877c2a69cdbb54b84ec0bc0daebd05c7847875a1c122aafac2057\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://876c35e32e5877c2a69cdbb54b84ec0bc0daebd05c7847875a1c122aafac2057\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krfjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://938ff3fffad5347212d00d02f96c105cd5e6252a4d6a7ada258f0367da3f8e77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://938ff3fffad5347212d00d02f96c105cd5e6252a4d6a7ada258f0367da3f8e77\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:40:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krfjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca99fc95f4bc463b8e638cfa1e6dc6be77fc4111b6ec61b4b2913fdf91f46671\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ca99fc95f4bc463b8e638cfa1e6dc6be77fc4111b6ec61b4b2913fdf91f46671\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:40:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krfjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9667f80988a61a4f26f6f2b8b047ef94bdc6fa052083d5d1285f5bd9235615ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9667f80988a61a4f26f6f2b8b047ef94bdc6fa052083d5d1285f5bd9235615ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:40:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krfjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a4e7fb7b6f922e6e0fa2a7fcb5476ba549a6aa21dc88ee145bc55f4dac0912bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a4e7fb7b6f922e6e0fa2a7fcb5476ba549a6aa21dc88ee145bc55f4dac0912bb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:40:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krfjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://76a8239bfbb38c5858f1f9b0b01b9552463d8b2774564d7cdacb6757bf053656\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76a8239bfbb38c5858f1f9b0b01b9552463d8b2774564d7cdacb6757bf053656\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:40:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krfjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9fnsr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:50Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:50 crc kubenswrapper[4790]: I1003 08:40:50.824329 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7bpwn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a3923f73-5efb-4ce3-b68f-0c4e1e8eca4b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://305c950f0352fe1bec1c151dc89eed1389144ad5ee321458d48c6609e688bb73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gt4gf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:37Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7bpwn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:50Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:50 crc kubenswrapper[4790]: I1003 08:40:50.838330 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-fn764" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12525ca4-5242-41bc-b7ba-2c1ab5a45892\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ee4f33c772a396fdd8d2a6f958d402139cb81cfd79da0f32c83bbe9adbe0801\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2t6qw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d79cdfa897fe5ca615930826e3a153a184ab7977626a9837dbcdbddbe4d27af6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2t6qw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-fn764\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:50Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:50 crc kubenswrapper[4790]: I1003 08:40:50.852369 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28b99cc027b4390bdc7e3ed519bb798eb4a7b825ff29e4574f3420b9922265b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:50Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:50 crc kubenswrapper[4790]: I1003 08:40:50.866825 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:50Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:50 crc kubenswrapper[4790]: I1003 08:40:50.882973 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:50Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:50 crc kubenswrapper[4790]: I1003 08:40:50.899881 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"36eae06a-d8be-4128-8d68-acb32e1f011b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a97a1a7f3aba494cd0da6bcd15052a85d70642642a62d108ac0be51ce8e3d302\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww2bb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8c3aea222eb433729d8eac107ebcdb306f649bef25942cd0390bfc1bbda5c74\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww2bb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:35Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-zqj4j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:50Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:50 crc kubenswrapper[4790]: I1003 08:40:50.909329 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:50 crc kubenswrapper[4790]: I1003 08:40:50.909389 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:50 crc kubenswrapper[4790]: I1003 08:40:50.909405 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:50 crc kubenswrapper[4790]: I1003 08:40:50.909432 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:50 crc kubenswrapper[4790]: I1003 08:40:50.909450 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:50Z","lastTransitionTime":"2025-10-03T08:40:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:50 crc kubenswrapper[4790]: I1003 08:40:50.923075 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vsjvh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa372dcc-63aa-48e5-b153-7739078026ef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e69102d376e90485ca565d34e18dbc12067e166f0bdf3c4325a0873f9f2b9b0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52pgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e52e55afef7b68f05e0c7d975d4497ee57dd9fa5aef7f72e9ca7441ea1633c05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52pgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55b4e8ab33316a7f7057ee723ebd49eb0adcd7270124f084b2828ce538ed3297\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52pgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f2aaa340e3927261ef0119f786b348e2091ddcd4b791619cc521122cc0774e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52pgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff0f1dd623e164ebe1862661c5905a38bec3aa1dd5c61982eebde25d0feb4c6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52pgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27b1573f76936a5148d24accaa7fdc1a1da552fb56b348812445e7f1c1e4cbef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52pgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fb145beaf63d143f4e9dbaa8da4340794905d8ee9b2b317b7f5389999c43b09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f23e27271cc758d5c79c89c2c23cc95fdeec4f05bc5e026a05e0ce68da68541b\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-03T08:40:46Z\\\",\\\"message\\\":\\\"qos/v1/apis/informers/externalversions/factory.go:140\\\\nI1003 08:40:46.779070 6066 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1003 08:40:46.779119 6066 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1003 08:40:46.779154 6066 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1003 08:40:46.779161 6066 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1003 08:40:46.779205 6066 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1003 08:40:46.779253 6066 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1003 08:40:46.779260 6066 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1003 08:40:46.779333 6066 factory.go:656] Stopping watch factory\\\\nI1003 08:40:46.779335 6066 handler.go:208] Removed *v1.Node event handler 2\\\\nI1003 08:40:46.779354 6066 ovnkube.go:599] Stopped ovnkube\\\\nI1003 08:40:46.779365 6066 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1003 08:40:46.779375 6066 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1003 08:40:46.779383 6066 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1003 08:40:46.779397 6066 handler.go:208] Removed *v1.Node event handler 7\\\\nI1003 08:40:46.779363 6066 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1003 08\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:43Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6fb145beaf63d143f4e9dbaa8da4340794905d8ee9b2b317b7f5389999c43b09\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-03T08:40:49Z\\\",\\\"message\\\":\\\"6 for removal\\\\nI1003 08:40:48.888763 6236 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1003 08:40:48.888772 6236 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1003 08:40:48.888784 6236 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1003 08:40:48.888812 6236 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1003 08:40:48.888830 6236 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1003 08:40:48.888831 6236 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1003 08:40:48.888804 6236 factory.go:656] Stopping watch factory\\\\nI1003 08:40:48.888954 6236 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI1003 08:40:48.889061 6236 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-machine-api/cluster-autoscaler-operator\\\\\\\"}\\\\nI1003 08:40:48.889112 6236 services_controller.go:360] Finished syncing service cluster-autoscaler-operator on namespace openshift-machine-api for network=default : 2.188891ms\\\\nI1003 08:40:48.889115 6236 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI1003 08:40:48.889164 6236 ovnkube.go:599] Stopped ovnkube\\\\nI1003 08:40:48.889201 6236 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1003 08:40:48.889310 6236 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52pgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74d44e422b0cbbd3793553dad98475d5a93ac38530a167db0dd1ef152edac010\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52pgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e45c39722f37fc673cee930ca10396bba43b56e19e61aa99bd2b89f131341cb4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e45c39722f37fc673cee930ca10396bba43b56e19e61aa99bd2b89f131341cb4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:40:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52pgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vsjvh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:50Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:50 crc kubenswrapper[4790]: I1003 08:40:50.935472 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-4vl95" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9236b957-81cb-40bc-969e-a2057884ba50\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j2gd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j2gd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:49Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-4vl95\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:50Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:50 crc kubenswrapper[4790]: I1003 08:40:50.951522 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"535765e1-fd5d-4501-b13e-684c5f08b296\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e0caa9033c82a625d9594342d4e58052cfbfea8251bc0fa2f1b102ba235b5ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db5e8a267e2821ea1c1628ff477a4efc3c5bcc78538460b0f30a160bd738dcee\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d85a637b07da3de31d73486c444bf86e7e2fd799180fc1cd9ab56fee61afaa9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://905a99c09cfaadc3eddc17fa6e509098364525d7a03add46a97c98dc08def8a2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://905a99c09cfaadc3eddc17fa6e509098364525d7a03add46a97c98dc08def8a2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T08:40:30Z\\\",\\\"message\\\":\\\"file observer\\\\nW1003 08:40:30.396497 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1003 08:40:30.399279 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 08:40:30.400119 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-247026750/tls.crt::/tmp/serving-cert-247026750/tls.key\\\\\\\"\\\\nI1003 08:40:30.809123 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1003 08:40:30.812812 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1003 08:40:30.812831 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1003 08:40:30.812852 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1003 08:40:30.812858 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1003 08:40:30.817440 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1003 08:40:30.817463 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1003 08:40:30.817482 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 08:40:30.817491 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 08:40:30.817495 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1003 08:40:30.817498 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1003 08:40:30.817501 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1003 08:40:30.817504 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1003 08:40:30.821080 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4591d1f2d844cb16eef19a65aaad416898d23ca5ce266e4863a9810d49ff4ba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://444e973af493d5aca2c924bcaee20abca5f1281e6b9c9f364e4603bf622d0e48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://444e973af493d5aca2c924bcaee20abca5f1281e6b9c9f364e4603bf622d0e48\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:40:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:14Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:50Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:50 crc kubenswrapper[4790]: I1003 08:40:50.962370 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gprtl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c5bfb61-e69a-4576-b200-18dd7f38c7f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dcb3e7b6550f0ad490e187c2154ec53cb427cbab7acaf0618eb4de188eb724c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66xc5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:34Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gprtl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:50Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:50 crc kubenswrapper[4790]: I1003 08:40:50.975366 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3362615c1396a6051d3770c79e4b67eea9d0c9df2a1a45381f729c4120970818\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://41113d498e57788a4c3162e8e981f5b9d9a5a7d49ee50b15085688fbc6b67f4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:50Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:51 crc kubenswrapper[4790]: I1003 08:40:51.012344 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:51 crc kubenswrapper[4790]: I1003 08:40:51.012391 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:51 crc kubenswrapper[4790]: I1003 08:40:51.012402 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:51 crc kubenswrapper[4790]: I1003 08:40:51.012418 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:51 crc kubenswrapper[4790]: I1003 08:40:51.012429 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:51Z","lastTransitionTime":"2025-10-03T08:40:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:51 crc kubenswrapper[4790]: I1003 08:40:51.115596 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:51 crc kubenswrapper[4790]: I1003 08:40:51.115654 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:51 crc kubenswrapper[4790]: I1003 08:40:51.115665 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:51 crc kubenswrapper[4790]: I1003 08:40:51.115686 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:51 crc kubenswrapper[4790]: I1003 08:40:51.115697 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:51Z","lastTransitionTime":"2025-10-03T08:40:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:51 crc kubenswrapper[4790]: I1003 08:40:51.218846 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:51 crc kubenswrapper[4790]: I1003 08:40:51.219028 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:51 crc kubenswrapper[4790]: I1003 08:40:51.219053 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:51 crc kubenswrapper[4790]: I1003 08:40:51.219082 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:51 crc kubenswrapper[4790]: I1003 08:40:51.219102 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:51Z","lastTransitionTime":"2025-10-03T08:40:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:51 crc kubenswrapper[4790]: I1003 08:40:51.322978 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:51 crc kubenswrapper[4790]: I1003 08:40:51.323031 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:51 crc kubenswrapper[4790]: I1003 08:40:51.323048 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:51 crc kubenswrapper[4790]: I1003 08:40:51.323076 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:51 crc kubenswrapper[4790]: I1003 08:40:51.323093 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:51Z","lastTransitionTime":"2025-10-03T08:40:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:51 crc kubenswrapper[4790]: I1003 08:40:51.328461 4790 scope.go:117] "RemoveContainer" containerID="905a99c09cfaadc3eddc17fa6e509098364525d7a03add46a97c98dc08def8a2" Oct 03 08:40:51 crc kubenswrapper[4790]: I1003 08:40:51.426995 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:51 crc kubenswrapper[4790]: I1003 08:40:51.427079 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:51 crc kubenswrapper[4790]: I1003 08:40:51.427099 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:51 crc kubenswrapper[4790]: I1003 08:40:51.427130 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:51 crc kubenswrapper[4790]: I1003 08:40:51.427153 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:51Z","lastTransitionTime":"2025-10-03T08:40:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:51 crc kubenswrapper[4790]: I1003 08:40:51.530296 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:51 crc kubenswrapper[4790]: I1003 08:40:51.530346 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:51 crc kubenswrapper[4790]: I1003 08:40:51.530364 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:51 crc kubenswrapper[4790]: I1003 08:40:51.530385 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:51 crc kubenswrapper[4790]: I1003 08:40:51.530397 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:51Z","lastTransitionTime":"2025-10-03T08:40:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:51 crc kubenswrapper[4790]: I1003 08:40:51.635198 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:51 crc kubenswrapper[4790]: I1003 08:40:51.635447 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:51 crc kubenswrapper[4790]: I1003 08:40:51.635467 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:51 crc kubenswrapper[4790]: I1003 08:40:51.635495 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:51 crc kubenswrapper[4790]: I1003 08:40:51.635516 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:51Z","lastTransitionTime":"2025-10-03T08:40:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:51 crc kubenswrapper[4790]: I1003 08:40:51.733311 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Oct 03 08:40:51 crc kubenswrapper[4790]: I1003 08:40:51.737863 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:51 crc kubenswrapper[4790]: I1003 08:40:51.737934 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:51 crc kubenswrapper[4790]: I1003 08:40:51.737959 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:51 crc kubenswrapper[4790]: I1003 08:40:51.737987 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:51 crc kubenswrapper[4790]: I1003 08:40:51.738008 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:51Z","lastTransitionTime":"2025-10-03T08:40:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:51 crc kubenswrapper[4790]: I1003 08:40:51.737863 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"7c713cd887b75dc54dfefd8b613f5bdb78ade644b31fc7db5e789414fcacad4b"} Oct 03 08:40:51 crc kubenswrapper[4790]: I1003 08:40:51.738889 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 03 08:40:51 crc kubenswrapper[4790]: I1003 08:40:51.759255 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:51Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:51 crc kubenswrapper[4790]: I1003 08:40:51.776915 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9fnsr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e01f9373-74e4-439b-bd7f-ad00e60a3284\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45be0671b4ad4d5e294f87dba5d52747298b00a3ebd9a13f2845881c0f1846a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krfjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://876c35e32e5877c2a69cdbb54b84ec0bc0daebd05c7847875a1c122aafac2057\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://876c35e32e5877c2a69cdbb54b84ec0bc0daebd05c7847875a1c122aafac2057\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krfjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://938ff3fffad5347212d00d02f96c105cd5e6252a4d6a7ada258f0367da3f8e77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://938ff3fffad5347212d00d02f96c105cd5e6252a4d6a7ada258f0367da3f8e77\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:40:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krfjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca99fc95f4bc463b8e638cfa1e6dc6be77fc4111b6ec61b4b2913fdf91f46671\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ca99fc95f4bc463b8e638cfa1e6dc6be77fc4111b6ec61b4b2913fdf91f46671\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:40:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krfjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9667f80988a61a4f26f6f2b8b047ef94bdc6fa052083d5d1285f5bd9235615ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9667f80988a61a4f26f6f2b8b047ef94bdc6fa052083d5d1285f5bd9235615ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:40:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krfjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a4e7fb7b6f922e6e0fa2a7fcb5476ba549a6aa21dc88ee145bc55f4dac0912bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a4e7fb7b6f922e6e0fa2a7fcb5476ba549a6aa21dc88ee145bc55f4dac0912bb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:40:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krfjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://76a8239bfbb38c5858f1f9b0b01b9552463d8b2774564d7cdacb6757bf053656\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76a8239bfbb38c5858f1f9b0b01b9552463d8b2774564d7cdacb6757bf053656\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:40:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krfjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9fnsr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:51Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:51 crc kubenswrapper[4790]: I1003 08:40:51.791835 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7bpwn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a3923f73-5efb-4ce3-b68f-0c4e1e8eca4b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://305c950f0352fe1bec1c151dc89eed1389144ad5ee321458d48c6609e688bb73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gt4gf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:37Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7bpwn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:51Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:51 crc kubenswrapper[4790]: I1003 08:40:51.809390 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-fn764" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12525ca4-5242-41bc-b7ba-2c1ab5a45892\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ee4f33c772a396fdd8d2a6f958d402139cb81cfd79da0f32c83bbe9adbe0801\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2t6qw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d79cdfa897fe5ca615930826e3a153a184ab7977626a9837dbcdbddbe4d27af6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2t6qw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-fn764\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:51Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:51 crc kubenswrapper[4790]: I1003 08:40:51.829348 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28b99cc027b4390bdc7e3ed519bb798eb4a7b825ff29e4574f3420b9922265b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:51Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:51 crc kubenswrapper[4790]: I1003 08:40:51.841678 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:51 crc kubenswrapper[4790]: I1003 08:40:51.841717 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:51 crc kubenswrapper[4790]: I1003 08:40:51.841729 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:51 crc kubenswrapper[4790]: I1003 08:40:51.841753 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:51 crc kubenswrapper[4790]: I1003 08:40:51.841769 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:51Z","lastTransitionTime":"2025-10-03T08:40:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:51 crc kubenswrapper[4790]: I1003 08:40:51.849808 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:51Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:51 crc kubenswrapper[4790]: I1003 08:40:51.865715 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:51Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:51 crc kubenswrapper[4790]: I1003 08:40:51.881338 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"36eae06a-d8be-4128-8d68-acb32e1f011b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a97a1a7f3aba494cd0da6bcd15052a85d70642642a62d108ac0be51ce8e3d302\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww2bb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8c3aea222eb433729d8eac107ebcdb306f649bef25942cd0390bfc1bbda5c74\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww2bb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:35Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-zqj4j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:51Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:51 crc kubenswrapper[4790]: I1003 08:40:51.910119 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vsjvh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa372dcc-63aa-48e5-b153-7739078026ef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e69102d376e90485ca565d34e18dbc12067e166f0bdf3c4325a0873f9f2b9b0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52pgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e52e55afef7b68f05e0c7d975d4497ee57dd9fa5aef7f72e9ca7441ea1633c05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52pgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55b4e8ab33316a7f7057ee723ebd49eb0adcd7270124f084b2828ce538ed3297\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52pgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f2aaa340e3927261ef0119f786b348e2091ddcd4b791619cc521122cc0774e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52pgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff0f1dd623e164ebe1862661c5905a38bec3aa1dd5c61982eebde25d0feb4c6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52pgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27b1573f76936a5148d24accaa7fdc1a1da552fb56b348812445e7f1c1e4cbef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52pgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fb145beaf63d143f4e9dbaa8da4340794905d8ee9b2b317b7f5389999c43b09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f23e27271cc758d5c79c89c2c23cc95fdeec4f05bc5e026a05e0ce68da68541b\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-03T08:40:46Z\\\",\\\"message\\\":\\\"qos/v1/apis/informers/externalversions/factory.go:140\\\\nI1003 08:40:46.779070 6066 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1003 08:40:46.779119 6066 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1003 08:40:46.779154 6066 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1003 08:40:46.779161 6066 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1003 08:40:46.779205 6066 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1003 08:40:46.779253 6066 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1003 08:40:46.779260 6066 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1003 08:40:46.779333 6066 factory.go:656] Stopping watch factory\\\\nI1003 08:40:46.779335 6066 handler.go:208] Removed *v1.Node event handler 2\\\\nI1003 08:40:46.779354 6066 ovnkube.go:599] Stopped ovnkube\\\\nI1003 08:40:46.779365 6066 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1003 08:40:46.779375 6066 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1003 08:40:46.779383 6066 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1003 08:40:46.779397 6066 handler.go:208] Removed *v1.Node event handler 7\\\\nI1003 08:40:46.779363 6066 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1003 08\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:43Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6fb145beaf63d143f4e9dbaa8da4340794905d8ee9b2b317b7f5389999c43b09\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-03T08:40:49Z\\\",\\\"message\\\":\\\"6 for removal\\\\nI1003 08:40:48.888763 6236 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1003 08:40:48.888772 6236 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1003 08:40:48.888784 6236 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1003 08:40:48.888812 6236 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1003 08:40:48.888830 6236 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1003 08:40:48.888831 6236 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1003 08:40:48.888804 6236 factory.go:656] Stopping watch factory\\\\nI1003 08:40:48.888954 6236 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI1003 08:40:48.889061 6236 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-machine-api/cluster-autoscaler-operator\\\\\\\"}\\\\nI1003 08:40:48.889112 6236 services_controller.go:360] Finished syncing service cluster-autoscaler-operator on namespace openshift-machine-api for network=default : 2.188891ms\\\\nI1003 08:40:48.889115 6236 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI1003 08:40:48.889164 6236 ovnkube.go:599] Stopped ovnkube\\\\nI1003 08:40:48.889201 6236 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1003 08:40:48.889310 6236 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52pgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74d44e422b0cbbd3793553dad98475d5a93ac38530a167db0dd1ef152edac010\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52pgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e45c39722f37fc673cee930ca10396bba43b56e19e61aa99bd2b89f131341cb4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e45c39722f37fc673cee930ca10396bba43b56e19e61aa99bd2b89f131341cb4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:40:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52pgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vsjvh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:51Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:51 crc kubenswrapper[4790]: I1003 08:40:51.926444 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-4vl95" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9236b957-81cb-40bc-969e-a2057884ba50\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j2gd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j2gd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:49Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-4vl95\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:51Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:51 crc kubenswrapper[4790]: I1003 08:40:51.945095 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:51 crc kubenswrapper[4790]: I1003 08:40:51.945146 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:51 crc kubenswrapper[4790]: I1003 08:40:51.945171 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:51 crc kubenswrapper[4790]: I1003 08:40:51.945195 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:51 crc kubenswrapper[4790]: I1003 08:40:51.945206 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:51Z","lastTransitionTime":"2025-10-03T08:40:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:51 crc kubenswrapper[4790]: I1003 08:40:51.954430 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"535765e1-fd5d-4501-b13e-684c5f08b296\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e0caa9033c82a625d9594342d4e58052cfbfea8251bc0fa2f1b102ba235b5ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db5e8a267e2821ea1c1628ff477a4efc3c5bcc78538460b0f30a160bd738dcee\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d85a637b07da3de31d73486c444bf86e7e2fd799180fc1cd9ab56fee61afaa9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c713cd887b75dc54dfefd8b613f5bdb78ade644b31fc7db5e789414fcacad4b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://905a99c09cfaadc3eddc17fa6e509098364525d7a03add46a97c98dc08def8a2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T08:40:30Z\\\",\\\"message\\\":\\\"file observer\\\\nW1003 08:40:30.396497 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1003 08:40:30.399279 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 08:40:30.400119 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-247026750/tls.crt::/tmp/serving-cert-247026750/tls.key\\\\\\\"\\\\nI1003 08:40:30.809123 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1003 08:40:30.812812 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1003 08:40:30.812831 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1003 08:40:30.812852 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1003 08:40:30.812858 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1003 08:40:30.817440 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1003 08:40:30.817463 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1003 08:40:30.817482 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 08:40:30.817491 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 08:40:30.817495 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1003 08:40:30.817498 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1003 08:40:30.817501 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1003 08:40:30.817504 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1003 08:40:30.821080 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4591d1f2d844cb16eef19a65aaad416898d23ca5ce266e4863a9810d49ff4ba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://444e973af493d5aca2c924bcaee20abca5f1281e6b9c9f364e4603bf622d0e48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://444e973af493d5aca2c924bcaee20abca5f1281e6b9c9f364e4603bf622d0e48\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:40:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:14Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:51Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:51 crc kubenswrapper[4790]: I1003 08:40:51.966273 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gprtl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c5bfb61-e69a-4576-b200-18dd7f38c7f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dcb3e7b6550f0ad490e187c2154ec53cb427cbab7acaf0618eb4de188eb724c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66xc5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:34Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gprtl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:51Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:51 crc kubenswrapper[4790]: I1003 08:40:51.985099 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3362615c1396a6051d3770c79e4b67eea9d0c9df2a1a45381f729c4120970818\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://41113d498e57788a4c3162e8e981f5b9d9a5a7d49ee50b15085688fbc6b67f4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:51Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:51 crc kubenswrapper[4790]: I1003 08:40:51.999120 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7ddb59e5-6feb-471d-9594-f86e6993a2e4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b0138e08174582c0c1388716f1f6b9e42a16b009f1a467803c4dacd50c3d80ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c575d680c9bc994b8c2e97e720d794138153bc3329c4b7f8ef81115a0245d0cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://027dd0df8d83044cff9f4e8c87bef3bb18e2169bb93e615486fd1e9b5a17b86e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e837e81f3b907d8506f674d13a9bc8f20f63e6c40e1f952cd57b715855ec9bf9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:14Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:51Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:52 crc kubenswrapper[4790]: I1003 08:40:52.016677 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08d57a60251c60c6c6a2cab2b862c322c01b531ca1aa12ff4b0933bee9eb95f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:52Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:52 crc kubenswrapper[4790]: I1003 08:40:52.032732 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-hncl2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f2f5631d-8486-44f9-81b0-ee78515e7866\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3fde680972c36057c9e06b0f745acbfb2e6ab399b14d6c25a3c379b65e598afd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-strp2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-hncl2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:52Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:52 crc kubenswrapper[4790]: I1003 08:40:52.047929 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:52 crc kubenswrapper[4790]: I1003 08:40:52.047962 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:52 crc kubenswrapper[4790]: I1003 08:40:52.047973 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:52 crc kubenswrapper[4790]: I1003 08:40:52.047992 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:52 crc kubenswrapper[4790]: I1003 08:40:52.048008 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:52Z","lastTransitionTime":"2025-10-03T08:40:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:52 crc kubenswrapper[4790]: I1003 08:40:52.151695 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:52 crc kubenswrapper[4790]: I1003 08:40:52.151741 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:52 crc kubenswrapper[4790]: I1003 08:40:52.151749 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:52 crc kubenswrapper[4790]: I1003 08:40:52.151767 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:52 crc kubenswrapper[4790]: I1003 08:40:52.151777 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:52Z","lastTransitionTime":"2025-10-03T08:40:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:52 crc kubenswrapper[4790]: I1003 08:40:52.260994 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:52 crc kubenswrapper[4790]: I1003 08:40:52.261043 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:52 crc kubenswrapper[4790]: I1003 08:40:52.261056 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:52 crc kubenswrapper[4790]: I1003 08:40:52.261078 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:52 crc kubenswrapper[4790]: I1003 08:40:52.261092 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:52Z","lastTransitionTime":"2025-10-03T08:40:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:52 crc kubenswrapper[4790]: I1003 08:40:52.328109 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 08:40:52 crc kubenswrapper[4790]: I1003 08:40:52.328144 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 08:40:52 crc kubenswrapper[4790]: E1003 08:40:52.328302 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 08:40:52 crc kubenswrapper[4790]: I1003 08:40:52.328388 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 08:40:52 crc kubenswrapper[4790]: I1003 08:40:52.328428 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4vl95" Oct 03 08:40:52 crc kubenswrapper[4790]: E1003 08:40:52.328560 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 08:40:52 crc kubenswrapper[4790]: E1003 08:40:52.328687 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 08:40:52 crc kubenswrapper[4790]: E1003 08:40:52.328747 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4vl95" podUID="9236b957-81cb-40bc-969e-a2057884ba50" Oct 03 08:40:52 crc kubenswrapper[4790]: I1003 08:40:52.364385 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:52 crc kubenswrapper[4790]: I1003 08:40:52.364428 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:52 crc kubenswrapper[4790]: I1003 08:40:52.364438 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:52 crc kubenswrapper[4790]: I1003 08:40:52.364454 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:52 crc kubenswrapper[4790]: I1003 08:40:52.364463 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:52Z","lastTransitionTime":"2025-10-03T08:40:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:52 crc kubenswrapper[4790]: I1003 08:40:52.467291 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:52 crc kubenswrapper[4790]: I1003 08:40:52.467325 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:52 crc kubenswrapper[4790]: I1003 08:40:52.467336 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:52 crc kubenswrapper[4790]: I1003 08:40:52.467352 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:52 crc kubenswrapper[4790]: I1003 08:40:52.467361 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:52Z","lastTransitionTime":"2025-10-03T08:40:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:52 crc kubenswrapper[4790]: I1003 08:40:52.571224 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:52 crc kubenswrapper[4790]: I1003 08:40:52.571298 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:52 crc kubenswrapper[4790]: I1003 08:40:52.571322 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:52 crc kubenswrapper[4790]: I1003 08:40:52.571358 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:52 crc kubenswrapper[4790]: I1003 08:40:52.571380 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:52Z","lastTransitionTime":"2025-10-03T08:40:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:52 crc kubenswrapper[4790]: I1003 08:40:52.674291 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:52 crc kubenswrapper[4790]: I1003 08:40:52.674374 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:52 crc kubenswrapper[4790]: I1003 08:40:52.674393 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:52 crc kubenswrapper[4790]: I1003 08:40:52.674421 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:52 crc kubenswrapper[4790]: I1003 08:40:52.674441 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:52Z","lastTransitionTime":"2025-10-03T08:40:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:52 crc kubenswrapper[4790]: I1003 08:40:52.750093 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9236b957-81cb-40bc-969e-a2057884ba50-metrics-certs\") pod \"network-metrics-daemon-4vl95\" (UID: \"9236b957-81cb-40bc-969e-a2057884ba50\") " pod="openshift-multus/network-metrics-daemon-4vl95" Oct 03 08:40:52 crc kubenswrapper[4790]: E1003 08:40:52.750354 4790 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 03 08:40:52 crc kubenswrapper[4790]: E1003 08:40:52.750517 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9236b957-81cb-40bc-969e-a2057884ba50-metrics-certs podName:9236b957-81cb-40bc-969e-a2057884ba50 nodeName:}" failed. No retries permitted until 2025-10-03 08:40:56.750470263 +0000 UTC m=+43.103404661 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9236b957-81cb-40bc-969e-a2057884ba50-metrics-certs") pod "network-metrics-daemon-4vl95" (UID: "9236b957-81cb-40bc-969e-a2057884ba50") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 03 08:40:52 crc kubenswrapper[4790]: I1003 08:40:52.779231 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:52 crc kubenswrapper[4790]: I1003 08:40:52.779307 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:52 crc kubenswrapper[4790]: I1003 08:40:52.779323 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:52 crc kubenswrapper[4790]: I1003 08:40:52.779350 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:52 crc kubenswrapper[4790]: I1003 08:40:52.779567 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:52Z","lastTransitionTime":"2025-10-03T08:40:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:52 crc kubenswrapper[4790]: I1003 08:40:52.882731 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:52 crc kubenswrapper[4790]: I1003 08:40:52.882778 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:52 crc kubenswrapper[4790]: I1003 08:40:52.882790 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:52 crc kubenswrapper[4790]: I1003 08:40:52.882811 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:52 crc kubenswrapper[4790]: I1003 08:40:52.882825 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:52Z","lastTransitionTime":"2025-10-03T08:40:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:52 crc kubenswrapper[4790]: I1003 08:40:52.986116 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:52 crc kubenswrapper[4790]: I1003 08:40:52.986175 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:52 crc kubenswrapper[4790]: I1003 08:40:52.986187 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:52 crc kubenswrapper[4790]: I1003 08:40:52.986214 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:52 crc kubenswrapper[4790]: I1003 08:40:52.986227 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:52Z","lastTransitionTime":"2025-10-03T08:40:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:53 crc kubenswrapper[4790]: I1003 08:40:53.088821 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:53 crc kubenswrapper[4790]: I1003 08:40:53.088871 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:53 crc kubenswrapper[4790]: I1003 08:40:53.088886 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:53 crc kubenswrapper[4790]: I1003 08:40:53.088907 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:53 crc kubenswrapper[4790]: I1003 08:40:53.088917 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:53Z","lastTransitionTime":"2025-10-03T08:40:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:53 crc kubenswrapper[4790]: I1003 08:40:53.191631 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:53 crc kubenswrapper[4790]: I1003 08:40:53.191708 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:53 crc kubenswrapper[4790]: I1003 08:40:53.191725 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:53 crc kubenswrapper[4790]: I1003 08:40:53.191750 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:53 crc kubenswrapper[4790]: I1003 08:40:53.191769 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:53Z","lastTransitionTime":"2025-10-03T08:40:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:53 crc kubenswrapper[4790]: I1003 08:40:53.294208 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:53 crc kubenswrapper[4790]: I1003 08:40:53.294270 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:53 crc kubenswrapper[4790]: I1003 08:40:53.294282 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:53 crc kubenswrapper[4790]: I1003 08:40:53.294307 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:53 crc kubenswrapper[4790]: I1003 08:40:53.294320 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:53Z","lastTransitionTime":"2025-10-03T08:40:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:53 crc kubenswrapper[4790]: I1003 08:40:53.397848 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:53 crc kubenswrapper[4790]: I1003 08:40:53.397907 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:53 crc kubenswrapper[4790]: I1003 08:40:53.397919 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:53 crc kubenswrapper[4790]: I1003 08:40:53.397941 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:53 crc kubenswrapper[4790]: I1003 08:40:53.397958 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:53Z","lastTransitionTime":"2025-10-03T08:40:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:53 crc kubenswrapper[4790]: I1003 08:40:53.501481 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:53 crc kubenswrapper[4790]: I1003 08:40:53.501545 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:53 crc kubenswrapper[4790]: I1003 08:40:53.501554 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:53 crc kubenswrapper[4790]: I1003 08:40:53.501593 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:53 crc kubenswrapper[4790]: I1003 08:40:53.501608 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:53Z","lastTransitionTime":"2025-10-03T08:40:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:53 crc kubenswrapper[4790]: I1003 08:40:53.604839 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:53 crc kubenswrapper[4790]: I1003 08:40:53.604917 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:53 crc kubenswrapper[4790]: I1003 08:40:53.604928 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:53 crc kubenswrapper[4790]: I1003 08:40:53.604950 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:53 crc kubenswrapper[4790]: I1003 08:40:53.604962 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:53Z","lastTransitionTime":"2025-10-03T08:40:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:53 crc kubenswrapper[4790]: I1003 08:40:53.707702 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:53 crc kubenswrapper[4790]: I1003 08:40:53.707749 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:53 crc kubenswrapper[4790]: I1003 08:40:53.707761 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:53 crc kubenswrapper[4790]: I1003 08:40:53.707780 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:53 crc kubenswrapper[4790]: I1003 08:40:53.707802 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:53Z","lastTransitionTime":"2025-10-03T08:40:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:53 crc kubenswrapper[4790]: I1003 08:40:53.811327 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:53 crc kubenswrapper[4790]: I1003 08:40:53.811379 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:53 crc kubenswrapper[4790]: I1003 08:40:53.811388 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:53 crc kubenswrapper[4790]: I1003 08:40:53.811405 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:53 crc kubenswrapper[4790]: I1003 08:40:53.811426 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:53Z","lastTransitionTime":"2025-10-03T08:40:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:53 crc kubenswrapper[4790]: I1003 08:40:53.913729 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:53 crc kubenswrapper[4790]: I1003 08:40:53.913787 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:53 crc kubenswrapper[4790]: I1003 08:40:53.913802 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:53 crc kubenswrapper[4790]: I1003 08:40:53.913824 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:53 crc kubenswrapper[4790]: I1003 08:40:53.913838 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:53Z","lastTransitionTime":"2025-10-03T08:40:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:54 crc kubenswrapper[4790]: I1003 08:40:54.016936 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:54 crc kubenswrapper[4790]: I1003 08:40:54.016995 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:54 crc kubenswrapper[4790]: I1003 08:40:54.017007 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:54 crc kubenswrapper[4790]: I1003 08:40:54.017027 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:54 crc kubenswrapper[4790]: I1003 08:40:54.017038 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:54Z","lastTransitionTime":"2025-10-03T08:40:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:54 crc kubenswrapper[4790]: I1003 08:40:54.119900 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:54 crc kubenswrapper[4790]: I1003 08:40:54.119943 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:54 crc kubenswrapper[4790]: I1003 08:40:54.119953 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:54 crc kubenswrapper[4790]: I1003 08:40:54.119969 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:54 crc kubenswrapper[4790]: I1003 08:40:54.119980 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:54Z","lastTransitionTime":"2025-10-03T08:40:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:54 crc kubenswrapper[4790]: I1003 08:40:54.222973 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:54 crc kubenswrapper[4790]: I1003 08:40:54.223021 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:54 crc kubenswrapper[4790]: I1003 08:40:54.223035 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:54 crc kubenswrapper[4790]: I1003 08:40:54.223051 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:54 crc kubenswrapper[4790]: I1003 08:40:54.223062 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:54Z","lastTransitionTime":"2025-10-03T08:40:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:54 crc kubenswrapper[4790]: I1003 08:40:54.325665 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:54 crc kubenswrapper[4790]: I1003 08:40:54.325718 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:54 crc kubenswrapper[4790]: I1003 08:40:54.325733 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:54 crc kubenswrapper[4790]: I1003 08:40:54.325753 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:54 crc kubenswrapper[4790]: I1003 08:40:54.325771 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:54Z","lastTransitionTime":"2025-10-03T08:40:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:54 crc kubenswrapper[4790]: I1003 08:40:54.328080 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 08:40:54 crc kubenswrapper[4790]: I1003 08:40:54.328137 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 08:40:54 crc kubenswrapper[4790]: E1003 08:40:54.328232 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 08:40:54 crc kubenswrapper[4790]: I1003 08:40:54.328166 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4vl95" Oct 03 08:40:54 crc kubenswrapper[4790]: I1003 08:40:54.328362 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 08:40:54 crc kubenswrapper[4790]: E1003 08:40:54.328553 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 08:40:54 crc kubenswrapper[4790]: E1003 08:40:54.328709 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4vl95" podUID="9236b957-81cb-40bc-969e-a2057884ba50" Oct 03 08:40:54 crc kubenswrapper[4790]: E1003 08:40:54.328852 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 08:40:54 crc kubenswrapper[4790]: I1003 08:40:54.343330 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:54Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:54 crc kubenswrapper[4790]: I1003 08:40:54.357383 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9fnsr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e01f9373-74e4-439b-bd7f-ad00e60a3284\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45be0671b4ad4d5e294f87dba5d52747298b00a3ebd9a13f2845881c0f1846a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krfjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://876c35e32e5877c2a69cdbb54b84ec0bc0daebd05c7847875a1c122aafac2057\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://876c35e32e5877c2a69cdbb54b84ec0bc0daebd05c7847875a1c122aafac2057\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krfjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://938ff3fffad5347212d00d02f96c105cd5e6252a4d6a7ada258f0367da3f8e77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://938ff3fffad5347212d00d02f96c105cd5e6252a4d6a7ada258f0367da3f8e77\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:40:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krfjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca99fc95f4bc463b8e638cfa1e6dc6be77fc4111b6ec61b4b2913fdf91f46671\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ca99fc95f4bc463b8e638cfa1e6dc6be77fc4111b6ec61b4b2913fdf91f46671\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:40:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krfjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9667f80988a61a4f26f6f2b8b047ef94bdc6fa052083d5d1285f5bd9235615ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9667f80988a61a4f26f6f2b8b047ef94bdc6fa052083d5d1285f5bd9235615ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:40:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krfjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a4e7fb7b6f922e6e0fa2a7fcb5476ba549a6aa21dc88ee145bc55f4dac0912bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a4e7fb7b6f922e6e0fa2a7fcb5476ba549a6aa21dc88ee145bc55f4dac0912bb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:40:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krfjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://76a8239bfbb38c5858f1f9b0b01b9552463d8b2774564d7cdacb6757bf053656\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76a8239bfbb38c5858f1f9b0b01b9552463d8b2774564d7cdacb6757bf053656\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:40:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krfjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9fnsr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:54Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:54 crc kubenswrapper[4790]: I1003 08:40:54.371627 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7bpwn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a3923f73-5efb-4ce3-b68f-0c4e1e8eca4b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://305c950f0352fe1bec1c151dc89eed1389144ad5ee321458d48c6609e688bb73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gt4gf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:37Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7bpwn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:54Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:54 crc kubenswrapper[4790]: I1003 08:40:54.386243 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-fn764" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12525ca4-5242-41bc-b7ba-2c1ab5a45892\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ee4f33c772a396fdd8d2a6f958d402139cb81cfd79da0f32c83bbe9adbe0801\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2t6qw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d79cdfa897fe5ca615930826e3a153a184ab7977626a9837dbcdbddbe4d27af6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2t6qw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-fn764\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:54Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:54 crc kubenswrapper[4790]: I1003 08:40:54.400954 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28b99cc027b4390bdc7e3ed519bb798eb4a7b825ff29e4574f3420b9922265b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:54Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:54 crc kubenswrapper[4790]: I1003 08:40:54.415047 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:54Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:54 crc kubenswrapper[4790]: I1003 08:40:54.428621 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:54 crc kubenswrapper[4790]: I1003 08:40:54.428673 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:54 crc kubenswrapper[4790]: I1003 08:40:54.428692 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:54 crc kubenswrapper[4790]: I1003 08:40:54.428710 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:54 crc kubenswrapper[4790]: I1003 08:40:54.428708 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:54Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:54 crc kubenswrapper[4790]: I1003 08:40:54.428723 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:54Z","lastTransitionTime":"2025-10-03T08:40:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:54 crc kubenswrapper[4790]: I1003 08:40:54.442739 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"36eae06a-d8be-4128-8d68-acb32e1f011b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a97a1a7f3aba494cd0da6bcd15052a85d70642642a62d108ac0be51ce8e3d302\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww2bb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8c3aea222eb433729d8eac107ebcdb306f649bef25942cd0390bfc1bbda5c74\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww2bb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:35Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-zqj4j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:54Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:54 crc kubenswrapper[4790]: I1003 08:40:54.463492 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vsjvh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa372dcc-63aa-48e5-b153-7739078026ef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e69102d376e90485ca565d34e18dbc12067e166f0bdf3c4325a0873f9f2b9b0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52pgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e52e55afef7b68f05e0c7d975d4497ee57dd9fa5aef7f72e9ca7441ea1633c05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52pgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55b4e8ab33316a7f7057ee723ebd49eb0adcd7270124f084b2828ce538ed3297\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52pgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f2aaa340e3927261ef0119f786b348e2091ddcd4b791619cc521122cc0774e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52pgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff0f1dd623e164ebe1862661c5905a38bec3aa1dd5c61982eebde25d0feb4c6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52pgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27b1573f76936a5148d24accaa7fdc1a1da552fb56b348812445e7f1c1e4cbef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52pgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fb145beaf63d143f4e9dbaa8da4340794905d8ee9b2b317b7f5389999c43b09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f23e27271cc758d5c79c89c2c23cc95fdeec4f05bc5e026a05e0ce68da68541b\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-03T08:40:46Z\\\",\\\"message\\\":\\\"qos/v1/apis/informers/externalversions/factory.go:140\\\\nI1003 08:40:46.779070 6066 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1003 08:40:46.779119 6066 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1003 08:40:46.779154 6066 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1003 08:40:46.779161 6066 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1003 08:40:46.779205 6066 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1003 08:40:46.779253 6066 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1003 08:40:46.779260 6066 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1003 08:40:46.779333 6066 factory.go:656] Stopping watch factory\\\\nI1003 08:40:46.779335 6066 handler.go:208] Removed *v1.Node event handler 2\\\\nI1003 08:40:46.779354 6066 ovnkube.go:599] Stopped ovnkube\\\\nI1003 08:40:46.779365 6066 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1003 08:40:46.779375 6066 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1003 08:40:46.779383 6066 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1003 08:40:46.779397 6066 handler.go:208] Removed *v1.Node event handler 7\\\\nI1003 08:40:46.779363 6066 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1003 08\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:43Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6fb145beaf63d143f4e9dbaa8da4340794905d8ee9b2b317b7f5389999c43b09\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-03T08:40:49Z\\\",\\\"message\\\":\\\"6 for removal\\\\nI1003 08:40:48.888763 6236 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1003 08:40:48.888772 6236 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1003 08:40:48.888784 6236 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1003 08:40:48.888812 6236 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1003 08:40:48.888830 6236 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1003 08:40:48.888831 6236 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1003 08:40:48.888804 6236 factory.go:656] Stopping watch factory\\\\nI1003 08:40:48.888954 6236 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI1003 08:40:48.889061 6236 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-machine-api/cluster-autoscaler-operator\\\\\\\"}\\\\nI1003 08:40:48.889112 6236 services_controller.go:360] Finished syncing service cluster-autoscaler-operator on namespace openshift-machine-api for network=default : 2.188891ms\\\\nI1003 08:40:48.889115 6236 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI1003 08:40:48.889164 6236 ovnkube.go:599] Stopped ovnkube\\\\nI1003 08:40:48.889201 6236 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1003 08:40:48.889310 6236 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52pgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74d44e422b0cbbd3793553dad98475d5a93ac38530a167db0dd1ef152edac010\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52pgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e45c39722f37fc673cee930ca10396bba43b56e19e61aa99bd2b89f131341cb4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e45c39722f37fc673cee930ca10396bba43b56e19e61aa99bd2b89f131341cb4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:40:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52pgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vsjvh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:54Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:54 crc kubenswrapper[4790]: I1003 08:40:54.474086 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-4vl95" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9236b957-81cb-40bc-969e-a2057884ba50\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j2gd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j2gd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:49Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-4vl95\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:54Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:54 crc kubenswrapper[4790]: I1003 08:40:54.488077 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"535765e1-fd5d-4501-b13e-684c5f08b296\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e0caa9033c82a625d9594342d4e58052cfbfea8251bc0fa2f1b102ba235b5ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db5e8a267e2821ea1c1628ff477a4efc3c5bcc78538460b0f30a160bd738dcee\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d85a637b07da3de31d73486c444bf86e7e2fd799180fc1cd9ab56fee61afaa9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c713cd887b75dc54dfefd8b613f5bdb78ade644b31fc7db5e789414fcacad4b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://905a99c09cfaadc3eddc17fa6e509098364525d7a03add46a97c98dc08def8a2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T08:40:30Z\\\",\\\"message\\\":\\\"file observer\\\\nW1003 08:40:30.396497 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1003 08:40:30.399279 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 08:40:30.400119 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-247026750/tls.crt::/tmp/serving-cert-247026750/tls.key\\\\\\\"\\\\nI1003 08:40:30.809123 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1003 08:40:30.812812 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1003 08:40:30.812831 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1003 08:40:30.812852 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1003 08:40:30.812858 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1003 08:40:30.817440 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1003 08:40:30.817463 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1003 08:40:30.817482 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 08:40:30.817491 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 08:40:30.817495 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1003 08:40:30.817498 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1003 08:40:30.817501 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1003 08:40:30.817504 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1003 08:40:30.821080 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4591d1f2d844cb16eef19a65aaad416898d23ca5ce266e4863a9810d49ff4ba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://444e973af493d5aca2c924bcaee20abca5f1281e6b9c9f364e4603bf622d0e48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://444e973af493d5aca2c924bcaee20abca5f1281e6b9c9f364e4603bf622d0e48\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:40:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:14Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:54Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:54 crc kubenswrapper[4790]: I1003 08:40:54.500962 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gprtl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c5bfb61-e69a-4576-b200-18dd7f38c7f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dcb3e7b6550f0ad490e187c2154ec53cb427cbab7acaf0618eb4de188eb724c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66xc5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:34Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gprtl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:54Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:54 crc kubenswrapper[4790]: I1003 08:40:54.515228 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3362615c1396a6051d3770c79e4b67eea9d0c9df2a1a45381f729c4120970818\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://41113d498e57788a4c3162e8e981f5b9d9a5a7d49ee50b15085688fbc6b67f4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:54Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:54 crc kubenswrapper[4790]: I1003 08:40:54.531204 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:54 crc kubenswrapper[4790]: I1003 08:40:54.531670 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:54 crc kubenswrapper[4790]: I1003 08:40:54.531682 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:54 crc kubenswrapper[4790]: I1003 08:40:54.531702 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:54 crc kubenswrapper[4790]: I1003 08:40:54.531715 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:54Z","lastTransitionTime":"2025-10-03T08:40:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:54 crc kubenswrapper[4790]: I1003 08:40:54.536208 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7ddb59e5-6feb-471d-9594-f86e6993a2e4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b0138e08174582c0c1388716f1f6b9e42a16b009f1a467803c4dacd50c3d80ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c575d680c9bc994b8c2e97e720d794138153bc3329c4b7f8ef81115a0245d0cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://027dd0df8d83044cff9f4e8c87bef3bb18e2169bb93e615486fd1e9b5a17b86e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e837e81f3b907d8506f674d13a9bc8f20f63e6c40e1f952cd57b715855ec9bf9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:14Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:54Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:54 crc kubenswrapper[4790]: I1003 08:40:54.551701 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08d57a60251c60c6c6a2cab2b862c322c01b531ca1aa12ff4b0933bee9eb95f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:54Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:54 crc kubenswrapper[4790]: I1003 08:40:54.571822 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-hncl2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f2f5631d-8486-44f9-81b0-ee78515e7866\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3fde680972c36057c9e06b0f745acbfb2e6ab399b14d6c25a3c379b65e598afd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-strp2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-hncl2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:40:54Z is after 2025-08-24T17:21:41Z" Oct 03 08:40:54 crc kubenswrapper[4790]: I1003 08:40:54.634237 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:54 crc kubenswrapper[4790]: I1003 08:40:54.634645 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:54 crc kubenswrapper[4790]: I1003 08:40:54.634773 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:54 crc kubenswrapper[4790]: I1003 08:40:54.634913 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:54 crc kubenswrapper[4790]: I1003 08:40:54.635026 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:54Z","lastTransitionTime":"2025-10-03T08:40:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:54 crc kubenswrapper[4790]: I1003 08:40:54.737477 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:54 crc kubenswrapper[4790]: I1003 08:40:54.737524 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:54 crc kubenswrapper[4790]: I1003 08:40:54.737537 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:54 crc kubenswrapper[4790]: I1003 08:40:54.737560 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:54 crc kubenswrapper[4790]: I1003 08:40:54.737593 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:54Z","lastTransitionTime":"2025-10-03T08:40:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:54 crc kubenswrapper[4790]: I1003 08:40:54.840519 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:54 crc kubenswrapper[4790]: I1003 08:40:54.840565 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:54 crc kubenswrapper[4790]: I1003 08:40:54.840600 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:54 crc kubenswrapper[4790]: I1003 08:40:54.840617 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:54 crc kubenswrapper[4790]: I1003 08:40:54.840626 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:54Z","lastTransitionTime":"2025-10-03T08:40:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:54 crc kubenswrapper[4790]: I1003 08:40:54.942898 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:54 crc kubenswrapper[4790]: I1003 08:40:54.942947 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:54 crc kubenswrapper[4790]: I1003 08:40:54.942957 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:54 crc kubenswrapper[4790]: I1003 08:40:54.942972 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:54 crc kubenswrapper[4790]: I1003 08:40:54.942984 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:54Z","lastTransitionTime":"2025-10-03T08:40:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:55 crc kubenswrapper[4790]: I1003 08:40:55.045217 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:55 crc kubenswrapper[4790]: I1003 08:40:55.045246 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:55 crc kubenswrapper[4790]: I1003 08:40:55.045253 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:55 crc kubenswrapper[4790]: I1003 08:40:55.045266 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:55 crc kubenswrapper[4790]: I1003 08:40:55.045275 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:55Z","lastTransitionTime":"2025-10-03T08:40:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:55 crc kubenswrapper[4790]: I1003 08:40:55.148765 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:55 crc kubenswrapper[4790]: I1003 08:40:55.148819 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:55 crc kubenswrapper[4790]: I1003 08:40:55.148829 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:55 crc kubenswrapper[4790]: I1003 08:40:55.148853 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:55 crc kubenswrapper[4790]: I1003 08:40:55.148864 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:55Z","lastTransitionTime":"2025-10-03T08:40:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:55 crc kubenswrapper[4790]: I1003 08:40:55.252486 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:55 crc kubenswrapper[4790]: I1003 08:40:55.252538 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:55 crc kubenswrapper[4790]: I1003 08:40:55.252548 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:55 crc kubenswrapper[4790]: I1003 08:40:55.252593 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:55 crc kubenswrapper[4790]: I1003 08:40:55.252606 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:55Z","lastTransitionTime":"2025-10-03T08:40:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:55 crc kubenswrapper[4790]: I1003 08:40:55.354721 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:55 crc kubenswrapper[4790]: I1003 08:40:55.354792 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:55 crc kubenswrapper[4790]: I1003 08:40:55.354804 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:55 crc kubenswrapper[4790]: I1003 08:40:55.354828 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:55 crc kubenswrapper[4790]: I1003 08:40:55.354843 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:55Z","lastTransitionTime":"2025-10-03T08:40:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:55 crc kubenswrapper[4790]: I1003 08:40:55.458052 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:55 crc kubenswrapper[4790]: I1003 08:40:55.458097 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:55 crc kubenswrapper[4790]: I1003 08:40:55.458112 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:55 crc kubenswrapper[4790]: I1003 08:40:55.458130 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:55 crc kubenswrapper[4790]: I1003 08:40:55.458141 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:55Z","lastTransitionTime":"2025-10-03T08:40:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:55 crc kubenswrapper[4790]: I1003 08:40:55.561136 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:55 crc kubenswrapper[4790]: I1003 08:40:55.561185 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:55 crc kubenswrapper[4790]: I1003 08:40:55.561196 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:55 crc kubenswrapper[4790]: I1003 08:40:55.561211 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:55 crc kubenswrapper[4790]: I1003 08:40:55.561222 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:55Z","lastTransitionTime":"2025-10-03T08:40:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:55 crc kubenswrapper[4790]: I1003 08:40:55.664335 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:55 crc kubenswrapper[4790]: I1003 08:40:55.664386 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:55 crc kubenswrapper[4790]: I1003 08:40:55.664397 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:55 crc kubenswrapper[4790]: I1003 08:40:55.664413 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:55 crc kubenswrapper[4790]: I1003 08:40:55.664425 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:55Z","lastTransitionTime":"2025-10-03T08:40:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:55 crc kubenswrapper[4790]: I1003 08:40:55.768319 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:55 crc kubenswrapper[4790]: I1003 08:40:55.768370 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:55 crc kubenswrapper[4790]: I1003 08:40:55.768387 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:55 crc kubenswrapper[4790]: I1003 08:40:55.768413 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:55 crc kubenswrapper[4790]: I1003 08:40:55.768431 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:55Z","lastTransitionTime":"2025-10-03T08:40:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:55 crc kubenswrapper[4790]: I1003 08:40:55.871005 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:55 crc kubenswrapper[4790]: I1003 08:40:55.871057 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:55 crc kubenswrapper[4790]: I1003 08:40:55.871068 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:55 crc kubenswrapper[4790]: I1003 08:40:55.871088 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:55 crc kubenswrapper[4790]: I1003 08:40:55.871104 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:55Z","lastTransitionTime":"2025-10-03T08:40:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:55 crc kubenswrapper[4790]: I1003 08:40:55.974704 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:55 crc kubenswrapper[4790]: I1003 08:40:55.974761 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:55 crc kubenswrapper[4790]: I1003 08:40:55.974772 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:55 crc kubenswrapper[4790]: I1003 08:40:55.974789 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:55 crc kubenswrapper[4790]: I1003 08:40:55.974803 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:55Z","lastTransitionTime":"2025-10-03T08:40:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:56 crc kubenswrapper[4790]: I1003 08:40:56.078668 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:56 crc kubenswrapper[4790]: I1003 08:40:56.078730 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:56 crc kubenswrapper[4790]: I1003 08:40:56.078740 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:56 crc kubenswrapper[4790]: I1003 08:40:56.078755 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:56 crc kubenswrapper[4790]: I1003 08:40:56.078765 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:56Z","lastTransitionTime":"2025-10-03T08:40:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:56 crc kubenswrapper[4790]: I1003 08:40:56.182194 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:56 crc kubenswrapper[4790]: I1003 08:40:56.182249 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:56 crc kubenswrapper[4790]: I1003 08:40:56.182260 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:56 crc kubenswrapper[4790]: I1003 08:40:56.182281 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:56 crc kubenswrapper[4790]: I1003 08:40:56.182293 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:56Z","lastTransitionTime":"2025-10-03T08:40:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:56 crc kubenswrapper[4790]: I1003 08:40:56.285553 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:56 crc kubenswrapper[4790]: I1003 08:40:56.285657 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:56 crc kubenswrapper[4790]: I1003 08:40:56.285673 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:56 crc kubenswrapper[4790]: I1003 08:40:56.285696 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:56 crc kubenswrapper[4790]: I1003 08:40:56.285713 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:56Z","lastTransitionTime":"2025-10-03T08:40:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:56 crc kubenswrapper[4790]: I1003 08:40:56.327687 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4vl95" Oct 03 08:40:56 crc kubenswrapper[4790]: I1003 08:40:56.327874 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 08:40:56 crc kubenswrapper[4790]: E1003 08:40:56.327902 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4vl95" podUID="9236b957-81cb-40bc-969e-a2057884ba50" Oct 03 08:40:56 crc kubenswrapper[4790]: I1003 08:40:56.327991 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 08:40:56 crc kubenswrapper[4790]: I1003 08:40:56.328072 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 08:40:56 crc kubenswrapper[4790]: E1003 08:40:56.328189 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 08:40:56 crc kubenswrapper[4790]: E1003 08:40:56.328272 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 08:40:56 crc kubenswrapper[4790]: E1003 08:40:56.328327 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 08:40:56 crc kubenswrapper[4790]: I1003 08:40:56.388675 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:56 crc kubenswrapper[4790]: I1003 08:40:56.388733 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:56 crc kubenswrapper[4790]: I1003 08:40:56.388749 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:56 crc kubenswrapper[4790]: I1003 08:40:56.388773 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:56 crc kubenswrapper[4790]: I1003 08:40:56.388791 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:56Z","lastTransitionTime":"2025-10-03T08:40:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:56 crc kubenswrapper[4790]: I1003 08:40:56.492475 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:56 crc kubenswrapper[4790]: I1003 08:40:56.492538 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:56 crc kubenswrapper[4790]: I1003 08:40:56.492556 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:56 crc kubenswrapper[4790]: I1003 08:40:56.492610 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:56 crc kubenswrapper[4790]: I1003 08:40:56.492632 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:56Z","lastTransitionTime":"2025-10-03T08:40:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:56 crc kubenswrapper[4790]: I1003 08:40:56.597137 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:56 crc kubenswrapper[4790]: I1003 08:40:56.597178 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:56 crc kubenswrapper[4790]: I1003 08:40:56.597188 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:56 crc kubenswrapper[4790]: I1003 08:40:56.597207 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:56 crc kubenswrapper[4790]: I1003 08:40:56.597217 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:56Z","lastTransitionTime":"2025-10-03T08:40:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:56 crc kubenswrapper[4790]: I1003 08:40:56.700854 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:56 crc kubenswrapper[4790]: I1003 08:40:56.700907 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:56 crc kubenswrapper[4790]: I1003 08:40:56.700916 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:56 crc kubenswrapper[4790]: I1003 08:40:56.700932 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:56 crc kubenswrapper[4790]: I1003 08:40:56.700942 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:56Z","lastTransitionTime":"2025-10-03T08:40:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:56 crc kubenswrapper[4790]: I1003 08:40:56.795775 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9236b957-81cb-40bc-969e-a2057884ba50-metrics-certs\") pod \"network-metrics-daemon-4vl95\" (UID: \"9236b957-81cb-40bc-969e-a2057884ba50\") " pod="openshift-multus/network-metrics-daemon-4vl95" Oct 03 08:40:56 crc kubenswrapper[4790]: E1003 08:40:56.795963 4790 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 03 08:40:56 crc kubenswrapper[4790]: E1003 08:40:56.796030 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9236b957-81cb-40bc-969e-a2057884ba50-metrics-certs podName:9236b957-81cb-40bc-969e-a2057884ba50 nodeName:}" failed. No retries permitted until 2025-10-03 08:41:04.796013127 +0000 UTC m=+51.148947365 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9236b957-81cb-40bc-969e-a2057884ba50-metrics-certs") pod "network-metrics-daemon-4vl95" (UID: "9236b957-81cb-40bc-969e-a2057884ba50") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 03 08:40:56 crc kubenswrapper[4790]: I1003 08:40:56.809167 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:56 crc kubenswrapper[4790]: I1003 08:40:56.809217 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:56 crc kubenswrapper[4790]: I1003 08:40:56.809231 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:56 crc kubenswrapper[4790]: I1003 08:40:56.809249 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:56 crc kubenswrapper[4790]: I1003 08:40:56.809262 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:56Z","lastTransitionTime":"2025-10-03T08:40:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:56 crc kubenswrapper[4790]: I1003 08:40:56.912151 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:56 crc kubenswrapper[4790]: I1003 08:40:56.912229 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:56 crc kubenswrapper[4790]: I1003 08:40:56.912256 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:56 crc kubenswrapper[4790]: I1003 08:40:56.912290 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:56 crc kubenswrapper[4790]: I1003 08:40:56.912316 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:56Z","lastTransitionTime":"2025-10-03T08:40:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:57 crc kubenswrapper[4790]: I1003 08:40:57.015564 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:57 crc kubenswrapper[4790]: I1003 08:40:57.015646 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:57 crc kubenswrapper[4790]: I1003 08:40:57.015659 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:57 crc kubenswrapper[4790]: I1003 08:40:57.015678 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:57 crc kubenswrapper[4790]: I1003 08:40:57.015693 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:57Z","lastTransitionTime":"2025-10-03T08:40:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:57 crc kubenswrapper[4790]: I1003 08:40:57.119260 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:57 crc kubenswrapper[4790]: I1003 08:40:57.119334 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:57 crc kubenswrapper[4790]: I1003 08:40:57.119355 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:57 crc kubenswrapper[4790]: I1003 08:40:57.119385 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:57 crc kubenswrapper[4790]: I1003 08:40:57.119420 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:57Z","lastTransitionTime":"2025-10-03T08:40:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:57 crc kubenswrapper[4790]: I1003 08:40:57.223276 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:57 crc kubenswrapper[4790]: I1003 08:40:57.223770 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:57 crc kubenswrapper[4790]: I1003 08:40:57.223960 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:57 crc kubenswrapper[4790]: I1003 08:40:57.224203 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:57 crc kubenswrapper[4790]: I1003 08:40:57.224404 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:57Z","lastTransitionTime":"2025-10-03T08:40:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:57 crc kubenswrapper[4790]: I1003 08:40:57.327856 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:57 crc kubenswrapper[4790]: I1003 08:40:57.327930 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:57 crc kubenswrapper[4790]: I1003 08:40:57.327949 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:57 crc kubenswrapper[4790]: I1003 08:40:57.327971 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:57 crc kubenswrapper[4790]: I1003 08:40:57.327991 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:57Z","lastTransitionTime":"2025-10-03T08:40:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:57 crc kubenswrapper[4790]: I1003 08:40:57.430818 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:57 crc kubenswrapper[4790]: I1003 08:40:57.431283 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:57 crc kubenswrapper[4790]: I1003 08:40:57.431390 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:57 crc kubenswrapper[4790]: I1003 08:40:57.431494 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:57 crc kubenswrapper[4790]: I1003 08:40:57.431564 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:57Z","lastTransitionTime":"2025-10-03T08:40:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:57 crc kubenswrapper[4790]: I1003 08:40:57.535025 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:57 crc kubenswrapper[4790]: I1003 08:40:57.535057 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:57 crc kubenswrapper[4790]: I1003 08:40:57.535068 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:57 crc kubenswrapper[4790]: I1003 08:40:57.535083 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:57 crc kubenswrapper[4790]: I1003 08:40:57.535113 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:57Z","lastTransitionTime":"2025-10-03T08:40:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:57 crc kubenswrapper[4790]: I1003 08:40:57.639046 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:57 crc kubenswrapper[4790]: I1003 08:40:57.639123 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:57 crc kubenswrapper[4790]: I1003 08:40:57.639141 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:57 crc kubenswrapper[4790]: I1003 08:40:57.639169 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:57 crc kubenswrapper[4790]: I1003 08:40:57.639190 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:57Z","lastTransitionTime":"2025-10-03T08:40:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:57 crc kubenswrapper[4790]: I1003 08:40:57.742217 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:57 crc kubenswrapper[4790]: I1003 08:40:57.742257 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:57 crc kubenswrapper[4790]: I1003 08:40:57.742287 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:57 crc kubenswrapper[4790]: I1003 08:40:57.742306 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:57 crc kubenswrapper[4790]: I1003 08:40:57.742318 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:57Z","lastTransitionTime":"2025-10-03T08:40:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:57 crc kubenswrapper[4790]: I1003 08:40:57.845210 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:57 crc kubenswrapper[4790]: I1003 08:40:57.845279 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:57 crc kubenswrapper[4790]: I1003 08:40:57.845293 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:57 crc kubenswrapper[4790]: I1003 08:40:57.845317 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:57 crc kubenswrapper[4790]: I1003 08:40:57.845331 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:57Z","lastTransitionTime":"2025-10-03T08:40:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:57 crc kubenswrapper[4790]: I1003 08:40:57.949341 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:57 crc kubenswrapper[4790]: I1003 08:40:57.949399 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:57 crc kubenswrapper[4790]: I1003 08:40:57.949419 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:57 crc kubenswrapper[4790]: I1003 08:40:57.949447 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:57 crc kubenswrapper[4790]: I1003 08:40:57.949481 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:57Z","lastTransitionTime":"2025-10-03T08:40:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:58 crc kubenswrapper[4790]: I1003 08:40:58.052682 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:58 crc kubenswrapper[4790]: I1003 08:40:58.052763 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:58 crc kubenswrapper[4790]: I1003 08:40:58.052778 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:58 crc kubenswrapper[4790]: I1003 08:40:58.053134 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:58 crc kubenswrapper[4790]: I1003 08:40:58.053177 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:58Z","lastTransitionTime":"2025-10-03T08:40:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:58 crc kubenswrapper[4790]: I1003 08:40:58.156100 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:58 crc kubenswrapper[4790]: I1003 08:40:58.156142 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:58 crc kubenswrapper[4790]: I1003 08:40:58.156156 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:58 crc kubenswrapper[4790]: I1003 08:40:58.156174 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:58 crc kubenswrapper[4790]: I1003 08:40:58.156184 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:58Z","lastTransitionTime":"2025-10-03T08:40:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:58 crc kubenswrapper[4790]: I1003 08:40:58.259666 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:58 crc kubenswrapper[4790]: I1003 08:40:58.259777 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:58 crc kubenswrapper[4790]: I1003 08:40:58.259793 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:58 crc kubenswrapper[4790]: I1003 08:40:58.259816 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:58 crc kubenswrapper[4790]: I1003 08:40:58.259847 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:58Z","lastTransitionTime":"2025-10-03T08:40:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:58 crc kubenswrapper[4790]: I1003 08:40:58.327882 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 08:40:58 crc kubenswrapper[4790]: I1003 08:40:58.327944 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 08:40:58 crc kubenswrapper[4790]: I1003 08:40:58.327918 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4vl95" Oct 03 08:40:58 crc kubenswrapper[4790]: I1003 08:40:58.327917 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 08:40:58 crc kubenswrapper[4790]: E1003 08:40:58.328069 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 08:40:58 crc kubenswrapper[4790]: E1003 08:40:58.328175 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4vl95" podUID="9236b957-81cb-40bc-969e-a2057884ba50" Oct 03 08:40:58 crc kubenswrapper[4790]: E1003 08:40:58.328262 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 08:40:58 crc kubenswrapper[4790]: E1003 08:40:58.328353 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 08:40:58 crc kubenswrapper[4790]: I1003 08:40:58.362370 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:58 crc kubenswrapper[4790]: I1003 08:40:58.362904 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:58 crc kubenswrapper[4790]: I1003 08:40:58.363104 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:58 crc kubenswrapper[4790]: I1003 08:40:58.363355 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:58 crc kubenswrapper[4790]: I1003 08:40:58.363566 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:58Z","lastTransitionTime":"2025-10-03T08:40:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:58 crc kubenswrapper[4790]: I1003 08:40:58.465941 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:58 crc kubenswrapper[4790]: I1003 08:40:58.466020 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:58 crc kubenswrapper[4790]: I1003 08:40:58.466033 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:58 crc kubenswrapper[4790]: I1003 08:40:58.466049 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:58 crc kubenswrapper[4790]: I1003 08:40:58.466061 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:58Z","lastTransitionTime":"2025-10-03T08:40:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:58 crc kubenswrapper[4790]: I1003 08:40:58.569277 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:58 crc kubenswrapper[4790]: I1003 08:40:58.569346 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:58 crc kubenswrapper[4790]: I1003 08:40:58.569360 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:58 crc kubenswrapper[4790]: I1003 08:40:58.569387 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:58 crc kubenswrapper[4790]: I1003 08:40:58.569412 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:58Z","lastTransitionTime":"2025-10-03T08:40:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:58 crc kubenswrapper[4790]: I1003 08:40:58.673350 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:58 crc kubenswrapper[4790]: I1003 08:40:58.673415 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:58 crc kubenswrapper[4790]: I1003 08:40:58.673427 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:58 crc kubenswrapper[4790]: I1003 08:40:58.673447 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:58 crc kubenswrapper[4790]: I1003 08:40:58.673461 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:58Z","lastTransitionTime":"2025-10-03T08:40:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:58 crc kubenswrapper[4790]: I1003 08:40:58.776926 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:58 crc kubenswrapper[4790]: I1003 08:40:58.776998 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:58 crc kubenswrapper[4790]: I1003 08:40:58.777019 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:58 crc kubenswrapper[4790]: I1003 08:40:58.777049 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:58 crc kubenswrapper[4790]: I1003 08:40:58.777069 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:58Z","lastTransitionTime":"2025-10-03T08:40:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:58 crc kubenswrapper[4790]: I1003 08:40:58.879929 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:58 crc kubenswrapper[4790]: I1003 08:40:58.879991 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:58 crc kubenswrapper[4790]: I1003 08:40:58.880007 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:58 crc kubenswrapper[4790]: I1003 08:40:58.880031 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:58 crc kubenswrapper[4790]: I1003 08:40:58.880043 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:58Z","lastTransitionTime":"2025-10-03T08:40:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:58 crc kubenswrapper[4790]: I1003 08:40:58.983552 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:58 crc kubenswrapper[4790]: I1003 08:40:58.983628 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:58 crc kubenswrapper[4790]: I1003 08:40:58.983658 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:58 crc kubenswrapper[4790]: I1003 08:40:58.983683 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:58 crc kubenswrapper[4790]: I1003 08:40:58.983696 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:58Z","lastTransitionTime":"2025-10-03T08:40:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:59 crc kubenswrapper[4790]: I1003 08:40:59.086692 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:59 crc kubenswrapper[4790]: I1003 08:40:59.086747 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:59 crc kubenswrapper[4790]: I1003 08:40:59.086758 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:59 crc kubenswrapper[4790]: I1003 08:40:59.086775 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:59 crc kubenswrapper[4790]: I1003 08:40:59.086790 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:59Z","lastTransitionTime":"2025-10-03T08:40:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:59 crc kubenswrapper[4790]: I1003 08:40:59.190806 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:59 crc kubenswrapper[4790]: I1003 08:40:59.190908 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:59 crc kubenswrapper[4790]: I1003 08:40:59.190934 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:59 crc kubenswrapper[4790]: I1003 08:40:59.190974 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:59 crc kubenswrapper[4790]: I1003 08:40:59.190998 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:59Z","lastTransitionTime":"2025-10-03T08:40:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:59 crc kubenswrapper[4790]: I1003 08:40:59.294130 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:59 crc kubenswrapper[4790]: I1003 08:40:59.294200 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:59 crc kubenswrapper[4790]: I1003 08:40:59.294214 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:59 crc kubenswrapper[4790]: I1003 08:40:59.294238 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:59 crc kubenswrapper[4790]: I1003 08:40:59.294252 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:59Z","lastTransitionTime":"2025-10-03T08:40:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:59 crc kubenswrapper[4790]: I1003 08:40:59.397764 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:59 crc kubenswrapper[4790]: I1003 08:40:59.397863 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:59 crc kubenswrapper[4790]: I1003 08:40:59.397878 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:59 crc kubenswrapper[4790]: I1003 08:40:59.397903 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:59 crc kubenswrapper[4790]: I1003 08:40:59.397918 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:59Z","lastTransitionTime":"2025-10-03T08:40:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:59 crc kubenswrapper[4790]: I1003 08:40:59.503441 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:59 crc kubenswrapper[4790]: I1003 08:40:59.503498 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:59 crc kubenswrapper[4790]: I1003 08:40:59.503510 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:59 crc kubenswrapper[4790]: I1003 08:40:59.503531 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:59 crc kubenswrapper[4790]: I1003 08:40:59.503547 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:59Z","lastTransitionTime":"2025-10-03T08:40:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:59 crc kubenswrapper[4790]: I1003 08:40:59.607409 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:59 crc kubenswrapper[4790]: I1003 08:40:59.607475 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:59 crc kubenswrapper[4790]: I1003 08:40:59.607488 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:59 crc kubenswrapper[4790]: I1003 08:40:59.607509 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:59 crc kubenswrapper[4790]: I1003 08:40:59.607521 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:59Z","lastTransitionTime":"2025-10-03T08:40:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:59 crc kubenswrapper[4790]: I1003 08:40:59.710974 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:59 crc kubenswrapper[4790]: I1003 08:40:59.711053 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:59 crc kubenswrapper[4790]: I1003 08:40:59.711072 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:59 crc kubenswrapper[4790]: I1003 08:40:59.711107 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:59 crc kubenswrapper[4790]: I1003 08:40:59.711127 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:59Z","lastTransitionTime":"2025-10-03T08:40:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:59 crc kubenswrapper[4790]: I1003 08:40:59.814902 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:59 crc kubenswrapper[4790]: I1003 08:40:59.814954 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:59 crc kubenswrapper[4790]: I1003 08:40:59.814963 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:59 crc kubenswrapper[4790]: I1003 08:40:59.814985 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:59 crc kubenswrapper[4790]: I1003 08:40:59.814998 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:59Z","lastTransitionTime":"2025-10-03T08:40:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:40:59 crc kubenswrapper[4790]: I1003 08:40:59.918132 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:40:59 crc kubenswrapper[4790]: I1003 08:40:59.918191 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:40:59 crc kubenswrapper[4790]: I1003 08:40:59.918203 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:40:59 crc kubenswrapper[4790]: I1003 08:40:59.918224 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:40:59 crc kubenswrapper[4790]: I1003 08:40:59.918238 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:40:59Z","lastTransitionTime":"2025-10-03T08:40:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:00 crc kubenswrapper[4790]: I1003 08:41:00.021635 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:00 crc kubenswrapper[4790]: I1003 08:41:00.021697 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:00 crc kubenswrapper[4790]: I1003 08:41:00.021706 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:00 crc kubenswrapper[4790]: I1003 08:41:00.021723 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:00 crc kubenswrapper[4790]: I1003 08:41:00.021786 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:00Z","lastTransitionTime":"2025-10-03T08:41:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:00 crc kubenswrapper[4790]: I1003 08:41:00.125148 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:00 crc kubenswrapper[4790]: I1003 08:41:00.125263 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:00 crc kubenswrapper[4790]: I1003 08:41:00.125274 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:00 crc kubenswrapper[4790]: I1003 08:41:00.125294 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:00 crc kubenswrapper[4790]: I1003 08:41:00.125323 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:00Z","lastTransitionTime":"2025-10-03T08:41:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:00 crc kubenswrapper[4790]: I1003 08:41:00.229823 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:00 crc kubenswrapper[4790]: I1003 08:41:00.230301 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:00 crc kubenswrapper[4790]: I1003 08:41:00.230396 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:00 crc kubenswrapper[4790]: I1003 08:41:00.230496 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:00 crc kubenswrapper[4790]: I1003 08:41:00.230602 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:00Z","lastTransitionTime":"2025-10-03T08:41:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:00 crc kubenswrapper[4790]: I1003 08:41:00.328076 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 08:41:00 crc kubenswrapper[4790]: I1003 08:41:00.328172 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4vl95" Oct 03 08:41:00 crc kubenswrapper[4790]: E1003 08:41:00.328284 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 08:41:00 crc kubenswrapper[4790]: I1003 08:41:00.328419 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 08:41:00 crc kubenswrapper[4790]: I1003 08:41:00.328483 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 08:41:00 crc kubenswrapper[4790]: E1003 08:41:00.328434 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4vl95" podUID="9236b957-81cb-40bc-969e-a2057884ba50" Oct 03 08:41:00 crc kubenswrapper[4790]: E1003 08:41:00.328536 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 08:41:00 crc kubenswrapper[4790]: E1003 08:41:00.328735 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 08:41:00 crc kubenswrapper[4790]: I1003 08:41:00.333824 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:00 crc kubenswrapper[4790]: I1003 08:41:00.333886 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:00 crc kubenswrapper[4790]: I1003 08:41:00.333911 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:00 crc kubenswrapper[4790]: I1003 08:41:00.333937 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:00 crc kubenswrapper[4790]: I1003 08:41:00.333957 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:00Z","lastTransitionTime":"2025-10-03T08:41:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:00 crc kubenswrapper[4790]: I1003 08:41:00.437145 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:00 crc kubenswrapper[4790]: I1003 08:41:00.437208 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:00 crc kubenswrapper[4790]: I1003 08:41:00.437221 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:00 crc kubenswrapper[4790]: I1003 08:41:00.437243 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:00 crc kubenswrapper[4790]: I1003 08:41:00.437257 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:00Z","lastTransitionTime":"2025-10-03T08:41:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:00 crc kubenswrapper[4790]: I1003 08:41:00.540953 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:00 crc kubenswrapper[4790]: I1003 08:41:00.541045 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:00 crc kubenswrapper[4790]: I1003 08:41:00.541070 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:00 crc kubenswrapper[4790]: I1003 08:41:00.541105 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:00 crc kubenswrapper[4790]: I1003 08:41:00.541128 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:00Z","lastTransitionTime":"2025-10-03T08:41:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:00 crc kubenswrapper[4790]: I1003 08:41:00.644147 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:00 crc kubenswrapper[4790]: I1003 08:41:00.644204 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:00 crc kubenswrapper[4790]: I1003 08:41:00.644223 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:00 crc kubenswrapper[4790]: I1003 08:41:00.644253 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:00 crc kubenswrapper[4790]: I1003 08:41:00.644272 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:00Z","lastTransitionTime":"2025-10-03T08:41:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:00 crc kubenswrapper[4790]: I1003 08:41:00.747355 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:00 crc kubenswrapper[4790]: I1003 08:41:00.747412 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:00 crc kubenswrapper[4790]: I1003 08:41:00.747423 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:00 crc kubenswrapper[4790]: I1003 08:41:00.747440 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:00 crc kubenswrapper[4790]: I1003 08:41:00.747451 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:00Z","lastTransitionTime":"2025-10-03T08:41:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:00 crc kubenswrapper[4790]: I1003 08:41:00.830485 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:00 crc kubenswrapper[4790]: I1003 08:41:00.830561 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:00 crc kubenswrapper[4790]: I1003 08:41:00.830611 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:00 crc kubenswrapper[4790]: I1003 08:41:00.830643 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:00 crc kubenswrapper[4790]: I1003 08:41:00.830664 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:00Z","lastTransitionTime":"2025-10-03T08:41:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:00 crc kubenswrapper[4790]: E1003 08:41:00.855791 4790 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T08:41:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T08:41:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T08:41:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T08:41:00Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T08:41:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T08:41:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T08:41:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T08:41:00Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e374b313-7030-4739-b7f1-da1c9b39fe8d\\\",\\\"systemUUID\\\":\\\"312be368-bbdc-461d-b74b-f792414190de\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:41:00Z is after 2025-08-24T17:21:41Z" Oct 03 08:41:00 crc kubenswrapper[4790]: I1003 08:41:00.861718 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:00 crc kubenswrapper[4790]: I1003 08:41:00.861849 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:00 crc kubenswrapper[4790]: I1003 08:41:00.861924 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:00 crc kubenswrapper[4790]: I1003 08:41:00.861963 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:00 crc kubenswrapper[4790]: I1003 08:41:00.862052 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:00Z","lastTransitionTime":"2025-10-03T08:41:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:00 crc kubenswrapper[4790]: E1003 08:41:00.886162 4790 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T08:41:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T08:41:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T08:41:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T08:41:00Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T08:41:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T08:41:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T08:41:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T08:41:00Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e374b313-7030-4739-b7f1-da1c9b39fe8d\\\",\\\"systemUUID\\\":\\\"312be368-bbdc-461d-b74b-f792414190de\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:41:00Z is after 2025-08-24T17:21:41Z" Oct 03 08:41:00 crc kubenswrapper[4790]: I1003 08:41:00.891159 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:00 crc kubenswrapper[4790]: I1003 08:41:00.891208 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:00 crc kubenswrapper[4790]: I1003 08:41:00.891226 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:00 crc kubenswrapper[4790]: I1003 08:41:00.891253 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:00 crc kubenswrapper[4790]: I1003 08:41:00.891273 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:00Z","lastTransitionTime":"2025-10-03T08:41:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:00 crc kubenswrapper[4790]: E1003 08:41:00.914119 4790 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T08:41:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T08:41:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T08:41:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T08:41:00Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T08:41:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T08:41:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T08:41:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T08:41:00Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e374b313-7030-4739-b7f1-da1c9b39fe8d\\\",\\\"systemUUID\\\":\\\"312be368-bbdc-461d-b74b-f792414190de\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:41:00Z is after 2025-08-24T17:21:41Z" Oct 03 08:41:00 crc kubenswrapper[4790]: I1003 08:41:00.921080 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:00 crc kubenswrapper[4790]: I1003 08:41:00.921203 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:00 crc kubenswrapper[4790]: I1003 08:41:00.921227 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:00 crc kubenswrapper[4790]: I1003 08:41:00.921260 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:00 crc kubenswrapper[4790]: I1003 08:41:00.921286 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:00Z","lastTransitionTime":"2025-10-03T08:41:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:00 crc kubenswrapper[4790]: E1003 08:41:00.942464 4790 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T08:41:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T08:41:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T08:41:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T08:41:00Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T08:41:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T08:41:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T08:41:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T08:41:00Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e374b313-7030-4739-b7f1-da1c9b39fe8d\\\",\\\"systemUUID\\\":\\\"312be368-bbdc-461d-b74b-f792414190de\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:41:00Z is after 2025-08-24T17:21:41Z" Oct 03 08:41:00 crc kubenswrapper[4790]: I1003 08:41:00.947828 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:00 crc kubenswrapper[4790]: I1003 08:41:00.947900 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:00 crc kubenswrapper[4790]: I1003 08:41:00.947924 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:00 crc kubenswrapper[4790]: I1003 08:41:00.947956 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:00 crc kubenswrapper[4790]: I1003 08:41:00.947975 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:00Z","lastTransitionTime":"2025-10-03T08:41:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:00 crc kubenswrapper[4790]: E1003 08:41:00.968438 4790 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T08:41:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T08:41:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T08:41:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T08:41:00Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T08:41:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T08:41:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T08:41:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T08:41:00Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e374b313-7030-4739-b7f1-da1c9b39fe8d\\\",\\\"systemUUID\\\":\\\"312be368-bbdc-461d-b74b-f792414190de\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:41:00Z is after 2025-08-24T17:21:41Z" Oct 03 08:41:00 crc kubenswrapper[4790]: E1003 08:41:00.968802 4790 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 03 08:41:00 crc kubenswrapper[4790]: I1003 08:41:00.971422 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:00 crc kubenswrapper[4790]: I1003 08:41:00.971518 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:00 crc kubenswrapper[4790]: I1003 08:41:00.971551 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:00 crc kubenswrapper[4790]: I1003 08:41:00.971627 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:00 crc kubenswrapper[4790]: I1003 08:41:00.971659 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:00Z","lastTransitionTime":"2025-10-03T08:41:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:01 crc kubenswrapper[4790]: I1003 08:41:01.075377 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:01 crc kubenswrapper[4790]: I1003 08:41:01.075447 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:01 crc kubenswrapper[4790]: I1003 08:41:01.075459 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:01 crc kubenswrapper[4790]: I1003 08:41:01.075486 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:01 crc kubenswrapper[4790]: I1003 08:41:01.075501 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:01Z","lastTransitionTime":"2025-10-03T08:41:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:01 crc kubenswrapper[4790]: I1003 08:41:01.178941 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:01 crc kubenswrapper[4790]: I1003 08:41:01.178981 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:01 crc kubenswrapper[4790]: I1003 08:41:01.178992 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:01 crc kubenswrapper[4790]: I1003 08:41:01.179012 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:01 crc kubenswrapper[4790]: I1003 08:41:01.179026 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:01Z","lastTransitionTime":"2025-10-03T08:41:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:01 crc kubenswrapper[4790]: I1003 08:41:01.283057 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:01 crc kubenswrapper[4790]: I1003 08:41:01.283154 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:01 crc kubenswrapper[4790]: I1003 08:41:01.283180 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:01 crc kubenswrapper[4790]: I1003 08:41:01.283217 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:01 crc kubenswrapper[4790]: I1003 08:41:01.283244 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:01Z","lastTransitionTime":"2025-10-03T08:41:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:01 crc kubenswrapper[4790]: I1003 08:41:01.386444 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:01 crc kubenswrapper[4790]: I1003 08:41:01.386523 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:01 crc kubenswrapper[4790]: I1003 08:41:01.386550 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:01 crc kubenswrapper[4790]: I1003 08:41:01.386612 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:01 crc kubenswrapper[4790]: I1003 08:41:01.386629 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:01Z","lastTransitionTime":"2025-10-03T08:41:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:01 crc kubenswrapper[4790]: I1003 08:41:01.490704 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:01 crc kubenswrapper[4790]: I1003 08:41:01.490774 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:01 crc kubenswrapper[4790]: I1003 08:41:01.490787 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:01 crc kubenswrapper[4790]: I1003 08:41:01.490812 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:01 crc kubenswrapper[4790]: I1003 08:41:01.490826 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:01Z","lastTransitionTime":"2025-10-03T08:41:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:01 crc kubenswrapper[4790]: I1003 08:41:01.593764 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:01 crc kubenswrapper[4790]: I1003 08:41:01.593828 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:01 crc kubenswrapper[4790]: I1003 08:41:01.593840 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:01 crc kubenswrapper[4790]: I1003 08:41:01.593860 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:01 crc kubenswrapper[4790]: I1003 08:41:01.593873 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:01Z","lastTransitionTime":"2025-10-03T08:41:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:01 crc kubenswrapper[4790]: I1003 08:41:01.697183 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:01 crc kubenswrapper[4790]: I1003 08:41:01.697241 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:01 crc kubenswrapper[4790]: I1003 08:41:01.697251 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:01 crc kubenswrapper[4790]: I1003 08:41:01.697275 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:01 crc kubenswrapper[4790]: I1003 08:41:01.697289 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:01Z","lastTransitionTime":"2025-10-03T08:41:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:01 crc kubenswrapper[4790]: I1003 08:41:01.801078 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:01 crc kubenswrapper[4790]: I1003 08:41:01.801171 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:01 crc kubenswrapper[4790]: I1003 08:41:01.801213 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:01 crc kubenswrapper[4790]: I1003 08:41:01.801252 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:01 crc kubenswrapper[4790]: I1003 08:41:01.801278 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:01Z","lastTransitionTime":"2025-10-03T08:41:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:01 crc kubenswrapper[4790]: I1003 08:41:01.903983 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:01 crc kubenswrapper[4790]: I1003 08:41:01.904075 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:01 crc kubenswrapper[4790]: I1003 08:41:01.904099 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:01 crc kubenswrapper[4790]: I1003 08:41:01.904132 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:01 crc kubenswrapper[4790]: I1003 08:41:01.904155 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:01Z","lastTransitionTime":"2025-10-03T08:41:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:02 crc kubenswrapper[4790]: I1003 08:41:02.008374 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:02 crc kubenswrapper[4790]: I1003 08:41:02.008439 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:02 crc kubenswrapper[4790]: I1003 08:41:02.008454 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:02 crc kubenswrapper[4790]: I1003 08:41:02.008477 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:02 crc kubenswrapper[4790]: I1003 08:41:02.008489 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:02Z","lastTransitionTime":"2025-10-03T08:41:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:02 crc kubenswrapper[4790]: I1003 08:41:02.113738 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:02 crc kubenswrapper[4790]: I1003 08:41:02.113825 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:02 crc kubenswrapper[4790]: I1003 08:41:02.113850 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:02 crc kubenswrapper[4790]: I1003 08:41:02.113885 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:02 crc kubenswrapper[4790]: I1003 08:41:02.113905 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:02Z","lastTransitionTime":"2025-10-03T08:41:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:02 crc kubenswrapper[4790]: I1003 08:41:02.216835 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:02 crc kubenswrapper[4790]: I1003 08:41:02.216905 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:02 crc kubenswrapper[4790]: I1003 08:41:02.216925 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:02 crc kubenswrapper[4790]: I1003 08:41:02.216950 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:02 crc kubenswrapper[4790]: I1003 08:41:02.216965 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:02Z","lastTransitionTime":"2025-10-03T08:41:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:02 crc kubenswrapper[4790]: I1003 08:41:02.320927 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:02 crc kubenswrapper[4790]: I1003 08:41:02.321012 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:02 crc kubenswrapper[4790]: I1003 08:41:02.321031 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:02 crc kubenswrapper[4790]: I1003 08:41:02.321064 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:02 crc kubenswrapper[4790]: I1003 08:41:02.321084 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:02Z","lastTransitionTime":"2025-10-03T08:41:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:02 crc kubenswrapper[4790]: I1003 08:41:02.327615 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4vl95" Oct 03 08:41:02 crc kubenswrapper[4790]: I1003 08:41:02.327670 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 08:41:02 crc kubenswrapper[4790]: I1003 08:41:02.327971 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 08:41:02 crc kubenswrapper[4790]: E1003 08:41:02.328012 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4vl95" podUID="9236b957-81cb-40bc-969e-a2057884ba50" Oct 03 08:41:02 crc kubenswrapper[4790]: I1003 08:41:02.328075 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 08:41:02 crc kubenswrapper[4790]: E1003 08:41:02.329168 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 08:41:02 crc kubenswrapper[4790]: E1003 08:41:02.328190 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 08:41:02 crc kubenswrapper[4790]: E1003 08:41:02.329232 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 08:41:02 crc kubenswrapper[4790]: I1003 08:41:02.329694 4790 scope.go:117] "RemoveContainer" containerID="6fb145beaf63d143f4e9dbaa8da4340794905d8ee9b2b317b7f5389999c43b09" Oct 03 08:41:02 crc kubenswrapper[4790]: I1003 08:41:02.349815 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7ddb59e5-6feb-471d-9594-f86e6993a2e4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b0138e08174582c0c1388716f1f6b9e42a16b009f1a467803c4dacd50c3d80ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c575d680c9bc994b8c2e97e720d794138153bc3329c4b7f8ef81115a0245d0cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://027dd0df8d83044cff9f4e8c87bef3bb18e2169bb93e615486fd1e9b5a17b86e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e837e81f3b907d8506f674d13a9bc8f20f63e6c40e1f952cd57b715855ec9bf9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:14Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:41:02Z is after 2025-08-24T17:21:41Z" Oct 03 08:41:02 crc kubenswrapper[4790]: I1003 08:41:02.368294 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08d57a60251c60c6c6a2cab2b862c322c01b531ca1aa12ff4b0933bee9eb95f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:41:02Z is after 2025-08-24T17:21:41Z" Oct 03 08:41:02 crc kubenswrapper[4790]: I1003 08:41:02.385274 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-hncl2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f2f5631d-8486-44f9-81b0-ee78515e7866\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3fde680972c36057c9e06b0f745acbfb2e6ab399b14d6c25a3c379b65e598afd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-strp2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-hncl2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:41:02Z is after 2025-08-24T17:21:41Z" Oct 03 08:41:02 crc kubenswrapper[4790]: I1003 08:41:02.403709 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:41:02Z is after 2025-08-24T17:21:41Z" Oct 03 08:41:02 crc kubenswrapper[4790]: I1003 08:41:02.422924 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9fnsr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e01f9373-74e4-439b-bd7f-ad00e60a3284\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45be0671b4ad4d5e294f87dba5d52747298b00a3ebd9a13f2845881c0f1846a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krfjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://876c35e32e5877c2a69cdbb54b84ec0bc0daebd05c7847875a1c122aafac2057\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://876c35e32e5877c2a69cdbb54b84ec0bc0daebd05c7847875a1c122aafac2057\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krfjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://938ff3fffad5347212d00d02f96c105cd5e6252a4d6a7ada258f0367da3f8e77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://938ff3fffad5347212d00d02f96c105cd5e6252a4d6a7ada258f0367da3f8e77\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:40:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krfjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca99fc95f4bc463b8e638cfa1e6dc6be77fc4111b6ec61b4b2913fdf91f46671\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ca99fc95f4bc463b8e638cfa1e6dc6be77fc4111b6ec61b4b2913fdf91f46671\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:40:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krfjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9667f80988a61a4f26f6f2b8b047ef94bdc6fa052083d5d1285f5bd9235615ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9667f80988a61a4f26f6f2b8b047ef94bdc6fa052083d5d1285f5bd9235615ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:40:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krfjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a4e7fb7b6f922e6e0fa2a7fcb5476ba549a6aa21dc88ee145bc55f4dac0912bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a4e7fb7b6f922e6e0fa2a7fcb5476ba549a6aa21dc88ee145bc55f4dac0912bb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:40:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krfjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://76a8239bfbb38c5858f1f9b0b01b9552463d8b2774564d7cdacb6757bf053656\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76a8239bfbb38c5858f1f9b0b01b9552463d8b2774564d7cdacb6757bf053656\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:40:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krfjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9fnsr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:41:02Z is after 2025-08-24T17:21:41Z" Oct 03 08:41:02 crc kubenswrapper[4790]: I1003 08:41:02.423192 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:02 crc kubenswrapper[4790]: I1003 08:41:02.423595 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:02 crc kubenswrapper[4790]: I1003 08:41:02.423611 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:02 crc kubenswrapper[4790]: I1003 08:41:02.423634 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:02 crc kubenswrapper[4790]: I1003 08:41:02.423671 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:02Z","lastTransitionTime":"2025-10-03T08:41:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:02 crc kubenswrapper[4790]: I1003 08:41:02.438081 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7bpwn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a3923f73-5efb-4ce3-b68f-0c4e1e8eca4b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://305c950f0352fe1bec1c151dc89eed1389144ad5ee321458d48c6609e688bb73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gt4gf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:37Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7bpwn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:41:02Z is after 2025-08-24T17:21:41Z" Oct 03 08:41:02 crc kubenswrapper[4790]: I1003 08:41:02.451119 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-fn764" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12525ca4-5242-41bc-b7ba-2c1ab5a45892\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ee4f33c772a396fdd8d2a6f958d402139cb81cfd79da0f32c83bbe9adbe0801\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2t6qw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d79cdfa897fe5ca615930826e3a153a184ab7977626a9837dbcdbddbe4d27af6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2t6qw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-fn764\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:41:02Z is after 2025-08-24T17:21:41Z" Oct 03 08:41:02 crc kubenswrapper[4790]: I1003 08:41:02.469649 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-4vl95" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9236b957-81cb-40bc-969e-a2057884ba50\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j2gd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j2gd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:49Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-4vl95\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:41:02Z is after 2025-08-24T17:21:41Z" Oct 03 08:41:02 crc kubenswrapper[4790]: I1003 08:41:02.484852 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28b99cc027b4390bdc7e3ed519bb798eb4a7b825ff29e4574f3420b9922265b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:41:02Z is after 2025-08-24T17:21:41Z" Oct 03 08:41:02 crc kubenswrapper[4790]: I1003 08:41:02.499260 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:41:02Z is after 2025-08-24T17:21:41Z" Oct 03 08:41:02 crc kubenswrapper[4790]: I1003 08:41:02.512923 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:41:02Z is after 2025-08-24T17:21:41Z" Oct 03 08:41:02 crc kubenswrapper[4790]: I1003 08:41:02.526640 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"36eae06a-d8be-4128-8d68-acb32e1f011b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a97a1a7f3aba494cd0da6bcd15052a85d70642642a62d108ac0be51ce8e3d302\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww2bb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8c3aea222eb433729d8eac107ebcdb306f649bef25942cd0390bfc1bbda5c74\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww2bb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:35Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-zqj4j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:41:02Z is after 2025-08-24T17:21:41Z" Oct 03 08:41:02 crc kubenswrapper[4790]: I1003 08:41:02.527069 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:02 crc kubenswrapper[4790]: I1003 08:41:02.527109 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:02 crc kubenswrapper[4790]: I1003 08:41:02.527119 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:02 crc kubenswrapper[4790]: I1003 08:41:02.527134 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:02 crc kubenswrapper[4790]: I1003 08:41:02.527145 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:02Z","lastTransitionTime":"2025-10-03T08:41:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:02 crc kubenswrapper[4790]: I1003 08:41:02.547070 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vsjvh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa372dcc-63aa-48e5-b153-7739078026ef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e69102d376e90485ca565d34e18dbc12067e166f0bdf3c4325a0873f9f2b9b0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52pgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e52e55afef7b68f05e0c7d975d4497ee57dd9fa5aef7f72e9ca7441ea1633c05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52pgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55b4e8ab33316a7f7057ee723ebd49eb0adcd7270124f084b2828ce538ed3297\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52pgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f2aaa340e3927261ef0119f786b348e2091ddcd4b791619cc521122cc0774e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52pgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff0f1dd623e164ebe1862661c5905a38bec3aa1dd5c61982eebde25d0feb4c6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52pgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27b1573f76936a5148d24accaa7fdc1a1da552fb56b348812445e7f1c1e4cbef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52pgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fb145beaf63d143f4e9dbaa8da4340794905d8ee9b2b317b7f5389999c43b09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6fb145beaf63d143f4e9dbaa8da4340794905d8ee9b2b317b7f5389999c43b09\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-03T08:40:49Z\\\",\\\"message\\\":\\\"6 for removal\\\\nI1003 08:40:48.888763 6236 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1003 08:40:48.888772 6236 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1003 08:40:48.888784 6236 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1003 08:40:48.888812 6236 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1003 08:40:48.888830 6236 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1003 08:40:48.888831 6236 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1003 08:40:48.888804 6236 factory.go:656] Stopping watch factory\\\\nI1003 08:40:48.888954 6236 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI1003 08:40:48.889061 6236 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-machine-api/cluster-autoscaler-operator\\\\\\\"}\\\\nI1003 08:40:48.889112 6236 services_controller.go:360] Finished syncing service cluster-autoscaler-operator on namespace openshift-machine-api for network=default : 2.188891ms\\\\nI1003 08:40:48.889115 6236 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI1003 08:40:48.889164 6236 ovnkube.go:599] Stopped ovnkube\\\\nI1003 08:40:48.889201 6236 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1003 08:40:48.889310 6236 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:47Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-vsjvh_openshift-ovn-kubernetes(fa372dcc-63aa-48e5-b153-7739078026ef)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52pgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74d44e422b0cbbd3793553dad98475d5a93ac38530a167db0dd1ef152edac010\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52pgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e45c39722f37fc673cee930ca10396bba43b56e19e61aa99bd2b89f131341cb4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e45c39722f37fc673cee930ca10396bba43b56e19e61aa99bd2b89f131341cb4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:40:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52pgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vsjvh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:41:02Z is after 2025-08-24T17:21:41Z" Oct 03 08:41:02 crc kubenswrapper[4790]: I1003 08:41:02.563904 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"535765e1-fd5d-4501-b13e-684c5f08b296\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e0caa9033c82a625d9594342d4e58052cfbfea8251bc0fa2f1b102ba235b5ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db5e8a267e2821ea1c1628ff477a4efc3c5bcc78538460b0f30a160bd738dcee\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d85a637b07da3de31d73486c444bf86e7e2fd799180fc1cd9ab56fee61afaa9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c713cd887b75dc54dfefd8b613f5bdb78ade644b31fc7db5e789414fcacad4b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://905a99c09cfaadc3eddc17fa6e509098364525d7a03add46a97c98dc08def8a2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T08:40:30Z\\\",\\\"message\\\":\\\"file observer\\\\nW1003 08:40:30.396497 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1003 08:40:30.399279 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 08:40:30.400119 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-247026750/tls.crt::/tmp/serving-cert-247026750/tls.key\\\\\\\"\\\\nI1003 08:40:30.809123 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1003 08:40:30.812812 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1003 08:40:30.812831 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1003 08:40:30.812852 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1003 08:40:30.812858 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1003 08:40:30.817440 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1003 08:40:30.817463 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1003 08:40:30.817482 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 08:40:30.817491 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 08:40:30.817495 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1003 08:40:30.817498 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1003 08:40:30.817501 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1003 08:40:30.817504 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1003 08:40:30.821080 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4591d1f2d844cb16eef19a65aaad416898d23ca5ce266e4863a9810d49ff4ba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://444e973af493d5aca2c924bcaee20abca5f1281e6b9c9f364e4603bf622d0e48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://444e973af493d5aca2c924bcaee20abca5f1281e6b9c9f364e4603bf622d0e48\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:40:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:14Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:41:02Z is after 2025-08-24T17:21:41Z" Oct 03 08:41:02 crc kubenswrapper[4790]: I1003 08:41:02.575359 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gprtl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c5bfb61-e69a-4576-b200-18dd7f38c7f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dcb3e7b6550f0ad490e187c2154ec53cb427cbab7acaf0618eb4de188eb724c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66xc5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:34Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gprtl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:41:02Z is after 2025-08-24T17:21:41Z" Oct 03 08:41:02 crc kubenswrapper[4790]: I1003 08:41:02.589297 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3362615c1396a6051d3770c79e4b67eea9d0c9df2a1a45381f729c4120970818\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://41113d498e57788a4c3162e8e981f5b9d9a5a7d49ee50b15085688fbc6b67f4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:41:02Z is after 2025-08-24T17:21:41Z" Oct 03 08:41:02 crc kubenswrapper[4790]: I1003 08:41:02.629767 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:02 crc kubenswrapper[4790]: I1003 08:41:02.629816 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:02 crc kubenswrapper[4790]: I1003 08:41:02.629827 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:02 crc kubenswrapper[4790]: I1003 08:41:02.629843 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:02 crc kubenswrapper[4790]: I1003 08:41:02.629855 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:02Z","lastTransitionTime":"2025-10-03T08:41:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:02 crc kubenswrapper[4790]: I1003 08:41:02.733161 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:02 crc kubenswrapper[4790]: I1003 08:41:02.733231 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:02 crc kubenswrapper[4790]: I1003 08:41:02.733245 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:02 crc kubenswrapper[4790]: I1003 08:41:02.733268 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:02 crc kubenswrapper[4790]: I1003 08:41:02.733286 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:02Z","lastTransitionTime":"2025-10-03T08:41:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:02 crc kubenswrapper[4790]: I1003 08:41:02.778703 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vsjvh_fa372dcc-63aa-48e5-b153-7739078026ef/ovnkube-controller/1.log" Oct 03 08:41:02 crc kubenswrapper[4790]: I1003 08:41:02.782605 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vsjvh" event={"ID":"fa372dcc-63aa-48e5-b153-7739078026ef","Type":"ContainerStarted","Data":"fa6b0394cf2fb97d539acd9a37b2c7184af5640122bf0b1d3f0fb663a78a9e91"} Oct 03 08:41:02 crc kubenswrapper[4790]: I1003 08:41:02.782775 4790 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 03 08:41:02 crc kubenswrapper[4790]: I1003 08:41:02.796756 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gprtl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c5bfb61-e69a-4576-b200-18dd7f38c7f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dcb3e7b6550f0ad490e187c2154ec53cb427cbab7acaf0618eb4de188eb724c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66xc5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:34Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gprtl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:41:02Z is after 2025-08-24T17:21:41Z" Oct 03 08:41:02 crc kubenswrapper[4790]: I1003 08:41:02.811242 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3362615c1396a6051d3770c79e4b67eea9d0c9df2a1a45381f729c4120970818\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://41113d498e57788a4c3162e8e981f5b9d9a5a7d49ee50b15085688fbc6b67f4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:41:02Z is after 2025-08-24T17:21:41Z" Oct 03 08:41:02 crc kubenswrapper[4790]: I1003 08:41:02.827555 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"535765e1-fd5d-4501-b13e-684c5f08b296\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e0caa9033c82a625d9594342d4e58052cfbfea8251bc0fa2f1b102ba235b5ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db5e8a267e2821ea1c1628ff477a4efc3c5bcc78538460b0f30a160bd738dcee\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d85a637b07da3de31d73486c444bf86e7e2fd799180fc1cd9ab56fee61afaa9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c713cd887b75dc54dfefd8b613f5bdb78ade644b31fc7db5e789414fcacad4b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://905a99c09cfaadc3eddc17fa6e509098364525d7a03add46a97c98dc08def8a2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T08:40:30Z\\\",\\\"message\\\":\\\"file observer\\\\nW1003 08:40:30.396497 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1003 08:40:30.399279 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 08:40:30.400119 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-247026750/tls.crt::/tmp/serving-cert-247026750/tls.key\\\\\\\"\\\\nI1003 08:40:30.809123 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1003 08:40:30.812812 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1003 08:40:30.812831 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1003 08:40:30.812852 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1003 08:40:30.812858 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1003 08:40:30.817440 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1003 08:40:30.817463 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1003 08:40:30.817482 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 08:40:30.817491 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 08:40:30.817495 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1003 08:40:30.817498 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1003 08:40:30.817501 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1003 08:40:30.817504 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1003 08:40:30.821080 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4591d1f2d844cb16eef19a65aaad416898d23ca5ce266e4863a9810d49ff4ba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://444e973af493d5aca2c924bcaee20abca5f1281e6b9c9f364e4603bf622d0e48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://444e973af493d5aca2c924bcaee20abca5f1281e6b9c9f364e4603bf622d0e48\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:40:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:14Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:41:02Z is after 2025-08-24T17:21:41Z" Oct 03 08:41:02 crc kubenswrapper[4790]: I1003 08:41:02.835914 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:02 crc kubenswrapper[4790]: I1003 08:41:02.835986 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:02 crc kubenswrapper[4790]: I1003 08:41:02.835999 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:02 crc kubenswrapper[4790]: I1003 08:41:02.836017 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:02 crc kubenswrapper[4790]: I1003 08:41:02.836029 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:02Z","lastTransitionTime":"2025-10-03T08:41:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:02 crc kubenswrapper[4790]: I1003 08:41:02.843914 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08d57a60251c60c6c6a2cab2b862c322c01b531ca1aa12ff4b0933bee9eb95f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:41:02Z is after 2025-08-24T17:21:41Z" Oct 03 08:41:02 crc kubenswrapper[4790]: I1003 08:41:02.856743 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-hncl2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f2f5631d-8486-44f9-81b0-ee78515e7866\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3fde680972c36057c9e06b0f745acbfb2e6ab399b14d6c25a3c379b65e598afd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-strp2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-hncl2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:41:02Z is after 2025-08-24T17:21:41Z" Oct 03 08:41:02 crc kubenswrapper[4790]: I1003 08:41:02.872133 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7ddb59e5-6feb-471d-9594-f86e6993a2e4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b0138e08174582c0c1388716f1f6b9e42a16b009f1a467803c4dacd50c3d80ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c575d680c9bc994b8c2e97e720d794138153bc3329c4b7f8ef81115a0245d0cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://027dd0df8d83044cff9f4e8c87bef3bb18e2169bb93e615486fd1e9b5a17b86e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e837e81f3b907d8506f674d13a9bc8f20f63e6c40e1f952cd57b715855ec9bf9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:14Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:41:02Z is after 2025-08-24T17:21:41Z" Oct 03 08:41:02 crc kubenswrapper[4790]: I1003 08:41:02.888329 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9fnsr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e01f9373-74e4-439b-bd7f-ad00e60a3284\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45be0671b4ad4d5e294f87dba5d52747298b00a3ebd9a13f2845881c0f1846a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krfjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://876c35e32e5877c2a69cdbb54b84ec0bc0daebd05c7847875a1c122aafac2057\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://876c35e32e5877c2a69cdbb54b84ec0bc0daebd05c7847875a1c122aafac2057\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krfjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://938ff3fffad5347212d00d02f96c105cd5e6252a4d6a7ada258f0367da3f8e77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://938ff3fffad5347212d00d02f96c105cd5e6252a4d6a7ada258f0367da3f8e77\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:40:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krfjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca99fc95f4bc463b8e638cfa1e6dc6be77fc4111b6ec61b4b2913fdf91f46671\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ca99fc95f4bc463b8e638cfa1e6dc6be77fc4111b6ec61b4b2913fdf91f46671\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:40:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krfjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9667f80988a61a4f26f6f2b8b047ef94bdc6fa052083d5d1285f5bd9235615ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9667f80988a61a4f26f6f2b8b047ef94bdc6fa052083d5d1285f5bd9235615ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:40:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krfjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a4e7fb7b6f922e6e0fa2a7fcb5476ba549a6aa21dc88ee145bc55f4dac0912bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a4e7fb7b6f922e6e0fa2a7fcb5476ba549a6aa21dc88ee145bc55f4dac0912bb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:40:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krfjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://76a8239bfbb38c5858f1f9b0b01b9552463d8b2774564d7cdacb6757bf053656\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76a8239bfbb38c5858f1f9b0b01b9552463d8b2774564d7cdacb6757bf053656\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:40:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krfjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9fnsr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:41:02Z is after 2025-08-24T17:21:41Z" Oct 03 08:41:02 crc kubenswrapper[4790]: I1003 08:41:02.901809 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7bpwn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a3923f73-5efb-4ce3-b68f-0c4e1e8eca4b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://305c950f0352fe1bec1c151dc89eed1389144ad5ee321458d48c6609e688bb73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gt4gf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:37Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7bpwn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:41:02Z is after 2025-08-24T17:21:41Z" Oct 03 08:41:02 crc kubenswrapper[4790]: I1003 08:41:02.915756 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-fn764" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12525ca4-5242-41bc-b7ba-2c1ab5a45892\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ee4f33c772a396fdd8d2a6f958d402139cb81cfd79da0f32c83bbe9adbe0801\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2t6qw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d79cdfa897fe5ca615930826e3a153a184ab7977626a9837dbcdbddbe4d27af6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2t6qw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-fn764\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:41:02Z is after 2025-08-24T17:21:41Z" Oct 03 08:41:02 crc kubenswrapper[4790]: I1003 08:41:02.930042 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:41:02Z is after 2025-08-24T17:21:41Z" Oct 03 08:41:02 crc kubenswrapper[4790]: I1003 08:41:02.939410 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:02 crc kubenswrapper[4790]: I1003 08:41:02.939464 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:02 crc kubenswrapper[4790]: I1003 08:41:02.939474 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:02 crc kubenswrapper[4790]: I1003 08:41:02.939490 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:02 crc kubenswrapper[4790]: I1003 08:41:02.939512 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:02Z","lastTransitionTime":"2025-10-03T08:41:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:02 crc kubenswrapper[4790]: I1003 08:41:02.954946 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:41:02Z is after 2025-08-24T17:21:41Z" Oct 03 08:41:02 crc kubenswrapper[4790]: I1003 08:41:02.976093 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"36eae06a-d8be-4128-8d68-acb32e1f011b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a97a1a7f3aba494cd0da6bcd15052a85d70642642a62d108ac0be51ce8e3d302\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww2bb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8c3aea222eb433729d8eac107ebcdb306f649bef25942cd0390bfc1bbda5c74\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww2bb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:35Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-zqj4j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:41:02Z is after 2025-08-24T17:21:41Z" Oct 03 08:41:02 crc kubenswrapper[4790]: I1003 08:41:02.995878 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vsjvh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa372dcc-63aa-48e5-b153-7739078026ef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e69102d376e90485ca565d34e18dbc12067e166f0bdf3c4325a0873f9f2b9b0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52pgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e52e55afef7b68f05e0c7d975d4497ee57dd9fa5aef7f72e9ca7441ea1633c05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52pgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55b4e8ab33316a7f7057ee723ebd49eb0adcd7270124f084b2828ce538ed3297\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52pgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f2aaa340e3927261ef0119f786b348e2091ddcd4b791619cc521122cc0774e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52pgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff0f1dd623e164ebe1862661c5905a38bec3aa1dd5c61982eebde25d0feb4c6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52pgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27b1573f76936a5148d24accaa7fdc1a1da552fb56b348812445e7f1c1e4cbef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52pgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa6b0394cf2fb97d539acd9a37b2c7184af5640122bf0b1d3f0fb663a78a9e91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6fb145beaf63d143f4e9dbaa8da4340794905d8ee9b2b317b7f5389999c43b09\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-03T08:40:49Z\\\",\\\"message\\\":\\\"6 for removal\\\\nI1003 08:40:48.888763 6236 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1003 08:40:48.888772 6236 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1003 08:40:48.888784 6236 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1003 08:40:48.888812 6236 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1003 08:40:48.888830 6236 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1003 08:40:48.888831 6236 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1003 08:40:48.888804 6236 factory.go:656] Stopping watch factory\\\\nI1003 08:40:48.888954 6236 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI1003 08:40:48.889061 6236 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-machine-api/cluster-autoscaler-operator\\\\\\\"}\\\\nI1003 08:40:48.889112 6236 services_controller.go:360] Finished syncing service cluster-autoscaler-operator on namespace openshift-machine-api for network=default : 2.188891ms\\\\nI1003 08:40:48.889115 6236 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI1003 08:40:48.889164 6236 ovnkube.go:599] Stopped ovnkube\\\\nI1003 08:40:48.889201 6236 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1003 08:40:48.889310 6236 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:47Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:41:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52pgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74d44e422b0cbbd3793553dad98475d5a93ac38530a167db0dd1ef152edac010\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52pgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e45c39722f37fc673cee930ca10396bba43b56e19e61aa99bd2b89f131341cb4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e45c39722f37fc673cee930ca10396bba43b56e19e61aa99bd2b89f131341cb4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:40:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52pgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vsjvh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:41:02Z is after 2025-08-24T17:21:41Z" Oct 03 08:41:03 crc kubenswrapper[4790]: I1003 08:41:03.009391 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-4vl95" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9236b957-81cb-40bc-969e-a2057884ba50\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j2gd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j2gd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:49Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-4vl95\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:41:03Z is after 2025-08-24T17:21:41Z" Oct 03 08:41:03 crc kubenswrapper[4790]: I1003 08:41:03.023958 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28b99cc027b4390bdc7e3ed519bb798eb4a7b825ff29e4574f3420b9922265b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:41:03Z is after 2025-08-24T17:21:41Z" Oct 03 08:41:03 crc kubenswrapper[4790]: I1003 08:41:03.037522 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:41:03Z is after 2025-08-24T17:21:41Z" Oct 03 08:41:03 crc kubenswrapper[4790]: I1003 08:41:03.042926 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:03 crc kubenswrapper[4790]: I1003 08:41:03.042952 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:03 crc kubenswrapper[4790]: I1003 08:41:03.042960 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:03 crc kubenswrapper[4790]: I1003 08:41:03.042977 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:03 crc kubenswrapper[4790]: I1003 08:41:03.043007 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:03Z","lastTransitionTime":"2025-10-03T08:41:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:03 crc kubenswrapper[4790]: I1003 08:41:03.145825 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:03 crc kubenswrapper[4790]: I1003 08:41:03.145881 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:03 crc kubenswrapper[4790]: I1003 08:41:03.145895 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:03 crc kubenswrapper[4790]: I1003 08:41:03.145924 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:03 crc kubenswrapper[4790]: I1003 08:41:03.145943 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:03Z","lastTransitionTime":"2025-10-03T08:41:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:03 crc kubenswrapper[4790]: I1003 08:41:03.248659 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:03 crc kubenswrapper[4790]: I1003 08:41:03.248730 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:03 crc kubenswrapper[4790]: I1003 08:41:03.248747 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:03 crc kubenswrapper[4790]: I1003 08:41:03.248774 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:03 crc kubenswrapper[4790]: I1003 08:41:03.248789 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:03Z","lastTransitionTime":"2025-10-03T08:41:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:03 crc kubenswrapper[4790]: I1003 08:41:03.357558 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:03 crc kubenswrapper[4790]: I1003 08:41:03.357927 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:03 crc kubenswrapper[4790]: I1003 08:41:03.357953 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:03 crc kubenswrapper[4790]: I1003 08:41:03.358549 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:03 crc kubenswrapper[4790]: I1003 08:41:03.358632 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:03Z","lastTransitionTime":"2025-10-03T08:41:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:03 crc kubenswrapper[4790]: I1003 08:41:03.461680 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:03 crc kubenswrapper[4790]: I1003 08:41:03.461733 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:03 crc kubenswrapper[4790]: I1003 08:41:03.461743 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:03 crc kubenswrapper[4790]: I1003 08:41:03.461762 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:03 crc kubenswrapper[4790]: I1003 08:41:03.461774 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:03Z","lastTransitionTime":"2025-10-03T08:41:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:03 crc kubenswrapper[4790]: I1003 08:41:03.564896 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:03 crc kubenswrapper[4790]: I1003 08:41:03.564967 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:03 crc kubenswrapper[4790]: I1003 08:41:03.564981 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:03 crc kubenswrapper[4790]: I1003 08:41:03.565008 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:03 crc kubenswrapper[4790]: I1003 08:41:03.565025 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:03Z","lastTransitionTime":"2025-10-03T08:41:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:03 crc kubenswrapper[4790]: I1003 08:41:03.668502 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:03 crc kubenswrapper[4790]: I1003 08:41:03.668603 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:03 crc kubenswrapper[4790]: I1003 08:41:03.668622 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:03 crc kubenswrapper[4790]: I1003 08:41:03.668644 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:03 crc kubenswrapper[4790]: I1003 08:41:03.668658 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:03Z","lastTransitionTime":"2025-10-03T08:41:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:03 crc kubenswrapper[4790]: I1003 08:41:03.771614 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:03 crc kubenswrapper[4790]: I1003 08:41:03.771710 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:03 crc kubenswrapper[4790]: I1003 08:41:03.771732 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:03 crc kubenswrapper[4790]: I1003 08:41:03.771768 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:03 crc kubenswrapper[4790]: I1003 08:41:03.771789 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:03Z","lastTransitionTime":"2025-10-03T08:41:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:03 crc kubenswrapper[4790]: I1003 08:41:03.788913 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vsjvh_fa372dcc-63aa-48e5-b153-7739078026ef/ovnkube-controller/2.log" Oct 03 08:41:03 crc kubenswrapper[4790]: I1003 08:41:03.789517 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vsjvh_fa372dcc-63aa-48e5-b153-7739078026ef/ovnkube-controller/1.log" Oct 03 08:41:03 crc kubenswrapper[4790]: I1003 08:41:03.794388 4790 generic.go:334] "Generic (PLEG): container finished" podID="fa372dcc-63aa-48e5-b153-7739078026ef" containerID="fa6b0394cf2fb97d539acd9a37b2c7184af5640122bf0b1d3f0fb663a78a9e91" exitCode=1 Oct 03 08:41:03 crc kubenswrapper[4790]: I1003 08:41:03.794419 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vsjvh" event={"ID":"fa372dcc-63aa-48e5-b153-7739078026ef","Type":"ContainerDied","Data":"fa6b0394cf2fb97d539acd9a37b2c7184af5640122bf0b1d3f0fb663a78a9e91"} Oct 03 08:41:03 crc kubenswrapper[4790]: I1003 08:41:03.794486 4790 scope.go:117] "RemoveContainer" containerID="6fb145beaf63d143f4e9dbaa8da4340794905d8ee9b2b317b7f5389999c43b09" Oct 03 08:41:03 crc kubenswrapper[4790]: I1003 08:41:03.795919 4790 scope.go:117] "RemoveContainer" containerID="fa6b0394cf2fb97d539acd9a37b2c7184af5640122bf0b1d3f0fb663a78a9e91" Oct 03 08:41:03 crc kubenswrapper[4790]: E1003 08:41:03.796204 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-vsjvh_openshift-ovn-kubernetes(fa372dcc-63aa-48e5-b153-7739078026ef)\"" pod="openshift-ovn-kubernetes/ovnkube-node-vsjvh" podUID="fa372dcc-63aa-48e5-b153-7739078026ef" Oct 03 08:41:03 crc kubenswrapper[4790]: I1003 08:41:03.824074 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:41:03Z is after 2025-08-24T17:21:41Z" Oct 03 08:41:03 crc kubenswrapper[4790]: I1003 08:41:03.842820 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:41:03Z is after 2025-08-24T17:21:41Z" Oct 03 08:41:03 crc kubenswrapper[4790]: I1003 08:41:03.858687 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"36eae06a-d8be-4128-8d68-acb32e1f011b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a97a1a7f3aba494cd0da6bcd15052a85d70642642a62d108ac0be51ce8e3d302\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww2bb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8c3aea222eb433729d8eac107ebcdb306f649bef25942cd0390bfc1bbda5c74\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww2bb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:35Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-zqj4j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:41:03Z is after 2025-08-24T17:21:41Z" Oct 03 08:41:03 crc kubenswrapper[4790]: I1003 08:41:03.874599 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:03 crc kubenswrapper[4790]: I1003 08:41:03.874644 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:03 crc kubenswrapper[4790]: I1003 08:41:03.874654 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:03 crc kubenswrapper[4790]: I1003 08:41:03.874668 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:03 crc kubenswrapper[4790]: I1003 08:41:03.874680 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:03Z","lastTransitionTime":"2025-10-03T08:41:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:03 crc kubenswrapper[4790]: I1003 08:41:03.888165 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vsjvh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa372dcc-63aa-48e5-b153-7739078026ef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e69102d376e90485ca565d34e18dbc12067e166f0bdf3c4325a0873f9f2b9b0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52pgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e52e55afef7b68f05e0c7d975d4497ee57dd9fa5aef7f72e9ca7441ea1633c05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52pgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55b4e8ab33316a7f7057ee723ebd49eb0adcd7270124f084b2828ce538ed3297\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52pgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f2aaa340e3927261ef0119f786b348e2091ddcd4b791619cc521122cc0774e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52pgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff0f1dd623e164ebe1862661c5905a38bec3aa1dd5c61982eebde25d0feb4c6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52pgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27b1573f76936a5148d24accaa7fdc1a1da552fb56b348812445e7f1c1e4cbef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52pgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa6b0394cf2fb97d539acd9a37b2c7184af5640122bf0b1d3f0fb663a78a9e91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6fb145beaf63d143f4e9dbaa8da4340794905d8ee9b2b317b7f5389999c43b09\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-03T08:40:49Z\\\",\\\"message\\\":\\\"6 for removal\\\\nI1003 08:40:48.888763 6236 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1003 08:40:48.888772 6236 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1003 08:40:48.888784 6236 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1003 08:40:48.888812 6236 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1003 08:40:48.888830 6236 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1003 08:40:48.888831 6236 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1003 08:40:48.888804 6236 factory.go:656] Stopping watch factory\\\\nI1003 08:40:48.888954 6236 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI1003 08:40:48.889061 6236 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-machine-api/cluster-autoscaler-operator\\\\\\\"}\\\\nI1003 08:40:48.889112 6236 services_controller.go:360] Finished syncing service cluster-autoscaler-operator on namespace openshift-machine-api for network=default : 2.188891ms\\\\nI1003 08:40:48.889115 6236 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI1003 08:40:48.889164 6236 ovnkube.go:599] Stopped ovnkube\\\\nI1003 08:40:48.889201 6236 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1003 08:40:48.889310 6236 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:47Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa6b0394cf2fb97d539acd9a37b2c7184af5640122bf0b1d3f0fb663a78a9e91\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-03T08:41:03Z\\\",\\\"message\\\":\\\"resolver-gprtl in node crc\\\\nI1003 08:41:03.132081 6465 obj_retry.go:386] Retry successful for *v1.Pod openshift-dns/node-resolver-gprtl after 0 failed attempt(s)\\\\nI1003 08:41:03.132085 6465 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1003 08:41:03.132087 6465 default_network_controller.go:776] Recording success event on pod openshift-dns/node-resolver-gprtl\\\\nI1003 08:41:03.132018 6465 obj_retry.go:386] Retry successful for *v1.Pod openshift-kube-apiserver/kube-apiserver-crc after 0 failed attempt(s)\\\\nI1003 08:41:03.132100 6465 default_network_controller.go:776] Recording success event on pod openshift-kube-apiserver/kube-apiserver-crc\\\\nI1003 08:41:03.132110 6465 handler.go:208] Removed *v1.Node event handler 2\\\\nI1003 08:41:03.132122 6465 factory.go:656] Stopping watch factory\\\\nI1003 08:41:03.132130 6465 base_network_controller_pods.go:916] Annotation values: ip=[10.217.0.3/23] ; mac=0a:58:0a:d9:00:03 ; gw=[10.217.0.1]\\\\nI1003 08:41:03.132141 6465 ovnkube.go:599] Stopped ovnkube\\\\nI1003 08:41:03.132178 6465 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1003 08:41:03.132192 6465 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1003 08:41:03.132253 6465 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T08:41:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52pgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74d44e422b0cbbd3793553dad98475d5a93ac38530a167db0dd1ef152edac010\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52pgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e45c39722f37fc673cee930ca10396bba43b56e19e61aa99bd2b89f131341cb4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e45c39722f37fc673cee930ca10396bba43b56e19e61aa99bd2b89f131341cb4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:40:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52pgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vsjvh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:41:03Z is after 2025-08-24T17:21:41Z" Oct 03 08:41:03 crc kubenswrapper[4790]: I1003 08:41:03.901034 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-4vl95" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9236b957-81cb-40bc-969e-a2057884ba50\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j2gd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j2gd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:49Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-4vl95\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:41:03Z is after 2025-08-24T17:21:41Z" Oct 03 08:41:03 crc kubenswrapper[4790]: I1003 08:41:03.916072 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28b99cc027b4390bdc7e3ed519bb798eb4a7b825ff29e4574f3420b9922265b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:41:03Z is after 2025-08-24T17:21:41Z" Oct 03 08:41:03 crc kubenswrapper[4790]: I1003 08:41:03.933017 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"535765e1-fd5d-4501-b13e-684c5f08b296\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e0caa9033c82a625d9594342d4e58052cfbfea8251bc0fa2f1b102ba235b5ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db5e8a267e2821ea1c1628ff477a4efc3c5bcc78538460b0f30a160bd738dcee\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d85a637b07da3de31d73486c444bf86e7e2fd799180fc1cd9ab56fee61afaa9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c713cd887b75dc54dfefd8b613f5bdb78ade644b31fc7db5e789414fcacad4b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://905a99c09cfaadc3eddc17fa6e509098364525d7a03add46a97c98dc08def8a2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T08:40:30Z\\\",\\\"message\\\":\\\"file observer\\\\nW1003 08:40:30.396497 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1003 08:40:30.399279 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 08:40:30.400119 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-247026750/tls.crt::/tmp/serving-cert-247026750/tls.key\\\\\\\"\\\\nI1003 08:40:30.809123 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1003 08:40:30.812812 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1003 08:40:30.812831 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1003 08:40:30.812852 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1003 08:40:30.812858 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1003 08:40:30.817440 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1003 08:40:30.817463 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1003 08:40:30.817482 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 08:40:30.817491 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 08:40:30.817495 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1003 08:40:30.817498 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1003 08:40:30.817501 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1003 08:40:30.817504 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1003 08:40:30.821080 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4591d1f2d844cb16eef19a65aaad416898d23ca5ce266e4863a9810d49ff4ba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://444e973af493d5aca2c924bcaee20abca5f1281e6b9c9f364e4603bf622d0e48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://444e973af493d5aca2c924bcaee20abca5f1281e6b9c9f364e4603bf622d0e48\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:40:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:14Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:41:03Z is after 2025-08-24T17:21:41Z" Oct 03 08:41:03 crc kubenswrapper[4790]: I1003 08:41:03.945706 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gprtl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c5bfb61-e69a-4576-b200-18dd7f38c7f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dcb3e7b6550f0ad490e187c2154ec53cb427cbab7acaf0618eb4de188eb724c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66xc5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:34Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gprtl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:41:03Z is after 2025-08-24T17:21:41Z" Oct 03 08:41:03 crc kubenswrapper[4790]: I1003 08:41:03.959944 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3362615c1396a6051d3770c79e4b67eea9d0c9df2a1a45381f729c4120970818\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://41113d498e57788a4c3162e8e981f5b9d9a5a7d49ee50b15085688fbc6b67f4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:41:03Z is after 2025-08-24T17:21:41Z" Oct 03 08:41:03 crc kubenswrapper[4790]: I1003 08:41:03.974067 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08d57a60251c60c6c6a2cab2b862c322c01b531ca1aa12ff4b0933bee9eb95f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:41:03Z is after 2025-08-24T17:21:41Z" Oct 03 08:41:03 crc kubenswrapper[4790]: I1003 08:41:03.976880 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:03 crc kubenswrapper[4790]: I1003 08:41:03.976953 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:03 crc kubenswrapper[4790]: I1003 08:41:03.976977 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:03 crc kubenswrapper[4790]: I1003 08:41:03.977016 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:03 crc kubenswrapper[4790]: I1003 08:41:03.977042 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:03Z","lastTransitionTime":"2025-10-03T08:41:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:03 crc kubenswrapper[4790]: I1003 08:41:03.993470 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-hncl2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f2f5631d-8486-44f9-81b0-ee78515e7866\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3fde680972c36057c9e06b0f745acbfb2e6ab399b14d6c25a3c379b65e598afd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-strp2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-hncl2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:41:03Z is after 2025-08-24T17:21:41Z" Oct 03 08:41:04 crc kubenswrapper[4790]: I1003 08:41:04.007227 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7ddb59e5-6feb-471d-9594-f86e6993a2e4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b0138e08174582c0c1388716f1f6b9e42a16b009f1a467803c4dacd50c3d80ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c575d680c9bc994b8c2e97e720d794138153bc3329c4b7f8ef81115a0245d0cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://027dd0df8d83044cff9f4e8c87bef3bb18e2169bb93e615486fd1e9b5a17b86e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e837e81f3b907d8506f674d13a9bc8f20f63e6c40e1f952cd57b715855ec9bf9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:14Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:41:04Z is after 2025-08-24T17:21:41Z" Oct 03 08:41:04 crc kubenswrapper[4790]: I1003 08:41:04.025323 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:41:04Z is after 2025-08-24T17:21:41Z" Oct 03 08:41:04 crc kubenswrapper[4790]: I1003 08:41:04.025665 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-vsjvh" Oct 03 08:41:04 crc kubenswrapper[4790]: I1003 08:41:04.041681 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9fnsr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e01f9373-74e4-439b-bd7f-ad00e60a3284\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45be0671b4ad4d5e294f87dba5d52747298b00a3ebd9a13f2845881c0f1846a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krfjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://876c35e32e5877c2a69cdbb54b84ec0bc0daebd05c7847875a1c122aafac2057\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://876c35e32e5877c2a69cdbb54b84ec0bc0daebd05c7847875a1c122aafac2057\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krfjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://938ff3fffad5347212d00d02f96c105cd5e6252a4d6a7ada258f0367da3f8e77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://938ff3fffad5347212d00d02f96c105cd5e6252a4d6a7ada258f0367da3f8e77\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:40:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krfjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca99fc95f4bc463b8e638cfa1e6dc6be77fc4111b6ec61b4b2913fdf91f46671\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ca99fc95f4bc463b8e638cfa1e6dc6be77fc4111b6ec61b4b2913fdf91f46671\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:40:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krfjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9667f80988a61a4f26f6f2b8b047ef94bdc6fa052083d5d1285f5bd9235615ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9667f80988a61a4f26f6f2b8b047ef94bdc6fa052083d5d1285f5bd9235615ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:40:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krfjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a4e7fb7b6f922e6e0fa2a7fcb5476ba549a6aa21dc88ee145bc55f4dac0912bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a4e7fb7b6f922e6e0fa2a7fcb5476ba549a6aa21dc88ee145bc55f4dac0912bb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:40:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krfjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://76a8239bfbb38c5858f1f9b0b01b9552463d8b2774564d7cdacb6757bf053656\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76a8239bfbb38c5858f1f9b0b01b9552463d8b2774564d7cdacb6757bf053656\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:40:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krfjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9fnsr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:41:04Z is after 2025-08-24T17:21:41Z" Oct 03 08:41:04 crc kubenswrapper[4790]: I1003 08:41:04.057187 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7bpwn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a3923f73-5efb-4ce3-b68f-0c4e1e8eca4b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://305c950f0352fe1bec1c151dc89eed1389144ad5ee321458d48c6609e688bb73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gt4gf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:37Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7bpwn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:41:04Z is after 2025-08-24T17:21:41Z" Oct 03 08:41:04 crc kubenswrapper[4790]: I1003 08:41:04.076383 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-fn764" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12525ca4-5242-41bc-b7ba-2c1ab5a45892\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ee4f33c772a396fdd8d2a6f958d402139cb81cfd79da0f32c83bbe9adbe0801\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2t6qw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d79cdfa897fe5ca615930826e3a153a184ab7977626a9837dbcdbddbe4d27af6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2t6qw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-fn764\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:41:04Z is after 2025-08-24T17:21:41Z" Oct 03 08:41:04 crc kubenswrapper[4790]: I1003 08:41:04.079417 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:04 crc kubenswrapper[4790]: I1003 08:41:04.079475 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:04 crc kubenswrapper[4790]: I1003 08:41:04.079498 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:04 crc kubenswrapper[4790]: I1003 08:41:04.079532 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:04 crc kubenswrapper[4790]: I1003 08:41:04.079556 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:04Z","lastTransitionTime":"2025-10-03T08:41:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:04 crc kubenswrapper[4790]: I1003 08:41:04.182638 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:04 crc kubenswrapper[4790]: I1003 08:41:04.182713 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:04 crc kubenswrapper[4790]: I1003 08:41:04.182732 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:04 crc kubenswrapper[4790]: I1003 08:41:04.182799 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:04 crc kubenswrapper[4790]: I1003 08:41:04.182820 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:04Z","lastTransitionTime":"2025-10-03T08:41:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:04 crc kubenswrapper[4790]: I1003 08:41:04.286859 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:04 crc kubenswrapper[4790]: I1003 08:41:04.286957 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:04 crc kubenswrapper[4790]: I1003 08:41:04.286985 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:04 crc kubenswrapper[4790]: I1003 08:41:04.287020 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:04 crc kubenswrapper[4790]: I1003 08:41:04.287046 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:04Z","lastTransitionTime":"2025-10-03T08:41:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:04 crc kubenswrapper[4790]: I1003 08:41:04.327816 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4vl95" Oct 03 08:41:04 crc kubenswrapper[4790]: I1003 08:41:04.327856 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 08:41:04 crc kubenswrapper[4790]: E1003 08:41:04.328039 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4vl95" podUID="9236b957-81cb-40bc-969e-a2057884ba50" Oct 03 08:41:04 crc kubenswrapper[4790]: I1003 08:41:04.328143 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 08:41:04 crc kubenswrapper[4790]: I1003 08:41:04.328199 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 08:41:04 crc kubenswrapper[4790]: E1003 08:41:04.328213 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 08:41:04 crc kubenswrapper[4790]: E1003 08:41:04.328356 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 08:41:04 crc kubenswrapper[4790]: E1003 08:41:04.328617 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 08:41:04 crc kubenswrapper[4790]: I1003 08:41:04.355070 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"535765e1-fd5d-4501-b13e-684c5f08b296\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e0caa9033c82a625d9594342d4e58052cfbfea8251bc0fa2f1b102ba235b5ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db5e8a267e2821ea1c1628ff477a4efc3c5bcc78538460b0f30a160bd738dcee\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d85a637b07da3de31d73486c444bf86e7e2fd799180fc1cd9ab56fee61afaa9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c713cd887b75dc54dfefd8b613f5bdb78ade644b31fc7db5e789414fcacad4b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://905a99c09cfaadc3eddc17fa6e509098364525d7a03add46a97c98dc08def8a2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T08:40:30Z\\\",\\\"message\\\":\\\"file observer\\\\nW1003 08:40:30.396497 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1003 08:40:30.399279 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 08:40:30.400119 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-247026750/tls.crt::/tmp/serving-cert-247026750/tls.key\\\\\\\"\\\\nI1003 08:40:30.809123 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1003 08:40:30.812812 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1003 08:40:30.812831 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1003 08:40:30.812852 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1003 08:40:30.812858 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1003 08:40:30.817440 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1003 08:40:30.817463 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1003 08:40:30.817482 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 08:40:30.817491 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 08:40:30.817495 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1003 08:40:30.817498 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1003 08:40:30.817501 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1003 08:40:30.817504 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1003 08:40:30.821080 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4591d1f2d844cb16eef19a65aaad416898d23ca5ce266e4863a9810d49ff4ba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://444e973af493d5aca2c924bcaee20abca5f1281e6b9c9f364e4603bf622d0e48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://444e973af493d5aca2c924bcaee20abca5f1281e6b9c9f364e4603bf622d0e48\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:40:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:14Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:41:04Z is after 2025-08-24T17:21:41Z" Oct 03 08:41:04 crc kubenswrapper[4790]: I1003 08:41:04.372208 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gprtl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c5bfb61-e69a-4576-b200-18dd7f38c7f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dcb3e7b6550f0ad490e187c2154ec53cb427cbab7acaf0618eb4de188eb724c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66xc5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:34Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gprtl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:41:04Z is after 2025-08-24T17:21:41Z" Oct 03 08:41:04 crc kubenswrapper[4790]: I1003 08:41:04.395303 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:04 crc kubenswrapper[4790]: I1003 08:41:04.395516 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:04 crc kubenswrapper[4790]: I1003 08:41:04.395650 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:04 crc kubenswrapper[4790]: I1003 08:41:04.395697 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:04 crc kubenswrapper[4790]: I1003 08:41:04.395734 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:04Z","lastTransitionTime":"2025-10-03T08:41:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:04 crc kubenswrapper[4790]: I1003 08:41:04.395774 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3362615c1396a6051d3770c79e4b67eea9d0c9df2a1a45381f729c4120970818\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://41113d498e57788a4c3162e8e981f5b9d9a5a7d49ee50b15085688fbc6b67f4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:41:04Z is after 2025-08-24T17:21:41Z" Oct 03 08:41:04 crc kubenswrapper[4790]: I1003 08:41:04.415161 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7ddb59e5-6feb-471d-9594-f86e6993a2e4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b0138e08174582c0c1388716f1f6b9e42a16b009f1a467803c4dacd50c3d80ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c575d680c9bc994b8c2e97e720d794138153bc3329c4b7f8ef81115a0245d0cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://027dd0df8d83044cff9f4e8c87bef3bb18e2169bb93e615486fd1e9b5a17b86e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e837e81f3b907d8506f674d13a9bc8f20f63e6c40e1f952cd57b715855ec9bf9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:14Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:41:04Z is after 2025-08-24T17:21:41Z" Oct 03 08:41:04 crc kubenswrapper[4790]: I1003 08:41:04.433907 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08d57a60251c60c6c6a2cab2b862c322c01b531ca1aa12ff4b0933bee9eb95f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:41:04Z is after 2025-08-24T17:21:41Z" Oct 03 08:41:04 crc kubenswrapper[4790]: I1003 08:41:04.450357 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-hncl2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f2f5631d-8486-44f9-81b0-ee78515e7866\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3fde680972c36057c9e06b0f745acbfb2e6ab399b14d6c25a3c379b65e598afd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-strp2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-hncl2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:41:04Z is after 2025-08-24T17:21:41Z" Oct 03 08:41:04 crc kubenswrapper[4790]: I1003 08:41:04.466489 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:41:04Z is after 2025-08-24T17:21:41Z" Oct 03 08:41:04 crc kubenswrapper[4790]: I1003 08:41:04.482275 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9fnsr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e01f9373-74e4-439b-bd7f-ad00e60a3284\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45be0671b4ad4d5e294f87dba5d52747298b00a3ebd9a13f2845881c0f1846a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krfjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://876c35e32e5877c2a69cdbb54b84ec0bc0daebd05c7847875a1c122aafac2057\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://876c35e32e5877c2a69cdbb54b84ec0bc0daebd05c7847875a1c122aafac2057\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krfjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://938ff3fffad5347212d00d02f96c105cd5e6252a4d6a7ada258f0367da3f8e77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://938ff3fffad5347212d00d02f96c105cd5e6252a4d6a7ada258f0367da3f8e77\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:40:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krfjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca99fc95f4bc463b8e638cfa1e6dc6be77fc4111b6ec61b4b2913fdf91f46671\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ca99fc95f4bc463b8e638cfa1e6dc6be77fc4111b6ec61b4b2913fdf91f46671\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:40:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krfjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9667f80988a61a4f26f6f2b8b047ef94bdc6fa052083d5d1285f5bd9235615ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9667f80988a61a4f26f6f2b8b047ef94bdc6fa052083d5d1285f5bd9235615ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:40:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krfjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a4e7fb7b6f922e6e0fa2a7fcb5476ba549a6aa21dc88ee145bc55f4dac0912bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a4e7fb7b6f922e6e0fa2a7fcb5476ba549a6aa21dc88ee145bc55f4dac0912bb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:40:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krfjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://76a8239bfbb38c5858f1f9b0b01b9552463d8b2774564d7cdacb6757bf053656\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76a8239bfbb38c5858f1f9b0b01b9552463d8b2774564d7cdacb6757bf053656\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:40:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krfjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9fnsr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:41:04Z is after 2025-08-24T17:21:41Z" Oct 03 08:41:04 crc kubenswrapper[4790]: I1003 08:41:04.494730 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7bpwn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a3923f73-5efb-4ce3-b68f-0c4e1e8eca4b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://305c950f0352fe1bec1c151dc89eed1389144ad5ee321458d48c6609e688bb73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gt4gf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:37Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7bpwn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:41:04Z is after 2025-08-24T17:21:41Z" Oct 03 08:41:04 crc kubenswrapper[4790]: I1003 08:41:04.499617 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:04 crc kubenswrapper[4790]: I1003 08:41:04.499652 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:04 crc kubenswrapper[4790]: I1003 08:41:04.499665 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:04 crc kubenswrapper[4790]: I1003 08:41:04.499687 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:04 crc kubenswrapper[4790]: I1003 08:41:04.499700 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:04Z","lastTransitionTime":"2025-10-03T08:41:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:04 crc kubenswrapper[4790]: I1003 08:41:04.508781 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-fn764" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12525ca4-5242-41bc-b7ba-2c1ab5a45892\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ee4f33c772a396fdd8d2a6f958d402139cb81cfd79da0f32c83bbe9adbe0801\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2t6qw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d79cdfa897fe5ca615930826e3a153a184ab7977626a9837dbcdbddbe4d27af6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2t6qw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-fn764\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:41:04Z is after 2025-08-24T17:21:41Z" Oct 03 08:41:04 crc kubenswrapper[4790]: I1003 08:41:04.523738 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28b99cc027b4390bdc7e3ed519bb798eb4a7b825ff29e4574f3420b9922265b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:41:04Z is after 2025-08-24T17:21:41Z" Oct 03 08:41:04 crc kubenswrapper[4790]: I1003 08:41:04.537549 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:41:04Z is after 2025-08-24T17:21:41Z" Oct 03 08:41:04 crc kubenswrapper[4790]: I1003 08:41:04.550649 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:41:04Z is after 2025-08-24T17:21:41Z" Oct 03 08:41:04 crc kubenswrapper[4790]: I1003 08:41:04.563744 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"36eae06a-d8be-4128-8d68-acb32e1f011b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a97a1a7f3aba494cd0da6bcd15052a85d70642642a62d108ac0be51ce8e3d302\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww2bb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8c3aea222eb433729d8eac107ebcdb306f649bef25942cd0390bfc1bbda5c74\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww2bb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:35Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-zqj4j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:41:04Z is after 2025-08-24T17:21:41Z" Oct 03 08:41:04 crc kubenswrapper[4790]: I1003 08:41:04.583426 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vsjvh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa372dcc-63aa-48e5-b153-7739078026ef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e69102d376e90485ca565d34e18dbc12067e166f0bdf3c4325a0873f9f2b9b0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52pgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e52e55afef7b68f05e0c7d975d4497ee57dd9fa5aef7f72e9ca7441ea1633c05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52pgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55b4e8ab33316a7f7057ee723ebd49eb0adcd7270124f084b2828ce538ed3297\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52pgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f2aaa340e3927261ef0119f786b348e2091ddcd4b791619cc521122cc0774e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52pgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff0f1dd623e164ebe1862661c5905a38bec3aa1dd5c61982eebde25d0feb4c6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52pgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27b1573f76936a5148d24accaa7fdc1a1da552fb56b348812445e7f1c1e4cbef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52pgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa6b0394cf2fb97d539acd9a37b2c7184af5640122bf0b1d3f0fb663a78a9e91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6fb145beaf63d143f4e9dbaa8da4340794905d8ee9b2b317b7f5389999c43b09\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-03T08:40:49Z\\\",\\\"message\\\":\\\"6 for removal\\\\nI1003 08:40:48.888763 6236 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1003 08:40:48.888772 6236 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1003 08:40:48.888784 6236 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1003 08:40:48.888812 6236 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1003 08:40:48.888830 6236 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1003 08:40:48.888831 6236 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1003 08:40:48.888804 6236 factory.go:656] Stopping watch factory\\\\nI1003 08:40:48.888954 6236 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI1003 08:40:48.889061 6236 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-machine-api/cluster-autoscaler-operator\\\\\\\"}\\\\nI1003 08:40:48.889112 6236 services_controller.go:360] Finished syncing service cluster-autoscaler-operator on namespace openshift-machine-api for network=default : 2.188891ms\\\\nI1003 08:40:48.889115 6236 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI1003 08:40:48.889164 6236 ovnkube.go:599] Stopped ovnkube\\\\nI1003 08:40:48.889201 6236 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1003 08:40:48.889310 6236 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:47Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa6b0394cf2fb97d539acd9a37b2c7184af5640122bf0b1d3f0fb663a78a9e91\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-03T08:41:03Z\\\",\\\"message\\\":\\\"resolver-gprtl in node crc\\\\nI1003 08:41:03.132081 6465 obj_retry.go:386] Retry successful for *v1.Pod openshift-dns/node-resolver-gprtl after 0 failed attempt(s)\\\\nI1003 08:41:03.132085 6465 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1003 08:41:03.132087 6465 default_network_controller.go:776] Recording success event on pod openshift-dns/node-resolver-gprtl\\\\nI1003 08:41:03.132018 6465 obj_retry.go:386] Retry successful for *v1.Pod openshift-kube-apiserver/kube-apiserver-crc after 0 failed attempt(s)\\\\nI1003 08:41:03.132100 6465 default_network_controller.go:776] Recording success event on pod openshift-kube-apiserver/kube-apiserver-crc\\\\nI1003 08:41:03.132110 6465 handler.go:208] Removed *v1.Node event handler 2\\\\nI1003 08:41:03.132122 6465 factory.go:656] Stopping watch factory\\\\nI1003 08:41:03.132130 6465 base_network_controller_pods.go:916] Annotation values: ip=[10.217.0.3/23] ; mac=0a:58:0a:d9:00:03 ; gw=[10.217.0.1]\\\\nI1003 08:41:03.132141 6465 ovnkube.go:599] Stopped ovnkube\\\\nI1003 08:41:03.132178 6465 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1003 08:41:03.132192 6465 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1003 08:41:03.132253 6465 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T08:41:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52pgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74d44e422b0cbbd3793553dad98475d5a93ac38530a167db0dd1ef152edac010\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52pgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e45c39722f37fc673cee930ca10396bba43b56e19e61aa99bd2b89f131341cb4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e45c39722f37fc673cee930ca10396bba43b56e19e61aa99bd2b89f131341cb4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:40:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52pgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vsjvh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:41:04Z is after 2025-08-24T17:21:41Z" Oct 03 08:41:04 crc kubenswrapper[4790]: I1003 08:41:04.599774 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-4vl95" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9236b957-81cb-40bc-969e-a2057884ba50\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j2gd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j2gd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:49Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-4vl95\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:41:04Z is after 2025-08-24T17:21:41Z" Oct 03 08:41:04 crc kubenswrapper[4790]: I1003 08:41:04.601993 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:04 crc kubenswrapper[4790]: I1003 08:41:04.602058 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:04 crc kubenswrapper[4790]: I1003 08:41:04.602068 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:04 crc kubenswrapper[4790]: I1003 08:41:04.602090 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:04 crc kubenswrapper[4790]: I1003 08:41:04.602102 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:04Z","lastTransitionTime":"2025-10-03T08:41:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:04 crc kubenswrapper[4790]: I1003 08:41:04.705437 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:04 crc kubenswrapper[4790]: I1003 08:41:04.705513 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:04 crc kubenswrapper[4790]: I1003 08:41:04.705527 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:04 crc kubenswrapper[4790]: I1003 08:41:04.705550 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:04 crc kubenswrapper[4790]: I1003 08:41:04.705564 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:04Z","lastTransitionTime":"2025-10-03T08:41:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:04 crc kubenswrapper[4790]: I1003 08:41:04.800422 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vsjvh_fa372dcc-63aa-48e5-b153-7739078026ef/ovnkube-controller/2.log" Oct 03 08:41:04 crc kubenswrapper[4790]: I1003 08:41:04.805080 4790 scope.go:117] "RemoveContainer" containerID="fa6b0394cf2fb97d539acd9a37b2c7184af5640122bf0b1d3f0fb663a78a9e91" Oct 03 08:41:04 crc kubenswrapper[4790]: E1003 08:41:04.805319 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-vsjvh_openshift-ovn-kubernetes(fa372dcc-63aa-48e5-b153-7739078026ef)\"" pod="openshift-ovn-kubernetes/ovnkube-node-vsjvh" podUID="fa372dcc-63aa-48e5-b153-7739078026ef" Oct 03 08:41:04 crc kubenswrapper[4790]: I1003 08:41:04.808364 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:04 crc kubenswrapper[4790]: I1003 08:41:04.808430 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:04 crc kubenswrapper[4790]: I1003 08:41:04.808447 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:04 crc kubenswrapper[4790]: I1003 08:41:04.808471 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:04 crc kubenswrapper[4790]: I1003 08:41:04.808485 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:04Z","lastTransitionTime":"2025-10-03T08:41:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:04 crc kubenswrapper[4790]: I1003 08:41:04.819774 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"36eae06a-d8be-4128-8d68-acb32e1f011b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a97a1a7f3aba494cd0da6bcd15052a85d70642642a62d108ac0be51ce8e3d302\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww2bb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8c3aea222eb433729d8eac107ebcdb306f649bef25942cd0390bfc1bbda5c74\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww2bb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:35Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-zqj4j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:41:04Z is after 2025-08-24T17:21:41Z" Oct 03 08:41:04 crc kubenswrapper[4790]: I1003 08:41:04.845865 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vsjvh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa372dcc-63aa-48e5-b153-7739078026ef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e69102d376e90485ca565d34e18dbc12067e166f0bdf3c4325a0873f9f2b9b0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52pgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e52e55afef7b68f05e0c7d975d4497ee57dd9fa5aef7f72e9ca7441ea1633c05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52pgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55b4e8ab33316a7f7057ee723ebd49eb0adcd7270124f084b2828ce538ed3297\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52pgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f2aaa340e3927261ef0119f786b348e2091ddcd4b791619cc521122cc0774e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52pgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff0f1dd623e164ebe1862661c5905a38bec3aa1dd5c61982eebde25d0feb4c6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52pgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27b1573f76936a5148d24accaa7fdc1a1da552fb56b348812445e7f1c1e4cbef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52pgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa6b0394cf2fb97d539acd9a37b2c7184af5640122bf0b1d3f0fb663a78a9e91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa6b0394cf2fb97d539acd9a37b2c7184af5640122bf0b1d3f0fb663a78a9e91\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-03T08:41:03Z\\\",\\\"message\\\":\\\"resolver-gprtl in node crc\\\\nI1003 08:41:03.132081 6465 obj_retry.go:386] Retry successful for *v1.Pod openshift-dns/node-resolver-gprtl after 0 failed attempt(s)\\\\nI1003 08:41:03.132085 6465 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1003 08:41:03.132087 6465 default_network_controller.go:776] Recording success event on pod openshift-dns/node-resolver-gprtl\\\\nI1003 08:41:03.132018 6465 obj_retry.go:386] Retry successful for *v1.Pod openshift-kube-apiserver/kube-apiserver-crc after 0 failed attempt(s)\\\\nI1003 08:41:03.132100 6465 default_network_controller.go:776] Recording success event on pod openshift-kube-apiserver/kube-apiserver-crc\\\\nI1003 08:41:03.132110 6465 handler.go:208] Removed *v1.Node event handler 2\\\\nI1003 08:41:03.132122 6465 factory.go:656] Stopping watch factory\\\\nI1003 08:41:03.132130 6465 base_network_controller_pods.go:916] Annotation values: ip=[10.217.0.3/23] ; mac=0a:58:0a:d9:00:03 ; gw=[10.217.0.1]\\\\nI1003 08:41:03.132141 6465 ovnkube.go:599] Stopped ovnkube\\\\nI1003 08:41:03.132178 6465 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1003 08:41:03.132192 6465 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1003 08:41:03.132253 6465 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T08:41:02Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-vsjvh_openshift-ovn-kubernetes(fa372dcc-63aa-48e5-b153-7739078026ef)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52pgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74d44e422b0cbbd3793553dad98475d5a93ac38530a167db0dd1ef152edac010\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52pgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e45c39722f37fc673cee930ca10396bba43b56e19e61aa99bd2b89f131341cb4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e45c39722f37fc673cee930ca10396bba43b56e19e61aa99bd2b89f131341cb4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:40:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52pgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vsjvh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:41:04Z is after 2025-08-24T17:21:41Z" Oct 03 08:41:04 crc kubenswrapper[4790]: I1003 08:41:04.857412 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-4vl95" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9236b957-81cb-40bc-969e-a2057884ba50\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j2gd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j2gd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:49Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-4vl95\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:41:04Z is after 2025-08-24T17:21:41Z" Oct 03 08:41:04 crc kubenswrapper[4790]: I1003 08:41:04.870195 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28b99cc027b4390bdc7e3ed519bb798eb4a7b825ff29e4574f3420b9922265b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:41:04Z is after 2025-08-24T17:21:41Z" Oct 03 08:41:04 crc kubenswrapper[4790]: I1003 08:41:04.891322 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9236b957-81cb-40bc-969e-a2057884ba50-metrics-certs\") pod \"network-metrics-daemon-4vl95\" (UID: \"9236b957-81cb-40bc-969e-a2057884ba50\") " pod="openshift-multus/network-metrics-daemon-4vl95" Oct 03 08:41:04 crc kubenswrapper[4790]: E1003 08:41:04.891867 4790 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 03 08:41:04 crc kubenswrapper[4790]: E1003 08:41:04.892047 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9236b957-81cb-40bc-969e-a2057884ba50-metrics-certs podName:9236b957-81cb-40bc-969e-a2057884ba50 nodeName:}" failed. No retries permitted until 2025-10-03 08:41:20.892010012 +0000 UTC m=+67.244944290 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9236b957-81cb-40bc-969e-a2057884ba50-metrics-certs") pod "network-metrics-daemon-4vl95" (UID: "9236b957-81cb-40bc-969e-a2057884ba50") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 03 08:41:04 crc kubenswrapper[4790]: I1003 08:41:04.892748 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:41:04Z is after 2025-08-24T17:21:41Z" Oct 03 08:41:04 crc kubenswrapper[4790]: I1003 08:41:04.907989 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:41:04Z is after 2025-08-24T17:21:41Z" Oct 03 08:41:04 crc kubenswrapper[4790]: I1003 08:41:04.917688 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:04 crc kubenswrapper[4790]: I1003 08:41:04.917758 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:04 crc kubenswrapper[4790]: I1003 08:41:04.917785 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:04 crc kubenswrapper[4790]: I1003 08:41:04.917810 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:04 crc kubenswrapper[4790]: I1003 08:41:04.917824 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:04Z","lastTransitionTime":"2025-10-03T08:41:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:04 crc kubenswrapper[4790]: I1003 08:41:04.929814 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3362615c1396a6051d3770c79e4b67eea9d0c9df2a1a45381f729c4120970818\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://41113d498e57788a4c3162e8e981f5b9d9a5a7d49ee50b15085688fbc6b67f4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:41:04Z is after 2025-08-24T17:21:41Z" Oct 03 08:41:04 crc kubenswrapper[4790]: I1003 08:41:04.947203 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"535765e1-fd5d-4501-b13e-684c5f08b296\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e0caa9033c82a625d9594342d4e58052cfbfea8251bc0fa2f1b102ba235b5ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db5e8a267e2821ea1c1628ff477a4efc3c5bcc78538460b0f30a160bd738dcee\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d85a637b07da3de31d73486c444bf86e7e2fd799180fc1cd9ab56fee61afaa9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c713cd887b75dc54dfefd8b613f5bdb78ade644b31fc7db5e789414fcacad4b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://905a99c09cfaadc3eddc17fa6e509098364525d7a03add46a97c98dc08def8a2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T08:40:30Z\\\",\\\"message\\\":\\\"file observer\\\\nW1003 08:40:30.396497 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1003 08:40:30.399279 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 08:40:30.400119 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-247026750/tls.crt::/tmp/serving-cert-247026750/tls.key\\\\\\\"\\\\nI1003 08:40:30.809123 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1003 08:40:30.812812 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1003 08:40:30.812831 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1003 08:40:30.812852 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1003 08:40:30.812858 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1003 08:40:30.817440 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1003 08:40:30.817463 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1003 08:40:30.817482 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 08:40:30.817491 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 08:40:30.817495 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1003 08:40:30.817498 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1003 08:40:30.817501 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1003 08:40:30.817504 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1003 08:40:30.821080 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4591d1f2d844cb16eef19a65aaad416898d23ca5ce266e4863a9810d49ff4ba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://444e973af493d5aca2c924bcaee20abca5f1281e6b9c9f364e4603bf622d0e48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://444e973af493d5aca2c924bcaee20abca5f1281e6b9c9f364e4603bf622d0e48\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:40:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:14Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:41:04Z is after 2025-08-24T17:21:41Z" Oct 03 08:41:04 crc kubenswrapper[4790]: I1003 08:41:04.960593 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gprtl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c5bfb61-e69a-4576-b200-18dd7f38c7f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dcb3e7b6550f0ad490e187c2154ec53cb427cbab7acaf0618eb4de188eb724c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66xc5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:34Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gprtl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:41:04Z is after 2025-08-24T17:21:41Z" Oct 03 08:41:04 crc kubenswrapper[4790]: I1003 08:41:04.974175 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-hncl2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f2f5631d-8486-44f9-81b0-ee78515e7866\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3fde680972c36057c9e06b0f745acbfb2e6ab399b14d6c25a3c379b65e598afd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-strp2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-hncl2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:41:04Z is after 2025-08-24T17:21:41Z" Oct 03 08:41:04 crc kubenswrapper[4790]: I1003 08:41:04.987741 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7ddb59e5-6feb-471d-9594-f86e6993a2e4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b0138e08174582c0c1388716f1f6b9e42a16b009f1a467803c4dacd50c3d80ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c575d680c9bc994b8c2e97e720d794138153bc3329c4b7f8ef81115a0245d0cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://027dd0df8d83044cff9f4e8c87bef3bb18e2169bb93e615486fd1e9b5a17b86e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e837e81f3b907d8506f674d13a9bc8f20f63e6c40e1f952cd57b715855ec9bf9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:14Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:41:04Z is after 2025-08-24T17:21:41Z" Oct 03 08:41:05 crc kubenswrapper[4790]: I1003 08:41:05.000939 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08d57a60251c60c6c6a2cab2b862c322c01b531ca1aa12ff4b0933bee9eb95f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:41:04Z is after 2025-08-24T17:21:41Z" Oct 03 08:41:05 crc kubenswrapper[4790]: I1003 08:41:05.013476 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7bpwn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a3923f73-5efb-4ce3-b68f-0c4e1e8eca4b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://305c950f0352fe1bec1c151dc89eed1389144ad5ee321458d48c6609e688bb73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gt4gf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:37Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7bpwn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:41:05Z is after 2025-08-24T17:21:41Z" Oct 03 08:41:05 crc kubenswrapper[4790]: I1003 08:41:05.020752 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:05 crc kubenswrapper[4790]: I1003 08:41:05.020796 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:05 crc kubenswrapper[4790]: I1003 08:41:05.020808 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:05 crc kubenswrapper[4790]: I1003 08:41:05.020833 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:05 crc kubenswrapper[4790]: I1003 08:41:05.020846 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:05Z","lastTransitionTime":"2025-10-03T08:41:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:05 crc kubenswrapper[4790]: I1003 08:41:05.025872 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-fn764" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12525ca4-5242-41bc-b7ba-2c1ab5a45892\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ee4f33c772a396fdd8d2a6f958d402139cb81cfd79da0f32c83bbe9adbe0801\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2t6qw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d79cdfa897fe5ca615930826e3a153a184ab7977626a9837dbcdbddbe4d27af6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2t6qw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-fn764\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:41:05Z is after 2025-08-24T17:21:41Z" Oct 03 08:41:05 crc kubenswrapper[4790]: I1003 08:41:05.039031 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:41:05Z is after 2025-08-24T17:21:41Z" Oct 03 08:41:05 crc kubenswrapper[4790]: I1003 08:41:05.058180 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9fnsr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e01f9373-74e4-439b-bd7f-ad00e60a3284\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45be0671b4ad4d5e294f87dba5d52747298b00a3ebd9a13f2845881c0f1846a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krfjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://876c35e32e5877c2a69cdbb54b84ec0bc0daebd05c7847875a1c122aafac2057\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://876c35e32e5877c2a69cdbb54b84ec0bc0daebd05c7847875a1c122aafac2057\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krfjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://938ff3fffad5347212d00d02f96c105cd5e6252a4d6a7ada258f0367da3f8e77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://938ff3fffad5347212d00d02f96c105cd5e6252a4d6a7ada258f0367da3f8e77\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:40:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krfjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca99fc95f4bc463b8e638cfa1e6dc6be77fc4111b6ec61b4b2913fdf91f46671\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ca99fc95f4bc463b8e638cfa1e6dc6be77fc4111b6ec61b4b2913fdf91f46671\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:40:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krfjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9667f80988a61a4f26f6f2b8b047ef94bdc6fa052083d5d1285f5bd9235615ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9667f80988a61a4f26f6f2b8b047ef94bdc6fa052083d5d1285f5bd9235615ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:40:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krfjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a4e7fb7b6f922e6e0fa2a7fcb5476ba549a6aa21dc88ee145bc55f4dac0912bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a4e7fb7b6f922e6e0fa2a7fcb5476ba549a6aa21dc88ee145bc55f4dac0912bb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:40:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krfjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://76a8239bfbb38c5858f1f9b0b01b9552463d8b2774564d7cdacb6757bf053656\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76a8239bfbb38c5858f1f9b0b01b9552463d8b2774564d7cdacb6757bf053656\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:40:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krfjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9fnsr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:41:05Z is after 2025-08-24T17:21:41Z" Oct 03 08:41:05 crc kubenswrapper[4790]: I1003 08:41:05.123764 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:05 crc kubenswrapper[4790]: I1003 08:41:05.123814 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:05 crc kubenswrapper[4790]: I1003 08:41:05.123824 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:05 crc kubenswrapper[4790]: I1003 08:41:05.123844 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:05 crc kubenswrapper[4790]: I1003 08:41:05.123856 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:05Z","lastTransitionTime":"2025-10-03T08:41:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:05 crc kubenswrapper[4790]: I1003 08:41:05.227337 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:05 crc kubenswrapper[4790]: I1003 08:41:05.227406 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:05 crc kubenswrapper[4790]: I1003 08:41:05.227434 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:05 crc kubenswrapper[4790]: I1003 08:41:05.227469 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:05 crc kubenswrapper[4790]: I1003 08:41:05.227497 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:05Z","lastTransitionTime":"2025-10-03T08:41:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:05 crc kubenswrapper[4790]: I1003 08:41:05.331415 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:05 crc kubenswrapper[4790]: I1003 08:41:05.331494 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:05 crc kubenswrapper[4790]: I1003 08:41:05.331518 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:05 crc kubenswrapper[4790]: I1003 08:41:05.331550 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:05 crc kubenswrapper[4790]: I1003 08:41:05.331571 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:05Z","lastTransitionTime":"2025-10-03T08:41:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:05 crc kubenswrapper[4790]: I1003 08:41:05.435039 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:05 crc kubenswrapper[4790]: I1003 08:41:05.435103 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:05 crc kubenswrapper[4790]: I1003 08:41:05.435122 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:05 crc kubenswrapper[4790]: I1003 08:41:05.435150 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:05 crc kubenswrapper[4790]: I1003 08:41:05.435169 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:05Z","lastTransitionTime":"2025-10-03T08:41:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:05 crc kubenswrapper[4790]: I1003 08:41:05.539074 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:05 crc kubenswrapper[4790]: I1003 08:41:05.539163 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:05 crc kubenswrapper[4790]: I1003 08:41:05.539188 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:05 crc kubenswrapper[4790]: I1003 08:41:05.539221 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:05 crc kubenswrapper[4790]: I1003 08:41:05.539260 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:05Z","lastTransitionTime":"2025-10-03T08:41:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:05 crc kubenswrapper[4790]: I1003 08:41:05.642972 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:05 crc kubenswrapper[4790]: I1003 08:41:05.643097 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:05 crc kubenswrapper[4790]: I1003 08:41:05.643174 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:05 crc kubenswrapper[4790]: I1003 08:41:05.643216 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:05 crc kubenswrapper[4790]: I1003 08:41:05.643246 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:05Z","lastTransitionTime":"2025-10-03T08:41:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:05 crc kubenswrapper[4790]: I1003 08:41:05.746819 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:05 crc kubenswrapper[4790]: I1003 08:41:05.746878 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:05 crc kubenswrapper[4790]: I1003 08:41:05.746890 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:05 crc kubenswrapper[4790]: I1003 08:41:05.746913 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:05 crc kubenswrapper[4790]: I1003 08:41:05.746926 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:05Z","lastTransitionTime":"2025-10-03T08:41:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:05 crc kubenswrapper[4790]: I1003 08:41:05.850863 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:05 crc kubenswrapper[4790]: I1003 08:41:05.851164 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:05 crc kubenswrapper[4790]: I1003 08:41:05.851189 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:05 crc kubenswrapper[4790]: I1003 08:41:05.851226 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:05 crc kubenswrapper[4790]: I1003 08:41:05.851248 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:05Z","lastTransitionTime":"2025-10-03T08:41:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:05 crc kubenswrapper[4790]: I1003 08:41:05.955139 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:05 crc kubenswrapper[4790]: I1003 08:41:05.955211 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:05 crc kubenswrapper[4790]: I1003 08:41:05.955226 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:05 crc kubenswrapper[4790]: I1003 08:41:05.955250 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:05 crc kubenswrapper[4790]: I1003 08:41:05.955265 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:05Z","lastTransitionTime":"2025-10-03T08:41:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:05 crc kubenswrapper[4790]: I1003 08:41:05.983981 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 03 08:41:05 crc kubenswrapper[4790]: I1003 08:41:05.998405 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"535765e1-fd5d-4501-b13e-684c5f08b296\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:41:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:41:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e0caa9033c82a625d9594342d4e58052cfbfea8251bc0fa2f1b102ba235b5ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db5e8a267e2821ea1c1628ff477a4efc3c5bcc78538460b0f30a160bd738dcee\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d85a637b07da3de31d73486c444bf86e7e2fd799180fc1cd9ab56fee61afaa9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c713cd887b75dc54dfefd8b613f5bdb78ade644b31fc7db5e789414fcacad4b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://905a99c09cfaadc3eddc17fa6e509098364525d7a03add46a97c98dc08def8a2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T08:40:30Z\\\",\\\"message\\\":\\\"file observer\\\\nW1003 08:40:30.396497 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1003 08:40:30.399279 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 08:40:30.400119 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-247026750/tls.crt::/tmp/serving-cert-247026750/tls.key\\\\\\\"\\\\nI1003 08:40:30.809123 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1003 08:40:30.812812 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1003 08:40:30.812831 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1003 08:40:30.812852 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1003 08:40:30.812858 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1003 08:40:30.817440 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1003 08:40:30.817463 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1003 08:40:30.817482 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 08:40:30.817491 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 08:40:30.817495 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1003 08:40:30.817498 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1003 08:40:30.817501 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1003 08:40:30.817504 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1003 08:40:30.821080 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4591d1f2d844cb16eef19a65aaad416898d23ca5ce266e4863a9810d49ff4ba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://444e973af493d5aca2c924bcaee20abca5f1281e6b9c9f364e4603bf622d0e48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://444e973af493d5aca2c924bcaee20abca5f1281e6b9c9f364e4603bf622d0e48\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:40:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:14Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:41:05Z is after 2025-08-24T17:21:41Z" Oct 03 08:41:06 crc kubenswrapper[4790]: I1003 08:41:06.004195 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 08:41:06 crc kubenswrapper[4790]: E1003 08:41:06.004389 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 08:41:38.004358122 +0000 UTC m=+84.357292360 (durationBeforeRetry 32s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 08:41:06 crc kubenswrapper[4790]: I1003 08:41:06.011645 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gprtl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c5bfb61-e69a-4576-b200-18dd7f38c7f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dcb3e7b6550f0ad490e187c2154ec53cb427cbab7acaf0618eb4de188eb724c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66xc5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:34Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gprtl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:41:06Z is after 2025-08-24T17:21:41Z" Oct 03 08:41:06 crc kubenswrapper[4790]: I1003 08:41:06.024765 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3362615c1396a6051d3770c79e4b67eea9d0c9df2a1a45381f729c4120970818\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://41113d498e57788a4c3162e8e981f5b9d9a5a7d49ee50b15085688fbc6b67f4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:41:06Z is after 2025-08-24T17:21:41Z" Oct 03 08:41:06 crc kubenswrapper[4790]: I1003 08:41:06.038902 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7ddb59e5-6feb-471d-9594-f86e6993a2e4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b0138e08174582c0c1388716f1f6b9e42a16b009f1a467803c4dacd50c3d80ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c575d680c9bc994b8c2e97e720d794138153bc3329c4b7f8ef81115a0245d0cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://027dd0df8d83044cff9f4e8c87bef3bb18e2169bb93e615486fd1e9b5a17b86e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e837e81f3b907d8506f674d13a9bc8f20f63e6c40e1f952cd57b715855ec9bf9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:14Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:41:06Z is after 2025-08-24T17:21:41Z" Oct 03 08:41:06 crc kubenswrapper[4790]: I1003 08:41:06.051533 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08d57a60251c60c6c6a2cab2b862c322c01b531ca1aa12ff4b0933bee9eb95f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:41:06Z is after 2025-08-24T17:21:41Z" Oct 03 08:41:06 crc kubenswrapper[4790]: I1003 08:41:06.057792 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:06 crc kubenswrapper[4790]: I1003 08:41:06.057829 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:06 crc kubenswrapper[4790]: I1003 08:41:06.057840 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:06 crc kubenswrapper[4790]: I1003 08:41:06.057860 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:06 crc kubenswrapper[4790]: I1003 08:41:06.057870 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:06Z","lastTransitionTime":"2025-10-03T08:41:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:06 crc kubenswrapper[4790]: I1003 08:41:06.064880 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-hncl2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f2f5631d-8486-44f9-81b0-ee78515e7866\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3fde680972c36057c9e06b0f745acbfb2e6ab399b14d6c25a3c379b65e598afd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-strp2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-hncl2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:41:06Z is after 2025-08-24T17:21:41Z" Oct 03 08:41:06 crc kubenswrapper[4790]: I1003 08:41:06.078650 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:41:06Z is after 2025-08-24T17:21:41Z" Oct 03 08:41:06 crc kubenswrapper[4790]: I1003 08:41:06.096648 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9fnsr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e01f9373-74e4-439b-bd7f-ad00e60a3284\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45be0671b4ad4d5e294f87dba5d52747298b00a3ebd9a13f2845881c0f1846a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krfjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://876c35e32e5877c2a69cdbb54b84ec0bc0daebd05c7847875a1c122aafac2057\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://876c35e32e5877c2a69cdbb54b84ec0bc0daebd05c7847875a1c122aafac2057\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krfjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://938ff3fffad5347212d00d02f96c105cd5e6252a4d6a7ada258f0367da3f8e77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://938ff3fffad5347212d00d02f96c105cd5e6252a4d6a7ada258f0367da3f8e77\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:40:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krfjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca99fc95f4bc463b8e638cfa1e6dc6be77fc4111b6ec61b4b2913fdf91f46671\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ca99fc95f4bc463b8e638cfa1e6dc6be77fc4111b6ec61b4b2913fdf91f46671\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:40:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krfjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9667f80988a61a4f26f6f2b8b047ef94bdc6fa052083d5d1285f5bd9235615ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9667f80988a61a4f26f6f2b8b047ef94bdc6fa052083d5d1285f5bd9235615ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:40:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krfjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a4e7fb7b6f922e6e0fa2a7fcb5476ba549a6aa21dc88ee145bc55f4dac0912bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a4e7fb7b6f922e6e0fa2a7fcb5476ba549a6aa21dc88ee145bc55f4dac0912bb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:40:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krfjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://76a8239bfbb38c5858f1f9b0b01b9552463d8b2774564d7cdacb6757bf053656\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76a8239bfbb38c5858f1f9b0b01b9552463d8b2774564d7cdacb6757bf053656\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:40:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krfjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9fnsr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:41:06Z is after 2025-08-24T17:21:41Z" Oct 03 08:41:06 crc kubenswrapper[4790]: I1003 08:41:06.109959 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7bpwn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a3923f73-5efb-4ce3-b68f-0c4e1e8eca4b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://305c950f0352fe1bec1c151dc89eed1389144ad5ee321458d48c6609e688bb73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gt4gf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:37Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7bpwn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:41:06Z is after 2025-08-24T17:21:41Z" Oct 03 08:41:06 crc kubenswrapper[4790]: I1003 08:41:06.124707 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-fn764" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12525ca4-5242-41bc-b7ba-2c1ab5a45892\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ee4f33c772a396fdd8d2a6f958d402139cb81cfd79da0f32c83bbe9adbe0801\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2t6qw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d79cdfa897fe5ca615930826e3a153a184ab7977626a9837dbcdbddbe4d27af6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2t6qw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-fn764\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:41:06Z is after 2025-08-24T17:21:41Z" Oct 03 08:41:06 crc kubenswrapper[4790]: I1003 08:41:06.139028 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-4vl95" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9236b957-81cb-40bc-969e-a2057884ba50\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j2gd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j2gd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:49Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-4vl95\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:41:06Z is after 2025-08-24T17:21:41Z" Oct 03 08:41:06 crc kubenswrapper[4790]: I1003 08:41:06.158382 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28b99cc027b4390bdc7e3ed519bb798eb4a7b825ff29e4574f3420b9922265b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:41:06Z is after 2025-08-24T17:21:41Z" Oct 03 08:41:06 crc kubenswrapper[4790]: I1003 08:41:06.160768 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:06 crc kubenswrapper[4790]: I1003 08:41:06.160830 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:06 crc kubenswrapper[4790]: I1003 08:41:06.160848 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:06 crc kubenswrapper[4790]: I1003 08:41:06.160871 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:06 crc kubenswrapper[4790]: I1003 08:41:06.160883 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:06Z","lastTransitionTime":"2025-10-03T08:41:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:06 crc kubenswrapper[4790]: I1003 08:41:06.176916 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:41:06Z is after 2025-08-24T17:21:41Z" Oct 03 08:41:06 crc kubenswrapper[4790]: I1003 08:41:06.190877 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:41:06Z is after 2025-08-24T17:21:41Z" Oct 03 08:41:06 crc kubenswrapper[4790]: I1003 08:41:06.207744 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 08:41:06 crc kubenswrapper[4790]: I1003 08:41:06.207805 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 08:41:06 crc kubenswrapper[4790]: I1003 08:41:06.207844 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 08:41:06 crc kubenswrapper[4790]: I1003 08:41:06.207949 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 08:41:06 crc kubenswrapper[4790]: E1003 08:41:06.208043 4790 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 03 08:41:06 crc kubenswrapper[4790]: E1003 08:41:06.208064 4790 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 03 08:41:06 crc kubenswrapper[4790]: E1003 08:41:06.208125 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-03 08:41:38.208101578 +0000 UTC m=+84.561035816 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 03 08:41:06 crc kubenswrapper[4790]: E1003 08:41:06.208162 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-03 08:41:38.208139799 +0000 UTC m=+84.561074037 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 03 08:41:06 crc kubenswrapper[4790]: E1003 08:41:06.208043 4790 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 03 08:41:06 crc kubenswrapper[4790]: E1003 08:41:06.208206 4790 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 03 08:41:06 crc kubenswrapper[4790]: E1003 08:41:06.208220 4790 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 03 08:41:06 crc kubenswrapper[4790]: E1003 08:41:06.208252 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-03 08:41:38.208245832 +0000 UTC m=+84.561180070 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 03 08:41:06 crc kubenswrapper[4790]: E1003 08:41:06.208241 4790 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 03 08:41:06 crc kubenswrapper[4790]: E1003 08:41:06.208314 4790 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 03 08:41:06 crc kubenswrapper[4790]: E1003 08:41:06.208344 4790 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 03 08:41:06 crc kubenswrapper[4790]: E1003 08:41:06.208462 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-03 08:41:38.208428717 +0000 UTC m=+84.561362995 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 03 08:41:06 crc kubenswrapper[4790]: I1003 08:41:06.208399 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"36eae06a-d8be-4128-8d68-acb32e1f011b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a97a1a7f3aba494cd0da6bcd15052a85d70642642a62d108ac0be51ce8e3d302\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww2bb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8c3aea222eb433729d8eac107ebcdb306f649bef25942cd0390bfc1bbda5c74\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww2bb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:35Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-zqj4j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:41:06Z is after 2025-08-24T17:21:41Z" Oct 03 08:41:06 crc kubenswrapper[4790]: I1003 08:41:06.238409 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vsjvh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa372dcc-63aa-48e5-b153-7739078026ef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e69102d376e90485ca565d34e18dbc12067e166f0bdf3c4325a0873f9f2b9b0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52pgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e52e55afef7b68f05e0c7d975d4497ee57dd9fa5aef7f72e9ca7441ea1633c05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52pgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55b4e8ab33316a7f7057ee723ebd49eb0adcd7270124f084b2828ce538ed3297\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52pgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f2aaa340e3927261ef0119f786b348e2091ddcd4b791619cc521122cc0774e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52pgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff0f1dd623e164ebe1862661c5905a38bec3aa1dd5c61982eebde25d0feb4c6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52pgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27b1573f76936a5148d24accaa7fdc1a1da552fb56b348812445e7f1c1e4cbef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52pgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa6b0394cf2fb97d539acd9a37b2c7184af5640122bf0b1d3f0fb663a78a9e91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa6b0394cf2fb97d539acd9a37b2c7184af5640122bf0b1d3f0fb663a78a9e91\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-03T08:41:03Z\\\",\\\"message\\\":\\\"resolver-gprtl in node crc\\\\nI1003 08:41:03.132081 6465 obj_retry.go:386] Retry successful for *v1.Pod openshift-dns/node-resolver-gprtl after 0 failed attempt(s)\\\\nI1003 08:41:03.132085 6465 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1003 08:41:03.132087 6465 default_network_controller.go:776] Recording success event on pod openshift-dns/node-resolver-gprtl\\\\nI1003 08:41:03.132018 6465 obj_retry.go:386] Retry successful for *v1.Pod openshift-kube-apiserver/kube-apiserver-crc after 0 failed attempt(s)\\\\nI1003 08:41:03.132100 6465 default_network_controller.go:776] Recording success event on pod openshift-kube-apiserver/kube-apiserver-crc\\\\nI1003 08:41:03.132110 6465 handler.go:208] Removed *v1.Node event handler 2\\\\nI1003 08:41:03.132122 6465 factory.go:656] Stopping watch factory\\\\nI1003 08:41:03.132130 6465 base_network_controller_pods.go:916] Annotation values: ip=[10.217.0.3/23] ; mac=0a:58:0a:d9:00:03 ; gw=[10.217.0.1]\\\\nI1003 08:41:03.132141 6465 ovnkube.go:599] Stopped ovnkube\\\\nI1003 08:41:03.132178 6465 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1003 08:41:03.132192 6465 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1003 08:41:03.132253 6465 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T08:41:02Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-vsjvh_openshift-ovn-kubernetes(fa372dcc-63aa-48e5-b153-7739078026ef)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52pgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74d44e422b0cbbd3793553dad98475d5a93ac38530a167db0dd1ef152edac010\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52pgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e45c39722f37fc673cee930ca10396bba43b56e19e61aa99bd2b89f131341cb4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e45c39722f37fc673cee930ca10396bba43b56e19e61aa99bd2b89f131341cb4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:40:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52pgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vsjvh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:41:06Z is after 2025-08-24T17:21:41Z" Oct 03 08:41:06 crc kubenswrapper[4790]: I1003 08:41:06.264100 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:06 crc kubenswrapper[4790]: I1003 08:41:06.264157 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:06 crc kubenswrapper[4790]: I1003 08:41:06.264170 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:06 crc kubenswrapper[4790]: I1003 08:41:06.264189 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:06 crc kubenswrapper[4790]: I1003 08:41:06.264205 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:06Z","lastTransitionTime":"2025-10-03T08:41:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:06 crc kubenswrapper[4790]: I1003 08:41:06.328059 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4vl95" Oct 03 08:41:06 crc kubenswrapper[4790]: I1003 08:41:06.328111 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 08:41:06 crc kubenswrapper[4790]: I1003 08:41:06.328068 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 08:41:06 crc kubenswrapper[4790]: I1003 08:41:06.328189 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 08:41:06 crc kubenswrapper[4790]: E1003 08:41:06.328270 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 08:41:06 crc kubenswrapper[4790]: E1003 08:41:06.328390 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 08:41:06 crc kubenswrapper[4790]: E1003 08:41:06.328478 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 08:41:06 crc kubenswrapper[4790]: E1003 08:41:06.328806 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4vl95" podUID="9236b957-81cb-40bc-969e-a2057884ba50" Oct 03 08:41:06 crc kubenswrapper[4790]: I1003 08:41:06.367733 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:06 crc kubenswrapper[4790]: I1003 08:41:06.367792 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:06 crc kubenswrapper[4790]: I1003 08:41:06.367804 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:06 crc kubenswrapper[4790]: I1003 08:41:06.367829 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:06 crc kubenswrapper[4790]: I1003 08:41:06.367845 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:06Z","lastTransitionTime":"2025-10-03T08:41:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:06 crc kubenswrapper[4790]: I1003 08:41:06.471455 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:06 crc kubenswrapper[4790]: I1003 08:41:06.472180 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:06 crc kubenswrapper[4790]: I1003 08:41:06.472206 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:06 crc kubenswrapper[4790]: I1003 08:41:06.472242 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:06 crc kubenswrapper[4790]: I1003 08:41:06.472265 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:06Z","lastTransitionTime":"2025-10-03T08:41:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:06 crc kubenswrapper[4790]: I1003 08:41:06.575655 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:06 crc kubenswrapper[4790]: I1003 08:41:06.575723 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:06 crc kubenswrapper[4790]: I1003 08:41:06.575741 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:06 crc kubenswrapper[4790]: I1003 08:41:06.575770 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:06 crc kubenswrapper[4790]: I1003 08:41:06.575796 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:06Z","lastTransitionTime":"2025-10-03T08:41:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:06 crc kubenswrapper[4790]: I1003 08:41:06.679319 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:06 crc kubenswrapper[4790]: I1003 08:41:06.679381 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:06 crc kubenswrapper[4790]: I1003 08:41:06.679395 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:06 crc kubenswrapper[4790]: I1003 08:41:06.679419 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:06 crc kubenswrapper[4790]: I1003 08:41:06.679433 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:06Z","lastTransitionTime":"2025-10-03T08:41:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:06 crc kubenswrapper[4790]: I1003 08:41:06.783305 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:06 crc kubenswrapper[4790]: I1003 08:41:06.783349 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:06 crc kubenswrapper[4790]: I1003 08:41:06.783361 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:06 crc kubenswrapper[4790]: I1003 08:41:06.783392 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:06 crc kubenswrapper[4790]: I1003 08:41:06.783405 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:06Z","lastTransitionTime":"2025-10-03T08:41:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:06 crc kubenswrapper[4790]: I1003 08:41:06.887105 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:06 crc kubenswrapper[4790]: I1003 08:41:06.887179 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:06 crc kubenswrapper[4790]: I1003 08:41:06.887195 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:06 crc kubenswrapper[4790]: I1003 08:41:06.887223 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:06 crc kubenswrapper[4790]: I1003 08:41:06.887241 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:06Z","lastTransitionTime":"2025-10-03T08:41:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:06 crc kubenswrapper[4790]: I1003 08:41:06.989723 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:06 crc kubenswrapper[4790]: I1003 08:41:06.989775 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:06 crc kubenswrapper[4790]: I1003 08:41:06.989787 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:06 crc kubenswrapper[4790]: I1003 08:41:06.989806 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:06 crc kubenswrapper[4790]: I1003 08:41:06.989818 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:06Z","lastTransitionTime":"2025-10-03T08:41:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:07 crc kubenswrapper[4790]: I1003 08:41:07.092897 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:07 crc kubenswrapper[4790]: I1003 08:41:07.092954 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:07 crc kubenswrapper[4790]: I1003 08:41:07.092965 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:07 crc kubenswrapper[4790]: I1003 08:41:07.092984 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:07 crc kubenswrapper[4790]: I1003 08:41:07.092995 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:07Z","lastTransitionTime":"2025-10-03T08:41:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:07 crc kubenswrapper[4790]: I1003 08:41:07.196107 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:07 crc kubenswrapper[4790]: I1003 08:41:07.196202 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:07 crc kubenswrapper[4790]: I1003 08:41:07.196230 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:07 crc kubenswrapper[4790]: I1003 08:41:07.196269 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:07 crc kubenswrapper[4790]: I1003 08:41:07.196299 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:07Z","lastTransitionTime":"2025-10-03T08:41:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:07 crc kubenswrapper[4790]: I1003 08:41:07.300225 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:07 crc kubenswrapper[4790]: I1003 08:41:07.300287 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:07 crc kubenswrapper[4790]: I1003 08:41:07.300302 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:07 crc kubenswrapper[4790]: I1003 08:41:07.300329 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:07 crc kubenswrapper[4790]: I1003 08:41:07.300346 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:07Z","lastTransitionTime":"2025-10-03T08:41:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:07 crc kubenswrapper[4790]: I1003 08:41:07.403204 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:07 crc kubenswrapper[4790]: I1003 08:41:07.403279 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:07 crc kubenswrapper[4790]: I1003 08:41:07.403297 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:07 crc kubenswrapper[4790]: I1003 08:41:07.403329 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:07 crc kubenswrapper[4790]: I1003 08:41:07.403347 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:07Z","lastTransitionTime":"2025-10-03T08:41:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:07 crc kubenswrapper[4790]: I1003 08:41:07.506722 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:07 crc kubenswrapper[4790]: I1003 08:41:07.506771 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:07 crc kubenswrapper[4790]: I1003 08:41:07.506781 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:07 crc kubenswrapper[4790]: I1003 08:41:07.506799 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:07 crc kubenswrapper[4790]: I1003 08:41:07.506810 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:07Z","lastTransitionTime":"2025-10-03T08:41:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:07 crc kubenswrapper[4790]: I1003 08:41:07.610675 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:07 crc kubenswrapper[4790]: I1003 08:41:07.610744 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:07 crc kubenswrapper[4790]: I1003 08:41:07.610762 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:07 crc kubenswrapper[4790]: I1003 08:41:07.610795 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:07 crc kubenswrapper[4790]: I1003 08:41:07.610814 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:07Z","lastTransitionTime":"2025-10-03T08:41:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:07 crc kubenswrapper[4790]: I1003 08:41:07.713657 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:07 crc kubenswrapper[4790]: I1003 08:41:07.713706 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:07 crc kubenswrapper[4790]: I1003 08:41:07.713716 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:07 crc kubenswrapper[4790]: I1003 08:41:07.713733 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:07 crc kubenswrapper[4790]: I1003 08:41:07.713742 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:07Z","lastTransitionTime":"2025-10-03T08:41:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:07 crc kubenswrapper[4790]: I1003 08:41:07.817132 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:07 crc kubenswrapper[4790]: I1003 08:41:07.817212 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:07 crc kubenswrapper[4790]: I1003 08:41:07.817241 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:07 crc kubenswrapper[4790]: I1003 08:41:07.817276 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:07 crc kubenswrapper[4790]: I1003 08:41:07.817301 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:07Z","lastTransitionTime":"2025-10-03T08:41:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:07 crc kubenswrapper[4790]: I1003 08:41:07.863656 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 03 08:41:07 crc kubenswrapper[4790]: I1003 08:41:07.875559 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Oct 03 08:41:07 crc kubenswrapper[4790]: I1003 08:41:07.883702 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08d57a60251c60c6c6a2cab2b862c322c01b531ca1aa12ff4b0933bee9eb95f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:41:07Z is after 2025-08-24T17:21:41Z" Oct 03 08:41:07 crc kubenswrapper[4790]: I1003 08:41:07.904909 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-hncl2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f2f5631d-8486-44f9-81b0-ee78515e7866\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3fde680972c36057c9e06b0f745acbfb2e6ab399b14d6c25a3c379b65e598afd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-strp2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-hncl2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:41:07Z is after 2025-08-24T17:21:41Z" Oct 03 08:41:07 crc kubenswrapper[4790]: I1003 08:41:07.921825 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:07 crc kubenswrapper[4790]: I1003 08:41:07.921901 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:07 crc kubenswrapper[4790]: I1003 08:41:07.921919 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:07 crc kubenswrapper[4790]: I1003 08:41:07.921946 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:07 crc kubenswrapper[4790]: I1003 08:41:07.921966 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:07Z","lastTransitionTime":"2025-10-03T08:41:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:07 crc kubenswrapper[4790]: I1003 08:41:07.926807 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7ddb59e5-6feb-471d-9594-f86e6993a2e4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b0138e08174582c0c1388716f1f6b9e42a16b009f1a467803c4dacd50c3d80ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c575d680c9bc994b8c2e97e720d794138153bc3329c4b7f8ef81115a0245d0cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://027dd0df8d83044cff9f4e8c87bef3bb18e2169bb93e615486fd1e9b5a17b86e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e837e81f3b907d8506f674d13a9bc8f20f63e6c40e1f952cd57b715855ec9bf9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:14Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:41:07Z is after 2025-08-24T17:21:41Z" Oct 03 08:41:07 crc kubenswrapper[4790]: I1003 08:41:07.951322 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9fnsr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e01f9373-74e4-439b-bd7f-ad00e60a3284\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45be0671b4ad4d5e294f87dba5d52747298b00a3ebd9a13f2845881c0f1846a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krfjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://876c35e32e5877c2a69cdbb54b84ec0bc0daebd05c7847875a1c122aafac2057\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://876c35e32e5877c2a69cdbb54b84ec0bc0daebd05c7847875a1c122aafac2057\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krfjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://938ff3fffad5347212d00d02f96c105cd5e6252a4d6a7ada258f0367da3f8e77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://938ff3fffad5347212d00d02f96c105cd5e6252a4d6a7ada258f0367da3f8e77\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:40:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krfjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca99fc95f4bc463b8e638cfa1e6dc6be77fc4111b6ec61b4b2913fdf91f46671\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ca99fc95f4bc463b8e638cfa1e6dc6be77fc4111b6ec61b4b2913fdf91f46671\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:40:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krfjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9667f80988a61a4f26f6f2b8b047ef94bdc6fa052083d5d1285f5bd9235615ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9667f80988a61a4f26f6f2b8b047ef94bdc6fa052083d5d1285f5bd9235615ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:40:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krfjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a4e7fb7b6f922e6e0fa2a7fcb5476ba549a6aa21dc88ee145bc55f4dac0912bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a4e7fb7b6f922e6e0fa2a7fcb5476ba549a6aa21dc88ee145bc55f4dac0912bb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:40:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krfjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://76a8239bfbb38c5858f1f9b0b01b9552463d8b2774564d7cdacb6757bf053656\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76a8239bfbb38c5858f1f9b0b01b9552463d8b2774564d7cdacb6757bf053656\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:40:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krfjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9fnsr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:41:07Z is after 2025-08-24T17:21:41Z" Oct 03 08:41:07 crc kubenswrapper[4790]: I1003 08:41:07.962653 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7bpwn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a3923f73-5efb-4ce3-b68f-0c4e1e8eca4b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://305c950f0352fe1bec1c151dc89eed1389144ad5ee321458d48c6609e688bb73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gt4gf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:37Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7bpwn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:41:07Z is after 2025-08-24T17:21:41Z" Oct 03 08:41:07 crc kubenswrapper[4790]: I1003 08:41:07.982013 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-fn764" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12525ca4-5242-41bc-b7ba-2c1ab5a45892\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ee4f33c772a396fdd8d2a6f958d402139cb81cfd79da0f32c83bbe9adbe0801\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2t6qw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d79cdfa897fe5ca615930826e3a153a184ab7977626a9837dbcdbddbe4d27af6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2t6qw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-fn764\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:41:07Z is after 2025-08-24T17:21:41Z" Oct 03 08:41:07 crc kubenswrapper[4790]: I1003 08:41:07.996726 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:41:07Z is after 2025-08-24T17:21:41Z" Oct 03 08:41:08 crc kubenswrapper[4790]: I1003 08:41:08.014192 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:41:08Z is after 2025-08-24T17:21:41Z" Oct 03 08:41:08 crc kubenswrapper[4790]: I1003 08:41:08.024687 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:08 crc kubenswrapper[4790]: I1003 08:41:08.024761 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:08 crc kubenswrapper[4790]: I1003 08:41:08.024779 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:08 crc kubenswrapper[4790]: I1003 08:41:08.024803 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:08 crc kubenswrapper[4790]: I1003 08:41:08.024818 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:08Z","lastTransitionTime":"2025-10-03T08:41:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:08 crc kubenswrapper[4790]: I1003 08:41:08.029171 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"36eae06a-d8be-4128-8d68-acb32e1f011b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a97a1a7f3aba494cd0da6bcd15052a85d70642642a62d108ac0be51ce8e3d302\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww2bb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8c3aea222eb433729d8eac107ebcdb306f649bef25942cd0390bfc1bbda5c74\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww2bb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:35Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-zqj4j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:41:08Z is after 2025-08-24T17:21:41Z" Oct 03 08:41:08 crc kubenswrapper[4790]: I1003 08:41:08.050895 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vsjvh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa372dcc-63aa-48e5-b153-7739078026ef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e69102d376e90485ca565d34e18dbc12067e166f0bdf3c4325a0873f9f2b9b0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52pgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e52e55afef7b68f05e0c7d975d4497ee57dd9fa5aef7f72e9ca7441ea1633c05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52pgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55b4e8ab33316a7f7057ee723ebd49eb0adcd7270124f084b2828ce538ed3297\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52pgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f2aaa340e3927261ef0119f786b348e2091ddcd4b791619cc521122cc0774e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52pgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff0f1dd623e164ebe1862661c5905a38bec3aa1dd5c61982eebde25d0feb4c6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52pgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27b1573f76936a5148d24accaa7fdc1a1da552fb56b348812445e7f1c1e4cbef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52pgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa6b0394cf2fb97d539acd9a37b2c7184af5640122bf0b1d3f0fb663a78a9e91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa6b0394cf2fb97d539acd9a37b2c7184af5640122bf0b1d3f0fb663a78a9e91\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-03T08:41:03Z\\\",\\\"message\\\":\\\"resolver-gprtl in node crc\\\\nI1003 08:41:03.132081 6465 obj_retry.go:386] Retry successful for *v1.Pod openshift-dns/node-resolver-gprtl after 0 failed attempt(s)\\\\nI1003 08:41:03.132085 6465 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1003 08:41:03.132087 6465 default_network_controller.go:776] Recording success event on pod openshift-dns/node-resolver-gprtl\\\\nI1003 08:41:03.132018 6465 obj_retry.go:386] Retry successful for *v1.Pod openshift-kube-apiserver/kube-apiserver-crc after 0 failed attempt(s)\\\\nI1003 08:41:03.132100 6465 default_network_controller.go:776] Recording success event on pod openshift-kube-apiserver/kube-apiserver-crc\\\\nI1003 08:41:03.132110 6465 handler.go:208] Removed *v1.Node event handler 2\\\\nI1003 08:41:03.132122 6465 factory.go:656] Stopping watch factory\\\\nI1003 08:41:03.132130 6465 base_network_controller_pods.go:916] Annotation values: ip=[10.217.0.3/23] ; mac=0a:58:0a:d9:00:03 ; gw=[10.217.0.1]\\\\nI1003 08:41:03.132141 6465 ovnkube.go:599] Stopped ovnkube\\\\nI1003 08:41:03.132178 6465 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1003 08:41:03.132192 6465 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1003 08:41:03.132253 6465 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T08:41:02Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-vsjvh_openshift-ovn-kubernetes(fa372dcc-63aa-48e5-b153-7739078026ef)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52pgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74d44e422b0cbbd3793553dad98475d5a93ac38530a167db0dd1ef152edac010\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52pgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e45c39722f37fc673cee930ca10396bba43b56e19e61aa99bd2b89f131341cb4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e45c39722f37fc673cee930ca10396bba43b56e19e61aa99bd2b89f131341cb4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:40:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52pgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vsjvh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:41:08Z is after 2025-08-24T17:21:41Z" Oct 03 08:41:08 crc kubenswrapper[4790]: I1003 08:41:08.062859 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-4vl95" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9236b957-81cb-40bc-969e-a2057884ba50\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j2gd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j2gd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:49Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-4vl95\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:41:08Z is after 2025-08-24T17:21:41Z" Oct 03 08:41:08 crc kubenswrapper[4790]: I1003 08:41:08.074669 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28b99cc027b4390bdc7e3ed519bb798eb4a7b825ff29e4574f3420b9922265b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:41:08Z is after 2025-08-24T17:21:41Z" Oct 03 08:41:08 crc kubenswrapper[4790]: I1003 08:41:08.088859 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:41:08Z is after 2025-08-24T17:21:41Z" Oct 03 08:41:08 crc kubenswrapper[4790]: I1003 08:41:08.100727 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gprtl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c5bfb61-e69a-4576-b200-18dd7f38c7f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dcb3e7b6550f0ad490e187c2154ec53cb427cbab7acaf0618eb4de188eb724c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66xc5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:34Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gprtl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:41:08Z is after 2025-08-24T17:21:41Z" Oct 03 08:41:08 crc kubenswrapper[4790]: I1003 08:41:08.111832 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3362615c1396a6051d3770c79e4b67eea9d0c9df2a1a45381f729c4120970818\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://41113d498e57788a4c3162e8e981f5b9d9a5a7d49ee50b15085688fbc6b67f4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:41:08Z is after 2025-08-24T17:21:41Z" Oct 03 08:41:08 crc kubenswrapper[4790]: I1003 08:41:08.127281 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:08 crc kubenswrapper[4790]: I1003 08:41:08.127320 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:08 crc kubenswrapper[4790]: I1003 08:41:08.127331 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:08 crc kubenswrapper[4790]: I1003 08:41:08.127349 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:08 crc kubenswrapper[4790]: I1003 08:41:08.127361 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:08Z","lastTransitionTime":"2025-10-03T08:41:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:08 crc kubenswrapper[4790]: I1003 08:41:08.131035 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"535765e1-fd5d-4501-b13e-684c5f08b296\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:41:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:41:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e0caa9033c82a625d9594342d4e58052cfbfea8251bc0fa2f1b102ba235b5ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db5e8a267e2821ea1c1628ff477a4efc3c5bcc78538460b0f30a160bd738dcee\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d85a637b07da3de31d73486c444bf86e7e2fd799180fc1cd9ab56fee61afaa9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c713cd887b75dc54dfefd8b613f5bdb78ade644b31fc7db5e789414fcacad4b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://905a99c09cfaadc3eddc17fa6e509098364525d7a03add46a97c98dc08def8a2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T08:40:30Z\\\",\\\"message\\\":\\\"file observer\\\\nW1003 08:40:30.396497 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1003 08:40:30.399279 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 08:40:30.400119 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-247026750/tls.crt::/tmp/serving-cert-247026750/tls.key\\\\\\\"\\\\nI1003 08:40:30.809123 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1003 08:40:30.812812 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1003 08:40:30.812831 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1003 08:40:30.812852 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1003 08:40:30.812858 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1003 08:40:30.817440 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1003 08:40:30.817463 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1003 08:40:30.817482 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 08:40:30.817491 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 08:40:30.817495 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1003 08:40:30.817498 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1003 08:40:30.817501 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1003 08:40:30.817504 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1003 08:40:30.821080 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4591d1f2d844cb16eef19a65aaad416898d23ca5ce266e4863a9810d49ff4ba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://444e973af493d5aca2c924bcaee20abca5f1281e6b9c9f364e4603bf622d0e48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://444e973af493d5aca2c924bcaee20abca5f1281e6b9c9f364e4603bf622d0e48\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:40:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:14Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:41:08Z is after 2025-08-24T17:21:41Z" Oct 03 08:41:08 crc kubenswrapper[4790]: I1003 08:41:08.230949 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:08 crc kubenswrapper[4790]: I1003 08:41:08.231032 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:08 crc kubenswrapper[4790]: I1003 08:41:08.231055 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:08 crc kubenswrapper[4790]: I1003 08:41:08.231083 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:08 crc kubenswrapper[4790]: I1003 08:41:08.231102 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:08Z","lastTransitionTime":"2025-10-03T08:41:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:08 crc kubenswrapper[4790]: I1003 08:41:08.327886 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 08:41:08 crc kubenswrapper[4790]: I1003 08:41:08.327938 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 08:41:08 crc kubenswrapper[4790]: I1003 08:41:08.327980 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 08:41:08 crc kubenswrapper[4790]: I1003 08:41:08.327892 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4vl95" Oct 03 08:41:08 crc kubenswrapper[4790]: E1003 08:41:08.328106 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 08:41:08 crc kubenswrapper[4790]: E1003 08:41:08.328242 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 08:41:08 crc kubenswrapper[4790]: E1003 08:41:08.328380 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 08:41:08 crc kubenswrapper[4790]: E1003 08:41:08.328760 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4vl95" podUID="9236b957-81cb-40bc-969e-a2057884ba50" Oct 03 08:41:08 crc kubenswrapper[4790]: I1003 08:41:08.333188 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:08 crc kubenswrapper[4790]: I1003 08:41:08.333244 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:08 crc kubenswrapper[4790]: I1003 08:41:08.333254 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:08 crc kubenswrapper[4790]: I1003 08:41:08.333277 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:08 crc kubenswrapper[4790]: I1003 08:41:08.333292 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:08Z","lastTransitionTime":"2025-10-03T08:41:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:08 crc kubenswrapper[4790]: I1003 08:41:08.435727 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:08 crc kubenswrapper[4790]: I1003 08:41:08.435774 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:08 crc kubenswrapper[4790]: I1003 08:41:08.435786 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:08 crc kubenswrapper[4790]: I1003 08:41:08.435804 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:08 crc kubenswrapper[4790]: I1003 08:41:08.435818 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:08Z","lastTransitionTime":"2025-10-03T08:41:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:08 crc kubenswrapper[4790]: I1003 08:41:08.538123 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:08 crc kubenswrapper[4790]: I1003 08:41:08.538190 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:08 crc kubenswrapper[4790]: I1003 08:41:08.538202 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:08 crc kubenswrapper[4790]: I1003 08:41:08.538222 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:08 crc kubenswrapper[4790]: I1003 08:41:08.538255 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:08Z","lastTransitionTime":"2025-10-03T08:41:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:08 crc kubenswrapper[4790]: I1003 08:41:08.640795 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:08 crc kubenswrapper[4790]: I1003 08:41:08.640859 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:08 crc kubenswrapper[4790]: I1003 08:41:08.640879 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:08 crc kubenswrapper[4790]: I1003 08:41:08.640900 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:08 crc kubenswrapper[4790]: I1003 08:41:08.640914 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:08Z","lastTransitionTime":"2025-10-03T08:41:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:08 crc kubenswrapper[4790]: I1003 08:41:08.743255 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:08 crc kubenswrapper[4790]: I1003 08:41:08.743304 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:08 crc kubenswrapper[4790]: I1003 08:41:08.743314 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:08 crc kubenswrapper[4790]: I1003 08:41:08.743339 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:08 crc kubenswrapper[4790]: I1003 08:41:08.743353 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:08Z","lastTransitionTime":"2025-10-03T08:41:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:08 crc kubenswrapper[4790]: I1003 08:41:08.846538 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:08 crc kubenswrapper[4790]: I1003 08:41:08.846628 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:08 crc kubenswrapper[4790]: I1003 08:41:08.846645 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:08 crc kubenswrapper[4790]: I1003 08:41:08.846660 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:08 crc kubenswrapper[4790]: I1003 08:41:08.846669 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:08Z","lastTransitionTime":"2025-10-03T08:41:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:08 crc kubenswrapper[4790]: I1003 08:41:08.949257 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:08 crc kubenswrapper[4790]: I1003 08:41:08.949301 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:08 crc kubenswrapper[4790]: I1003 08:41:08.949313 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:08 crc kubenswrapper[4790]: I1003 08:41:08.949333 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:08 crc kubenswrapper[4790]: I1003 08:41:08.949344 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:08Z","lastTransitionTime":"2025-10-03T08:41:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:09 crc kubenswrapper[4790]: I1003 08:41:09.052694 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:09 crc kubenswrapper[4790]: I1003 08:41:09.052733 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:09 crc kubenswrapper[4790]: I1003 08:41:09.052743 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:09 crc kubenswrapper[4790]: I1003 08:41:09.052785 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:09 crc kubenswrapper[4790]: I1003 08:41:09.052795 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:09Z","lastTransitionTime":"2025-10-03T08:41:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:09 crc kubenswrapper[4790]: I1003 08:41:09.156035 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:09 crc kubenswrapper[4790]: I1003 08:41:09.156117 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:09 crc kubenswrapper[4790]: I1003 08:41:09.156141 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:09 crc kubenswrapper[4790]: I1003 08:41:09.156176 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:09 crc kubenswrapper[4790]: I1003 08:41:09.156202 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:09Z","lastTransitionTime":"2025-10-03T08:41:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:09 crc kubenswrapper[4790]: I1003 08:41:09.259344 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:09 crc kubenswrapper[4790]: I1003 08:41:09.259398 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:09 crc kubenswrapper[4790]: I1003 08:41:09.259414 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:09 crc kubenswrapper[4790]: I1003 08:41:09.259439 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:09 crc kubenswrapper[4790]: I1003 08:41:09.259455 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:09Z","lastTransitionTime":"2025-10-03T08:41:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:09 crc kubenswrapper[4790]: I1003 08:41:09.363789 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:09 crc kubenswrapper[4790]: I1003 08:41:09.363876 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:09 crc kubenswrapper[4790]: I1003 08:41:09.363903 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:09 crc kubenswrapper[4790]: I1003 08:41:09.363938 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:09 crc kubenswrapper[4790]: I1003 08:41:09.363964 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:09Z","lastTransitionTime":"2025-10-03T08:41:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:09 crc kubenswrapper[4790]: I1003 08:41:09.468200 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:09 crc kubenswrapper[4790]: I1003 08:41:09.468256 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:09 crc kubenswrapper[4790]: I1003 08:41:09.468271 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:09 crc kubenswrapper[4790]: I1003 08:41:09.468293 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:09 crc kubenswrapper[4790]: I1003 08:41:09.468308 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:09Z","lastTransitionTime":"2025-10-03T08:41:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:09 crc kubenswrapper[4790]: I1003 08:41:09.571458 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:09 crc kubenswrapper[4790]: I1003 08:41:09.571536 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:09 crc kubenswrapper[4790]: I1003 08:41:09.571559 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:09 crc kubenswrapper[4790]: I1003 08:41:09.571616 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:09 crc kubenswrapper[4790]: I1003 08:41:09.571637 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:09Z","lastTransitionTime":"2025-10-03T08:41:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:09 crc kubenswrapper[4790]: I1003 08:41:09.674994 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:09 crc kubenswrapper[4790]: I1003 08:41:09.675046 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:09 crc kubenswrapper[4790]: I1003 08:41:09.675062 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:09 crc kubenswrapper[4790]: I1003 08:41:09.675107 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:09 crc kubenswrapper[4790]: I1003 08:41:09.675126 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:09Z","lastTransitionTime":"2025-10-03T08:41:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:09 crc kubenswrapper[4790]: I1003 08:41:09.783704 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:09 crc kubenswrapper[4790]: I1003 08:41:09.783764 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:09 crc kubenswrapper[4790]: I1003 08:41:09.783777 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:09 crc kubenswrapper[4790]: I1003 08:41:09.783799 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:09 crc kubenswrapper[4790]: I1003 08:41:09.783814 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:09Z","lastTransitionTime":"2025-10-03T08:41:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:09 crc kubenswrapper[4790]: I1003 08:41:09.887122 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:09 crc kubenswrapper[4790]: I1003 08:41:09.887213 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:09 crc kubenswrapper[4790]: I1003 08:41:09.887234 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:09 crc kubenswrapper[4790]: I1003 08:41:09.887267 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:09 crc kubenswrapper[4790]: I1003 08:41:09.887289 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:09Z","lastTransitionTime":"2025-10-03T08:41:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:09 crc kubenswrapper[4790]: I1003 08:41:09.990354 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:09 crc kubenswrapper[4790]: I1003 08:41:09.990433 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:09 crc kubenswrapper[4790]: I1003 08:41:09.990452 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:09 crc kubenswrapper[4790]: I1003 08:41:09.990481 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:09 crc kubenswrapper[4790]: I1003 08:41:09.990501 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:09Z","lastTransitionTime":"2025-10-03T08:41:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:10 crc kubenswrapper[4790]: I1003 08:41:10.093682 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:10 crc kubenswrapper[4790]: I1003 08:41:10.093764 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:10 crc kubenswrapper[4790]: I1003 08:41:10.093820 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:10 crc kubenswrapper[4790]: I1003 08:41:10.093859 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:10 crc kubenswrapper[4790]: I1003 08:41:10.093883 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:10Z","lastTransitionTime":"2025-10-03T08:41:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:10 crc kubenswrapper[4790]: I1003 08:41:10.196770 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:10 crc kubenswrapper[4790]: I1003 08:41:10.196855 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:10 crc kubenswrapper[4790]: I1003 08:41:10.196879 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:10 crc kubenswrapper[4790]: I1003 08:41:10.196912 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:10 crc kubenswrapper[4790]: I1003 08:41:10.196936 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:10Z","lastTransitionTime":"2025-10-03T08:41:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:10 crc kubenswrapper[4790]: I1003 08:41:10.301082 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:10 crc kubenswrapper[4790]: I1003 08:41:10.301152 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:10 crc kubenswrapper[4790]: I1003 08:41:10.301175 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:10 crc kubenswrapper[4790]: I1003 08:41:10.301206 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:10 crc kubenswrapper[4790]: I1003 08:41:10.301228 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:10Z","lastTransitionTime":"2025-10-03T08:41:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:10 crc kubenswrapper[4790]: I1003 08:41:10.328035 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 08:41:10 crc kubenswrapper[4790]: I1003 08:41:10.328146 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 08:41:10 crc kubenswrapper[4790]: I1003 08:41:10.328165 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 08:41:10 crc kubenswrapper[4790]: I1003 08:41:10.328445 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4vl95" Oct 03 08:41:10 crc kubenswrapper[4790]: E1003 08:41:10.328797 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 08:41:10 crc kubenswrapper[4790]: E1003 08:41:10.329044 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 08:41:10 crc kubenswrapper[4790]: E1003 08:41:10.329127 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 08:41:10 crc kubenswrapper[4790]: E1003 08:41:10.329206 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4vl95" podUID="9236b957-81cb-40bc-969e-a2057884ba50" Oct 03 08:41:10 crc kubenswrapper[4790]: I1003 08:41:10.404696 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:10 crc kubenswrapper[4790]: I1003 08:41:10.404778 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:10 crc kubenswrapper[4790]: I1003 08:41:10.404798 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:10 crc kubenswrapper[4790]: I1003 08:41:10.404826 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:10 crc kubenswrapper[4790]: I1003 08:41:10.404845 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:10Z","lastTransitionTime":"2025-10-03T08:41:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:10 crc kubenswrapper[4790]: I1003 08:41:10.508179 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:10 crc kubenswrapper[4790]: I1003 08:41:10.508230 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:10 crc kubenswrapper[4790]: I1003 08:41:10.508245 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:10 crc kubenswrapper[4790]: I1003 08:41:10.508269 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:10 crc kubenswrapper[4790]: I1003 08:41:10.508284 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:10Z","lastTransitionTime":"2025-10-03T08:41:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:10 crc kubenswrapper[4790]: I1003 08:41:10.611949 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:10 crc kubenswrapper[4790]: I1003 08:41:10.612031 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:10 crc kubenswrapper[4790]: I1003 08:41:10.612048 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:10 crc kubenswrapper[4790]: I1003 08:41:10.612083 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:10 crc kubenswrapper[4790]: I1003 08:41:10.612104 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:10Z","lastTransitionTime":"2025-10-03T08:41:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:10 crc kubenswrapper[4790]: I1003 08:41:10.716710 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:10 crc kubenswrapper[4790]: I1003 08:41:10.716774 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:10 crc kubenswrapper[4790]: I1003 08:41:10.716793 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:10 crc kubenswrapper[4790]: I1003 08:41:10.716820 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:10 crc kubenswrapper[4790]: I1003 08:41:10.716841 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:10Z","lastTransitionTime":"2025-10-03T08:41:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:10 crc kubenswrapper[4790]: I1003 08:41:10.819649 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:10 crc kubenswrapper[4790]: I1003 08:41:10.819727 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:10 crc kubenswrapper[4790]: I1003 08:41:10.819748 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:10 crc kubenswrapper[4790]: I1003 08:41:10.819776 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:10 crc kubenswrapper[4790]: I1003 08:41:10.819797 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:10Z","lastTransitionTime":"2025-10-03T08:41:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:10 crc kubenswrapper[4790]: I1003 08:41:10.922818 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:10 crc kubenswrapper[4790]: I1003 08:41:10.922886 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:10 crc kubenswrapper[4790]: I1003 08:41:10.922906 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:10 crc kubenswrapper[4790]: I1003 08:41:10.922932 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:10 crc kubenswrapper[4790]: I1003 08:41:10.922948 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:10Z","lastTransitionTime":"2025-10-03T08:41:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:11 crc kubenswrapper[4790]: I1003 08:41:11.026335 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:11 crc kubenswrapper[4790]: I1003 08:41:11.026390 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:11 crc kubenswrapper[4790]: I1003 08:41:11.026403 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:11 crc kubenswrapper[4790]: I1003 08:41:11.026426 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:11 crc kubenswrapper[4790]: I1003 08:41:11.026440 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:11Z","lastTransitionTime":"2025-10-03T08:41:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:11 crc kubenswrapper[4790]: I1003 08:41:11.129843 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:11 crc kubenswrapper[4790]: I1003 08:41:11.129979 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:11 crc kubenswrapper[4790]: I1003 08:41:11.130002 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:11 crc kubenswrapper[4790]: I1003 08:41:11.130034 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:11 crc kubenswrapper[4790]: I1003 08:41:11.130056 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:11Z","lastTransitionTime":"2025-10-03T08:41:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:11 crc kubenswrapper[4790]: I1003 08:41:11.219422 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:11 crc kubenswrapper[4790]: I1003 08:41:11.219517 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:11 crc kubenswrapper[4790]: I1003 08:41:11.219543 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:11 crc kubenswrapper[4790]: I1003 08:41:11.219634 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:11 crc kubenswrapper[4790]: I1003 08:41:11.219667 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:11Z","lastTransitionTime":"2025-10-03T08:41:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:11 crc kubenswrapper[4790]: E1003 08:41:11.241183 4790 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T08:41:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T08:41:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T08:41:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T08:41:11Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T08:41:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T08:41:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T08:41:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T08:41:11Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e374b313-7030-4739-b7f1-da1c9b39fe8d\\\",\\\"systemUUID\\\":\\\"312be368-bbdc-461d-b74b-f792414190de\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:41:11Z is after 2025-08-24T17:21:41Z" Oct 03 08:41:11 crc kubenswrapper[4790]: I1003 08:41:11.246104 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:11 crc kubenswrapper[4790]: I1003 08:41:11.246157 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:11 crc kubenswrapper[4790]: I1003 08:41:11.246175 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:11 crc kubenswrapper[4790]: I1003 08:41:11.246201 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:11 crc kubenswrapper[4790]: I1003 08:41:11.246220 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:11Z","lastTransitionTime":"2025-10-03T08:41:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:11 crc kubenswrapper[4790]: E1003 08:41:11.261188 4790 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T08:41:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T08:41:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T08:41:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T08:41:11Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T08:41:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T08:41:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T08:41:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T08:41:11Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e374b313-7030-4739-b7f1-da1c9b39fe8d\\\",\\\"systemUUID\\\":\\\"312be368-bbdc-461d-b74b-f792414190de\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:41:11Z is after 2025-08-24T17:21:41Z" Oct 03 08:41:11 crc kubenswrapper[4790]: I1003 08:41:11.265711 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:11 crc kubenswrapper[4790]: I1003 08:41:11.265770 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:11 crc kubenswrapper[4790]: I1003 08:41:11.265792 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:11 crc kubenswrapper[4790]: I1003 08:41:11.265817 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:11 crc kubenswrapper[4790]: I1003 08:41:11.265837 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:11Z","lastTransitionTime":"2025-10-03T08:41:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:11 crc kubenswrapper[4790]: E1003 08:41:11.287391 4790 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T08:41:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T08:41:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T08:41:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T08:41:11Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T08:41:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T08:41:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T08:41:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T08:41:11Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e374b313-7030-4739-b7f1-da1c9b39fe8d\\\",\\\"systemUUID\\\":\\\"312be368-bbdc-461d-b74b-f792414190de\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:41:11Z is after 2025-08-24T17:21:41Z" Oct 03 08:41:11 crc kubenswrapper[4790]: I1003 08:41:11.293150 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:11 crc kubenswrapper[4790]: I1003 08:41:11.293204 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:11 crc kubenswrapper[4790]: I1003 08:41:11.293214 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:11 crc kubenswrapper[4790]: I1003 08:41:11.293238 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:11 crc kubenswrapper[4790]: I1003 08:41:11.293249 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:11Z","lastTransitionTime":"2025-10-03T08:41:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:11 crc kubenswrapper[4790]: E1003 08:41:11.315874 4790 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T08:41:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T08:41:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T08:41:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T08:41:11Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T08:41:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T08:41:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T08:41:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T08:41:11Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e374b313-7030-4739-b7f1-da1c9b39fe8d\\\",\\\"systemUUID\\\":\\\"312be368-bbdc-461d-b74b-f792414190de\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:41:11Z is after 2025-08-24T17:21:41Z" Oct 03 08:41:11 crc kubenswrapper[4790]: I1003 08:41:11.322296 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:11 crc kubenswrapper[4790]: I1003 08:41:11.322391 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:11 crc kubenswrapper[4790]: I1003 08:41:11.322418 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:11 crc kubenswrapper[4790]: I1003 08:41:11.322456 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:11 crc kubenswrapper[4790]: I1003 08:41:11.322481 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:11Z","lastTransitionTime":"2025-10-03T08:41:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:11 crc kubenswrapper[4790]: E1003 08:41:11.344886 4790 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T08:41:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T08:41:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T08:41:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T08:41:11Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T08:41:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T08:41:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T08:41:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T08:41:11Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e374b313-7030-4739-b7f1-da1c9b39fe8d\\\",\\\"systemUUID\\\":\\\"312be368-bbdc-461d-b74b-f792414190de\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:41:11Z is after 2025-08-24T17:21:41Z" Oct 03 08:41:11 crc kubenswrapper[4790]: E1003 08:41:11.345702 4790 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 03 08:41:11 crc kubenswrapper[4790]: I1003 08:41:11.348805 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:11 crc kubenswrapper[4790]: I1003 08:41:11.348881 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:11 crc kubenswrapper[4790]: I1003 08:41:11.348901 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:11 crc kubenswrapper[4790]: I1003 08:41:11.348932 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:11 crc kubenswrapper[4790]: I1003 08:41:11.348951 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:11Z","lastTransitionTime":"2025-10-03T08:41:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:11 crc kubenswrapper[4790]: I1003 08:41:11.452411 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:11 crc kubenswrapper[4790]: I1003 08:41:11.452499 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:11 crc kubenswrapper[4790]: I1003 08:41:11.452517 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:11 crc kubenswrapper[4790]: I1003 08:41:11.453053 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:11 crc kubenswrapper[4790]: I1003 08:41:11.453104 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:11Z","lastTransitionTime":"2025-10-03T08:41:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:11 crc kubenswrapper[4790]: I1003 08:41:11.556352 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:11 crc kubenswrapper[4790]: I1003 08:41:11.556426 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:11 crc kubenswrapper[4790]: I1003 08:41:11.556447 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:11 crc kubenswrapper[4790]: I1003 08:41:11.556477 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:11 crc kubenswrapper[4790]: I1003 08:41:11.556497 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:11Z","lastTransitionTime":"2025-10-03T08:41:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:11 crc kubenswrapper[4790]: I1003 08:41:11.659936 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:11 crc kubenswrapper[4790]: I1003 08:41:11.660005 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:11 crc kubenswrapper[4790]: I1003 08:41:11.660029 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:11 crc kubenswrapper[4790]: I1003 08:41:11.660061 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:11 crc kubenswrapper[4790]: I1003 08:41:11.660085 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:11Z","lastTransitionTime":"2025-10-03T08:41:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:11 crc kubenswrapper[4790]: I1003 08:41:11.763390 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:11 crc kubenswrapper[4790]: I1003 08:41:11.763448 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:11 crc kubenswrapper[4790]: I1003 08:41:11.763458 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:11 crc kubenswrapper[4790]: I1003 08:41:11.763479 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:11 crc kubenswrapper[4790]: I1003 08:41:11.763492 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:11Z","lastTransitionTime":"2025-10-03T08:41:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:11 crc kubenswrapper[4790]: I1003 08:41:11.866154 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:11 crc kubenswrapper[4790]: I1003 08:41:11.866225 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:11 crc kubenswrapper[4790]: I1003 08:41:11.866239 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:11 crc kubenswrapper[4790]: I1003 08:41:11.866265 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:11 crc kubenswrapper[4790]: I1003 08:41:11.866281 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:11Z","lastTransitionTime":"2025-10-03T08:41:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:11 crc kubenswrapper[4790]: I1003 08:41:11.969868 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:11 crc kubenswrapper[4790]: I1003 08:41:11.969929 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:11 crc kubenswrapper[4790]: I1003 08:41:11.969942 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:11 crc kubenswrapper[4790]: I1003 08:41:11.969964 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:11 crc kubenswrapper[4790]: I1003 08:41:11.969979 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:11Z","lastTransitionTime":"2025-10-03T08:41:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:12 crc kubenswrapper[4790]: I1003 08:41:12.073971 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:12 crc kubenswrapper[4790]: I1003 08:41:12.074025 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:12 crc kubenswrapper[4790]: I1003 08:41:12.074043 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:12 crc kubenswrapper[4790]: I1003 08:41:12.074068 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:12 crc kubenswrapper[4790]: I1003 08:41:12.074087 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:12Z","lastTransitionTime":"2025-10-03T08:41:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:12 crc kubenswrapper[4790]: I1003 08:41:12.178252 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:12 crc kubenswrapper[4790]: I1003 08:41:12.178343 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:12 crc kubenswrapper[4790]: I1003 08:41:12.178362 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:12 crc kubenswrapper[4790]: I1003 08:41:12.178392 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:12 crc kubenswrapper[4790]: I1003 08:41:12.178420 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:12Z","lastTransitionTime":"2025-10-03T08:41:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:12 crc kubenswrapper[4790]: I1003 08:41:12.282173 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:12 crc kubenswrapper[4790]: I1003 08:41:12.282251 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:12 crc kubenswrapper[4790]: I1003 08:41:12.282269 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:12 crc kubenswrapper[4790]: I1003 08:41:12.282295 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:12 crc kubenswrapper[4790]: I1003 08:41:12.282316 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:12Z","lastTransitionTime":"2025-10-03T08:41:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:12 crc kubenswrapper[4790]: I1003 08:41:12.327729 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 08:41:12 crc kubenswrapper[4790]: I1003 08:41:12.327807 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 08:41:12 crc kubenswrapper[4790]: I1003 08:41:12.327729 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4vl95" Oct 03 08:41:12 crc kubenswrapper[4790]: E1003 08:41:12.328010 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 08:41:12 crc kubenswrapper[4790]: I1003 08:41:12.328053 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 08:41:12 crc kubenswrapper[4790]: E1003 08:41:12.328226 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4vl95" podUID="9236b957-81cb-40bc-969e-a2057884ba50" Oct 03 08:41:12 crc kubenswrapper[4790]: E1003 08:41:12.328681 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 08:41:12 crc kubenswrapper[4790]: E1003 08:41:12.328756 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 08:41:12 crc kubenswrapper[4790]: I1003 08:41:12.384802 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:12 crc kubenswrapper[4790]: I1003 08:41:12.384842 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:12 crc kubenswrapper[4790]: I1003 08:41:12.384853 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:12 crc kubenswrapper[4790]: I1003 08:41:12.384874 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:12 crc kubenswrapper[4790]: I1003 08:41:12.384886 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:12Z","lastTransitionTime":"2025-10-03T08:41:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:12 crc kubenswrapper[4790]: I1003 08:41:12.487223 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:12 crc kubenswrapper[4790]: I1003 08:41:12.487272 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:12 crc kubenswrapper[4790]: I1003 08:41:12.487286 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:12 crc kubenswrapper[4790]: I1003 08:41:12.487304 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:12 crc kubenswrapper[4790]: I1003 08:41:12.487318 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:12Z","lastTransitionTime":"2025-10-03T08:41:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:12 crc kubenswrapper[4790]: I1003 08:41:12.590317 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:12 crc kubenswrapper[4790]: I1003 08:41:12.590367 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:12 crc kubenswrapper[4790]: I1003 08:41:12.590383 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:12 crc kubenswrapper[4790]: I1003 08:41:12.590403 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:12 crc kubenswrapper[4790]: I1003 08:41:12.590414 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:12Z","lastTransitionTime":"2025-10-03T08:41:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:12 crc kubenswrapper[4790]: I1003 08:41:12.692642 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:12 crc kubenswrapper[4790]: I1003 08:41:12.692683 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:12 crc kubenswrapper[4790]: I1003 08:41:12.692691 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:12 crc kubenswrapper[4790]: I1003 08:41:12.692728 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:12 crc kubenswrapper[4790]: I1003 08:41:12.692739 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:12Z","lastTransitionTime":"2025-10-03T08:41:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:12 crc kubenswrapper[4790]: I1003 08:41:12.794703 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:12 crc kubenswrapper[4790]: I1003 08:41:12.794773 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:12 crc kubenswrapper[4790]: I1003 08:41:12.794784 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:12 crc kubenswrapper[4790]: I1003 08:41:12.794801 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:12 crc kubenswrapper[4790]: I1003 08:41:12.794812 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:12Z","lastTransitionTime":"2025-10-03T08:41:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:12 crc kubenswrapper[4790]: I1003 08:41:12.897679 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:12 crc kubenswrapper[4790]: I1003 08:41:12.897735 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:12 crc kubenswrapper[4790]: I1003 08:41:12.897748 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:12 crc kubenswrapper[4790]: I1003 08:41:12.897763 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:12 crc kubenswrapper[4790]: I1003 08:41:12.897776 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:12Z","lastTransitionTime":"2025-10-03T08:41:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:12 crc kubenswrapper[4790]: I1003 08:41:12.999886 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:12 crc kubenswrapper[4790]: I1003 08:41:12.999926 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:12 crc kubenswrapper[4790]: I1003 08:41:12.999937 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:12 crc kubenswrapper[4790]: I1003 08:41:12.999952 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:13 crc kubenswrapper[4790]: I1003 08:41:12.999966 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:12Z","lastTransitionTime":"2025-10-03T08:41:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:13 crc kubenswrapper[4790]: I1003 08:41:13.103787 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:13 crc kubenswrapper[4790]: I1003 08:41:13.103827 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:13 crc kubenswrapper[4790]: I1003 08:41:13.103839 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:13 crc kubenswrapper[4790]: I1003 08:41:13.103861 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:13 crc kubenswrapper[4790]: I1003 08:41:13.103885 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:13Z","lastTransitionTime":"2025-10-03T08:41:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:13 crc kubenswrapper[4790]: I1003 08:41:13.206311 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:13 crc kubenswrapper[4790]: I1003 08:41:13.206368 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:13 crc kubenswrapper[4790]: I1003 08:41:13.206380 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:13 crc kubenswrapper[4790]: I1003 08:41:13.206398 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:13 crc kubenswrapper[4790]: I1003 08:41:13.206410 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:13Z","lastTransitionTime":"2025-10-03T08:41:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:13 crc kubenswrapper[4790]: I1003 08:41:13.309628 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:13 crc kubenswrapper[4790]: I1003 08:41:13.309691 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:13 crc kubenswrapper[4790]: I1003 08:41:13.309705 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:13 crc kubenswrapper[4790]: I1003 08:41:13.309727 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:13 crc kubenswrapper[4790]: I1003 08:41:13.309742 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:13Z","lastTransitionTime":"2025-10-03T08:41:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:13 crc kubenswrapper[4790]: I1003 08:41:13.413178 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:13 crc kubenswrapper[4790]: I1003 08:41:13.413255 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:13 crc kubenswrapper[4790]: I1003 08:41:13.413269 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:13 crc kubenswrapper[4790]: I1003 08:41:13.413289 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:13 crc kubenswrapper[4790]: I1003 08:41:13.413304 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:13Z","lastTransitionTime":"2025-10-03T08:41:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:13 crc kubenswrapper[4790]: I1003 08:41:13.516965 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:13 crc kubenswrapper[4790]: I1003 08:41:13.517020 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:13 crc kubenswrapper[4790]: I1003 08:41:13.517035 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:13 crc kubenswrapper[4790]: I1003 08:41:13.517054 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:13 crc kubenswrapper[4790]: I1003 08:41:13.517067 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:13Z","lastTransitionTime":"2025-10-03T08:41:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:13 crc kubenswrapper[4790]: I1003 08:41:13.621113 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:13 crc kubenswrapper[4790]: I1003 08:41:13.621171 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:13 crc kubenswrapper[4790]: I1003 08:41:13.621179 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:13 crc kubenswrapper[4790]: I1003 08:41:13.621195 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:13 crc kubenswrapper[4790]: I1003 08:41:13.621205 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:13Z","lastTransitionTime":"2025-10-03T08:41:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:13 crc kubenswrapper[4790]: I1003 08:41:13.725372 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:13 crc kubenswrapper[4790]: I1003 08:41:13.725457 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:13 crc kubenswrapper[4790]: I1003 08:41:13.725482 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:13 crc kubenswrapper[4790]: I1003 08:41:13.725516 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:13 crc kubenswrapper[4790]: I1003 08:41:13.725545 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:13Z","lastTransitionTime":"2025-10-03T08:41:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:13 crc kubenswrapper[4790]: I1003 08:41:13.834826 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:13 crc kubenswrapper[4790]: I1003 08:41:13.834910 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:13 crc kubenswrapper[4790]: I1003 08:41:13.834934 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:13 crc kubenswrapper[4790]: I1003 08:41:13.835068 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:13 crc kubenswrapper[4790]: I1003 08:41:13.835099 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:13Z","lastTransitionTime":"2025-10-03T08:41:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:13 crc kubenswrapper[4790]: I1003 08:41:13.938654 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:13 crc kubenswrapper[4790]: I1003 08:41:13.938724 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:13 crc kubenswrapper[4790]: I1003 08:41:13.938744 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:13 crc kubenswrapper[4790]: I1003 08:41:13.938773 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:13 crc kubenswrapper[4790]: I1003 08:41:13.938794 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:13Z","lastTransitionTime":"2025-10-03T08:41:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:14 crc kubenswrapper[4790]: I1003 08:41:14.042808 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:14 crc kubenswrapper[4790]: I1003 08:41:14.042881 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:14 crc kubenswrapper[4790]: I1003 08:41:14.042923 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:14 crc kubenswrapper[4790]: I1003 08:41:14.042963 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:14 crc kubenswrapper[4790]: I1003 08:41:14.042993 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:14Z","lastTransitionTime":"2025-10-03T08:41:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:14 crc kubenswrapper[4790]: I1003 08:41:14.146901 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:14 crc kubenswrapper[4790]: I1003 08:41:14.146975 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:14 crc kubenswrapper[4790]: I1003 08:41:14.146988 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:14 crc kubenswrapper[4790]: I1003 08:41:14.147012 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:14 crc kubenswrapper[4790]: I1003 08:41:14.147027 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:14Z","lastTransitionTime":"2025-10-03T08:41:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:14 crc kubenswrapper[4790]: I1003 08:41:14.250380 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:14 crc kubenswrapper[4790]: I1003 08:41:14.250437 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:14 crc kubenswrapper[4790]: I1003 08:41:14.250451 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:14 crc kubenswrapper[4790]: I1003 08:41:14.250477 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:14 crc kubenswrapper[4790]: I1003 08:41:14.250519 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:14Z","lastTransitionTime":"2025-10-03T08:41:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:14 crc kubenswrapper[4790]: I1003 08:41:14.327247 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 08:41:14 crc kubenswrapper[4790]: I1003 08:41:14.327302 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 08:41:14 crc kubenswrapper[4790]: I1003 08:41:14.327251 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 08:41:14 crc kubenswrapper[4790]: E1003 08:41:14.327465 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 08:41:14 crc kubenswrapper[4790]: E1003 08:41:14.327608 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 08:41:14 crc kubenswrapper[4790]: I1003 08:41:14.327688 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4vl95" Oct 03 08:41:14 crc kubenswrapper[4790]: E1003 08:41:14.327737 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 08:41:14 crc kubenswrapper[4790]: E1003 08:41:14.327861 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4vl95" podUID="9236b957-81cb-40bc-969e-a2057884ba50" Oct 03 08:41:14 crc kubenswrapper[4790]: I1003 08:41:14.344069 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"535765e1-fd5d-4501-b13e-684c5f08b296\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:41:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:41:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e0caa9033c82a625d9594342d4e58052cfbfea8251bc0fa2f1b102ba235b5ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db5e8a267e2821ea1c1628ff477a4efc3c5bcc78538460b0f30a160bd738dcee\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d85a637b07da3de31d73486c444bf86e7e2fd799180fc1cd9ab56fee61afaa9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c713cd887b75dc54dfefd8b613f5bdb78ade644b31fc7db5e789414fcacad4b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://905a99c09cfaadc3eddc17fa6e509098364525d7a03add46a97c98dc08def8a2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T08:40:30Z\\\",\\\"message\\\":\\\"file observer\\\\nW1003 08:40:30.396497 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1003 08:40:30.399279 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 08:40:30.400119 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-247026750/tls.crt::/tmp/serving-cert-247026750/tls.key\\\\\\\"\\\\nI1003 08:40:30.809123 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1003 08:40:30.812812 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1003 08:40:30.812831 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1003 08:40:30.812852 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1003 08:40:30.812858 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1003 08:40:30.817440 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1003 08:40:30.817463 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1003 08:40:30.817482 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 08:40:30.817491 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 08:40:30.817495 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1003 08:40:30.817498 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1003 08:40:30.817501 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1003 08:40:30.817504 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1003 08:40:30.821080 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4591d1f2d844cb16eef19a65aaad416898d23ca5ce266e4863a9810d49ff4ba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://444e973af493d5aca2c924bcaee20abca5f1281e6b9c9f364e4603bf622d0e48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://444e973af493d5aca2c924bcaee20abca5f1281e6b9c9f364e4603bf622d0e48\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:40:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:14Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:41:14Z is after 2025-08-24T17:21:41Z" Oct 03 08:41:14 crc kubenswrapper[4790]: I1003 08:41:14.353542 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:14 crc kubenswrapper[4790]: I1003 08:41:14.353591 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:14 crc kubenswrapper[4790]: I1003 08:41:14.353602 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:14 crc kubenswrapper[4790]: I1003 08:41:14.353619 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:14 crc kubenswrapper[4790]: I1003 08:41:14.353630 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:14Z","lastTransitionTime":"2025-10-03T08:41:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:14 crc kubenswrapper[4790]: I1003 08:41:14.360743 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gprtl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c5bfb61-e69a-4576-b200-18dd7f38c7f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dcb3e7b6550f0ad490e187c2154ec53cb427cbab7acaf0618eb4de188eb724c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66xc5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:34Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gprtl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:41:14Z is after 2025-08-24T17:21:41Z" Oct 03 08:41:14 crc kubenswrapper[4790]: I1003 08:41:14.378078 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3362615c1396a6051d3770c79e4b67eea9d0c9df2a1a45381f729c4120970818\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://41113d498e57788a4c3162e8e981f5b9d9a5a7d49ee50b15085688fbc6b67f4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:41:14Z is after 2025-08-24T17:21:41Z" Oct 03 08:41:14 crc kubenswrapper[4790]: I1003 08:41:14.391383 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7108359a-23ed-4464-a43a-6a1ef9877500\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:41:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:41:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca9eba35e4f0edf15874f10d6b11af0b09a3c3a43eaf8340ec5604122463dc8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d8b7cc917b16307413ed2fb2c91cf91b438cf0499138176e99562d0415bf872c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://55ba2aa3b5d8466c1193dab58f6add86f97ed9065aaa71ba877ad002d7b30ccc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3486e595b261af48bff933da2e9a7c49631e4523babd35cc054b39291e410512\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3486e595b261af48bff933da2e9a7c49631e4523babd35cc054b39291e410512\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:40:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:15Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:14Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:41:14Z is after 2025-08-24T17:21:41Z" Oct 03 08:41:14 crc kubenswrapper[4790]: I1003 08:41:14.404725 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08d57a60251c60c6c6a2cab2b862c322c01b531ca1aa12ff4b0933bee9eb95f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:41:14Z is after 2025-08-24T17:21:41Z" Oct 03 08:41:14 crc kubenswrapper[4790]: I1003 08:41:14.423536 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-hncl2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f2f5631d-8486-44f9-81b0-ee78515e7866\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3fde680972c36057c9e06b0f745acbfb2e6ab399b14d6c25a3c379b65e598afd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-strp2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-hncl2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:41:14Z is after 2025-08-24T17:21:41Z" Oct 03 08:41:14 crc kubenswrapper[4790]: I1003 08:41:14.439511 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7ddb59e5-6feb-471d-9594-f86e6993a2e4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b0138e08174582c0c1388716f1f6b9e42a16b009f1a467803c4dacd50c3d80ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c575d680c9bc994b8c2e97e720d794138153bc3329c4b7f8ef81115a0245d0cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://027dd0df8d83044cff9f4e8c87bef3bb18e2169bb93e615486fd1e9b5a17b86e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e837e81f3b907d8506f674d13a9bc8f20f63e6c40e1f952cd57b715855ec9bf9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:14Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:41:14Z is after 2025-08-24T17:21:41Z" Oct 03 08:41:14 crc kubenswrapper[4790]: I1003 08:41:14.457627 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:41:14Z is after 2025-08-24T17:21:41Z" Oct 03 08:41:14 crc kubenswrapper[4790]: I1003 08:41:14.463773 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:14 crc kubenswrapper[4790]: I1003 08:41:14.463831 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:14 crc kubenswrapper[4790]: I1003 08:41:14.463848 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:14 crc kubenswrapper[4790]: I1003 08:41:14.463873 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:14 crc kubenswrapper[4790]: I1003 08:41:14.463890 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:14Z","lastTransitionTime":"2025-10-03T08:41:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:14 crc kubenswrapper[4790]: I1003 08:41:14.478384 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9fnsr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e01f9373-74e4-439b-bd7f-ad00e60a3284\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45be0671b4ad4d5e294f87dba5d52747298b00a3ebd9a13f2845881c0f1846a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krfjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://876c35e32e5877c2a69cdbb54b84ec0bc0daebd05c7847875a1c122aafac2057\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://876c35e32e5877c2a69cdbb54b84ec0bc0daebd05c7847875a1c122aafac2057\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krfjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://938ff3fffad5347212d00d02f96c105cd5e6252a4d6a7ada258f0367da3f8e77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://938ff3fffad5347212d00d02f96c105cd5e6252a4d6a7ada258f0367da3f8e77\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:40:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krfjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca99fc95f4bc463b8e638cfa1e6dc6be77fc4111b6ec61b4b2913fdf91f46671\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ca99fc95f4bc463b8e638cfa1e6dc6be77fc4111b6ec61b4b2913fdf91f46671\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:40:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krfjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9667f80988a61a4f26f6f2b8b047ef94bdc6fa052083d5d1285f5bd9235615ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9667f80988a61a4f26f6f2b8b047ef94bdc6fa052083d5d1285f5bd9235615ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:40:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krfjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a4e7fb7b6f922e6e0fa2a7fcb5476ba549a6aa21dc88ee145bc55f4dac0912bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a4e7fb7b6f922e6e0fa2a7fcb5476ba549a6aa21dc88ee145bc55f4dac0912bb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:40:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krfjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://76a8239bfbb38c5858f1f9b0b01b9552463d8b2774564d7cdacb6757bf053656\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76a8239bfbb38c5858f1f9b0b01b9552463d8b2774564d7cdacb6757bf053656\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:40:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krfjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9fnsr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:41:14Z is after 2025-08-24T17:21:41Z" Oct 03 08:41:14 crc kubenswrapper[4790]: I1003 08:41:14.488286 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7bpwn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a3923f73-5efb-4ce3-b68f-0c4e1e8eca4b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://305c950f0352fe1bec1c151dc89eed1389144ad5ee321458d48c6609e688bb73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gt4gf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:37Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7bpwn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:41:14Z is after 2025-08-24T17:21:41Z" Oct 03 08:41:14 crc kubenswrapper[4790]: I1003 08:41:14.501447 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-fn764" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12525ca4-5242-41bc-b7ba-2c1ab5a45892\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ee4f33c772a396fdd8d2a6f958d402139cb81cfd79da0f32c83bbe9adbe0801\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2t6qw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d79cdfa897fe5ca615930826e3a153a184ab7977626a9837dbcdbddbe4d27af6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2t6qw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-fn764\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:41:14Z is after 2025-08-24T17:21:41Z" Oct 03 08:41:14 crc kubenswrapper[4790]: I1003 08:41:14.515022 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:41:14Z is after 2025-08-24T17:21:41Z" Oct 03 08:41:14 crc kubenswrapper[4790]: I1003 08:41:14.527262 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:41:14Z is after 2025-08-24T17:21:41Z" Oct 03 08:41:14 crc kubenswrapper[4790]: I1003 08:41:14.541996 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"36eae06a-d8be-4128-8d68-acb32e1f011b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a97a1a7f3aba494cd0da6bcd15052a85d70642642a62d108ac0be51ce8e3d302\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww2bb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8c3aea222eb433729d8eac107ebcdb306f649bef25942cd0390bfc1bbda5c74\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww2bb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:35Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-zqj4j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:41:14Z is after 2025-08-24T17:21:41Z" Oct 03 08:41:14 crc kubenswrapper[4790]: I1003 08:41:14.560965 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vsjvh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa372dcc-63aa-48e5-b153-7739078026ef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e69102d376e90485ca565d34e18dbc12067e166f0bdf3c4325a0873f9f2b9b0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52pgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e52e55afef7b68f05e0c7d975d4497ee57dd9fa5aef7f72e9ca7441ea1633c05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52pgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55b4e8ab33316a7f7057ee723ebd49eb0adcd7270124f084b2828ce538ed3297\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52pgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f2aaa340e3927261ef0119f786b348e2091ddcd4b791619cc521122cc0774e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52pgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff0f1dd623e164ebe1862661c5905a38bec3aa1dd5c61982eebde25d0feb4c6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52pgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27b1573f76936a5148d24accaa7fdc1a1da552fb56b348812445e7f1c1e4cbef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52pgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa6b0394cf2fb97d539acd9a37b2c7184af5640122bf0b1d3f0fb663a78a9e91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa6b0394cf2fb97d539acd9a37b2c7184af5640122bf0b1d3f0fb663a78a9e91\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-03T08:41:03Z\\\",\\\"message\\\":\\\"resolver-gprtl in node crc\\\\nI1003 08:41:03.132081 6465 obj_retry.go:386] Retry successful for *v1.Pod openshift-dns/node-resolver-gprtl after 0 failed attempt(s)\\\\nI1003 08:41:03.132085 6465 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1003 08:41:03.132087 6465 default_network_controller.go:776] Recording success event on pod openshift-dns/node-resolver-gprtl\\\\nI1003 08:41:03.132018 6465 obj_retry.go:386] Retry successful for *v1.Pod openshift-kube-apiserver/kube-apiserver-crc after 0 failed attempt(s)\\\\nI1003 08:41:03.132100 6465 default_network_controller.go:776] Recording success event on pod openshift-kube-apiserver/kube-apiserver-crc\\\\nI1003 08:41:03.132110 6465 handler.go:208] Removed *v1.Node event handler 2\\\\nI1003 08:41:03.132122 6465 factory.go:656] Stopping watch factory\\\\nI1003 08:41:03.132130 6465 base_network_controller_pods.go:916] Annotation values: ip=[10.217.0.3/23] ; mac=0a:58:0a:d9:00:03 ; gw=[10.217.0.1]\\\\nI1003 08:41:03.132141 6465 ovnkube.go:599] Stopped ovnkube\\\\nI1003 08:41:03.132178 6465 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1003 08:41:03.132192 6465 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1003 08:41:03.132253 6465 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T08:41:02Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-vsjvh_openshift-ovn-kubernetes(fa372dcc-63aa-48e5-b153-7739078026ef)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52pgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74d44e422b0cbbd3793553dad98475d5a93ac38530a167db0dd1ef152edac010\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52pgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e45c39722f37fc673cee930ca10396bba43b56e19e61aa99bd2b89f131341cb4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e45c39722f37fc673cee930ca10396bba43b56e19e61aa99bd2b89f131341cb4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:40:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52pgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vsjvh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:41:14Z is after 2025-08-24T17:21:41Z" Oct 03 08:41:14 crc kubenswrapper[4790]: I1003 08:41:14.566798 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:14 crc kubenswrapper[4790]: I1003 08:41:14.566848 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:14 crc kubenswrapper[4790]: I1003 08:41:14.566860 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:14 crc kubenswrapper[4790]: I1003 08:41:14.566914 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:14 crc kubenswrapper[4790]: I1003 08:41:14.566933 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:14Z","lastTransitionTime":"2025-10-03T08:41:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:14 crc kubenswrapper[4790]: I1003 08:41:14.575715 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-4vl95" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9236b957-81cb-40bc-969e-a2057884ba50\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j2gd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j2gd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:49Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-4vl95\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:41:14Z is after 2025-08-24T17:21:41Z" Oct 03 08:41:14 crc kubenswrapper[4790]: I1003 08:41:14.590103 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28b99cc027b4390bdc7e3ed519bb798eb4a7b825ff29e4574f3420b9922265b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:41:14Z is after 2025-08-24T17:21:41Z" Oct 03 08:41:14 crc kubenswrapper[4790]: I1003 08:41:14.669527 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:14 crc kubenswrapper[4790]: I1003 08:41:14.669598 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:14 crc kubenswrapper[4790]: I1003 08:41:14.669610 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:14 crc kubenswrapper[4790]: I1003 08:41:14.669628 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:14 crc kubenswrapper[4790]: I1003 08:41:14.669639 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:14Z","lastTransitionTime":"2025-10-03T08:41:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:14 crc kubenswrapper[4790]: I1003 08:41:14.773288 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:14 crc kubenswrapper[4790]: I1003 08:41:14.773360 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:14 crc kubenswrapper[4790]: I1003 08:41:14.773374 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:14 crc kubenswrapper[4790]: I1003 08:41:14.773394 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:14 crc kubenswrapper[4790]: I1003 08:41:14.773406 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:14Z","lastTransitionTime":"2025-10-03T08:41:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:14 crc kubenswrapper[4790]: I1003 08:41:14.876732 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:14 crc kubenswrapper[4790]: I1003 08:41:14.876782 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:14 crc kubenswrapper[4790]: I1003 08:41:14.876795 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:14 crc kubenswrapper[4790]: I1003 08:41:14.876813 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:14 crc kubenswrapper[4790]: I1003 08:41:14.876827 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:14Z","lastTransitionTime":"2025-10-03T08:41:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:14 crc kubenswrapper[4790]: I1003 08:41:14.980637 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:14 crc kubenswrapper[4790]: I1003 08:41:14.980734 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:14 crc kubenswrapper[4790]: I1003 08:41:14.980762 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:14 crc kubenswrapper[4790]: I1003 08:41:14.980797 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:14 crc kubenswrapper[4790]: I1003 08:41:14.980821 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:14Z","lastTransitionTime":"2025-10-03T08:41:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:15 crc kubenswrapper[4790]: I1003 08:41:15.083962 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:15 crc kubenswrapper[4790]: I1003 08:41:15.084018 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:15 crc kubenswrapper[4790]: I1003 08:41:15.084029 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:15 crc kubenswrapper[4790]: I1003 08:41:15.084053 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:15 crc kubenswrapper[4790]: I1003 08:41:15.084070 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:15Z","lastTransitionTime":"2025-10-03T08:41:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:15 crc kubenswrapper[4790]: I1003 08:41:15.186817 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:15 crc kubenswrapper[4790]: I1003 08:41:15.186881 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:15 crc kubenswrapper[4790]: I1003 08:41:15.186891 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:15 crc kubenswrapper[4790]: I1003 08:41:15.186908 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:15 crc kubenswrapper[4790]: I1003 08:41:15.186918 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:15Z","lastTransitionTime":"2025-10-03T08:41:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:15 crc kubenswrapper[4790]: I1003 08:41:15.291604 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:15 crc kubenswrapper[4790]: I1003 08:41:15.291683 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:15 crc kubenswrapper[4790]: I1003 08:41:15.291711 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:15 crc kubenswrapper[4790]: I1003 08:41:15.291745 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:15 crc kubenswrapper[4790]: I1003 08:41:15.291775 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:15Z","lastTransitionTime":"2025-10-03T08:41:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:15 crc kubenswrapper[4790]: I1003 08:41:15.395175 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:15 crc kubenswrapper[4790]: I1003 08:41:15.395335 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:15 crc kubenswrapper[4790]: I1003 08:41:15.395367 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:15 crc kubenswrapper[4790]: I1003 08:41:15.395404 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:15 crc kubenswrapper[4790]: I1003 08:41:15.395428 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:15Z","lastTransitionTime":"2025-10-03T08:41:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:15 crc kubenswrapper[4790]: I1003 08:41:15.499041 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:15 crc kubenswrapper[4790]: I1003 08:41:15.499109 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:15 crc kubenswrapper[4790]: I1003 08:41:15.499122 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:15 crc kubenswrapper[4790]: I1003 08:41:15.499144 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:15 crc kubenswrapper[4790]: I1003 08:41:15.499162 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:15Z","lastTransitionTime":"2025-10-03T08:41:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:15 crc kubenswrapper[4790]: I1003 08:41:15.602874 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:15 crc kubenswrapper[4790]: I1003 08:41:15.602930 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:15 crc kubenswrapper[4790]: I1003 08:41:15.602942 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:15 crc kubenswrapper[4790]: I1003 08:41:15.602970 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:15 crc kubenswrapper[4790]: I1003 08:41:15.602980 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:15Z","lastTransitionTime":"2025-10-03T08:41:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:15 crc kubenswrapper[4790]: I1003 08:41:15.706612 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:15 crc kubenswrapper[4790]: I1003 08:41:15.706689 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:15 crc kubenswrapper[4790]: I1003 08:41:15.706716 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:15 crc kubenswrapper[4790]: I1003 08:41:15.706745 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:15 crc kubenswrapper[4790]: I1003 08:41:15.706764 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:15Z","lastTransitionTime":"2025-10-03T08:41:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:15 crc kubenswrapper[4790]: I1003 08:41:15.809836 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:15 crc kubenswrapper[4790]: I1003 08:41:15.810063 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:15 crc kubenswrapper[4790]: I1003 08:41:15.810076 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:15 crc kubenswrapper[4790]: I1003 08:41:15.810093 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:15 crc kubenswrapper[4790]: I1003 08:41:15.810108 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:15Z","lastTransitionTime":"2025-10-03T08:41:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:15 crc kubenswrapper[4790]: I1003 08:41:15.912568 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:15 crc kubenswrapper[4790]: I1003 08:41:15.912734 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:15 crc kubenswrapper[4790]: I1003 08:41:15.912771 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:15 crc kubenswrapper[4790]: I1003 08:41:15.912809 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:15 crc kubenswrapper[4790]: I1003 08:41:15.912833 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:15Z","lastTransitionTime":"2025-10-03T08:41:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:16 crc kubenswrapper[4790]: I1003 08:41:16.015396 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:16 crc kubenswrapper[4790]: I1003 08:41:16.016169 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:16 crc kubenswrapper[4790]: I1003 08:41:16.016214 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:16 crc kubenswrapper[4790]: I1003 08:41:16.016235 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:16 crc kubenswrapper[4790]: I1003 08:41:16.016246 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:16Z","lastTransitionTime":"2025-10-03T08:41:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:16 crc kubenswrapper[4790]: I1003 08:41:16.122784 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:16 crc kubenswrapper[4790]: I1003 08:41:16.122839 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:16 crc kubenswrapper[4790]: I1003 08:41:16.122849 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:16 crc kubenswrapper[4790]: I1003 08:41:16.122867 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:16 crc kubenswrapper[4790]: I1003 08:41:16.122878 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:16Z","lastTransitionTime":"2025-10-03T08:41:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:16 crc kubenswrapper[4790]: I1003 08:41:16.226158 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:16 crc kubenswrapper[4790]: I1003 08:41:16.226222 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:16 crc kubenswrapper[4790]: I1003 08:41:16.226241 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:16 crc kubenswrapper[4790]: I1003 08:41:16.226272 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:16 crc kubenswrapper[4790]: I1003 08:41:16.226293 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:16Z","lastTransitionTime":"2025-10-03T08:41:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:16 crc kubenswrapper[4790]: I1003 08:41:16.327661 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 08:41:16 crc kubenswrapper[4790]: I1003 08:41:16.327651 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 08:41:16 crc kubenswrapper[4790]: I1003 08:41:16.327792 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4vl95" Oct 03 08:41:16 crc kubenswrapper[4790]: E1003 08:41:16.327914 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 08:41:16 crc kubenswrapper[4790]: I1003 08:41:16.327953 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 08:41:16 crc kubenswrapper[4790]: E1003 08:41:16.328068 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4vl95" podUID="9236b957-81cb-40bc-969e-a2057884ba50" Oct 03 08:41:16 crc kubenswrapper[4790]: E1003 08:41:16.328218 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 08:41:16 crc kubenswrapper[4790]: E1003 08:41:16.328318 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 08:41:16 crc kubenswrapper[4790]: I1003 08:41:16.329013 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:16 crc kubenswrapper[4790]: I1003 08:41:16.329066 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:16 crc kubenswrapper[4790]: I1003 08:41:16.329086 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:16 crc kubenswrapper[4790]: I1003 08:41:16.329115 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:16 crc kubenswrapper[4790]: I1003 08:41:16.329231 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:16Z","lastTransitionTime":"2025-10-03T08:41:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:16 crc kubenswrapper[4790]: I1003 08:41:16.432664 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:16 crc kubenswrapper[4790]: I1003 08:41:16.432733 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:16 crc kubenswrapper[4790]: I1003 08:41:16.432751 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:16 crc kubenswrapper[4790]: I1003 08:41:16.432782 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:16 crc kubenswrapper[4790]: I1003 08:41:16.432803 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:16Z","lastTransitionTime":"2025-10-03T08:41:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:16 crc kubenswrapper[4790]: I1003 08:41:16.535911 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:16 crc kubenswrapper[4790]: I1003 08:41:16.535945 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:16 crc kubenswrapper[4790]: I1003 08:41:16.535954 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:16 crc kubenswrapper[4790]: I1003 08:41:16.535969 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:16 crc kubenswrapper[4790]: I1003 08:41:16.535980 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:16Z","lastTransitionTime":"2025-10-03T08:41:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:16 crc kubenswrapper[4790]: I1003 08:41:16.638801 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:16 crc kubenswrapper[4790]: I1003 08:41:16.638833 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:16 crc kubenswrapper[4790]: I1003 08:41:16.638842 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:16 crc kubenswrapper[4790]: I1003 08:41:16.638854 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:16 crc kubenswrapper[4790]: I1003 08:41:16.638863 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:16Z","lastTransitionTime":"2025-10-03T08:41:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:16 crc kubenswrapper[4790]: I1003 08:41:16.742221 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:16 crc kubenswrapper[4790]: I1003 08:41:16.742273 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:16 crc kubenswrapper[4790]: I1003 08:41:16.742284 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:16 crc kubenswrapper[4790]: I1003 08:41:16.742301 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:16 crc kubenswrapper[4790]: I1003 08:41:16.742316 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:16Z","lastTransitionTime":"2025-10-03T08:41:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:16 crc kubenswrapper[4790]: I1003 08:41:16.846207 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:16 crc kubenswrapper[4790]: I1003 08:41:16.846261 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:16 crc kubenswrapper[4790]: I1003 08:41:16.846270 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:16 crc kubenswrapper[4790]: I1003 08:41:16.846290 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:16 crc kubenswrapper[4790]: I1003 08:41:16.846303 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:16Z","lastTransitionTime":"2025-10-03T08:41:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:16 crc kubenswrapper[4790]: I1003 08:41:16.949042 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:16 crc kubenswrapper[4790]: I1003 08:41:16.949446 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:16 crc kubenswrapper[4790]: I1003 08:41:16.949621 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:16 crc kubenswrapper[4790]: I1003 08:41:16.949754 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:16 crc kubenswrapper[4790]: I1003 08:41:16.949846 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:16Z","lastTransitionTime":"2025-10-03T08:41:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:17 crc kubenswrapper[4790]: I1003 08:41:17.053398 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:17 crc kubenswrapper[4790]: I1003 08:41:17.053435 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:17 crc kubenswrapper[4790]: I1003 08:41:17.053444 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:17 crc kubenswrapper[4790]: I1003 08:41:17.053462 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:17 crc kubenswrapper[4790]: I1003 08:41:17.053472 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:17Z","lastTransitionTime":"2025-10-03T08:41:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:17 crc kubenswrapper[4790]: I1003 08:41:17.155844 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:17 crc kubenswrapper[4790]: I1003 08:41:17.155894 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:17 crc kubenswrapper[4790]: I1003 08:41:17.155905 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:17 crc kubenswrapper[4790]: I1003 08:41:17.155923 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:17 crc kubenswrapper[4790]: I1003 08:41:17.155935 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:17Z","lastTransitionTime":"2025-10-03T08:41:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:17 crc kubenswrapper[4790]: I1003 08:41:17.259038 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:17 crc kubenswrapper[4790]: I1003 08:41:17.259125 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:17 crc kubenswrapper[4790]: I1003 08:41:17.259137 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:17 crc kubenswrapper[4790]: I1003 08:41:17.259156 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:17 crc kubenswrapper[4790]: I1003 08:41:17.259171 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:17Z","lastTransitionTime":"2025-10-03T08:41:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:17 crc kubenswrapper[4790]: I1003 08:41:17.362396 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:17 crc kubenswrapper[4790]: I1003 08:41:17.362463 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:17 crc kubenswrapper[4790]: I1003 08:41:17.362479 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:17 crc kubenswrapper[4790]: I1003 08:41:17.362502 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:17 crc kubenswrapper[4790]: I1003 08:41:17.362517 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:17Z","lastTransitionTime":"2025-10-03T08:41:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:17 crc kubenswrapper[4790]: I1003 08:41:17.465723 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:17 crc kubenswrapper[4790]: I1003 08:41:17.465794 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:17 crc kubenswrapper[4790]: I1003 08:41:17.465808 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:17 crc kubenswrapper[4790]: I1003 08:41:17.465831 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:17 crc kubenswrapper[4790]: I1003 08:41:17.465844 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:17Z","lastTransitionTime":"2025-10-03T08:41:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:17 crc kubenswrapper[4790]: I1003 08:41:17.568745 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:17 crc kubenswrapper[4790]: I1003 08:41:17.568801 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:17 crc kubenswrapper[4790]: I1003 08:41:17.568813 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:17 crc kubenswrapper[4790]: I1003 08:41:17.568831 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:17 crc kubenswrapper[4790]: I1003 08:41:17.568846 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:17Z","lastTransitionTime":"2025-10-03T08:41:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:17 crc kubenswrapper[4790]: I1003 08:41:17.672173 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:17 crc kubenswrapper[4790]: I1003 08:41:17.672219 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:17 crc kubenswrapper[4790]: I1003 08:41:17.672230 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:17 crc kubenswrapper[4790]: I1003 08:41:17.672244 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:17 crc kubenswrapper[4790]: I1003 08:41:17.672254 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:17Z","lastTransitionTime":"2025-10-03T08:41:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:17 crc kubenswrapper[4790]: I1003 08:41:17.774507 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:17 crc kubenswrapper[4790]: I1003 08:41:17.774557 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:17 crc kubenswrapper[4790]: I1003 08:41:17.774567 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:17 crc kubenswrapper[4790]: I1003 08:41:17.774603 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:17 crc kubenswrapper[4790]: I1003 08:41:17.774619 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:17Z","lastTransitionTime":"2025-10-03T08:41:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:17 crc kubenswrapper[4790]: I1003 08:41:17.877527 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:17 crc kubenswrapper[4790]: I1003 08:41:17.877896 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:17 crc kubenswrapper[4790]: I1003 08:41:17.878021 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:17 crc kubenswrapper[4790]: I1003 08:41:17.878119 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:17 crc kubenswrapper[4790]: I1003 08:41:17.878225 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:17Z","lastTransitionTime":"2025-10-03T08:41:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:17 crc kubenswrapper[4790]: I1003 08:41:17.981284 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:17 crc kubenswrapper[4790]: I1003 08:41:17.981328 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:17 crc kubenswrapper[4790]: I1003 08:41:17.981337 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:17 crc kubenswrapper[4790]: I1003 08:41:17.981352 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:17 crc kubenswrapper[4790]: I1003 08:41:17.981365 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:17Z","lastTransitionTime":"2025-10-03T08:41:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:18 crc kubenswrapper[4790]: I1003 08:41:18.084291 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:18 crc kubenswrapper[4790]: I1003 08:41:18.084337 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:18 crc kubenswrapper[4790]: I1003 08:41:18.084354 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:18 crc kubenswrapper[4790]: I1003 08:41:18.084372 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:18 crc kubenswrapper[4790]: I1003 08:41:18.084382 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:18Z","lastTransitionTime":"2025-10-03T08:41:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:18 crc kubenswrapper[4790]: I1003 08:41:18.187804 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:18 crc kubenswrapper[4790]: I1003 08:41:18.187860 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:18 crc kubenswrapper[4790]: I1003 08:41:18.187873 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:18 crc kubenswrapper[4790]: I1003 08:41:18.187894 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:18 crc kubenswrapper[4790]: I1003 08:41:18.187910 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:18Z","lastTransitionTime":"2025-10-03T08:41:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:18 crc kubenswrapper[4790]: I1003 08:41:18.291192 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:18 crc kubenswrapper[4790]: I1003 08:41:18.291256 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:18 crc kubenswrapper[4790]: I1003 08:41:18.291272 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:18 crc kubenswrapper[4790]: I1003 08:41:18.291297 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:18 crc kubenswrapper[4790]: I1003 08:41:18.291310 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:18Z","lastTransitionTime":"2025-10-03T08:41:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:18 crc kubenswrapper[4790]: I1003 08:41:18.328029 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 08:41:18 crc kubenswrapper[4790]: I1003 08:41:18.328113 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 08:41:18 crc kubenswrapper[4790]: E1003 08:41:18.328213 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 08:41:18 crc kubenswrapper[4790]: I1003 08:41:18.328238 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4vl95" Oct 03 08:41:18 crc kubenswrapper[4790]: E1003 08:41:18.328329 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4vl95" podUID="9236b957-81cb-40bc-969e-a2057884ba50" Oct 03 08:41:18 crc kubenswrapper[4790]: E1003 08:41:18.328398 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 08:41:18 crc kubenswrapper[4790]: I1003 08:41:18.328737 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 08:41:18 crc kubenswrapper[4790]: E1003 08:41:18.329005 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 08:41:18 crc kubenswrapper[4790]: I1003 08:41:18.394453 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:18 crc kubenswrapper[4790]: I1003 08:41:18.394533 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:18 crc kubenswrapper[4790]: I1003 08:41:18.394563 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:18 crc kubenswrapper[4790]: I1003 08:41:18.394643 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:18 crc kubenswrapper[4790]: I1003 08:41:18.394668 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:18Z","lastTransitionTime":"2025-10-03T08:41:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:18 crc kubenswrapper[4790]: I1003 08:41:18.498061 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:18 crc kubenswrapper[4790]: I1003 08:41:18.498402 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:18 crc kubenswrapper[4790]: I1003 08:41:18.498493 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:18 crc kubenswrapper[4790]: I1003 08:41:18.498594 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:18 crc kubenswrapper[4790]: I1003 08:41:18.498667 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:18Z","lastTransitionTime":"2025-10-03T08:41:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:18 crc kubenswrapper[4790]: I1003 08:41:18.602532 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:18 crc kubenswrapper[4790]: I1003 08:41:18.602975 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:18 crc kubenswrapper[4790]: I1003 08:41:18.603148 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:18 crc kubenswrapper[4790]: I1003 08:41:18.603312 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:18 crc kubenswrapper[4790]: I1003 08:41:18.603411 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:18Z","lastTransitionTime":"2025-10-03T08:41:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:18 crc kubenswrapper[4790]: I1003 08:41:18.705865 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:18 crc kubenswrapper[4790]: I1003 08:41:18.706601 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:18 crc kubenswrapper[4790]: I1003 08:41:18.706666 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:18 crc kubenswrapper[4790]: I1003 08:41:18.706736 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:18 crc kubenswrapper[4790]: I1003 08:41:18.706800 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:18Z","lastTransitionTime":"2025-10-03T08:41:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:18 crc kubenswrapper[4790]: I1003 08:41:18.809215 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:18 crc kubenswrapper[4790]: I1003 08:41:18.809494 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:18 crc kubenswrapper[4790]: I1003 08:41:18.809568 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:18 crc kubenswrapper[4790]: I1003 08:41:18.809665 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:18 crc kubenswrapper[4790]: I1003 08:41:18.809687 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:18Z","lastTransitionTime":"2025-10-03T08:41:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:18 crc kubenswrapper[4790]: I1003 08:41:18.912914 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:18 crc kubenswrapper[4790]: I1003 08:41:18.912961 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:18 crc kubenswrapper[4790]: I1003 08:41:18.912973 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:18 crc kubenswrapper[4790]: I1003 08:41:18.912993 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:18 crc kubenswrapper[4790]: I1003 08:41:18.913005 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:18Z","lastTransitionTime":"2025-10-03T08:41:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:19 crc kubenswrapper[4790]: I1003 08:41:19.016138 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:19 crc kubenswrapper[4790]: I1003 08:41:19.016181 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:19 crc kubenswrapper[4790]: I1003 08:41:19.016191 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:19 crc kubenswrapper[4790]: I1003 08:41:19.016210 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:19 crc kubenswrapper[4790]: I1003 08:41:19.016222 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:19Z","lastTransitionTime":"2025-10-03T08:41:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:19 crc kubenswrapper[4790]: I1003 08:41:19.122412 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:19 crc kubenswrapper[4790]: I1003 08:41:19.122448 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:19 crc kubenswrapper[4790]: I1003 08:41:19.122458 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:19 crc kubenswrapper[4790]: I1003 08:41:19.122472 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:19 crc kubenswrapper[4790]: I1003 08:41:19.122484 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:19Z","lastTransitionTime":"2025-10-03T08:41:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:19 crc kubenswrapper[4790]: I1003 08:41:19.225116 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:19 crc kubenswrapper[4790]: I1003 08:41:19.225162 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:19 crc kubenswrapper[4790]: I1003 08:41:19.225170 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:19 crc kubenswrapper[4790]: I1003 08:41:19.225188 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:19 crc kubenswrapper[4790]: I1003 08:41:19.225200 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:19Z","lastTransitionTime":"2025-10-03T08:41:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:19 crc kubenswrapper[4790]: I1003 08:41:19.327460 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:19 crc kubenswrapper[4790]: I1003 08:41:19.327511 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:19 crc kubenswrapper[4790]: I1003 08:41:19.327527 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:19 crc kubenswrapper[4790]: I1003 08:41:19.327552 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:19 crc kubenswrapper[4790]: I1003 08:41:19.327567 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:19Z","lastTransitionTime":"2025-10-03T08:41:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:19 crc kubenswrapper[4790]: I1003 08:41:19.328442 4790 scope.go:117] "RemoveContainer" containerID="fa6b0394cf2fb97d539acd9a37b2c7184af5640122bf0b1d3f0fb663a78a9e91" Oct 03 08:41:19 crc kubenswrapper[4790]: E1003 08:41:19.328725 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-vsjvh_openshift-ovn-kubernetes(fa372dcc-63aa-48e5-b153-7739078026ef)\"" pod="openshift-ovn-kubernetes/ovnkube-node-vsjvh" podUID="fa372dcc-63aa-48e5-b153-7739078026ef" Oct 03 08:41:19 crc kubenswrapper[4790]: I1003 08:41:19.431009 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:19 crc kubenswrapper[4790]: I1003 08:41:19.431053 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:19 crc kubenswrapper[4790]: I1003 08:41:19.431063 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:19 crc kubenswrapper[4790]: I1003 08:41:19.431079 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:19 crc kubenswrapper[4790]: I1003 08:41:19.431091 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:19Z","lastTransitionTime":"2025-10-03T08:41:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:19 crc kubenswrapper[4790]: I1003 08:41:19.533499 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:19 crc kubenswrapper[4790]: I1003 08:41:19.533546 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:19 crc kubenswrapper[4790]: I1003 08:41:19.533559 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:19 crc kubenswrapper[4790]: I1003 08:41:19.533597 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:19 crc kubenswrapper[4790]: I1003 08:41:19.533611 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:19Z","lastTransitionTime":"2025-10-03T08:41:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:19 crc kubenswrapper[4790]: I1003 08:41:19.636319 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:19 crc kubenswrapper[4790]: I1003 08:41:19.636377 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:19 crc kubenswrapper[4790]: I1003 08:41:19.636391 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:19 crc kubenswrapper[4790]: I1003 08:41:19.636411 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:19 crc kubenswrapper[4790]: I1003 08:41:19.636422 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:19Z","lastTransitionTime":"2025-10-03T08:41:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:19 crc kubenswrapper[4790]: I1003 08:41:19.740204 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:19 crc kubenswrapper[4790]: I1003 08:41:19.740873 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:19 crc kubenswrapper[4790]: I1003 08:41:19.740919 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:19 crc kubenswrapper[4790]: I1003 08:41:19.740948 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:19 crc kubenswrapper[4790]: I1003 08:41:19.740970 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:19Z","lastTransitionTime":"2025-10-03T08:41:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:19 crc kubenswrapper[4790]: I1003 08:41:19.845869 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:19 crc kubenswrapper[4790]: I1003 08:41:19.845920 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:19 crc kubenswrapper[4790]: I1003 08:41:19.845931 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:19 crc kubenswrapper[4790]: I1003 08:41:19.845949 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:19 crc kubenswrapper[4790]: I1003 08:41:19.845961 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:19Z","lastTransitionTime":"2025-10-03T08:41:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:19 crc kubenswrapper[4790]: I1003 08:41:19.948864 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:19 crc kubenswrapper[4790]: I1003 08:41:19.948945 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:19 crc kubenswrapper[4790]: I1003 08:41:19.948977 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:19 crc kubenswrapper[4790]: I1003 08:41:19.949001 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:19 crc kubenswrapper[4790]: I1003 08:41:19.949015 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:19Z","lastTransitionTime":"2025-10-03T08:41:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:20 crc kubenswrapper[4790]: I1003 08:41:20.052028 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:20 crc kubenswrapper[4790]: I1003 08:41:20.052077 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:20 crc kubenswrapper[4790]: I1003 08:41:20.052089 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:20 crc kubenswrapper[4790]: I1003 08:41:20.052110 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:20 crc kubenswrapper[4790]: I1003 08:41:20.052121 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:20Z","lastTransitionTime":"2025-10-03T08:41:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:20 crc kubenswrapper[4790]: I1003 08:41:20.155379 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:20 crc kubenswrapper[4790]: I1003 08:41:20.155448 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:20 crc kubenswrapper[4790]: I1003 08:41:20.155462 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:20 crc kubenswrapper[4790]: I1003 08:41:20.155483 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:20 crc kubenswrapper[4790]: I1003 08:41:20.155501 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:20Z","lastTransitionTime":"2025-10-03T08:41:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:20 crc kubenswrapper[4790]: I1003 08:41:20.258202 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:20 crc kubenswrapper[4790]: I1003 08:41:20.258251 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:20 crc kubenswrapper[4790]: I1003 08:41:20.258262 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:20 crc kubenswrapper[4790]: I1003 08:41:20.258282 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:20 crc kubenswrapper[4790]: I1003 08:41:20.258295 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:20Z","lastTransitionTime":"2025-10-03T08:41:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:20 crc kubenswrapper[4790]: I1003 08:41:20.328127 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 08:41:20 crc kubenswrapper[4790]: I1003 08:41:20.328220 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4vl95" Oct 03 08:41:20 crc kubenswrapper[4790]: I1003 08:41:20.328217 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 08:41:20 crc kubenswrapper[4790]: I1003 08:41:20.328783 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 08:41:20 crc kubenswrapper[4790]: E1003 08:41:20.328869 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 08:41:20 crc kubenswrapper[4790]: E1003 08:41:20.332669 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 08:41:20 crc kubenswrapper[4790]: E1003 08:41:20.332820 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4vl95" podUID="9236b957-81cb-40bc-969e-a2057884ba50" Oct 03 08:41:20 crc kubenswrapper[4790]: E1003 08:41:20.332830 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 08:41:20 crc kubenswrapper[4790]: I1003 08:41:20.362271 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:20 crc kubenswrapper[4790]: I1003 08:41:20.362719 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:20 crc kubenswrapper[4790]: I1003 08:41:20.362759 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:20 crc kubenswrapper[4790]: I1003 08:41:20.362782 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:20 crc kubenswrapper[4790]: I1003 08:41:20.362794 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:20Z","lastTransitionTime":"2025-10-03T08:41:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:20 crc kubenswrapper[4790]: I1003 08:41:20.465422 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:20 crc kubenswrapper[4790]: I1003 08:41:20.465490 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:20 crc kubenswrapper[4790]: I1003 08:41:20.465502 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:20 crc kubenswrapper[4790]: I1003 08:41:20.465521 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:20 crc kubenswrapper[4790]: I1003 08:41:20.465534 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:20Z","lastTransitionTime":"2025-10-03T08:41:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:20 crc kubenswrapper[4790]: I1003 08:41:20.568444 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:20 crc kubenswrapper[4790]: I1003 08:41:20.568510 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:20 crc kubenswrapper[4790]: I1003 08:41:20.568527 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:20 crc kubenswrapper[4790]: I1003 08:41:20.568554 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:20 crc kubenswrapper[4790]: I1003 08:41:20.568596 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:20Z","lastTransitionTime":"2025-10-03T08:41:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:20 crc kubenswrapper[4790]: I1003 08:41:20.671804 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:20 crc kubenswrapper[4790]: I1003 08:41:20.671864 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:20 crc kubenswrapper[4790]: I1003 08:41:20.671882 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:20 crc kubenswrapper[4790]: I1003 08:41:20.671910 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:20 crc kubenswrapper[4790]: I1003 08:41:20.671934 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:20Z","lastTransitionTime":"2025-10-03T08:41:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:20 crc kubenswrapper[4790]: I1003 08:41:20.775923 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:20 crc kubenswrapper[4790]: I1003 08:41:20.775992 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:20 crc kubenswrapper[4790]: I1003 08:41:20.776008 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:20 crc kubenswrapper[4790]: I1003 08:41:20.776032 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:20 crc kubenswrapper[4790]: I1003 08:41:20.776048 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:20Z","lastTransitionTime":"2025-10-03T08:41:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:20 crc kubenswrapper[4790]: I1003 08:41:20.878520 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:20 crc kubenswrapper[4790]: I1003 08:41:20.878596 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:20 crc kubenswrapper[4790]: I1003 08:41:20.878610 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:20 crc kubenswrapper[4790]: I1003 08:41:20.878654 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:20 crc kubenswrapper[4790]: I1003 08:41:20.878668 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:20Z","lastTransitionTime":"2025-10-03T08:41:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:20 crc kubenswrapper[4790]: I1003 08:41:20.982488 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:20 crc kubenswrapper[4790]: I1003 08:41:20.982546 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:20 crc kubenswrapper[4790]: I1003 08:41:20.982564 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:20 crc kubenswrapper[4790]: I1003 08:41:20.982634 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:20 crc kubenswrapper[4790]: I1003 08:41:20.982659 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:20Z","lastTransitionTime":"2025-10-03T08:41:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:20 crc kubenswrapper[4790]: I1003 08:41:20.985262 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9236b957-81cb-40bc-969e-a2057884ba50-metrics-certs\") pod \"network-metrics-daemon-4vl95\" (UID: \"9236b957-81cb-40bc-969e-a2057884ba50\") " pod="openshift-multus/network-metrics-daemon-4vl95" Oct 03 08:41:20 crc kubenswrapper[4790]: E1003 08:41:20.985547 4790 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 03 08:41:20 crc kubenswrapper[4790]: E1003 08:41:20.985697 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9236b957-81cb-40bc-969e-a2057884ba50-metrics-certs podName:9236b957-81cb-40bc-969e-a2057884ba50 nodeName:}" failed. No retries permitted until 2025-10-03 08:41:52.985663215 +0000 UTC m=+99.338597483 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9236b957-81cb-40bc-969e-a2057884ba50-metrics-certs") pod "network-metrics-daemon-4vl95" (UID: "9236b957-81cb-40bc-969e-a2057884ba50") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 03 08:41:21 crc kubenswrapper[4790]: I1003 08:41:21.085364 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:21 crc kubenswrapper[4790]: I1003 08:41:21.085417 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:21 crc kubenswrapper[4790]: I1003 08:41:21.085428 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:21 crc kubenswrapper[4790]: I1003 08:41:21.085447 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:21 crc kubenswrapper[4790]: I1003 08:41:21.085459 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:21Z","lastTransitionTime":"2025-10-03T08:41:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:21 crc kubenswrapper[4790]: I1003 08:41:21.188049 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:21 crc kubenswrapper[4790]: I1003 08:41:21.188086 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:21 crc kubenswrapper[4790]: I1003 08:41:21.188094 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:21 crc kubenswrapper[4790]: I1003 08:41:21.188112 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:21 crc kubenswrapper[4790]: I1003 08:41:21.188122 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:21Z","lastTransitionTime":"2025-10-03T08:41:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:21 crc kubenswrapper[4790]: I1003 08:41:21.291440 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:21 crc kubenswrapper[4790]: I1003 08:41:21.291490 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:21 crc kubenswrapper[4790]: I1003 08:41:21.291506 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:21 crc kubenswrapper[4790]: I1003 08:41:21.291531 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:21 crc kubenswrapper[4790]: I1003 08:41:21.291550 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:21Z","lastTransitionTime":"2025-10-03T08:41:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:21 crc kubenswrapper[4790]: I1003 08:41:21.400918 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:21 crc kubenswrapper[4790]: I1003 08:41:21.400950 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:21 crc kubenswrapper[4790]: I1003 08:41:21.400958 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:21 crc kubenswrapper[4790]: I1003 08:41:21.400976 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:21 crc kubenswrapper[4790]: I1003 08:41:21.400985 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:21Z","lastTransitionTime":"2025-10-03T08:41:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:21 crc kubenswrapper[4790]: I1003 08:41:21.504054 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:21 crc kubenswrapper[4790]: I1003 08:41:21.504096 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:21 crc kubenswrapper[4790]: I1003 08:41:21.504106 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:21 crc kubenswrapper[4790]: I1003 08:41:21.504122 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:21 crc kubenswrapper[4790]: I1003 08:41:21.504133 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:21Z","lastTransitionTime":"2025-10-03T08:41:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:21 crc kubenswrapper[4790]: I1003 08:41:21.607843 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:21 crc kubenswrapper[4790]: I1003 08:41:21.607903 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:21 crc kubenswrapper[4790]: I1003 08:41:21.607928 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:21 crc kubenswrapper[4790]: I1003 08:41:21.607950 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:21 crc kubenswrapper[4790]: I1003 08:41:21.607966 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:21Z","lastTransitionTime":"2025-10-03T08:41:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:21 crc kubenswrapper[4790]: I1003 08:41:21.684982 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:21 crc kubenswrapper[4790]: I1003 08:41:21.685028 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:21 crc kubenswrapper[4790]: I1003 08:41:21.685038 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:21 crc kubenswrapper[4790]: I1003 08:41:21.685061 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:21 crc kubenswrapper[4790]: I1003 08:41:21.685077 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:21Z","lastTransitionTime":"2025-10-03T08:41:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:21 crc kubenswrapper[4790]: E1003 08:41:21.715786 4790 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T08:41:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T08:41:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T08:41:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T08:41:21Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T08:41:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T08:41:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T08:41:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T08:41:21Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e374b313-7030-4739-b7f1-da1c9b39fe8d\\\",\\\"systemUUID\\\":\\\"312be368-bbdc-461d-b74b-f792414190de\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:41:21Z is after 2025-08-24T17:21:41Z" Oct 03 08:41:21 crc kubenswrapper[4790]: I1003 08:41:21.725805 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:21 crc kubenswrapper[4790]: I1003 08:41:21.725851 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:21 crc kubenswrapper[4790]: I1003 08:41:21.725864 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:21 crc kubenswrapper[4790]: I1003 08:41:21.725887 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:21 crc kubenswrapper[4790]: I1003 08:41:21.725906 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:21Z","lastTransitionTime":"2025-10-03T08:41:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:21 crc kubenswrapper[4790]: E1003 08:41:21.751336 4790 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T08:41:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T08:41:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T08:41:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T08:41:21Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T08:41:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T08:41:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T08:41:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T08:41:21Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e374b313-7030-4739-b7f1-da1c9b39fe8d\\\",\\\"systemUUID\\\":\\\"312be368-bbdc-461d-b74b-f792414190de\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:41:21Z is after 2025-08-24T17:21:41Z" Oct 03 08:41:21 crc kubenswrapper[4790]: I1003 08:41:21.758437 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:21 crc kubenswrapper[4790]: I1003 08:41:21.758480 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:21 crc kubenswrapper[4790]: I1003 08:41:21.758493 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:21 crc kubenswrapper[4790]: I1003 08:41:21.758511 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:21 crc kubenswrapper[4790]: I1003 08:41:21.758524 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:21Z","lastTransitionTime":"2025-10-03T08:41:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:21 crc kubenswrapper[4790]: E1003 08:41:21.784565 4790 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T08:41:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T08:41:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T08:41:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T08:41:21Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T08:41:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T08:41:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T08:41:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T08:41:21Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e374b313-7030-4739-b7f1-da1c9b39fe8d\\\",\\\"systemUUID\\\":\\\"312be368-bbdc-461d-b74b-f792414190de\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:41:21Z is after 2025-08-24T17:21:41Z" Oct 03 08:41:21 crc kubenswrapper[4790]: I1003 08:41:21.789953 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:21 crc kubenswrapper[4790]: I1003 08:41:21.790009 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:21 crc kubenswrapper[4790]: I1003 08:41:21.790023 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:21 crc kubenswrapper[4790]: I1003 08:41:21.790047 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:21 crc kubenswrapper[4790]: I1003 08:41:21.790061 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:21Z","lastTransitionTime":"2025-10-03T08:41:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:21 crc kubenswrapper[4790]: E1003 08:41:21.808352 4790 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T08:41:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T08:41:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T08:41:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T08:41:21Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T08:41:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T08:41:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T08:41:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T08:41:21Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e374b313-7030-4739-b7f1-da1c9b39fe8d\\\",\\\"systemUUID\\\":\\\"312be368-bbdc-461d-b74b-f792414190de\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:41:21Z is after 2025-08-24T17:21:41Z" Oct 03 08:41:21 crc kubenswrapper[4790]: I1003 08:41:21.812704 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:21 crc kubenswrapper[4790]: I1003 08:41:21.812736 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:21 crc kubenswrapper[4790]: I1003 08:41:21.812745 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:21 crc kubenswrapper[4790]: I1003 08:41:21.812762 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:21 crc kubenswrapper[4790]: I1003 08:41:21.812774 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:21Z","lastTransitionTime":"2025-10-03T08:41:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:21 crc kubenswrapper[4790]: E1003 08:41:21.826315 4790 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T08:41:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T08:41:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T08:41:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T08:41:21Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T08:41:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T08:41:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T08:41:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T08:41:21Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e374b313-7030-4739-b7f1-da1c9b39fe8d\\\",\\\"systemUUID\\\":\\\"312be368-bbdc-461d-b74b-f792414190de\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:41:21Z is after 2025-08-24T17:21:41Z" Oct 03 08:41:21 crc kubenswrapper[4790]: E1003 08:41:21.826484 4790 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 03 08:41:21 crc kubenswrapper[4790]: I1003 08:41:21.828167 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:21 crc kubenswrapper[4790]: I1003 08:41:21.828189 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:21 crc kubenswrapper[4790]: I1003 08:41:21.828197 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:21 crc kubenswrapper[4790]: I1003 08:41:21.828212 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:21 crc kubenswrapper[4790]: I1003 08:41:21.828224 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:21Z","lastTransitionTime":"2025-10-03T08:41:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:21 crc kubenswrapper[4790]: I1003 08:41:21.931335 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:21 crc kubenswrapper[4790]: I1003 08:41:21.931388 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:21 crc kubenswrapper[4790]: I1003 08:41:21.931401 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:21 crc kubenswrapper[4790]: I1003 08:41:21.931421 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:21 crc kubenswrapper[4790]: I1003 08:41:21.931434 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:21Z","lastTransitionTime":"2025-10-03T08:41:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:22 crc kubenswrapper[4790]: I1003 08:41:22.035010 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:22 crc kubenswrapper[4790]: I1003 08:41:22.035299 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:22 crc kubenswrapper[4790]: I1003 08:41:22.035399 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:22 crc kubenswrapper[4790]: I1003 08:41:22.035504 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:22 crc kubenswrapper[4790]: I1003 08:41:22.035607 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:22Z","lastTransitionTime":"2025-10-03T08:41:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:22 crc kubenswrapper[4790]: I1003 08:41:22.137949 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:22 crc kubenswrapper[4790]: I1003 08:41:22.138300 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:22 crc kubenswrapper[4790]: I1003 08:41:22.138366 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:22 crc kubenswrapper[4790]: I1003 08:41:22.138434 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:22 crc kubenswrapper[4790]: I1003 08:41:22.138501 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:22Z","lastTransitionTime":"2025-10-03T08:41:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:22 crc kubenswrapper[4790]: I1003 08:41:22.241351 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:22 crc kubenswrapper[4790]: I1003 08:41:22.241392 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:22 crc kubenswrapper[4790]: I1003 08:41:22.241403 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:22 crc kubenswrapper[4790]: I1003 08:41:22.241419 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:22 crc kubenswrapper[4790]: I1003 08:41:22.241431 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:22Z","lastTransitionTime":"2025-10-03T08:41:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:22 crc kubenswrapper[4790]: I1003 08:41:22.328233 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 08:41:22 crc kubenswrapper[4790]: I1003 08:41:22.328358 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4vl95" Oct 03 08:41:22 crc kubenswrapper[4790]: I1003 08:41:22.328358 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 08:41:22 crc kubenswrapper[4790]: I1003 08:41:22.328439 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 08:41:22 crc kubenswrapper[4790]: E1003 08:41:22.328717 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 08:41:22 crc kubenswrapper[4790]: E1003 08:41:22.328857 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 08:41:22 crc kubenswrapper[4790]: E1003 08:41:22.329135 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4vl95" podUID="9236b957-81cb-40bc-969e-a2057884ba50" Oct 03 08:41:22 crc kubenswrapper[4790]: E1003 08:41:22.329210 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 08:41:22 crc kubenswrapper[4790]: I1003 08:41:22.343790 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:22 crc kubenswrapper[4790]: I1003 08:41:22.343830 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:22 crc kubenswrapper[4790]: I1003 08:41:22.343839 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:22 crc kubenswrapper[4790]: I1003 08:41:22.343853 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:22 crc kubenswrapper[4790]: I1003 08:41:22.343863 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:22Z","lastTransitionTime":"2025-10-03T08:41:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:22 crc kubenswrapper[4790]: I1003 08:41:22.448223 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:22 crc kubenswrapper[4790]: I1003 08:41:22.448298 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:22 crc kubenswrapper[4790]: I1003 08:41:22.448318 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:22 crc kubenswrapper[4790]: I1003 08:41:22.448347 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:22 crc kubenswrapper[4790]: I1003 08:41:22.448368 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:22Z","lastTransitionTime":"2025-10-03T08:41:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:22 crc kubenswrapper[4790]: I1003 08:41:22.551745 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:22 crc kubenswrapper[4790]: I1003 08:41:22.551805 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:22 crc kubenswrapper[4790]: I1003 08:41:22.551818 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:22 crc kubenswrapper[4790]: I1003 08:41:22.551843 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:22 crc kubenswrapper[4790]: I1003 08:41:22.551855 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:22Z","lastTransitionTime":"2025-10-03T08:41:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:22 crc kubenswrapper[4790]: I1003 08:41:22.654427 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:22 crc kubenswrapper[4790]: I1003 08:41:22.654487 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:22 crc kubenswrapper[4790]: I1003 08:41:22.654503 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:22 crc kubenswrapper[4790]: I1003 08:41:22.654525 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:22 crc kubenswrapper[4790]: I1003 08:41:22.654541 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:22Z","lastTransitionTime":"2025-10-03T08:41:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:22 crc kubenswrapper[4790]: I1003 08:41:22.757436 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:22 crc kubenswrapper[4790]: I1003 08:41:22.758234 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:22 crc kubenswrapper[4790]: I1003 08:41:22.758317 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:22 crc kubenswrapper[4790]: I1003 08:41:22.758419 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:22 crc kubenswrapper[4790]: I1003 08:41:22.758486 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:22Z","lastTransitionTime":"2025-10-03T08:41:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:22 crc kubenswrapper[4790]: I1003 08:41:22.861354 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:22 crc kubenswrapper[4790]: I1003 08:41:22.861997 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:22 crc kubenswrapper[4790]: I1003 08:41:22.862087 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:22 crc kubenswrapper[4790]: I1003 08:41:22.862184 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:22 crc kubenswrapper[4790]: I1003 08:41:22.862275 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:22Z","lastTransitionTime":"2025-10-03T08:41:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:22 crc kubenswrapper[4790]: I1003 08:41:22.964774 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:22 crc kubenswrapper[4790]: I1003 08:41:22.964825 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:22 crc kubenswrapper[4790]: I1003 08:41:22.964839 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:22 crc kubenswrapper[4790]: I1003 08:41:22.964861 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:22 crc kubenswrapper[4790]: I1003 08:41:22.964875 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:22Z","lastTransitionTime":"2025-10-03T08:41:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:23 crc kubenswrapper[4790]: I1003 08:41:23.066828 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:23 crc kubenswrapper[4790]: I1003 08:41:23.067143 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:23 crc kubenswrapper[4790]: I1003 08:41:23.067210 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:23 crc kubenswrapper[4790]: I1003 08:41:23.067282 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:23 crc kubenswrapper[4790]: I1003 08:41:23.067348 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:23Z","lastTransitionTime":"2025-10-03T08:41:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:23 crc kubenswrapper[4790]: I1003 08:41:23.170536 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:23 crc kubenswrapper[4790]: I1003 08:41:23.170908 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:23 crc kubenswrapper[4790]: I1003 08:41:23.170997 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:23 crc kubenswrapper[4790]: I1003 08:41:23.171097 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:23 crc kubenswrapper[4790]: I1003 08:41:23.171183 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:23Z","lastTransitionTime":"2025-10-03T08:41:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:23 crc kubenswrapper[4790]: I1003 08:41:23.274519 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:23 crc kubenswrapper[4790]: I1003 08:41:23.274589 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:23 crc kubenswrapper[4790]: I1003 08:41:23.274603 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:23 crc kubenswrapper[4790]: I1003 08:41:23.274622 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:23 crc kubenswrapper[4790]: I1003 08:41:23.274638 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:23Z","lastTransitionTime":"2025-10-03T08:41:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:23 crc kubenswrapper[4790]: I1003 08:41:23.377517 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:23 crc kubenswrapper[4790]: I1003 08:41:23.377567 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:23 crc kubenswrapper[4790]: I1003 08:41:23.377598 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:23 crc kubenswrapper[4790]: I1003 08:41:23.377619 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:23 crc kubenswrapper[4790]: I1003 08:41:23.377635 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:23Z","lastTransitionTime":"2025-10-03T08:41:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:23 crc kubenswrapper[4790]: I1003 08:41:23.481070 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:23 crc kubenswrapper[4790]: I1003 08:41:23.481121 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:23 crc kubenswrapper[4790]: I1003 08:41:23.481141 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:23 crc kubenswrapper[4790]: I1003 08:41:23.481160 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:23 crc kubenswrapper[4790]: I1003 08:41:23.481171 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:23Z","lastTransitionTime":"2025-10-03T08:41:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:23 crc kubenswrapper[4790]: I1003 08:41:23.584272 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:23 crc kubenswrapper[4790]: I1003 08:41:23.584320 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:23 crc kubenswrapper[4790]: I1003 08:41:23.584335 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:23 crc kubenswrapper[4790]: I1003 08:41:23.584356 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:23 crc kubenswrapper[4790]: I1003 08:41:23.584376 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:23Z","lastTransitionTime":"2025-10-03T08:41:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:23 crc kubenswrapper[4790]: I1003 08:41:23.687393 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:23 crc kubenswrapper[4790]: I1003 08:41:23.687434 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:23 crc kubenswrapper[4790]: I1003 08:41:23.687444 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:23 crc kubenswrapper[4790]: I1003 08:41:23.687460 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:23 crc kubenswrapper[4790]: I1003 08:41:23.687472 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:23Z","lastTransitionTime":"2025-10-03T08:41:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:23 crc kubenswrapper[4790]: I1003 08:41:23.789586 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:23 crc kubenswrapper[4790]: I1003 08:41:23.789624 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:23 crc kubenswrapper[4790]: I1003 08:41:23.789633 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:23 crc kubenswrapper[4790]: I1003 08:41:23.789648 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:23 crc kubenswrapper[4790]: I1003 08:41:23.789661 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:23Z","lastTransitionTime":"2025-10-03T08:41:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:23 crc kubenswrapper[4790]: I1003 08:41:23.892888 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:23 crc kubenswrapper[4790]: I1003 08:41:23.892941 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:23 crc kubenswrapper[4790]: I1003 08:41:23.892952 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:23 crc kubenswrapper[4790]: I1003 08:41:23.892974 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:23 crc kubenswrapper[4790]: I1003 08:41:23.892987 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:23Z","lastTransitionTime":"2025-10-03T08:41:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:23 crc kubenswrapper[4790]: I1003 08:41:23.995903 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:23 crc kubenswrapper[4790]: I1003 08:41:23.995940 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:23 crc kubenswrapper[4790]: I1003 08:41:23.995948 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:23 crc kubenswrapper[4790]: I1003 08:41:23.995966 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:23 crc kubenswrapper[4790]: I1003 08:41:23.995975 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:23Z","lastTransitionTime":"2025-10-03T08:41:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:24 crc kubenswrapper[4790]: I1003 08:41:24.098701 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:24 crc kubenswrapper[4790]: I1003 08:41:24.098749 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:24 crc kubenswrapper[4790]: I1003 08:41:24.098761 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:24 crc kubenswrapper[4790]: I1003 08:41:24.098782 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:24 crc kubenswrapper[4790]: I1003 08:41:24.098794 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:24Z","lastTransitionTime":"2025-10-03T08:41:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:24 crc kubenswrapper[4790]: I1003 08:41:24.202701 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:24 crc kubenswrapper[4790]: I1003 08:41:24.202762 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:24 crc kubenswrapper[4790]: I1003 08:41:24.202779 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:24 crc kubenswrapper[4790]: I1003 08:41:24.202807 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:24 crc kubenswrapper[4790]: I1003 08:41:24.202823 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:24Z","lastTransitionTime":"2025-10-03T08:41:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:24 crc kubenswrapper[4790]: I1003 08:41:24.306529 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:24 crc kubenswrapper[4790]: I1003 08:41:24.306644 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:24 crc kubenswrapper[4790]: I1003 08:41:24.306663 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:24 crc kubenswrapper[4790]: I1003 08:41:24.306691 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:24 crc kubenswrapper[4790]: I1003 08:41:24.306708 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:24Z","lastTransitionTime":"2025-10-03T08:41:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:24 crc kubenswrapper[4790]: I1003 08:41:24.327257 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 08:41:24 crc kubenswrapper[4790]: I1003 08:41:24.327295 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 08:41:24 crc kubenswrapper[4790]: I1003 08:41:24.327259 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 08:41:24 crc kubenswrapper[4790]: I1003 08:41:24.327285 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4vl95" Oct 03 08:41:24 crc kubenswrapper[4790]: E1003 08:41:24.327433 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 08:41:24 crc kubenswrapper[4790]: E1003 08:41:24.327470 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 08:41:24 crc kubenswrapper[4790]: E1003 08:41:24.327724 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 08:41:24 crc kubenswrapper[4790]: E1003 08:41:24.327878 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4vl95" podUID="9236b957-81cb-40bc-969e-a2057884ba50" Oct 03 08:41:24 crc kubenswrapper[4790]: I1003 08:41:24.346750 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28b99cc027b4390bdc7e3ed519bb798eb4a7b825ff29e4574f3420b9922265b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:41:24Z is after 2025-08-24T17:21:41Z" Oct 03 08:41:24 crc kubenswrapper[4790]: I1003 08:41:24.363861 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:41:24Z is after 2025-08-24T17:21:41Z" Oct 03 08:41:24 crc kubenswrapper[4790]: I1003 08:41:24.377217 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:41:24Z is after 2025-08-24T17:21:41Z" Oct 03 08:41:24 crc kubenswrapper[4790]: I1003 08:41:24.389691 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"36eae06a-d8be-4128-8d68-acb32e1f011b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a97a1a7f3aba494cd0da6bcd15052a85d70642642a62d108ac0be51ce8e3d302\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww2bb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8c3aea222eb433729d8eac107ebcdb306f649bef25942cd0390bfc1bbda5c74\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww2bb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:35Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-zqj4j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:41:24Z is after 2025-08-24T17:21:41Z" Oct 03 08:41:24 crc kubenswrapper[4790]: I1003 08:41:24.411483 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:24 crc kubenswrapper[4790]: I1003 08:41:24.411422 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vsjvh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa372dcc-63aa-48e5-b153-7739078026ef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e69102d376e90485ca565d34e18dbc12067e166f0bdf3c4325a0873f9f2b9b0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52pgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e52e55afef7b68f05e0c7d975d4497ee57dd9fa5aef7f72e9ca7441ea1633c05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52pgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55b4e8ab33316a7f7057ee723ebd49eb0adcd7270124f084b2828ce538ed3297\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52pgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f2aaa340e3927261ef0119f786b348e2091ddcd4b791619cc521122cc0774e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52pgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff0f1dd623e164ebe1862661c5905a38bec3aa1dd5c61982eebde25d0feb4c6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52pgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27b1573f76936a5148d24accaa7fdc1a1da552fb56b348812445e7f1c1e4cbef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52pgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa6b0394cf2fb97d539acd9a37b2c7184af5640122bf0b1d3f0fb663a78a9e91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa6b0394cf2fb97d539acd9a37b2c7184af5640122bf0b1d3f0fb663a78a9e91\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-03T08:41:03Z\\\",\\\"message\\\":\\\"resolver-gprtl in node crc\\\\nI1003 08:41:03.132081 6465 obj_retry.go:386] Retry successful for *v1.Pod openshift-dns/node-resolver-gprtl after 0 failed attempt(s)\\\\nI1003 08:41:03.132085 6465 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1003 08:41:03.132087 6465 default_network_controller.go:776] Recording success event on pod openshift-dns/node-resolver-gprtl\\\\nI1003 08:41:03.132018 6465 obj_retry.go:386] Retry successful for *v1.Pod openshift-kube-apiserver/kube-apiserver-crc after 0 failed attempt(s)\\\\nI1003 08:41:03.132100 6465 default_network_controller.go:776] Recording success event on pod openshift-kube-apiserver/kube-apiserver-crc\\\\nI1003 08:41:03.132110 6465 handler.go:208] Removed *v1.Node event handler 2\\\\nI1003 08:41:03.132122 6465 factory.go:656] Stopping watch factory\\\\nI1003 08:41:03.132130 6465 base_network_controller_pods.go:916] Annotation values: ip=[10.217.0.3/23] ; mac=0a:58:0a:d9:00:03 ; gw=[10.217.0.1]\\\\nI1003 08:41:03.132141 6465 ovnkube.go:599] Stopped ovnkube\\\\nI1003 08:41:03.132178 6465 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1003 08:41:03.132192 6465 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1003 08:41:03.132253 6465 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T08:41:02Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-vsjvh_openshift-ovn-kubernetes(fa372dcc-63aa-48e5-b153-7739078026ef)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52pgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74d44e422b0cbbd3793553dad98475d5a93ac38530a167db0dd1ef152edac010\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52pgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e45c39722f37fc673cee930ca10396bba43b56e19e61aa99bd2b89f131341cb4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e45c39722f37fc673cee930ca10396bba43b56e19e61aa99bd2b89f131341cb4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:40:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52pgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vsjvh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:41:24Z is after 2025-08-24T17:21:41Z" Oct 03 08:41:24 crc kubenswrapper[4790]: I1003 08:41:24.411540 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:24 crc kubenswrapper[4790]: I1003 08:41:24.411709 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:24 crc kubenswrapper[4790]: I1003 08:41:24.411740 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:24 crc kubenswrapper[4790]: I1003 08:41:24.411755 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:24Z","lastTransitionTime":"2025-10-03T08:41:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:24 crc kubenswrapper[4790]: I1003 08:41:24.422970 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-4vl95" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9236b957-81cb-40bc-969e-a2057884ba50\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j2gd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j2gd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:49Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-4vl95\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:41:24Z is after 2025-08-24T17:21:41Z" Oct 03 08:41:24 crc kubenswrapper[4790]: I1003 08:41:24.436564 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"535765e1-fd5d-4501-b13e-684c5f08b296\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:41:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:41:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e0caa9033c82a625d9594342d4e58052cfbfea8251bc0fa2f1b102ba235b5ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db5e8a267e2821ea1c1628ff477a4efc3c5bcc78538460b0f30a160bd738dcee\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d85a637b07da3de31d73486c444bf86e7e2fd799180fc1cd9ab56fee61afaa9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c713cd887b75dc54dfefd8b613f5bdb78ade644b31fc7db5e789414fcacad4b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://905a99c09cfaadc3eddc17fa6e509098364525d7a03add46a97c98dc08def8a2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T08:40:30Z\\\",\\\"message\\\":\\\"file observer\\\\nW1003 08:40:30.396497 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1003 08:40:30.399279 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 08:40:30.400119 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-247026750/tls.crt::/tmp/serving-cert-247026750/tls.key\\\\\\\"\\\\nI1003 08:40:30.809123 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1003 08:40:30.812812 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1003 08:40:30.812831 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1003 08:40:30.812852 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1003 08:40:30.812858 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1003 08:40:30.817440 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1003 08:40:30.817463 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1003 08:40:30.817482 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 08:40:30.817491 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 08:40:30.817495 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1003 08:40:30.817498 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1003 08:40:30.817501 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1003 08:40:30.817504 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1003 08:40:30.821080 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4591d1f2d844cb16eef19a65aaad416898d23ca5ce266e4863a9810d49ff4ba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://444e973af493d5aca2c924bcaee20abca5f1281e6b9c9f364e4603bf622d0e48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://444e973af493d5aca2c924bcaee20abca5f1281e6b9c9f364e4603bf622d0e48\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:40:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:14Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:41:24Z is after 2025-08-24T17:21:41Z" Oct 03 08:41:24 crc kubenswrapper[4790]: I1003 08:41:24.449937 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gprtl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c5bfb61-e69a-4576-b200-18dd7f38c7f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dcb3e7b6550f0ad490e187c2154ec53cb427cbab7acaf0618eb4de188eb724c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66xc5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:34Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gprtl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:41:24Z is after 2025-08-24T17:21:41Z" Oct 03 08:41:24 crc kubenswrapper[4790]: I1003 08:41:24.464488 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3362615c1396a6051d3770c79e4b67eea9d0c9df2a1a45381f729c4120970818\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://41113d498e57788a4c3162e8e981f5b9d9a5a7d49ee50b15085688fbc6b67f4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:41:24Z is after 2025-08-24T17:21:41Z" Oct 03 08:41:24 crc kubenswrapper[4790]: I1003 08:41:24.483656 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7ddb59e5-6feb-471d-9594-f86e6993a2e4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b0138e08174582c0c1388716f1f6b9e42a16b009f1a467803c4dacd50c3d80ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c575d680c9bc994b8c2e97e720d794138153bc3329c4b7f8ef81115a0245d0cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://027dd0df8d83044cff9f4e8c87bef3bb18e2169bb93e615486fd1e9b5a17b86e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e837e81f3b907d8506f674d13a9bc8f20f63e6c40e1f952cd57b715855ec9bf9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:14Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:41:24Z is after 2025-08-24T17:21:41Z" Oct 03 08:41:24 crc kubenswrapper[4790]: I1003 08:41:24.495597 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7108359a-23ed-4464-a43a-6a1ef9877500\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:41:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:41:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca9eba35e4f0edf15874f10d6b11af0b09a3c3a43eaf8340ec5604122463dc8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d8b7cc917b16307413ed2fb2c91cf91b438cf0499138176e99562d0415bf872c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://55ba2aa3b5d8466c1193dab58f6add86f97ed9065aaa71ba877ad002d7b30ccc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3486e595b261af48bff933da2e9a7c49631e4523babd35cc054b39291e410512\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3486e595b261af48bff933da2e9a7c49631e4523babd35cc054b39291e410512\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:40:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:15Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:14Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:41:24Z is after 2025-08-24T17:21:41Z" Oct 03 08:41:24 crc kubenswrapper[4790]: I1003 08:41:24.507245 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08d57a60251c60c6c6a2cab2b862c322c01b531ca1aa12ff4b0933bee9eb95f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:41:24Z is after 2025-08-24T17:21:41Z" Oct 03 08:41:24 crc kubenswrapper[4790]: I1003 08:41:24.514139 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:24 crc kubenswrapper[4790]: I1003 08:41:24.514198 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:24 crc kubenswrapper[4790]: I1003 08:41:24.514211 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:24 crc kubenswrapper[4790]: I1003 08:41:24.514232 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:24 crc kubenswrapper[4790]: I1003 08:41:24.514244 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:24Z","lastTransitionTime":"2025-10-03T08:41:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:24 crc kubenswrapper[4790]: I1003 08:41:24.521565 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-hncl2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f2f5631d-8486-44f9-81b0-ee78515e7866\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3fde680972c36057c9e06b0f745acbfb2e6ab399b14d6c25a3c379b65e598afd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-strp2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-hncl2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:41:24Z is after 2025-08-24T17:21:41Z" Oct 03 08:41:24 crc kubenswrapper[4790]: I1003 08:41:24.533980 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:41:24Z is after 2025-08-24T17:21:41Z" Oct 03 08:41:24 crc kubenswrapper[4790]: I1003 08:41:24.547958 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9fnsr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e01f9373-74e4-439b-bd7f-ad00e60a3284\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45be0671b4ad4d5e294f87dba5d52747298b00a3ebd9a13f2845881c0f1846a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krfjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://876c35e32e5877c2a69cdbb54b84ec0bc0daebd05c7847875a1c122aafac2057\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://876c35e32e5877c2a69cdbb54b84ec0bc0daebd05c7847875a1c122aafac2057\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krfjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://938ff3fffad5347212d00d02f96c105cd5e6252a4d6a7ada258f0367da3f8e77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://938ff3fffad5347212d00d02f96c105cd5e6252a4d6a7ada258f0367da3f8e77\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:40:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krfjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca99fc95f4bc463b8e638cfa1e6dc6be77fc4111b6ec61b4b2913fdf91f46671\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ca99fc95f4bc463b8e638cfa1e6dc6be77fc4111b6ec61b4b2913fdf91f46671\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:40:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krfjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9667f80988a61a4f26f6f2b8b047ef94bdc6fa052083d5d1285f5bd9235615ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9667f80988a61a4f26f6f2b8b047ef94bdc6fa052083d5d1285f5bd9235615ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:40:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krfjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a4e7fb7b6f922e6e0fa2a7fcb5476ba549a6aa21dc88ee145bc55f4dac0912bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a4e7fb7b6f922e6e0fa2a7fcb5476ba549a6aa21dc88ee145bc55f4dac0912bb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:40:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krfjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://76a8239bfbb38c5858f1f9b0b01b9552463d8b2774564d7cdacb6757bf053656\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76a8239bfbb38c5858f1f9b0b01b9552463d8b2774564d7cdacb6757bf053656\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:40:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krfjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9fnsr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:41:24Z is after 2025-08-24T17:21:41Z" Oct 03 08:41:24 crc kubenswrapper[4790]: I1003 08:41:24.559156 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7bpwn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a3923f73-5efb-4ce3-b68f-0c4e1e8eca4b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://305c950f0352fe1bec1c151dc89eed1389144ad5ee321458d48c6609e688bb73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gt4gf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:37Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7bpwn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:41:24Z is after 2025-08-24T17:21:41Z" Oct 03 08:41:24 crc kubenswrapper[4790]: I1003 08:41:24.573983 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-fn764" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12525ca4-5242-41bc-b7ba-2c1ab5a45892\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ee4f33c772a396fdd8d2a6f958d402139cb81cfd79da0f32c83bbe9adbe0801\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2t6qw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d79cdfa897fe5ca615930826e3a153a184ab7977626a9837dbcdbddbe4d27af6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2t6qw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-fn764\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:41:24Z is after 2025-08-24T17:21:41Z" Oct 03 08:41:24 crc kubenswrapper[4790]: I1003 08:41:24.616905 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:24 crc kubenswrapper[4790]: I1003 08:41:24.616947 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:24 crc kubenswrapper[4790]: I1003 08:41:24.616959 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:24 crc kubenswrapper[4790]: I1003 08:41:24.616977 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:24 crc kubenswrapper[4790]: I1003 08:41:24.616987 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:24Z","lastTransitionTime":"2025-10-03T08:41:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:24 crc kubenswrapper[4790]: I1003 08:41:24.719536 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:24 crc kubenswrapper[4790]: I1003 08:41:24.719637 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:24 crc kubenswrapper[4790]: I1003 08:41:24.719652 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:24 crc kubenswrapper[4790]: I1003 08:41:24.719671 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:24 crc kubenswrapper[4790]: I1003 08:41:24.719683 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:24Z","lastTransitionTime":"2025-10-03T08:41:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:24 crc kubenswrapper[4790]: I1003 08:41:24.823001 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:24 crc kubenswrapper[4790]: I1003 08:41:24.823066 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:24 crc kubenswrapper[4790]: I1003 08:41:24.823078 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:24 crc kubenswrapper[4790]: I1003 08:41:24.823100 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:24 crc kubenswrapper[4790]: I1003 08:41:24.823113 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:24Z","lastTransitionTime":"2025-10-03T08:41:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:24 crc kubenswrapper[4790]: I1003 08:41:24.884082 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-hncl2_f2f5631d-8486-44f9-81b0-ee78515e7866/kube-multus/0.log" Oct 03 08:41:24 crc kubenswrapper[4790]: I1003 08:41:24.884356 4790 generic.go:334] "Generic (PLEG): container finished" podID="f2f5631d-8486-44f9-81b0-ee78515e7866" containerID="3fde680972c36057c9e06b0f745acbfb2e6ab399b14d6c25a3c379b65e598afd" exitCode=1 Oct 03 08:41:24 crc kubenswrapper[4790]: I1003 08:41:24.884428 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-hncl2" event={"ID":"f2f5631d-8486-44f9-81b0-ee78515e7866","Type":"ContainerDied","Data":"3fde680972c36057c9e06b0f745acbfb2e6ab399b14d6c25a3c379b65e598afd"} Oct 03 08:41:24 crc kubenswrapper[4790]: I1003 08:41:24.887998 4790 scope.go:117] "RemoveContainer" containerID="3fde680972c36057c9e06b0f745acbfb2e6ab399b14d6c25a3c379b65e598afd" Oct 03 08:41:24 crc kubenswrapper[4790]: I1003 08:41:24.907172 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"535765e1-fd5d-4501-b13e-684c5f08b296\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:41:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:41:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e0caa9033c82a625d9594342d4e58052cfbfea8251bc0fa2f1b102ba235b5ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db5e8a267e2821ea1c1628ff477a4efc3c5bcc78538460b0f30a160bd738dcee\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d85a637b07da3de31d73486c444bf86e7e2fd799180fc1cd9ab56fee61afaa9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c713cd887b75dc54dfefd8b613f5bdb78ade644b31fc7db5e789414fcacad4b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://905a99c09cfaadc3eddc17fa6e509098364525d7a03add46a97c98dc08def8a2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T08:40:30Z\\\",\\\"message\\\":\\\"file observer\\\\nW1003 08:40:30.396497 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1003 08:40:30.399279 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 08:40:30.400119 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-247026750/tls.crt::/tmp/serving-cert-247026750/tls.key\\\\\\\"\\\\nI1003 08:40:30.809123 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1003 08:40:30.812812 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1003 08:40:30.812831 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1003 08:40:30.812852 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1003 08:40:30.812858 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1003 08:40:30.817440 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1003 08:40:30.817463 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1003 08:40:30.817482 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 08:40:30.817491 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 08:40:30.817495 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1003 08:40:30.817498 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1003 08:40:30.817501 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1003 08:40:30.817504 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1003 08:40:30.821080 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4591d1f2d844cb16eef19a65aaad416898d23ca5ce266e4863a9810d49ff4ba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://444e973af493d5aca2c924bcaee20abca5f1281e6b9c9f364e4603bf622d0e48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://444e973af493d5aca2c924bcaee20abca5f1281e6b9c9f364e4603bf622d0e48\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:40:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:14Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:41:24Z is after 2025-08-24T17:21:41Z" Oct 03 08:41:24 crc kubenswrapper[4790]: I1003 08:41:24.921456 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gprtl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c5bfb61-e69a-4576-b200-18dd7f38c7f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dcb3e7b6550f0ad490e187c2154ec53cb427cbab7acaf0618eb4de188eb724c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66xc5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:34Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gprtl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:41:24Z is after 2025-08-24T17:21:41Z" Oct 03 08:41:24 crc kubenswrapper[4790]: I1003 08:41:24.926737 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:24 crc kubenswrapper[4790]: I1003 08:41:24.926803 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:24 crc kubenswrapper[4790]: I1003 08:41:24.926818 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:24 crc kubenswrapper[4790]: I1003 08:41:24.926845 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:24 crc kubenswrapper[4790]: I1003 08:41:24.926860 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:24Z","lastTransitionTime":"2025-10-03T08:41:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:24 crc kubenswrapper[4790]: I1003 08:41:24.937038 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3362615c1396a6051d3770c79e4b67eea9d0c9df2a1a45381f729c4120970818\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://41113d498e57788a4c3162e8e981f5b9d9a5a7d49ee50b15085688fbc6b67f4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:41:24Z is after 2025-08-24T17:21:41Z" Oct 03 08:41:24 crc kubenswrapper[4790]: I1003 08:41:24.953223 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7ddb59e5-6feb-471d-9594-f86e6993a2e4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b0138e08174582c0c1388716f1f6b9e42a16b009f1a467803c4dacd50c3d80ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c575d680c9bc994b8c2e97e720d794138153bc3329c4b7f8ef81115a0245d0cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://027dd0df8d83044cff9f4e8c87bef3bb18e2169bb93e615486fd1e9b5a17b86e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e837e81f3b907d8506f674d13a9bc8f20f63e6c40e1f952cd57b715855ec9bf9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:14Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:41:24Z is after 2025-08-24T17:21:41Z" Oct 03 08:41:24 crc kubenswrapper[4790]: I1003 08:41:24.968158 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7108359a-23ed-4464-a43a-6a1ef9877500\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:41:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:41:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca9eba35e4f0edf15874f10d6b11af0b09a3c3a43eaf8340ec5604122463dc8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d8b7cc917b16307413ed2fb2c91cf91b438cf0499138176e99562d0415bf872c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://55ba2aa3b5d8466c1193dab58f6add86f97ed9065aaa71ba877ad002d7b30ccc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3486e595b261af48bff933da2e9a7c49631e4523babd35cc054b39291e410512\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3486e595b261af48bff933da2e9a7c49631e4523babd35cc054b39291e410512\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:40:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:15Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:14Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:41:24Z is after 2025-08-24T17:21:41Z" Oct 03 08:41:24 crc kubenswrapper[4790]: I1003 08:41:24.982864 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08d57a60251c60c6c6a2cab2b862c322c01b531ca1aa12ff4b0933bee9eb95f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:41:24Z is after 2025-08-24T17:21:41Z" Oct 03 08:41:24 crc kubenswrapper[4790]: I1003 08:41:24.999164 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-hncl2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f2f5631d-8486-44f9-81b0-ee78515e7866\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:41:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:41:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3fde680972c36057c9e06b0f745acbfb2e6ab399b14d6c25a3c379b65e598afd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3fde680972c36057c9e06b0f745acbfb2e6ab399b14d6c25a3c379b65e598afd\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-03T08:41:24Z\\\",\\\"message\\\":\\\"2025-10-03T08:40:38+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_d2710489-37bc-43d0-9069-63e58afa0354\\\\n2025-10-03T08:40:38+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_d2710489-37bc-43d0-9069-63e58afa0354 to /host/opt/cni/bin/\\\\n2025-10-03T08:40:39Z [verbose] multus-daemon started\\\\n2025-10-03T08:40:39Z [verbose] Readiness Indicator file check\\\\n2025-10-03T08:41:24Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-strp2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-hncl2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:41:24Z is after 2025-08-24T17:21:41Z" Oct 03 08:41:25 crc kubenswrapper[4790]: I1003 08:41:25.013792 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-fn764" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12525ca4-5242-41bc-b7ba-2c1ab5a45892\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ee4f33c772a396fdd8d2a6f958d402139cb81cfd79da0f32c83bbe9adbe0801\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2t6qw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d79cdfa897fe5ca615930826e3a153a184ab7977626a9837dbcdbddbe4d27af6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2t6qw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-fn764\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:41:25Z is after 2025-08-24T17:21:41Z" Oct 03 08:41:25 crc kubenswrapper[4790]: I1003 08:41:25.029059 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:25 crc kubenswrapper[4790]: I1003 08:41:25.029246 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:25 crc kubenswrapper[4790]: I1003 08:41:25.029315 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:25 crc kubenswrapper[4790]: I1003 08:41:25.029400 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:25 crc kubenswrapper[4790]: I1003 08:41:25.029470 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:25Z","lastTransitionTime":"2025-10-03T08:41:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:25 crc kubenswrapper[4790]: I1003 08:41:25.031406 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:41:25Z is after 2025-08-24T17:21:41Z" Oct 03 08:41:25 crc kubenswrapper[4790]: I1003 08:41:25.046077 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9fnsr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e01f9373-74e4-439b-bd7f-ad00e60a3284\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45be0671b4ad4d5e294f87dba5d52747298b00a3ebd9a13f2845881c0f1846a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krfjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://876c35e32e5877c2a69cdbb54b84ec0bc0daebd05c7847875a1c122aafac2057\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://876c35e32e5877c2a69cdbb54b84ec0bc0daebd05c7847875a1c122aafac2057\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krfjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://938ff3fffad5347212d00d02f96c105cd5e6252a4d6a7ada258f0367da3f8e77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://938ff3fffad5347212d00d02f96c105cd5e6252a4d6a7ada258f0367da3f8e77\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:40:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krfjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca99fc95f4bc463b8e638cfa1e6dc6be77fc4111b6ec61b4b2913fdf91f46671\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ca99fc95f4bc463b8e638cfa1e6dc6be77fc4111b6ec61b4b2913fdf91f46671\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:40:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krfjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9667f80988a61a4f26f6f2b8b047ef94bdc6fa052083d5d1285f5bd9235615ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9667f80988a61a4f26f6f2b8b047ef94bdc6fa052083d5d1285f5bd9235615ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:40:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krfjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a4e7fb7b6f922e6e0fa2a7fcb5476ba549a6aa21dc88ee145bc55f4dac0912bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a4e7fb7b6f922e6e0fa2a7fcb5476ba549a6aa21dc88ee145bc55f4dac0912bb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:40:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krfjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://76a8239bfbb38c5858f1f9b0b01b9552463d8b2774564d7cdacb6757bf053656\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76a8239bfbb38c5858f1f9b0b01b9552463d8b2774564d7cdacb6757bf053656\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:40:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krfjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9fnsr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:41:25Z is after 2025-08-24T17:21:41Z" Oct 03 08:41:25 crc kubenswrapper[4790]: I1003 08:41:25.057561 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7bpwn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a3923f73-5efb-4ce3-b68f-0c4e1e8eca4b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://305c950f0352fe1bec1c151dc89eed1389144ad5ee321458d48c6609e688bb73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gt4gf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:37Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7bpwn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:41:25Z is after 2025-08-24T17:21:41Z" Oct 03 08:41:25 crc kubenswrapper[4790]: I1003 08:41:25.078761 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vsjvh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa372dcc-63aa-48e5-b153-7739078026ef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e69102d376e90485ca565d34e18dbc12067e166f0bdf3c4325a0873f9f2b9b0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52pgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e52e55afef7b68f05e0c7d975d4497ee57dd9fa5aef7f72e9ca7441ea1633c05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52pgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55b4e8ab33316a7f7057ee723ebd49eb0adcd7270124f084b2828ce538ed3297\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52pgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f2aaa340e3927261ef0119f786b348e2091ddcd4b791619cc521122cc0774e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52pgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff0f1dd623e164ebe1862661c5905a38bec3aa1dd5c61982eebde25d0feb4c6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52pgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27b1573f76936a5148d24accaa7fdc1a1da552fb56b348812445e7f1c1e4cbef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52pgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa6b0394cf2fb97d539acd9a37b2c7184af5640122bf0b1d3f0fb663a78a9e91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa6b0394cf2fb97d539acd9a37b2c7184af5640122bf0b1d3f0fb663a78a9e91\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-03T08:41:03Z\\\",\\\"message\\\":\\\"resolver-gprtl in node crc\\\\nI1003 08:41:03.132081 6465 obj_retry.go:386] Retry successful for *v1.Pod openshift-dns/node-resolver-gprtl after 0 failed attempt(s)\\\\nI1003 08:41:03.132085 6465 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1003 08:41:03.132087 6465 default_network_controller.go:776] Recording success event on pod openshift-dns/node-resolver-gprtl\\\\nI1003 08:41:03.132018 6465 obj_retry.go:386] Retry successful for *v1.Pod openshift-kube-apiserver/kube-apiserver-crc after 0 failed attempt(s)\\\\nI1003 08:41:03.132100 6465 default_network_controller.go:776] Recording success event on pod openshift-kube-apiserver/kube-apiserver-crc\\\\nI1003 08:41:03.132110 6465 handler.go:208] Removed *v1.Node event handler 2\\\\nI1003 08:41:03.132122 6465 factory.go:656] Stopping watch factory\\\\nI1003 08:41:03.132130 6465 base_network_controller_pods.go:916] Annotation values: ip=[10.217.0.3/23] ; mac=0a:58:0a:d9:00:03 ; gw=[10.217.0.1]\\\\nI1003 08:41:03.132141 6465 ovnkube.go:599] Stopped ovnkube\\\\nI1003 08:41:03.132178 6465 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1003 08:41:03.132192 6465 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1003 08:41:03.132253 6465 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T08:41:02Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-vsjvh_openshift-ovn-kubernetes(fa372dcc-63aa-48e5-b153-7739078026ef)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52pgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74d44e422b0cbbd3793553dad98475d5a93ac38530a167db0dd1ef152edac010\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52pgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e45c39722f37fc673cee930ca10396bba43b56e19e61aa99bd2b89f131341cb4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e45c39722f37fc673cee930ca10396bba43b56e19e61aa99bd2b89f131341cb4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:40:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52pgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vsjvh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:41:25Z is after 2025-08-24T17:21:41Z" Oct 03 08:41:25 crc kubenswrapper[4790]: I1003 08:41:25.089944 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-4vl95" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9236b957-81cb-40bc-969e-a2057884ba50\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j2gd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j2gd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:49Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-4vl95\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:41:25Z is after 2025-08-24T17:21:41Z" Oct 03 08:41:25 crc kubenswrapper[4790]: I1003 08:41:25.103975 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28b99cc027b4390bdc7e3ed519bb798eb4a7b825ff29e4574f3420b9922265b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:41:25Z is after 2025-08-24T17:21:41Z" Oct 03 08:41:25 crc kubenswrapper[4790]: I1003 08:41:25.117814 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:41:25Z is after 2025-08-24T17:21:41Z" Oct 03 08:41:25 crc kubenswrapper[4790]: I1003 08:41:25.129821 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:41:25Z is after 2025-08-24T17:21:41Z" Oct 03 08:41:25 crc kubenswrapper[4790]: I1003 08:41:25.131752 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:25 crc kubenswrapper[4790]: I1003 08:41:25.131796 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:25 crc kubenswrapper[4790]: I1003 08:41:25.131824 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:25 crc kubenswrapper[4790]: I1003 08:41:25.131842 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:25 crc kubenswrapper[4790]: I1003 08:41:25.131852 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:25Z","lastTransitionTime":"2025-10-03T08:41:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:25 crc kubenswrapper[4790]: I1003 08:41:25.142517 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"36eae06a-d8be-4128-8d68-acb32e1f011b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a97a1a7f3aba494cd0da6bcd15052a85d70642642a62d108ac0be51ce8e3d302\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww2bb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8c3aea222eb433729d8eac107ebcdb306f649bef25942cd0390bfc1bbda5c74\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww2bb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:35Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-zqj4j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:41:25Z is after 2025-08-24T17:21:41Z" Oct 03 08:41:25 crc kubenswrapper[4790]: I1003 08:41:25.235066 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:25 crc kubenswrapper[4790]: I1003 08:41:25.235107 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:25 crc kubenswrapper[4790]: I1003 08:41:25.235117 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:25 crc kubenswrapper[4790]: I1003 08:41:25.235137 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:25 crc kubenswrapper[4790]: I1003 08:41:25.235148 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:25Z","lastTransitionTime":"2025-10-03T08:41:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:25 crc kubenswrapper[4790]: I1003 08:41:25.337671 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:25 crc kubenswrapper[4790]: I1003 08:41:25.337742 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:25 crc kubenswrapper[4790]: I1003 08:41:25.337756 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:25 crc kubenswrapper[4790]: I1003 08:41:25.337779 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:25 crc kubenswrapper[4790]: I1003 08:41:25.337796 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:25Z","lastTransitionTime":"2025-10-03T08:41:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:25 crc kubenswrapper[4790]: I1003 08:41:25.440937 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:25 crc kubenswrapper[4790]: I1003 08:41:25.440992 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:25 crc kubenswrapper[4790]: I1003 08:41:25.441006 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:25 crc kubenswrapper[4790]: I1003 08:41:25.441026 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:25 crc kubenswrapper[4790]: I1003 08:41:25.441037 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:25Z","lastTransitionTime":"2025-10-03T08:41:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:25 crc kubenswrapper[4790]: I1003 08:41:25.544467 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:25 crc kubenswrapper[4790]: I1003 08:41:25.544530 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:25 crc kubenswrapper[4790]: I1003 08:41:25.544546 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:25 crc kubenswrapper[4790]: I1003 08:41:25.544593 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:25 crc kubenswrapper[4790]: I1003 08:41:25.544611 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:25Z","lastTransitionTime":"2025-10-03T08:41:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:25 crc kubenswrapper[4790]: I1003 08:41:25.648385 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:25 crc kubenswrapper[4790]: I1003 08:41:25.648446 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:25 crc kubenswrapper[4790]: I1003 08:41:25.648457 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:25 crc kubenswrapper[4790]: I1003 08:41:25.648491 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:25 crc kubenswrapper[4790]: I1003 08:41:25.648502 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:25Z","lastTransitionTime":"2025-10-03T08:41:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:25 crc kubenswrapper[4790]: I1003 08:41:25.751891 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:25 crc kubenswrapper[4790]: I1003 08:41:25.751941 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:25 crc kubenswrapper[4790]: I1003 08:41:25.751952 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:25 crc kubenswrapper[4790]: I1003 08:41:25.751973 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:25 crc kubenswrapper[4790]: I1003 08:41:25.751985 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:25Z","lastTransitionTime":"2025-10-03T08:41:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:25 crc kubenswrapper[4790]: I1003 08:41:25.855287 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:25 crc kubenswrapper[4790]: I1003 08:41:25.855341 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:25 crc kubenswrapper[4790]: I1003 08:41:25.855353 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:25 crc kubenswrapper[4790]: I1003 08:41:25.855379 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:25 crc kubenswrapper[4790]: I1003 08:41:25.855393 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:25Z","lastTransitionTime":"2025-10-03T08:41:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:25 crc kubenswrapper[4790]: I1003 08:41:25.890514 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-hncl2_f2f5631d-8486-44f9-81b0-ee78515e7866/kube-multus/0.log" Oct 03 08:41:25 crc kubenswrapper[4790]: I1003 08:41:25.890615 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-hncl2" event={"ID":"f2f5631d-8486-44f9-81b0-ee78515e7866","Type":"ContainerStarted","Data":"934c9d6aafe23db007c316b08845e9d67eecf0072ba180fa513fa213245cc37c"} Oct 03 08:41:25 crc kubenswrapper[4790]: I1003 08:41:25.907248 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7108359a-23ed-4464-a43a-6a1ef9877500\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:41:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:41:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca9eba35e4f0edf15874f10d6b11af0b09a3c3a43eaf8340ec5604122463dc8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d8b7cc917b16307413ed2fb2c91cf91b438cf0499138176e99562d0415bf872c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://55ba2aa3b5d8466c1193dab58f6add86f97ed9065aaa71ba877ad002d7b30ccc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3486e595b261af48bff933da2e9a7c49631e4523babd35cc054b39291e410512\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3486e595b261af48bff933da2e9a7c49631e4523babd35cc054b39291e410512\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:40:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:15Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:14Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:41:25Z is after 2025-08-24T17:21:41Z" Oct 03 08:41:25 crc kubenswrapper[4790]: I1003 08:41:25.927183 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08d57a60251c60c6c6a2cab2b862c322c01b531ca1aa12ff4b0933bee9eb95f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:41:25Z is after 2025-08-24T17:21:41Z" Oct 03 08:41:25 crc kubenswrapper[4790]: I1003 08:41:25.943990 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-hncl2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f2f5631d-8486-44f9-81b0-ee78515e7866\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:41:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:41:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://934c9d6aafe23db007c316b08845e9d67eecf0072ba180fa513fa213245cc37c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3fde680972c36057c9e06b0f745acbfb2e6ab399b14d6c25a3c379b65e598afd\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-03T08:41:24Z\\\",\\\"message\\\":\\\"2025-10-03T08:40:38+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_d2710489-37bc-43d0-9069-63e58afa0354\\\\n2025-10-03T08:40:38+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_d2710489-37bc-43d0-9069-63e58afa0354 to /host/opt/cni/bin/\\\\n2025-10-03T08:40:39Z [verbose] multus-daemon started\\\\n2025-10-03T08:40:39Z [verbose] Readiness Indicator file check\\\\n2025-10-03T08:41:24Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:37Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:41:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-strp2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-hncl2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:41:25Z is after 2025-08-24T17:21:41Z" Oct 03 08:41:25 crc kubenswrapper[4790]: I1003 08:41:25.959050 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7ddb59e5-6feb-471d-9594-f86e6993a2e4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b0138e08174582c0c1388716f1f6b9e42a16b009f1a467803c4dacd50c3d80ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c575d680c9bc994b8c2e97e720d794138153bc3329c4b7f8ef81115a0245d0cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://027dd0df8d83044cff9f4e8c87bef3bb18e2169bb93e615486fd1e9b5a17b86e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e837e81f3b907d8506f674d13a9bc8f20f63e6c40e1f952cd57b715855ec9bf9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:14Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:41:25Z is after 2025-08-24T17:21:41Z" Oct 03 08:41:25 crc kubenswrapper[4790]: I1003 08:41:25.959200 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:25 crc kubenswrapper[4790]: I1003 08:41:25.959309 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:25 crc kubenswrapper[4790]: I1003 08:41:25.959318 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:25 crc kubenswrapper[4790]: I1003 08:41:25.959333 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:25 crc kubenswrapper[4790]: I1003 08:41:25.959344 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:25Z","lastTransitionTime":"2025-10-03T08:41:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:25 crc kubenswrapper[4790]: I1003 08:41:25.975829 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:41:25Z is after 2025-08-24T17:21:41Z" Oct 03 08:41:25 crc kubenswrapper[4790]: I1003 08:41:25.992839 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9fnsr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e01f9373-74e4-439b-bd7f-ad00e60a3284\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45be0671b4ad4d5e294f87dba5d52747298b00a3ebd9a13f2845881c0f1846a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krfjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://876c35e32e5877c2a69cdbb54b84ec0bc0daebd05c7847875a1c122aafac2057\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://876c35e32e5877c2a69cdbb54b84ec0bc0daebd05c7847875a1c122aafac2057\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krfjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://938ff3fffad5347212d00d02f96c105cd5e6252a4d6a7ada258f0367da3f8e77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://938ff3fffad5347212d00d02f96c105cd5e6252a4d6a7ada258f0367da3f8e77\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:40:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krfjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca99fc95f4bc463b8e638cfa1e6dc6be77fc4111b6ec61b4b2913fdf91f46671\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ca99fc95f4bc463b8e638cfa1e6dc6be77fc4111b6ec61b4b2913fdf91f46671\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:40:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krfjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9667f80988a61a4f26f6f2b8b047ef94bdc6fa052083d5d1285f5bd9235615ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9667f80988a61a4f26f6f2b8b047ef94bdc6fa052083d5d1285f5bd9235615ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:40:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krfjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a4e7fb7b6f922e6e0fa2a7fcb5476ba549a6aa21dc88ee145bc55f4dac0912bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a4e7fb7b6f922e6e0fa2a7fcb5476ba549a6aa21dc88ee145bc55f4dac0912bb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:40:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krfjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://76a8239bfbb38c5858f1f9b0b01b9552463d8b2774564d7cdacb6757bf053656\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76a8239bfbb38c5858f1f9b0b01b9552463d8b2774564d7cdacb6757bf053656\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:40:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krfjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9fnsr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:41:25Z is after 2025-08-24T17:21:41Z" Oct 03 08:41:26 crc kubenswrapper[4790]: I1003 08:41:26.005963 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7bpwn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a3923f73-5efb-4ce3-b68f-0c4e1e8eca4b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://305c950f0352fe1bec1c151dc89eed1389144ad5ee321458d48c6609e688bb73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gt4gf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:37Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7bpwn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:41:26Z is after 2025-08-24T17:21:41Z" Oct 03 08:41:26 crc kubenswrapper[4790]: I1003 08:41:26.019524 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-fn764" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12525ca4-5242-41bc-b7ba-2c1ab5a45892\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ee4f33c772a396fdd8d2a6f958d402139cb81cfd79da0f32c83bbe9adbe0801\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2t6qw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d79cdfa897fe5ca615930826e3a153a184ab7977626a9837dbcdbddbe4d27af6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2t6qw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-fn764\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:41:26Z is after 2025-08-24T17:21:41Z" Oct 03 08:41:26 crc kubenswrapper[4790]: I1003 08:41:26.041357 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:41:26Z is after 2025-08-24T17:21:41Z" Oct 03 08:41:26 crc kubenswrapper[4790]: I1003 08:41:26.058566 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:41:26Z is after 2025-08-24T17:21:41Z" Oct 03 08:41:26 crc kubenswrapper[4790]: I1003 08:41:26.062561 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:26 crc kubenswrapper[4790]: I1003 08:41:26.062616 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:26 crc kubenswrapper[4790]: I1003 08:41:26.062629 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:26 crc kubenswrapper[4790]: I1003 08:41:26.062650 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:26 crc kubenswrapper[4790]: I1003 08:41:26.062665 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:26Z","lastTransitionTime":"2025-10-03T08:41:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:26 crc kubenswrapper[4790]: I1003 08:41:26.075206 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"36eae06a-d8be-4128-8d68-acb32e1f011b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a97a1a7f3aba494cd0da6bcd15052a85d70642642a62d108ac0be51ce8e3d302\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww2bb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8c3aea222eb433729d8eac107ebcdb306f649bef25942cd0390bfc1bbda5c74\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww2bb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:35Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-zqj4j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:41:26Z is after 2025-08-24T17:21:41Z" Oct 03 08:41:26 crc kubenswrapper[4790]: I1003 08:41:26.097797 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vsjvh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa372dcc-63aa-48e5-b153-7739078026ef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e69102d376e90485ca565d34e18dbc12067e166f0bdf3c4325a0873f9f2b9b0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52pgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e52e55afef7b68f05e0c7d975d4497ee57dd9fa5aef7f72e9ca7441ea1633c05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52pgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55b4e8ab33316a7f7057ee723ebd49eb0adcd7270124f084b2828ce538ed3297\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52pgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f2aaa340e3927261ef0119f786b348e2091ddcd4b791619cc521122cc0774e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52pgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff0f1dd623e164ebe1862661c5905a38bec3aa1dd5c61982eebde25d0feb4c6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52pgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27b1573f76936a5148d24accaa7fdc1a1da552fb56b348812445e7f1c1e4cbef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52pgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa6b0394cf2fb97d539acd9a37b2c7184af5640122bf0b1d3f0fb663a78a9e91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa6b0394cf2fb97d539acd9a37b2c7184af5640122bf0b1d3f0fb663a78a9e91\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-03T08:41:03Z\\\",\\\"message\\\":\\\"resolver-gprtl in node crc\\\\nI1003 08:41:03.132081 6465 obj_retry.go:386] Retry successful for *v1.Pod openshift-dns/node-resolver-gprtl after 0 failed attempt(s)\\\\nI1003 08:41:03.132085 6465 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1003 08:41:03.132087 6465 default_network_controller.go:776] Recording success event on pod openshift-dns/node-resolver-gprtl\\\\nI1003 08:41:03.132018 6465 obj_retry.go:386] Retry successful for *v1.Pod openshift-kube-apiserver/kube-apiserver-crc after 0 failed attempt(s)\\\\nI1003 08:41:03.132100 6465 default_network_controller.go:776] Recording success event on pod openshift-kube-apiserver/kube-apiserver-crc\\\\nI1003 08:41:03.132110 6465 handler.go:208] Removed *v1.Node event handler 2\\\\nI1003 08:41:03.132122 6465 factory.go:656] Stopping watch factory\\\\nI1003 08:41:03.132130 6465 base_network_controller_pods.go:916] Annotation values: ip=[10.217.0.3/23] ; mac=0a:58:0a:d9:00:03 ; gw=[10.217.0.1]\\\\nI1003 08:41:03.132141 6465 ovnkube.go:599] Stopped ovnkube\\\\nI1003 08:41:03.132178 6465 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1003 08:41:03.132192 6465 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1003 08:41:03.132253 6465 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T08:41:02Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-vsjvh_openshift-ovn-kubernetes(fa372dcc-63aa-48e5-b153-7739078026ef)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52pgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74d44e422b0cbbd3793553dad98475d5a93ac38530a167db0dd1ef152edac010\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52pgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e45c39722f37fc673cee930ca10396bba43b56e19e61aa99bd2b89f131341cb4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e45c39722f37fc673cee930ca10396bba43b56e19e61aa99bd2b89f131341cb4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:40:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52pgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vsjvh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:41:26Z is after 2025-08-24T17:21:41Z" Oct 03 08:41:26 crc kubenswrapper[4790]: I1003 08:41:26.111038 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-4vl95" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9236b957-81cb-40bc-969e-a2057884ba50\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j2gd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j2gd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:49Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-4vl95\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:41:26Z is after 2025-08-24T17:21:41Z" Oct 03 08:41:26 crc kubenswrapper[4790]: I1003 08:41:26.127344 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28b99cc027b4390bdc7e3ed519bb798eb4a7b825ff29e4574f3420b9922265b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:41:26Z is after 2025-08-24T17:21:41Z" Oct 03 08:41:26 crc kubenswrapper[4790]: I1003 08:41:26.144831 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"535765e1-fd5d-4501-b13e-684c5f08b296\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:41:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:41:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e0caa9033c82a625d9594342d4e58052cfbfea8251bc0fa2f1b102ba235b5ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db5e8a267e2821ea1c1628ff477a4efc3c5bcc78538460b0f30a160bd738dcee\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d85a637b07da3de31d73486c444bf86e7e2fd799180fc1cd9ab56fee61afaa9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c713cd887b75dc54dfefd8b613f5bdb78ade644b31fc7db5e789414fcacad4b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://905a99c09cfaadc3eddc17fa6e509098364525d7a03add46a97c98dc08def8a2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T08:40:30Z\\\",\\\"message\\\":\\\"file observer\\\\nW1003 08:40:30.396497 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1003 08:40:30.399279 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 08:40:30.400119 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-247026750/tls.crt::/tmp/serving-cert-247026750/tls.key\\\\\\\"\\\\nI1003 08:40:30.809123 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1003 08:40:30.812812 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1003 08:40:30.812831 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1003 08:40:30.812852 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1003 08:40:30.812858 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1003 08:40:30.817440 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1003 08:40:30.817463 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1003 08:40:30.817482 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 08:40:30.817491 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 08:40:30.817495 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1003 08:40:30.817498 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1003 08:40:30.817501 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1003 08:40:30.817504 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1003 08:40:30.821080 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4591d1f2d844cb16eef19a65aaad416898d23ca5ce266e4863a9810d49ff4ba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://444e973af493d5aca2c924bcaee20abca5f1281e6b9c9f364e4603bf622d0e48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://444e973af493d5aca2c924bcaee20abca5f1281e6b9c9f364e4603bf622d0e48\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:40:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:14Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:41:26Z is after 2025-08-24T17:21:41Z" Oct 03 08:41:26 crc kubenswrapper[4790]: I1003 08:41:26.156364 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gprtl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c5bfb61-e69a-4576-b200-18dd7f38c7f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dcb3e7b6550f0ad490e187c2154ec53cb427cbab7acaf0618eb4de188eb724c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66xc5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:34Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gprtl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:41:26Z is after 2025-08-24T17:21:41Z" Oct 03 08:41:26 crc kubenswrapper[4790]: I1003 08:41:26.164823 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:26 crc kubenswrapper[4790]: I1003 08:41:26.164899 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:26 crc kubenswrapper[4790]: I1003 08:41:26.164911 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:26 crc kubenswrapper[4790]: I1003 08:41:26.164933 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:26 crc kubenswrapper[4790]: I1003 08:41:26.164947 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:26Z","lastTransitionTime":"2025-10-03T08:41:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:26 crc kubenswrapper[4790]: I1003 08:41:26.169988 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3362615c1396a6051d3770c79e4b67eea9d0c9df2a1a45381f729c4120970818\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://41113d498e57788a4c3162e8e981f5b9d9a5a7d49ee50b15085688fbc6b67f4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:41:26Z is after 2025-08-24T17:21:41Z" Oct 03 08:41:26 crc kubenswrapper[4790]: I1003 08:41:26.268046 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:26 crc kubenswrapper[4790]: I1003 08:41:26.268097 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:26 crc kubenswrapper[4790]: I1003 08:41:26.268106 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:26 crc kubenswrapper[4790]: I1003 08:41:26.268125 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:26 crc kubenswrapper[4790]: I1003 08:41:26.268137 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:26Z","lastTransitionTime":"2025-10-03T08:41:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:26 crc kubenswrapper[4790]: I1003 08:41:26.328273 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4vl95" Oct 03 08:41:26 crc kubenswrapper[4790]: I1003 08:41:26.328329 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 08:41:26 crc kubenswrapper[4790]: I1003 08:41:26.328424 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 08:41:26 crc kubenswrapper[4790]: E1003 08:41:26.328465 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4vl95" podUID="9236b957-81cb-40bc-969e-a2057884ba50" Oct 03 08:41:26 crc kubenswrapper[4790]: E1003 08:41:26.328811 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 08:41:26 crc kubenswrapper[4790]: E1003 08:41:26.328896 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 08:41:26 crc kubenswrapper[4790]: I1003 08:41:26.328988 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 08:41:26 crc kubenswrapper[4790]: E1003 08:41:26.329128 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 08:41:26 crc kubenswrapper[4790]: I1003 08:41:26.371050 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:26 crc kubenswrapper[4790]: I1003 08:41:26.371118 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:26 crc kubenswrapper[4790]: I1003 08:41:26.371140 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:26 crc kubenswrapper[4790]: I1003 08:41:26.371538 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:26 crc kubenswrapper[4790]: I1003 08:41:26.371790 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:26Z","lastTransitionTime":"2025-10-03T08:41:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:26 crc kubenswrapper[4790]: I1003 08:41:26.475351 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:26 crc kubenswrapper[4790]: I1003 08:41:26.475398 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:26 crc kubenswrapper[4790]: I1003 08:41:26.475410 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:26 crc kubenswrapper[4790]: I1003 08:41:26.475431 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:26 crc kubenswrapper[4790]: I1003 08:41:26.475445 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:26Z","lastTransitionTime":"2025-10-03T08:41:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:26 crc kubenswrapper[4790]: I1003 08:41:26.578697 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:26 crc kubenswrapper[4790]: I1003 08:41:26.578756 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:26 crc kubenswrapper[4790]: I1003 08:41:26.578781 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:26 crc kubenswrapper[4790]: I1003 08:41:26.578805 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:26 crc kubenswrapper[4790]: I1003 08:41:26.578821 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:26Z","lastTransitionTime":"2025-10-03T08:41:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:26 crc kubenswrapper[4790]: I1003 08:41:26.681671 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:26 crc kubenswrapper[4790]: I1003 08:41:26.681708 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:26 crc kubenswrapper[4790]: I1003 08:41:26.681717 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:26 crc kubenswrapper[4790]: I1003 08:41:26.681733 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:26 crc kubenswrapper[4790]: I1003 08:41:26.681744 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:26Z","lastTransitionTime":"2025-10-03T08:41:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:26 crc kubenswrapper[4790]: I1003 08:41:26.784808 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:26 crc kubenswrapper[4790]: I1003 08:41:26.784884 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:26 crc kubenswrapper[4790]: I1003 08:41:26.784900 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:26 crc kubenswrapper[4790]: I1003 08:41:26.784949 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:26 crc kubenswrapper[4790]: I1003 08:41:26.784968 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:26Z","lastTransitionTime":"2025-10-03T08:41:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:26 crc kubenswrapper[4790]: I1003 08:41:26.888939 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:26 crc kubenswrapper[4790]: I1003 08:41:26.888985 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:26 crc kubenswrapper[4790]: I1003 08:41:26.888997 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:26 crc kubenswrapper[4790]: I1003 08:41:26.889014 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:26 crc kubenswrapper[4790]: I1003 08:41:26.889026 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:26Z","lastTransitionTime":"2025-10-03T08:41:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:26 crc kubenswrapper[4790]: I1003 08:41:26.992197 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:26 crc kubenswrapper[4790]: I1003 08:41:26.992237 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:26 crc kubenswrapper[4790]: I1003 08:41:26.992249 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:26 crc kubenswrapper[4790]: I1003 08:41:26.992271 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:26 crc kubenswrapper[4790]: I1003 08:41:26.992282 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:26Z","lastTransitionTime":"2025-10-03T08:41:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:27 crc kubenswrapper[4790]: I1003 08:41:27.095257 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:27 crc kubenswrapper[4790]: I1003 08:41:27.095319 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:27 crc kubenswrapper[4790]: I1003 08:41:27.095343 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:27 crc kubenswrapper[4790]: I1003 08:41:27.095379 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:27 crc kubenswrapper[4790]: I1003 08:41:27.095405 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:27Z","lastTransitionTime":"2025-10-03T08:41:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:27 crc kubenswrapper[4790]: I1003 08:41:27.198536 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:27 crc kubenswrapper[4790]: I1003 08:41:27.198594 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:27 crc kubenswrapper[4790]: I1003 08:41:27.198607 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:27 crc kubenswrapper[4790]: I1003 08:41:27.198627 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:27 crc kubenswrapper[4790]: I1003 08:41:27.198641 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:27Z","lastTransitionTime":"2025-10-03T08:41:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:27 crc kubenswrapper[4790]: I1003 08:41:27.301265 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:27 crc kubenswrapper[4790]: I1003 08:41:27.301307 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:27 crc kubenswrapper[4790]: I1003 08:41:27.301319 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:27 crc kubenswrapper[4790]: I1003 08:41:27.301337 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:27 crc kubenswrapper[4790]: I1003 08:41:27.301350 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:27Z","lastTransitionTime":"2025-10-03T08:41:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:27 crc kubenswrapper[4790]: I1003 08:41:27.404015 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:27 crc kubenswrapper[4790]: I1003 08:41:27.404061 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:27 crc kubenswrapper[4790]: I1003 08:41:27.404073 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:27 crc kubenswrapper[4790]: I1003 08:41:27.404090 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:27 crc kubenswrapper[4790]: I1003 08:41:27.404100 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:27Z","lastTransitionTime":"2025-10-03T08:41:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:27 crc kubenswrapper[4790]: I1003 08:41:27.507469 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:27 crc kubenswrapper[4790]: I1003 08:41:27.507522 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:27 crc kubenswrapper[4790]: I1003 08:41:27.507534 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:27 crc kubenswrapper[4790]: I1003 08:41:27.507554 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:27 crc kubenswrapper[4790]: I1003 08:41:27.507591 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:27Z","lastTransitionTime":"2025-10-03T08:41:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:27 crc kubenswrapper[4790]: I1003 08:41:27.611033 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:27 crc kubenswrapper[4790]: I1003 08:41:27.611078 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:27 crc kubenswrapper[4790]: I1003 08:41:27.611088 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:27 crc kubenswrapper[4790]: I1003 08:41:27.611107 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:27 crc kubenswrapper[4790]: I1003 08:41:27.611116 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:27Z","lastTransitionTime":"2025-10-03T08:41:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:27 crc kubenswrapper[4790]: I1003 08:41:27.714288 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:27 crc kubenswrapper[4790]: I1003 08:41:27.714368 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:27 crc kubenswrapper[4790]: I1003 08:41:27.714391 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:27 crc kubenswrapper[4790]: I1003 08:41:27.714427 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:27 crc kubenswrapper[4790]: I1003 08:41:27.714450 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:27Z","lastTransitionTime":"2025-10-03T08:41:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:27 crc kubenswrapper[4790]: I1003 08:41:27.816847 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:27 crc kubenswrapper[4790]: I1003 08:41:27.816906 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:27 crc kubenswrapper[4790]: I1003 08:41:27.816916 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:27 crc kubenswrapper[4790]: I1003 08:41:27.816936 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:27 crc kubenswrapper[4790]: I1003 08:41:27.816947 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:27Z","lastTransitionTime":"2025-10-03T08:41:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:27 crc kubenswrapper[4790]: I1003 08:41:27.921550 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:27 crc kubenswrapper[4790]: I1003 08:41:27.921627 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:27 crc kubenswrapper[4790]: I1003 08:41:27.921668 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:27 crc kubenswrapper[4790]: I1003 08:41:27.921690 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:27 crc kubenswrapper[4790]: I1003 08:41:27.921703 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:27Z","lastTransitionTime":"2025-10-03T08:41:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:28 crc kubenswrapper[4790]: I1003 08:41:28.024908 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:28 crc kubenswrapper[4790]: I1003 08:41:28.024958 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:28 crc kubenswrapper[4790]: I1003 08:41:28.024970 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:28 crc kubenswrapper[4790]: I1003 08:41:28.024993 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:28 crc kubenswrapper[4790]: I1003 08:41:28.025012 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:28Z","lastTransitionTime":"2025-10-03T08:41:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:28 crc kubenswrapper[4790]: I1003 08:41:28.128565 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:28 crc kubenswrapper[4790]: I1003 08:41:28.128653 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:28 crc kubenswrapper[4790]: I1003 08:41:28.128665 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:28 crc kubenswrapper[4790]: I1003 08:41:28.128683 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:28 crc kubenswrapper[4790]: I1003 08:41:28.128694 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:28Z","lastTransitionTime":"2025-10-03T08:41:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:28 crc kubenswrapper[4790]: I1003 08:41:28.233540 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:28 crc kubenswrapper[4790]: I1003 08:41:28.233607 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:28 crc kubenswrapper[4790]: I1003 08:41:28.233617 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:28 crc kubenswrapper[4790]: I1003 08:41:28.233633 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:28 crc kubenswrapper[4790]: I1003 08:41:28.233647 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:28Z","lastTransitionTime":"2025-10-03T08:41:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:28 crc kubenswrapper[4790]: I1003 08:41:28.327930 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4vl95" Oct 03 08:41:28 crc kubenswrapper[4790]: I1003 08:41:28.328066 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 08:41:28 crc kubenswrapper[4790]: E1003 08:41:28.328099 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4vl95" podUID="9236b957-81cb-40bc-969e-a2057884ba50" Oct 03 08:41:28 crc kubenswrapper[4790]: E1003 08:41:28.328315 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 08:41:28 crc kubenswrapper[4790]: I1003 08:41:28.328384 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 08:41:28 crc kubenswrapper[4790]: E1003 08:41:28.328446 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 08:41:28 crc kubenswrapper[4790]: I1003 08:41:28.328605 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 08:41:28 crc kubenswrapper[4790]: E1003 08:41:28.328670 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 08:41:28 crc kubenswrapper[4790]: I1003 08:41:28.335762 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:28 crc kubenswrapper[4790]: I1003 08:41:28.335808 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:28 crc kubenswrapper[4790]: I1003 08:41:28.335819 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:28 crc kubenswrapper[4790]: I1003 08:41:28.335836 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:28 crc kubenswrapper[4790]: I1003 08:41:28.335849 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:28Z","lastTransitionTime":"2025-10-03T08:41:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:28 crc kubenswrapper[4790]: I1003 08:41:28.437954 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:28 crc kubenswrapper[4790]: I1003 08:41:28.438004 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:28 crc kubenswrapper[4790]: I1003 08:41:28.438015 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:28 crc kubenswrapper[4790]: I1003 08:41:28.438035 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:28 crc kubenswrapper[4790]: I1003 08:41:28.438046 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:28Z","lastTransitionTime":"2025-10-03T08:41:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:28 crc kubenswrapper[4790]: I1003 08:41:28.541662 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:28 crc kubenswrapper[4790]: I1003 08:41:28.541771 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:28 crc kubenswrapper[4790]: I1003 08:41:28.541784 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:28 crc kubenswrapper[4790]: I1003 08:41:28.541807 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:28 crc kubenswrapper[4790]: I1003 08:41:28.541822 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:28Z","lastTransitionTime":"2025-10-03T08:41:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:28 crc kubenswrapper[4790]: I1003 08:41:28.644312 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:28 crc kubenswrapper[4790]: I1003 08:41:28.644351 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:28 crc kubenswrapper[4790]: I1003 08:41:28.644364 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:28 crc kubenswrapper[4790]: I1003 08:41:28.644382 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:28 crc kubenswrapper[4790]: I1003 08:41:28.644394 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:28Z","lastTransitionTime":"2025-10-03T08:41:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:28 crc kubenswrapper[4790]: I1003 08:41:28.747201 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:28 crc kubenswrapper[4790]: I1003 08:41:28.747248 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:28 crc kubenswrapper[4790]: I1003 08:41:28.747257 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:28 crc kubenswrapper[4790]: I1003 08:41:28.747277 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:28 crc kubenswrapper[4790]: I1003 08:41:28.747299 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:28Z","lastTransitionTime":"2025-10-03T08:41:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:28 crc kubenswrapper[4790]: I1003 08:41:28.850458 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:28 crc kubenswrapper[4790]: I1003 08:41:28.850551 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:28 crc kubenswrapper[4790]: I1003 08:41:28.850630 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:28 crc kubenswrapper[4790]: I1003 08:41:28.850683 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:28 crc kubenswrapper[4790]: I1003 08:41:28.850721 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:28Z","lastTransitionTime":"2025-10-03T08:41:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:28 crc kubenswrapper[4790]: I1003 08:41:28.953985 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:28 crc kubenswrapper[4790]: I1003 08:41:28.954036 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:28 crc kubenswrapper[4790]: I1003 08:41:28.954049 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:28 crc kubenswrapper[4790]: I1003 08:41:28.954067 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:28 crc kubenswrapper[4790]: I1003 08:41:28.954079 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:28Z","lastTransitionTime":"2025-10-03T08:41:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:29 crc kubenswrapper[4790]: I1003 08:41:29.057661 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:29 crc kubenswrapper[4790]: I1003 08:41:29.057768 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:29 crc kubenswrapper[4790]: I1003 08:41:29.057793 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:29 crc kubenswrapper[4790]: I1003 08:41:29.057822 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:29 crc kubenswrapper[4790]: I1003 08:41:29.057842 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:29Z","lastTransitionTime":"2025-10-03T08:41:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:29 crc kubenswrapper[4790]: I1003 08:41:29.160684 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:29 crc kubenswrapper[4790]: I1003 08:41:29.160794 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:29 crc kubenswrapper[4790]: I1003 08:41:29.160808 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:29 crc kubenswrapper[4790]: I1003 08:41:29.160827 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:29 crc kubenswrapper[4790]: I1003 08:41:29.160839 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:29Z","lastTransitionTime":"2025-10-03T08:41:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:29 crc kubenswrapper[4790]: I1003 08:41:29.264252 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:29 crc kubenswrapper[4790]: I1003 08:41:29.264308 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:29 crc kubenswrapper[4790]: I1003 08:41:29.264321 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:29 crc kubenswrapper[4790]: I1003 08:41:29.264342 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:29 crc kubenswrapper[4790]: I1003 08:41:29.264354 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:29Z","lastTransitionTime":"2025-10-03T08:41:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:29 crc kubenswrapper[4790]: I1003 08:41:29.367778 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:29 crc kubenswrapper[4790]: I1003 08:41:29.367869 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:29 crc kubenswrapper[4790]: I1003 08:41:29.367896 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:29 crc kubenswrapper[4790]: I1003 08:41:29.367932 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:29 crc kubenswrapper[4790]: I1003 08:41:29.367955 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:29Z","lastTransitionTime":"2025-10-03T08:41:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:29 crc kubenswrapper[4790]: I1003 08:41:29.472230 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:29 crc kubenswrapper[4790]: I1003 08:41:29.472311 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:29 crc kubenswrapper[4790]: I1003 08:41:29.472337 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:29 crc kubenswrapper[4790]: I1003 08:41:29.472365 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:29 crc kubenswrapper[4790]: I1003 08:41:29.472383 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:29Z","lastTransitionTime":"2025-10-03T08:41:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:29 crc kubenswrapper[4790]: I1003 08:41:29.575412 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:29 crc kubenswrapper[4790]: I1003 08:41:29.575480 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:29 crc kubenswrapper[4790]: I1003 08:41:29.575496 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:29 crc kubenswrapper[4790]: I1003 08:41:29.575521 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:29 crc kubenswrapper[4790]: I1003 08:41:29.575541 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:29Z","lastTransitionTime":"2025-10-03T08:41:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:29 crc kubenswrapper[4790]: I1003 08:41:29.679142 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:29 crc kubenswrapper[4790]: I1003 08:41:29.679237 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:29 crc kubenswrapper[4790]: I1003 08:41:29.679262 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:29 crc kubenswrapper[4790]: I1003 08:41:29.679299 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:29 crc kubenswrapper[4790]: I1003 08:41:29.679319 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:29Z","lastTransitionTime":"2025-10-03T08:41:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:29 crc kubenswrapper[4790]: I1003 08:41:29.783058 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:29 crc kubenswrapper[4790]: I1003 08:41:29.783118 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:29 crc kubenswrapper[4790]: I1003 08:41:29.783130 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:29 crc kubenswrapper[4790]: I1003 08:41:29.783151 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:29 crc kubenswrapper[4790]: I1003 08:41:29.783164 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:29Z","lastTransitionTime":"2025-10-03T08:41:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:29 crc kubenswrapper[4790]: I1003 08:41:29.886057 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:29 crc kubenswrapper[4790]: I1003 08:41:29.886108 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:29 crc kubenswrapper[4790]: I1003 08:41:29.886119 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:29 crc kubenswrapper[4790]: I1003 08:41:29.886141 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:29 crc kubenswrapper[4790]: I1003 08:41:29.886153 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:29Z","lastTransitionTime":"2025-10-03T08:41:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:29 crc kubenswrapper[4790]: I1003 08:41:29.989314 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:29 crc kubenswrapper[4790]: I1003 08:41:29.989411 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:29 crc kubenswrapper[4790]: I1003 08:41:29.989423 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:29 crc kubenswrapper[4790]: I1003 08:41:29.989444 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:29 crc kubenswrapper[4790]: I1003 08:41:29.989456 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:29Z","lastTransitionTime":"2025-10-03T08:41:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:30 crc kubenswrapper[4790]: I1003 08:41:30.092683 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:30 crc kubenswrapper[4790]: I1003 08:41:30.092758 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:30 crc kubenswrapper[4790]: I1003 08:41:30.092776 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:30 crc kubenswrapper[4790]: I1003 08:41:30.092803 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:30 crc kubenswrapper[4790]: I1003 08:41:30.092822 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:30Z","lastTransitionTime":"2025-10-03T08:41:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:30 crc kubenswrapper[4790]: I1003 08:41:30.195501 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:30 crc kubenswrapper[4790]: I1003 08:41:30.195632 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:30 crc kubenswrapper[4790]: I1003 08:41:30.195654 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:30 crc kubenswrapper[4790]: I1003 08:41:30.195682 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:30 crc kubenswrapper[4790]: I1003 08:41:30.195701 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:30Z","lastTransitionTime":"2025-10-03T08:41:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:30 crc kubenswrapper[4790]: I1003 08:41:30.299124 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:30 crc kubenswrapper[4790]: I1003 08:41:30.299469 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:30 crc kubenswrapper[4790]: I1003 08:41:30.299600 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:30 crc kubenswrapper[4790]: I1003 08:41:30.299710 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:30 crc kubenswrapper[4790]: I1003 08:41:30.299792 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:30Z","lastTransitionTime":"2025-10-03T08:41:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:30 crc kubenswrapper[4790]: I1003 08:41:30.328261 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4vl95" Oct 03 08:41:30 crc kubenswrapper[4790]: I1003 08:41:30.328370 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 08:41:30 crc kubenswrapper[4790]: I1003 08:41:30.328610 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 08:41:30 crc kubenswrapper[4790]: E1003 08:41:30.328562 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4vl95" podUID="9236b957-81cb-40bc-969e-a2057884ba50" Oct 03 08:41:30 crc kubenswrapper[4790]: I1003 08:41:30.328632 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 08:41:30 crc kubenswrapper[4790]: E1003 08:41:30.328741 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 08:41:30 crc kubenswrapper[4790]: E1003 08:41:30.328913 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 08:41:30 crc kubenswrapper[4790]: E1003 08:41:30.329064 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 08:41:30 crc kubenswrapper[4790]: I1003 08:41:30.402465 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:30 crc kubenswrapper[4790]: I1003 08:41:30.402538 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:30 crc kubenswrapper[4790]: I1003 08:41:30.402558 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:30 crc kubenswrapper[4790]: I1003 08:41:30.402614 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:30 crc kubenswrapper[4790]: I1003 08:41:30.402635 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:30Z","lastTransitionTime":"2025-10-03T08:41:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:30 crc kubenswrapper[4790]: I1003 08:41:30.505303 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:30 crc kubenswrapper[4790]: I1003 08:41:30.505363 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:30 crc kubenswrapper[4790]: I1003 08:41:30.505373 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:30 crc kubenswrapper[4790]: I1003 08:41:30.505393 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:30 crc kubenswrapper[4790]: I1003 08:41:30.505405 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:30Z","lastTransitionTime":"2025-10-03T08:41:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:30 crc kubenswrapper[4790]: I1003 08:41:30.609607 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:30 crc kubenswrapper[4790]: I1003 08:41:30.609679 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:30 crc kubenswrapper[4790]: I1003 08:41:30.609697 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:30 crc kubenswrapper[4790]: I1003 08:41:30.609722 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:30 crc kubenswrapper[4790]: I1003 08:41:30.609751 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:30Z","lastTransitionTime":"2025-10-03T08:41:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:30 crc kubenswrapper[4790]: I1003 08:41:30.713186 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:30 crc kubenswrapper[4790]: I1003 08:41:30.713337 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:30 crc kubenswrapper[4790]: I1003 08:41:30.713366 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:30 crc kubenswrapper[4790]: I1003 08:41:30.713402 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:30 crc kubenswrapper[4790]: I1003 08:41:30.713429 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:30Z","lastTransitionTime":"2025-10-03T08:41:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:30 crc kubenswrapper[4790]: I1003 08:41:30.816978 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:30 crc kubenswrapper[4790]: I1003 08:41:30.817051 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:30 crc kubenswrapper[4790]: I1003 08:41:30.817071 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:30 crc kubenswrapper[4790]: I1003 08:41:30.817098 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:30 crc kubenswrapper[4790]: I1003 08:41:30.817117 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:30Z","lastTransitionTime":"2025-10-03T08:41:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:30 crc kubenswrapper[4790]: I1003 08:41:30.920442 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:30 crc kubenswrapper[4790]: I1003 08:41:30.920524 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:30 crc kubenswrapper[4790]: I1003 08:41:30.920550 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:30 crc kubenswrapper[4790]: I1003 08:41:30.920613 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:30 crc kubenswrapper[4790]: I1003 08:41:30.920638 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:30Z","lastTransitionTime":"2025-10-03T08:41:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:31 crc kubenswrapper[4790]: I1003 08:41:31.024269 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:31 crc kubenswrapper[4790]: I1003 08:41:31.024318 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:31 crc kubenswrapper[4790]: I1003 08:41:31.024333 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:31 crc kubenswrapper[4790]: I1003 08:41:31.024357 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:31 crc kubenswrapper[4790]: I1003 08:41:31.024375 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:31Z","lastTransitionTime":"2025-10-03T08:41:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:31 crc kubenswrapper[4790]: I1003 08:41:31.127696 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:31 crc kubenswrapper[4790]: I1003 08:41:31.127746 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:31 crc kubenswrapper[4790]: I1003 08:41:31.127755 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:31 crc kubenswrapper[4790]: I1003 08:41:31.127771 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:31 crc kubenswrapper[4790]: I1003 08:41:31.127781 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:31Z","lastTransitionTime":"2025-10-03T08:41:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:31 crc kubenswrapper[4790]: I1003 08:41:31.230960 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:31 crc kubenswrapper[4790]: I1003 08:41:31.231030 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:31 crc kubenswrapper[4790]: I1003 08:41:31.231050 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:31 crc kubenswrapper[4790]: I1003 08:41:31.231074 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:31 crc kubenswrapper[4790]: I1003 08:41:31.231088 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:31Z","lastTransitionTime":"2025-10-03T08:41:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:31 crc kubenswrapper[4790]: I1003 08:41:31.334057 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:31 crc kubenswrapper[4790]: I1003 08:41:31.334097 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:31 crc kubenswrapper[4790]: I1003 08:41:31.334107 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:31 crc kubenswrapper[4790]: I1003 08:41:31.334119 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:31 crc kubenswrapper[4790]: I1003 08:41:31.334129 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:31Z","lastTransitionTime":"2025-10-03T08:41:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:31 crc kubenswrapper[4790]: I1003 08:41:31.437548 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:31 crc kubenswrapper[4790]: I1003 08:41:31.437655 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:31 crc kubenswrapper[4790]: I1003 08:41:31.437674 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:31 crc kubenswrapper[4790]: I1003 08:41:31.437702 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:31 crc kubenswrapper[4790]: I1003 08:41:31.437721 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:31Z","lastTransitionTime":"2025-10-03T08:41:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:31 crc kubenswrapper[4790]: I1003 08:41:31.541701 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:31 crc kubenswrapper[4790]: I1003 08:41:31.541940 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:31 crc kubenswrapper[4790]: I1003 08:41:31.541966 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:31 crc kubenswrapper[4790]: I1003 08:41:31.541993 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:31 crc kubenswrapper[4790]: I1003 08:41:31.542015 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:31Z","lastTransitionTime":"2025-10-03T08:41:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:31 crc kubenswrapper[4790]: I1003 08:41:31.645560 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:31 crc kubenswrapper[4790]: I1003 08:41:31.645688 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:31 crc kubenswrapper[4790]: I1003 08:41:31.645713 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:31 crc kubenswrapper[4790]: I1003 08:41:31.645741 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:31 crc kubenswrapper[4790]: I1003 08:41:31.645762 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:31Z","lastTransitionTime":"2025-10-03T08:41:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:31 crc kubenswrapper[4790]: I1003 08:41:31.749470 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:31 crc kubenswrapper[4790]: I1003 08:41:31.749731 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:31 crc kubenswrapper[4790]: I1003 08:41:31.749842 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:31 crc kubenswrapper[4790]: I1003 08:41:31.749943 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:31 crc kubenswrapper[4790]: I1003 08:41:31.750062 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:31Z","lastTransitionTime":"2025-10-03T08:41:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:31 crc kubenswrapper[4790]: I1003 08:41:31.852932 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:31 crc kubenswrapper[4790]: I1003 08:41:31.853277 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:31 crc kubenswrapper[4790]: I1003 08:41:31.853423 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:31 crc kubenswrapper[4790]: I1003 08:41:31.853527 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:31 crc kubenswrapper[4790]: I1003 08:41:31.853666 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:31Z","lastTransitionTime":"2025-10-03T08:41:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:31 crc kubenswrapper[4790]: I1003 08:41:31.956659 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:31 crc kubenswrapper[4790]: I1003 08:41:31.956699 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:31 crc kubenswrapper[4790]: I1003 08:41:31.956712 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:31 crc kubenswrapper[4790]: I1003 08:41:31.956731 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:31 crc kubenswrapper[4790]: I1003 08:41:31.956744 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:31Z","lastTransitionTime":"2025-10-03T08:41:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:32 crc kubenswrapper[4790]: I1003 08:41:32.059958 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:32 crc kubenswrapper[4790]: I1003 08:41:32.060316 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:32 crc kubenswrapper[4790]: I1003 08:41:32.060384 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:32 crc kubenswrapper[4790]: I1003 08:41:32.060464 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:32 crc kubenswrapper[4790]: I1003 08:41:32.060542 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:32Z","lastTransitionTime":"2025-10-03T08:41:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:32 crc kubenswrapper[4790]: I1003 08:41:32.118394 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:32 crc kubenswrapper[4790]: I1003 08:41:32.118438 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:32 crc kubenswrapper[4790]: I1003 08:41:32.118451 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:32 crc kubenswrapper[4790]: I1003 08:41:32.118474 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:32 crc kubenswrapper[4790]: I1003 08:41:32.118494 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:32Z","lastTransitionTime":"2025-10-03T08:41:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:32 crc kubenswrapper[4790]: E1003 08:41:32.132108 4790 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T08:41:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T08:41:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T08:41:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T08:41:32Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T08:41:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T08:41:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T08:41:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T08:41:32Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e374b313-7030-4739-b7f1-da1c9b39fe8d\\\",\\\"systemUUID\\\":\\\"312be368-bbdc-461d-b74b-f792414190de\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:41:32Z is after 2025-08-24T17:21:41Z" Oct 03 08:41:32 crc kubenswrapper[4790]: I1003 08:41:32.138639 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:32 crc kubenswrapper[4790]: I1003 08:41:32.138692 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:32 crc kubenswrapper[4790]: I1003 08:41:32.138708 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:32 crc kubenswrapper[4790]: I1003 08:41:32.138727 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:32 crc kubenswrapper[4790]: I1003 08:41:32.138740 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:32Z","lastTransitionTime":"2025-10-03T08:41:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:32 crc kubenswrapper[4790]: E1003 08:41:32.152682 4790 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T08:41:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T08:41:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T08:41:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T08:41:32Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T08:41:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T08:41:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T08:41:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T08:41:32Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e374b313-7030-4739-b7f1-da1c9b39fe8d\\\",\\\"systemUUID\\\":\\\"312be368-bbdc-461d-b74b-f792414190de\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:41:32Z is after 2025-08-24T17:21:41Z" Oct 03 08:41:32 crc kubenswrapper[4790]: I1003 08:41:32.157896 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:32 crc kubenswrapper[4790]: I1003 08:41:32.157943 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:32 crc kubenswrapper[4790]: I1003 08:41:32.157954 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:32 crc kubenswrapper[4790]: I1003 08:41:32.157972 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:32 crc kubenswrapper[4790]: I1003 08:41:32.157982 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:32Z","lastTransitionTime":"2025-10-03T08:41:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:32 crc kubenswrapper[4790]: E1003 08:41:32.174237 4790 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T08:41:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T08:41:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T08:41:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T08:41:32Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T08:41:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T08:41:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T08:41:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T08:41:32Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e374b313-7030-4739-b7f1-da1c9b39fe8d\\\",\\\"systemUUID\\\":\\\"312be368-bbdc-461d-b74b-f792414190de\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:41:32Z is after 2025-08-24T17:21:41Z" Oct 03 08:41:32 crc kubenswrapper[4790]: I1003 08:41:32.178888 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:32 crc kubenswrapper[4790]: I1003 08:41:32.178927 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:32 crc kubenswrapper[4790]: I1003 08:41:32.178940 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:32 crc kubenswrapper[4790]: I1003 08:41:32.178960 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:32 crc kubenswrapper[4790]: I1003 08:41:32.178976 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:32Z","lastTransitionTime":"2025-10-03T08:41:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:32 crc kubenswrapper[4790]: E1003 08:41:32.193461 4790 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T08:41:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T08:41:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T08:41:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T08:41:32Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T08:41:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T08:41:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T08:41:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T08:41:32Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e374b313-7030-4739-b7f1-da1c9b39fe8d\\\",\\\"systemUUID\\\":\\\"312be368-bbdc-461d-b74b-f792414190de\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:41:32Z is after 2025-08-24T17:21:41Z" Oct 03 08:41:32 crc kubenswrapper[4790]: I1003 08:41:32.198308 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:32 crc kubenswrapper[4790]: I1003 08:41:32.198364 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:32 crc kubenswrapper[4790]: I1003 08:41:32.198376 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:32 crc kubenswrapper[4790]: I1003 08:41:32.198397 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:32 crc kubenswrapper[4790]: I1003 08:41:32.198411 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:32Z","lastTransitionTime":"2025-10-03T08:41:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:32 crc kubenswrapper[4790]: E1003 08:41:32.215483 4790 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T08:41:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T08:41:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T08:41:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T08:41:32Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T08:41:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T08:41:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T08:41:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T08:41:32Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e374b313-7030-4739-b7f1-da1c9b39fe8d\\\",\\\"systemUUID\\\":\\\"312be368-bbdc-461d-b74b-f792414190de\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:41:32Z is after 2025-08-24T17:21:41Z" Oct 03 08:41:32 crc kubenswrapper[4790]: E1003 08:41:32.215690 4790 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 03 08:41:32 crc kubenswrapper[4790]: I1003 08:41:32.217772 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:32 crc kubenswrapper[4790]: I1003 08:41:32.217803 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:32 crc kubenswrapper[4790]: I1003 08:41:32.217821 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:32 crc kubenswrapper[4790]: I1003 08:41:32.217843 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:32 crc kubenswrapper[4790]: I1003 08:41:32.217855 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:32Z","lastTransitionTime":"2025-10-03T08:41:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:32 crc kubenswrapper[4790]: I1003 08:41:32.322182 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:32 crc kubenswrapper[4790]: I1003 08:41:32.322234 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:32 crc kubenswrapper[4790]: I1003 08:41:32.322252 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:32 crc kubenswrapper[4790]: I1003 08:41:32.322277 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:32 crc kubenswrapper[4790]: I1003 08:41:32.322295 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:32Z","lastTransitionTime":"2025-10-03T08:41:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:32 crc kubenswrapper[4790]: I1003 08:41:32.328010 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 08:41:32 crc kubenswrapper[4790]: I1003 08:41:32.328077 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4vl95" Oct 03 08:41:32 crc kubenswrapper[4790]: I1003 08:41:32.328128 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 08:41:32 crc kubenswrapper[4790]: I1003 08:41:32.328290 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 08:41:32 crc kubenswrapper[4790]: E1003 08:41:32.328665 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 08:41:32 crc kubenswrapper[4790]: E1003 08:41:32.328803 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 08:41:32 crc kubenswrapper[4790]: E1003 08:41:32.329295 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 08:41:32 crc kubenswrapper[4790]: E1003 08:41:32.329490 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4vl95" podUID="9236b957-81cb-40bc-969e-a2057884ba50" Oct 03 08:41:32 crc kubenswrapper[4790]: I1003 08:41:32.425289 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:32 crc kubenswrapper[4790]: I1003 08:41:32.425328 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:32 crc kubenswrapper[4790]: I1003 08:41:32.425341 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:32 crc kubenswrapper[4790]: I1003 08:41:32.425360 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:32 crc kubenswrapper[4790]: I1003 08:41:32.425374 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:32Z","lastTransitionTime":"2025-10-03T08:41:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:32 crc kubenswrapper[4790]: I1003 08:41:32.528653 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:32 crc kubenswrapper[4790]: I1003 08:41:32.528704 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:32 crc kubenswrapper[4790]: I1003 08:41:32.528713 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:32 crc kubenswrapper[4790]: I1003 08:41:32.528734 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:32 crc kubenswrapper[4790]: I1003 08:41:32.528745 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:32Z","lastTransitionTime":"2025-10-03T08:41:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:32 crc kubenswrapper[4790]: I1003 08:41:32.631084 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:32 crc kubenswrapper[4790]: I1003 08:41:32.631138 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:32 crc kubenswrapper[4790]: I1003 08:41:32.631151 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:32 crc kubenswrapper[4790]: I1003 08:41:32.631170 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:32 crc kubenswrapper[4790]: I1003 08:41:32.631185 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:32Z","lastTransitionTime":"2025-10-03T08:41:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:32 crc kubenswrapper[4790]: I1003 08:41:32.734780 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:32 crc kubenswrapper[4790]: I1003 08:41:32.734876 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:32 crc kubenswrapper[4790]: I1003 08:41:32.734890 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:32 crc kubenswrapper[4790]: I1003 08:41:32.734914 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:32 crc kubenswrapper[4790]: I1003 08:41:32.734929 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:32Z","lastTransitionTime":"2025-10-03T08:41:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:32 crc kubenswrapper[4790]: I1003 08:41:32.837370 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:32 crc kubenswrapper[4790]: I1003 08:41:32.837438 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:32 crc kubenswrapper[4790]: I1003 08:41:32.837466 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:32 crc kubenswrapper[4790]: I1003 08:41:32.837500 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:32 crc kubenswrapper[4790]: I1003 08:41:32.837530 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:32Z","lastTransitionTime":"2025-10-03T08:41:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:32 crc kubenswrapper[4790]: I1003 08:41:32.940475 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:32 crc kubenswrapper[4790]: I1003 08:41:32.940924 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:32 crc kubenswrapper[4790]: I1003 08:41:32.941013 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:32 crc kubenswrapper[4790]: I1003 08:41:32.941102 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:32 crc kubenswrapper[4790]: I1003 08:41:32.941177 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:32Z","lastTransitionTime":"2025-10-03T08:41:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:33 crc kubenswrapper[4790]: I1003 08:41:33.044920 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:33 crc kubenswrapper[4790]: I1003 08:41:33.045042 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:33 crc kubenswrapper[4790]: I1003 08:41:33.045068 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:33 crc kubenswrapper[4790]: I1003 08:41:33.045098 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:33 crc kubenswrapper[4790]: I1003 08:41:33.045123 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:33Z","lastTransitionTime":"2025-10-03T08:41:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:33 crc kubenswrapper[4790]: I1003 08:41:33.148145 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:33 crc kubenswrapper[4790]: I1003 08:41:33.148200 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:33 crc kubenswrapper[4790]: I1003 08:41:33.148209 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:33 crc kubenswrapper[4790]: I1003 08:41:33.148226 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:33 crc kubenswrapper[4790]: I1003 08:41:33.148239 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:33Z","lastTransitionTime":"2025-10-03T08:41:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:33 crc kubenswrapper[4790]: I1003 08:41:33.251550 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:33 crc kubenswrapper[4790]: I1003 08:41:33.251637 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:33 crc kubenswrapper[4790]: I1003 08:41:33.251668 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:33 crc kubenswrapper[4790]: I1003 08:41:33.251689 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:33 crc kubenswrapper[4790]: I1003 08:41:33.251701 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:33Z","lastTransitionTime":"2025-10-03T08:41:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:33 crc kubenswrapper[4790]: I1003 08:41:33.328231 4790 scope.go:117] "RemoveContainer" containerID="fa6b0394cf2fb97d539acd9a37b2c7184af5640122bf0b1d3f0fb663a78a9e91" Oct 03 08:41:33 crc kubenswrapper[4790]: I1003 08:41:33.354052 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:33 crc kubenswrapper[4790]: I1003 08:41:33.354092 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:33 crc kubenswrapper[4790]: I1003 08:41:33.354102 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:33 crc kubenswrapper[4790]: I1003 08:41:33.354122 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:33 crc kubenswrapper[4790]: I1003 08:41:33.354134 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:33Z","lastTransitionTime":"2025-10-03T08:41:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:33 crc kubenswrapper[4790]: I1003 08:41:33.458677 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:33 crc kubenswrapper[4790]: I1003 08:41:33.458733 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:33 crc kubenswrapper[4790]: I1003 08:41:33.458747 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:33 crc kubenswrapper[4790]: I1003 08:41:33.458769 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:33 crc kubenswrapper[4790]: I1003 08:41:33.458783 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:33Z","lastTransitionTime":"2025-10-03T08:41:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:33 crc kubenswrapper[4790]: I1003 08:41:33.562687 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:33 crc kubenswrapper[4790]: I1003 08:41:33.562746 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:33 crc kubenswrapper[4790]: I1003 08:41:33.562760 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:33 crc kubenswrapper[4790]: I1003 08:41:33.562781 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:33 crc kubenswrapper[4790]: I1003 08:41:33.562794 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:33Z","lastTransitionTime":"2025-10-03T08:41:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:33 crc kubenswrapper[4790]: I1003 08:41:33.665956 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:33 crc kubenswrapper[4790]: I1003 08:41:33.666005 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:33 crc kubenswrapper[4790]: I1003 08:41:33.666020 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:33 crc kubenswrapper[4790]: I1003 08:41:33.666043 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:33 crc kubenswrapper[4790]: I1003 08:41:33.666059 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:33Z","lastTransitionTime":"2025-10-03T08:41:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:33 crc kubenswrapper[4790]: I1003 08:41:33.768755 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:33 crc kubenswrapper[4790]: I1003 08:41:33.768791 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:33 crc kubenswrapper[4790]: I1003 08:41:33.768801 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:33 crc kubenswrapper[4790]: I1003 08:41:33.768817 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:33 crc kubenswrapper[4790]: I1003 08:41:33.768826 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:33Z","lastTransitionTime":"2025-10-03T08:41:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:33 crc kubenswrapper[4790]: I1003 08:41:33.871320 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:33 crc kubenswrapper[4790]: I1003 08:41:33.871371 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:33 crc kubenswrapper[4790]: I1003 08:41:33.871391 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:33 crc kubenswrapper[4790]: I1003 08:41:33.871413 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:33 crc kubenswrapper[4790]: I1003 08:41:33.871425 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:33Z","lastTransitionTime":"2025-10-03T08:41:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:33 crc kubenswrapper[4790]: I1003 08:41:33.923981 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vsjvh_fa372dcc-63aa-48e5-b153-7739078026ef/ovnkube-controller/2.log" Oct 03 08:41:33 crc kubenswrapper[4790]: I1003 08:41:33.927543 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vsjvh" event={"ID":"fa372dcc-63aa-48e5-b153-7739078026ef","Type":"ContainerStarted","Data":"873da5acf186f02e995f5db997cebf47651cb6a2b87d4641ecb0318c004619e8"} Oct 03 08:41:33 crc kubenswrapper[4790]: I1003 08:41:33.928803 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-vsjvh" Oct 03 08:41:33 crc kubenswrapper[4790]: I1003 08:41:33.946940 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:41:33Z is after 2025-08-24T17:21:41Z" Oct 03 08:41:33 crc kubenswrapper[4790]: I1003 08:41:33.959593 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:41:33Z is after 2025-08-24T17:21:41Z" Oct 03 08:41:33 crc kubenswrapper[4790]: I1003 08:41:33.974024 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:33 crc kubenswrapper[4790]: I1003 08:41:33.974066 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:33 crc kubenswrapper[4790]: I1003 08:41:33.974076 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:33 crc kubenswrapper[4790]: I1003 08:41:33.974094 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:33 crc kubenswrapper[4790]: I1003 08:41:33.974105 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:33Z","lastTransitionTime":"2025-10-03T08:41:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:33 crc kubenswrapper[4790]: I1003 08:41:33.978719 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"36eae06a-d8be-4128-8d68-acb32e1f011b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a97a1a7f3aba494cd0da6bcd15052a85d70642642a62d108ac0be51ce8e3d302\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww2bb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8c3aea222eb433729d8eac107ebcdb306f649bef25942cd0390bfc1bbda5c74\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww2bb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:35Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-zqj4j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:41:33Z is after 2025-08-24T17:21:41Z" Oct 03 08:41:34 crc kubenswrapper[4790]: I1003 08:41:34.007760 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vsjvh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa372dcc-63aa-48e5-b153-7739078026ef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e69102d376e90485ca565d34e18dbc12067e166f0bdf3c4325a0873f9f2b9b0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52pgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e52e55afef7b68f05e0c7d975d4497ee57dd9fa5aef7f72e9ca7441ea1633c05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52pgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55b4e8ab33316a7f7057ee723ebd49eb0adcd7270124f084b2828ce538ed3297\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52pgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f2aaa340e3927261ef0119f786b348e2091ddcd4b791619cc521122cc0774e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52pgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff0f1dd623e164ebe1862661c5905a38bec3aa1dd5c61982eebde25d0feb4c6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52pgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27b1573f76936a5148d24accaa7fdc1a1da552fb56b348812445e7f1c1e4cbef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52pgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://873da5acf186f02e995f5db997cebf47651cb6a2b87d4641ecb0318c004619e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa6b0394cf2fb97d539acd9a37b2c7184af5640122bf0b1d3f0fb663a78a9e91\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-03T08:41:03Z\\\",\\\"message\\\":\\\"resolver-gprtl in node crc\\\\nI1003 08:41:03.132081 6465 obj_retry.go:386] Retry successful for *v1.Pod openshift-dns/node-resolver-gprtl after 0 failed attempt(s)\\\\nI1003 08:41:03.132085 6465 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1003 08:41:03.132087 6465 default_network_controller.go:776] Recording success event on pod openshift-dns/node-resolver-gprtl\\\\nI1003 08:41:03.132018 6465 obj_retry.go:386] Retry successful for *v1.Pod openshift-kube-apiserver/kube-apiserver-crc after 0 failed attempt(s)\\\\nI1003 08:41:03.132100 6465 default_network_controller.go:776] Recording success event on pod openshift-kube-apiserver/kube-apiserver-crc\\\\nI1003 08:41:03.132110 6465 handler.go:208] Removed *v1.Node event handler 2\\\\nI1003 08:41:03.132122 6465 factory.go:656] Stopping watch factory\\\\nI1003 08:41:03.132130 6465 base_network_controller_pods.go:916] Annotation values: ip=[10.217.0.3/23] ; mac=0a:58:0a:d9:00:03 ; gw=[10.217.0.1]\\\\nI1003 08:41:03.132141 6465 ovnkube.go:599] Stopped ovnkube\\\\nI1003 08:41:03.132178 6465 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1003 08:41:03.132192 6465 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1003 08:41:03.132253 6465 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T08:41:02Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:41:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52pgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74d44e422b0cbbd3793553dad98475d5a93ac38530a167db0dd1ef152edac010\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52pgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e45c39722f37fc673cee930ca10396bba43b56e19e61aa99bd2b89f131341cb4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e45c39722f37fc673cee930ca10396bba43b56e19e61aa99bd2b89f131341cb4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:40:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52pgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vsjvh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:41:33Z is after 2025-08-24T17:21:41Z" Oct 03 08:41:34 crc kubenswrapper[4790]: I1003 08:41:34.024018 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-4vl95" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9236b957-81cb-40bc-969e-a2057884ba50\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j2gd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j2gd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:49Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-4vl95\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:41:34Z is after 2025-08-24T17:21:41Z" Oct 03 08:41:34 crc kubenswrapper[4790]: I1003 08:41:34.039736 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28b99cc027b4390bdc7e3ed519bb798eb4a7b825ff29e4574f3420b9922265b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:41:34Z is after 2025-08-24T17:21:41Z" Oct 03 08:41:34 crc kubenswrapper[4790]: I1003 08:41:34.059744 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"535765e1-fd5d-4501-b13e-684c5f08b296\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:41:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:41:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e0caa9033c82a625d9594342d4e58052cfbfea8251bc0fa2f1b102ba235b5ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db5e8a267e2821ea1c1628ff477a4efc3c5bcc78538460b0f30a160bd738dcee\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d85a637b07da3de31d73486c444bf86e7e2fd799180fc1cd9ab56fee61afaa9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c713cd887b75dc54dfefd8b613f5bdb78ade644b31fc7db5e789414fcacad4b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://905a99c09cfaadc3eddc17fa6e509098364525d7a03add46a97c98dc08def8a2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T08:40:30Z\\\",\\\"message\\\":\\\"file observer\\\\nW1003 08:40:30.396497 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1003 08:40:30.399279 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 08:40:30.400119 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-247026750/tls.crt::/tmp/serving-cert-247026750/tls.key\\\\\\\"\\\\nI1003 08:40:30.809123 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1003 08:40:30.812812 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1003 08:40:30.812831 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1003 08:40:30.812852 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1003 08:40:30.812858 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1003 08:40:30.817440 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1003 08:40:30.817463 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1003 08:40:30.817482 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 08:40:30.817491 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 08:40:30.817495 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1003 08:40:30.817498 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1003 08:40:30.817501 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1003 08:40:30.817504 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1003 08:40:30.821080 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4591d1f2d844cb16eef19a65aaad416898d23ca5ce266e4863a9810d49ff4ba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://444e973af493d5aca2c924bcaee20abca5f1281e6b9c9f364e4603bf622d0e48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://444e973af493d5aca2c924bcaee20abca5f1281e6b9c9f364e4603bf622d0e48\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:40:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:14Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:41:34Z is after 2025-08-24T17:21:41Z" Oct 03 08:41:34 crc kubenswrapper[4790]: I1003 08:41:34.072097 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gprtl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c5bfb61-e69a-4576-b200-18dd7f38c7f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dcb3e7b6550f0ad490e187c2154ec53cb427cbab7acaf0618eb4de188eb724c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66xc5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:34Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gprtl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:41:34Z is after 2025-08-24T17:21:41Z" Oct 03 08:41:34 crc kubenswrapper[4790]: I1003 08:41:34.076453 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:34 crc kubenswrapper[4790]: I1003 08:41:34.076506 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:34 crc kubenswrapper[4790]: I1003 08:41:34.076520 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:34 crc kubenswrapper[4790]: I1003 08:41:34.076543 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:34 crc kubenswrapper[4790]: I1003 08:41:34.076557 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:34Z","lastTransitionTime":"2025-10-03T08:41:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:34 crc kubenswrapper[4790]: I1003 08:41:34.088638 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3362615c1396a6051d3770c79e4b67eea9d0c9df2a1a45381f729c4120970818\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://41113d498e57788a4c3162e8e981f5b9d9a5a7d49ee50b15085688fbc6b67f4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:41:34Z is after 2025-08-24T17:21:41Z" Oct 03 08:41:34 crc kubenswrapper[4790]: I1003 08:41:34.101389 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7108359a-23ed-4464-a43a-6a1ef9877500\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:41:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:41:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca9eba35e4f0edf15874f10d6b11af0b09a3c3a43eaf8340ec5604122463dc8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d8b7cc917b16307413ed2fb2c91cf91b438cf0499138176e99562d0415bf872c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://55ba2aa3b5d8466c1193dab58f6add86f97ed9065aaa71ba877ad002d7b30ccc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3486e595b261af48bff933da2e9a7c49631e4523babd35cc054b39291e410512\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3486e595b261af48bff933da2e9a7c49631e4523babd35cc054b39291e410512\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:40:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:15Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:14Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:41:34Z is after 2025-08-24T17:21:41Z" Oct 03 08:41:34 crc kubenswrapper[4790]: I1003 08:41:34.116357 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08d57a60251c60c6c6a2cab2b862c322c01b531ca1aa12ff4b0933bee9eb95f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:41:34Z is after 2025-08-24T17:21:41Z" Oct 03 08:41:34 crc kubenswrapper[4790]: I1003 08:41:34.137547 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-hncl2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f2f5631d-8486-44f9-81b0-ee78515e7866\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:41:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:41:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://934c9d6aafe23db007c316b08845e9d67eecf0072ba180fa513fa213245cc37c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3fde680972c36057c9e06b0f745acbfb2e6ab399b14d6c25a3c379b65e598afd\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-03T08:41:24Z\\\",\\\"message\\\":\\\"2025-10-03T08:40:38+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_d2710489-37bc-43d0-9069-63e58afa0354\\\\n2025-10-03T08:40:38+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_d2710489-37bc-43d0-9069-63e58afa0354 to /host/opt/cni/bin/\\\\n2025-10-03T08:40:39Z [verbose] multus-daemon started\\\\n2025-10-03T08:40:39Z [verbose] Readiness Indicator file check\\\\n2025-10-03T08:41:24Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:37Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:41:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-strp2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-hncl2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:41:34Z is after 2025-08-24T17:21:41Z" Oct 03 08:41:34 crc kubenswrapper[4790]: I1003 08:41:34.155279 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7ddb59e5-6feb-471d-9594-f86e6993a2e4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b0138e08174582c0c1388716f1f6b9e42a16b009f1a467803c4dacd50c3d80ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c575d680c9bc994b8c2e97e720d794138153bc3329c4b7f8ef81115a0245d0cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://027dd0df8d83044cff9f4e8c87bef3bb18e2169bb93e615486fd1e9b5a17b86e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e837e81f3b907d8506f674d13a9bc8f20f63e6c40e1f952cd57b715855ec9bf9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:14Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:41:34Z is after 2025-08-24T17:21:41Z" Oct 03 08:41:34 crc kubenswrapper[4790]: I1003 08:41:34.174065 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:41:34Z is after 2025-08-24T17:21:41Z" Oct 03 08:41:34 crc kubenswrapper[4790]: I1003 08:41:34.179066 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:34 crc kubenswrapper[4790]: I1003 08:41:34.179093 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:34 crc kubenswrapper[4790]: I1003 08:41:34.179101 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:34 crc kubenswrapper[4790]: I1003 08:41:34.179118 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:34 crc kubenswrapper[4790]: I1003 08:41:34.179129 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:34Z","lastTransitionTime":"2025-10-03T08:41:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:34 crc kubenswrapper[4790]: I1003 08:41:34.190808 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9fnsr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e01f9373-74e4-439b-bd7f-ad00e60a3284\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45be0671b4ad4d5e294f87dba5d52747298b00a3ebd9a13f2845881c0f1846a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krfjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://876c35e32e5877c2a69cdbb54b84ec0bc0daebd05c7847875a1c122aafac2057\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://876c35e32e5877c2a69cdbb54b84ec0bc0daebd05c7847875a1c122aafac2057\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krfjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://938ff3fffad5347212d00d02f96c105cd5e6252a4d6a7ada258f0367da3f8e77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://938ff3fffad5347212d00d02f96c105cd5e6252a4d6a7ada258f0367da3f8e77\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:40:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krfjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca99fc95f4bc463b8e638cfa1e6dc6be77fc4111b6ec61b4b2913fdf91f46671\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ca99fc95f4bc463b8e638cfa1e6dc6be77fc4111b6ec61b4b2913fdf91f46671\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:40:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krfjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9667f80988a61a4f26f6f2b8b047ef94bdc6fa052083d5d1285f5bd9235615ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9667f80988a61a4f26f6f2b8b047ef94bdc6fa052083d5d1285f5bd9235615ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:40:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krfjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a4e7fb7b6f922e6e0fa2a7fcb5476ba549a6aa21dc88ee145bc55f4dac0912bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a4e7fb7b6f922e6e0fa2a7fcb5476ba549a6aa21dc88ee145bc55f4dac0912bb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:40:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krfjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://76a8239bfbb38c5858f1f9b0b01b9552463d8b2774564d7cdacb6757bf053656\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76a8239bfbb38c5858f1f9b0b01b9552463d8b2774564d7cdacb6757bf053656\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:40:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krfjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9fnsr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:41:34Z is after 2025-08-24T17:21:41Z" Oct 03 08:41:34 crc kubenswrapper[4790]: I1003 08:41:34.201796 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7bpwn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a3923f73-5efb-4ce3-b68f-0c4e1e8eca4b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://305c950f0352fe1bec1c151dc89eed1389144ad5ee321458d48c6609e688bb73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gt4gf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:37Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7bpwn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:41:34Z is after 2025-08-24T17:21:41Z" Oct 03 08:41:34 crc kubenswrapper[4790]: I1003 08:41:34.215309 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-fn764" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12525ca4-5242-41bc-b7ba-2c1ab5a45892\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ee4f33c772a396fdd8d2a6f958d402139cb81cfd79da0f32c83bbe9adbe0801\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2t6qw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d79cdfa897fe5ca615930826e3a153a184ab7977626a9837dbcdbddbe4d27af6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2t6qw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-fn764\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:41:34Z is after 2025-08-24T17:21:41Z" Oct 03 08:41:34 crc kubenswrapper[4790]: I1003 08:41:34.281962 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:34 crc kubenswrapper[4790]: I1003 08:41:34.282021 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:34 crc kubenswrapper[4790]: I1003 08:41:34.282040 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:34 crc kubenswrapper[4790]: I1003 08:41:34.282068 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:34 crc kubenswrapper[4790]: I1003 08:41:34.282081 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:34Z","lastTransitionTime":"2025-10-03T08:41:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:34 crc kubenswrapper[4790]: I1003 08:41:34.328245 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 08:41:34 crc kubenswrapper[4790]: E1003 08:41:34.328419 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 08:41:34 crc kubenswrapper[4790]: I1003 08:41:34.328476 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 08:41:34 crc kubenswrapper[4790]: E1003 08:41:34.328636 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 08:41:34 crc kubenswrapper[4790]: I1003 08:41:34.328696 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 08:41:34 crc kubenswrapper[4790]: E1003 08:41:34.328765 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 08:41:34 crc kubenswrapper[4790]: I1003 08:41:34.328897 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4vl95" Oct 03 08:41:34 crc kubenswrapper[4790]: E1003 08:41:34.328974 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4vl95" podUID="9236b957-81cb-40bc-969e-a2057884ba50" Oct 03 08:41:34 crc kubenswrapper[4790]: I1003 08:41:34.346111 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7ddb59e5-6feb-471d-9594-f86e6993a2e4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b0138e08174582c0c1388716f1f6b9e42a16b009f1a467803c4dacd50c3d80ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c575d680c9bc994b8c2e97e720d794138153bc3329c4b7f8ef81115a0245d0cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://027dd0df8d83044cff9f4e8c87bef3bb18e2169bb93e615486fd1e9b5a17b86e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e837e81f3b907d8506f674d13a9bc8f20f63e6c40e1f952cd57b715855ec9bf9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:14Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:41:34Z is after 2025-08-24T17:21:41Z" Oct 03 08:41:34 crc kubenswrapper[4790]: I1003 08:41:34.359317 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7108359a-23ed-4464-a43a-6a1ef9877500\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:41:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:41:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca9eba35e4f0edf15874f10d6b11af0b09a3c3a43eaf8340ec5604122463dc8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d8b7cc917b16307413ed2fb2c91cf91b438cf0499138176e99562d0415bf872c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://55ba2aa3b5d8466c1193dab58f6add86f97ed9065aaa71ba877ad002d7b30ccc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3486e595b261af48bff933da2e9a7c49631e4523babd35cc054b39291e410512\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3486e595b261af48bff933da2e9a7c49631e4523babd35cc054b39291e410512\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:40:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:15Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:14Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:41:34Z is after 2025-08-24T17:21:41Z" Oct 03 08:41:34 crc kubenswrapper[4790]: I1003 08:41:34.372829 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08d57a60251c60c6c6a2cab2b862c322c01b531ca1aa12ff4b0933bee9eb95f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:41:34Z is after 2025-08-24T17:21:41Z" Oct 03 08:41:34 crc kubenswrapper[4790]: I1003 08:41:34.384224 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:34 crc kubenswrapper[4790]: I1003 08:41:34.384266 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:34 crc kubenswrapper[4790]: I1003 08:41:34.384276 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:34 crc kubenswrapper[4790]: I1003 08:41:34.384297 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:34 crc kubenswrapper[4790]: I1003 08:41:34.384310 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:34Z","lastTransitionTime":"2025-10-03T08:41:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:34 crc kubenswrapper[4790]: I1003 08:41:34.387143 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-hncl2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f2f5631d-8486-44f9-81b0-ee78515e7866\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:41:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:41:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://934c9d6aafe23db007c316b08845e9d67eecf0072ba180fa513fa213245cc37c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3fde680972c36057c9e06b0f745acbfb2e6ab399b14d6c25a3c379b65e598afd\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-03T08:41:24Z\\\",\\\"message\\\":\\\"2025-10-03T08:40:38+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_d2710489-37bc-43d0-9069-63e58afa0354\\\\n2025-10-03T08:40:38+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_d2710489-37bc-43d0-9069-63e58afa0354 to /host/opt/cni/bin/\\\\n2025-10-03T08:40:39Z [verbose] multus-daemon started\\\\n2025-10-03T08:40:39Z [verbose] Readiness Indicator file check\\\\n2025-10-03T08:41:24Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:37Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:41:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-strp2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-hncl2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:41:34Z is after 2025-08-24T17:21:41Z" Oct 03 08:41:34 crc kubenswrapper[4790]: I1003 08:41:34.402363 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:41:34Z is after 2025-08-24T17:21:41Z" Oct 03 08:41:34 crc kubenswrapper[4790]: I1003 08:41:34.416763 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9fnsr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e01f9373-74e4-439b-bd7f-ad00e60a3284\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45be0671b4ad4d5e294f87dba5d52747298b00a3ebd9a13f2845881c0f1846a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krfjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://876c35e32e5877c2a69cdbb54b84ec0bc0daebd05c7847875a1c122aafac2057\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://876c35e32e5877c2a69cdbb54b84ec0bc0daebd05c7847875a1c122aafac2057\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krfjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://938ff3fffad5347212d00d02f96c105cd5e6252a4d6a7ada258f0367da3f8e77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://938ff3fffad5347212d00d02f96c105cd5e6252a4d6a7ada258f0367da3f8e77\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:40:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krfjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca99fc95f4bc463b8e638cfa1e6dc6be77fc4111b6ec61b4b2913fdf91f46671\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ca99fc95f4bc463b8e638cfa1e6dc6be77fc4111b6ec61b4b2913fdf91f46671\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:40:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krfjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9667f80988a61a4f26f6f2b8b047ef94bdc6fa052083d5d1285f5bd9235615ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9667f80988a61a4f26f6f2b8b047ef94bdc6fa052083d5d1285f5bd9235615ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:40:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krfjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a4e7fb7b6f922e6e0fa2a7fcb5476ba549a6aa21dc88ee145bc55f4dac0912bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a4e7fb7b6f922e6e0fa2a7fcb5476ba549a6aa21dc88ee145bc55f4dac0912bb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:40:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krfjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://76a8239bfbb38c5858f1f9b0b01b9552463d8b2774564d7cdacb6757bf053656\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76a8239bfbb38c5858f1f9b0b01b9552463d8b2774564d7cdacb6757bf053656\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:40:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krfjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9fnsr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:41:34Z is after 2025-08-24T17:21:41Z" Oct 03 08:41:34 crc kubenswrapper[4790]: I1003 08:41:34.428071 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7bpwn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a3923f73-5efb-4ce3-b68f-0c4e1e8eca4b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://305c950f0352fe1bec1c151dc89eed1389144ad5ee321458d48c6609e688bb73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gt4gf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:37Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7bpwn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:41:34Z is after 2025-08-24T17:21:41Z" Oct 03 08:41:34 crc kubenswrapper[4790]: I1003 08:41:34.439197 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-fn764" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12525ca4-5242-41bc-b7ba-2c1ab5a45892\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ee4f33c772a396fdd8d2a6f958d402139cb81cfd79da0f32c83bbe9adbe0801\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2t6qw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d79cdfa897fe5ca615930826e3a153a184ab7977626a9837dbcdbddbe4d27af6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2t6qw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-fn764\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:41:34Z is after 2025-08-24T17:21:41Z" Oct 03 08:41:34 crc kubenswrapper[4790]: I1003 08:41:34.457361 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28b99cc027b4390bdc7e3ed519bb798eb4a7b825ff29e4574f3420b9922265b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:41:34Z is after 2025-08-24T17:21:41Z" Oct 03 08:41:34 crc kubenswrapper[4790]: I1003 08:41:34.468445 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:41:34Z is after 2025-08-24T17:21:41Z" Oct 03 08:41:34 crc kubenswrapper[4790]: I1003 08:41:34.481437 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:41:34Z is after 2025-08-24T17:21:41Z" Oct 03 08:41:34 crc kubenswrapper[4790]: I1003 08:41:34.487526 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:34 crc kubenswrapper[4790]: I1003 08:41:34.487600 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:34 crc kubenswrapper[4790]: I1003 08:41:34.487615 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:34 crc kubenswrapper[4790]: I1003 08:41:34.487638 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:34 crc kubenswrapper[4790]: I1003 08:41:34.487655 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:34Z","lastTransitionTime":"2025-10-03T08:41:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:34 crc kubenswrapper[4790]: I1003 08:41:34.492466 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"36eae06a-d8be-4128-8d68-acb32e1f011b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a97a1a7f3aba494cd0da6bcd15052a85d70642642a62d108ac0be51ce8e3d302\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww2bb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8c3aea222eb433729d8eac107ebcdb306f649bef25942cd0390bfc1bbda5c74\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww2bb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:35Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-zqj4j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:41:34Z is after 2025-08-24T17:21:41Z" Oct 03 08:41:34 crc kubenswrapper[4790]: I1003 08:41:34.511330 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vsjvh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa372dcc-63aa-48e5-b153-7739078026ef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e69102d376e90485ca565d34e18dbc12067e166f0bdf3c4325a0873f9f2b9b0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52pgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e52e55afef7b68f05e0c7d975d4497ee57dd9fa5aef7f72e9ca7441ea1633c05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52pgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55b4e8ab33316a7f7057ee723ebd49eb0adcd7270124f084b2828ce538ed3297\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52pgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f2aaa340e3927261ef0119f786b348e2091ddcd4b791619cc521122cc0774e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52pgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff0f1dd623e164ebe1862661c5905a38bec3aa1dd5c61982eebde25d0feb4c6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52pgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27b1573f76936a5148d24accaa7fdc1a1da552fb56b348812445e7f1c1e4cbef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52pgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://873da5acf186f02e995f5db997cebf47651cb6a2b87d4641ecb0318c004619e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa6b0394cf2fb97d539acd9a37b2c7184af5640122bf0b1d3f0fb663a78a9e91\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-03T08:41:03Z\\\",\\\"message\\\":\\\"resolver-gprtl in node crc\\\\nI1003 08:41:03.132081 6465 obj_retry.go:386] Retry successful for *v1.Pod openshift-dns/node-resolver-gprtl after 0 failed attempt(s)\\\\nI1003 08:41:03.132085 6465 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1003 08:41:03.132087 6465 default_network_controller.go:776] Recording success event on pod openshift-dns/node-resolver-gprtl\\\\nI1003 08:41:03.132018 6465 obj_retry.go:386] Retry successful for *v1.Pod openshift-kube-apiserver/kube-apiserver-crc after 0 failed attempt(s)\\\\nI1003 08:41:03.132100 6465 default_network_controller.go:776] Recording success event on pod openshift-kube-apiserver/kube-apiserver-crc\\\\nI1003 08:41:03.132110 6465 handler.go:208] Removed *v1.Node event handler 2\\\\nI1003 08:41:03.132122 6465 factory.go:656] Stopping watch factory\\\\nI1003 08:41:03.132130 6465 base_network_controller_pods.go:916] Annotation values: ip=[10.217.0.3/23] ; mac=0a:58:0a:d9:00:03 ; gw=[10.217.0.1]\\\\nI1003 08:41:03.132141 6465 ovnkube.go:599] Stopped ovnkube\\\\nI1003 08:41:03.132178 6465 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1003 08:41:03.132192 6465 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1003 08:41:03.132253 6465 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T08:41:02Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:41:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52pgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74d44e422b0cbbd3793553dad98475d5a93ac38530a167db0dd1ef152edac010\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52pgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e45c39722f37fc673cee930ca10396bba43b56e19e61aa99bd2b89f131341cb4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e45c39722f37fc673cee930ca10396bba43b56e19e61aa99bd2b89f131341cb4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:40:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52pgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vsjvh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:41:34Z is after 2025-08-24T17:21:41Z" Oct 03 08:41:34 crc kubenswrapper[4790]: I1003 08:41:34.526331 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-4vl95" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9236b957-81cb-40bc-969e-a2057884ba50\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j2gd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j2gd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:49Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-4vl95\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:41:34Z is after 2025-08-24T17:21:41Z" Oct 03 08:41:34 crc kubenswrapper[4790]: I1003 08:41:34.542840 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"535765e1-fd5d-4501-b13e-684c5f08b296\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:41:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:41:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e0caa9033c82a625d9594342d4e58052cfbfea8251bc0fa2f1b102ba235b5ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db5e8a267e2821ea1c1628ff477a4efc3c5bcc78538460b0f30a160bd738dcee\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d85a637b07da3de31d73486c444bf86e7e2fd799180fc1cd9ab56fee61afaa9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c713cd887b75dc54dfefd8b613f5bdb78ade644b31fc7db5e789414fcacad4b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://905a99c09cfaadc3eddc17fa6e509098364525d7a03add46a97c98dc08def8a2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T08:40:30Z\\\",\\\"message\\\":\\\"file observer\\\\nW1003 08:40:30.396497 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1003 08:40:30.399279 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 08:40:30.400119 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-247026750/tls.crt::/tmp/serving-cert-247026750/tls.key\\\\\\\"\\\\nI1003 08:40:30.809123 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1003 08:40:30.812812 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1003 08:40:30.812831 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1003 08:40:30.812852 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1003 08:40:30.812858 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1003 08:40:30.817440 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1003 08:40:30.817463 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1003 08:40:30.817482 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 08:40:30.817491 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 08:40:30.817495 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1003 08:40:30.817498 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1003 08:40:30.817501 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1003 08:40:30.817504 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1003 08:40:30.821080 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4591d1f2d844cb16eef19a65aaad416898d23ca5ce266e4863a9810d49ff4ba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://444e973af493d5aca2c924bcaee20abca5f1281e6b9c9f364e4603bf622d0e48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://444e973af493d5aca2c924bcaee20abca5f1281e6b9c9f364e4603bf622d0e48\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:40:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:14Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:41:34Z is after 2025-08-24T17:21:41Z" Oct 03 08:41:34 crc kubenswrapper[4790]: I1003 08:41:34.554880 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gprtl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c5bfb61-e69a-4576-b200-18dd7f38c7f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dcb3e7b6550f0ad490e187c2154ec53cb427cbab7acaf0618eb4de188eb724c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66xc5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:34Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gprtl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:41:34Z is after 2025-08-24T17:21:41Z" Oct 03 08:41:34 crc kubenswrapper[4790]: I1003 08:41:34.568556 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3362615c1396a6051d3770c79e4b67eea9d0c9df2a1a45381f729c4120970818\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://41113d498e57788a4c3162e8e981f5b9d9a5a7d49ee50b15085688fbc6b67f4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:41:34Z is after 2025-08-24T17:21:41Z" Oct 03 08:41:34 crc kubenswrapper[4790]: I1003 08:41:34.590930 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:34 crc kubenswrapper[4790]: I1003 08:41:34.590966 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:34 crc kubenswrapper[4790]: I1003 08:41:34.590976 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:34 crc kubenswrapper[4790]: I1003 08:41:34.590991 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:34 crc kubenswrapper[4790]: I1003 08:41:34.591003 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:34Z","lastTransitionTime":"2025-10-03T08:41:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:34 crc kubenswrapper[4790]: I1003 08:41:34.693617 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:34 crc kubenswrapper[4790]: I1003 08:41:34.693664 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:34 crc kubenswrapper[4790]: I1003 08:41:34.693674 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:34 crc kubenswrapper[4790]: I1003 08:41:34.693691 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:34 crc kubenswrapper[4790]: I1003 08:41:34.693702 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:34Z","lastTransitionTime":"2025-10-03T08:41:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:34 crc kubenswrapper[4790]: I1003 08:41:34.796278 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:34 crc kubenswrapper[4790]: I1003 08:41:34.796331 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:34 crc kubenswrapper[4790]: I1003 08:41:34.796344 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:34 crc kubenswrapper[4790]: I1003 08:41:34.796360 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:34 crc kubenswrapper[4790]: I1003 08:41:34.796370 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:34Z","lastTransitionTime":"2025-10-03T08:41:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:34 crc kubenswrapper[4790]: I1003 08:41:34.901079 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:34 crc kubenswrapper[4790]: I1003 08:41:34.901134 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:34 crc kubenswrapper[4790]: I1003 08:41:34.901148 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:34 crc kubenswrapper[4790]: I1003 08:41:34.901171 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:34 crc kubenswrapper[4790]: I1003 08:41:34.901185 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:34Z","lastTransitionTime":"2025-10-03T08:41:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:34 crc kubenswrapper[4790]: I1003 08:41:34.933947 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vsjvh_fa372dcc-63aa-48e5-b153-7739078026ef/ovnkube-controller/3.log" Oct 03 08:41:34 crc kubenswrapper[4790]: I1003 08:41:34.934843 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vsjvh_fa372dcc-63aa-48e5-b153-7739078026ef/ovnkube-controller/2.log" Oct 03 08:41:34 crc kubenswrapper[4790]: I1003 08:41:34.938699 4790 generic.go:334] "Generic (PLEG): container finished" podID="fa372dcc-63aa-48e5-b153-7739078026ef" containerID="873da5acf186f02e995f5db997cebf47651cb6a2b87d4641ecb0318c004619e8" exitCode=1 Oct 03 08:41:34 crc kubenswrapper[4790]: I1003 08:41:34.938752 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vsjvh" event={"ID":"fa372dcc-63aa-48e5-b153-7739078026ef","Type":"ContainerDied","Data":"873da5acf186f02e995f5db997cebf47651cb6a2b87d4641ecb0318c004619e8"} Oct 03 08:41:34 crc kubenswrapper[4790]: I1003 08:41:34.938829 4790 scope.go:117] "RemoveContainer" containerID="fa6b0394cf2fb97d539acd9a37b2c7184af5640122bf0b1d3f0fb663a78a9e91" Oct 03 08:41:34 crc kubenswrapper[4790]: I1003 08:41:34.939345 4790 scope.go:117] "RemoveContainer" containerID="873da5acf186f02e995f5db997cebf47651cb6a2b87d4641ecb0318c004619e8" Oct 03 08:41:34 crc kubenswrapper[4790]: E1003 08:41:34.939529 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-vsjvh_openshift-ovn-kubernetes(fa372dcc-63aa-48e5-b153-7739078026ef)\"" pod="openshift-ovn-kubernetes/ovnkube-node-vsjvh" podUID="fa372dcc-63aa-48e5-b153-7739078026ef" Oct 03 08:41:34 crc kubenswrapper[4790]: I1003 08:41:34.962959 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:41:34Z is after 2025-08-24T17:21:41Z" Oct 03 08:41:34 crc kubenswrapper[4790]: I1003 08:41:34.976074 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:41:34Z is after 2025-08-24T17:21:41Z" Oct 03 08:41:34 crc kubenswrapper[4790]: I1003 08:41:34.989646 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"36eae06a-d8be-4128-8d68-acb32e1f011b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a97a1a7f3aba494cd0da6bcd15052a85d70642642a62d108ac0be51ce8e3d302\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww2bb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8c3aea222eb433729d8eac107ebcdb306f649bef25942cd0390bfc1bbda5c74\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww2bb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:35Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-zqj4j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:41:34Z is after 2025-08-24T17:21:41Z" Oct 03 08:41:35 crc kubenswrapper[4790]: I1003 08:41:35.004630 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:35 crc kubenswrapper[4790]: I1003 08:41:35.004719 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:35 crc kubenswrapper[4790]: I1003 08:41:35.004731 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:35 crc kubenswrapper[4790]: I1003 08:41:35.004748 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:35 crc kubenswrapper[4790]: I1003 08:41:35.004759 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:35Z","lastTransitionTime":"2025-10-03T08:41:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:35 crc kubenswrapper[4790]: I1003 08:41:35.011279 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vsjvh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa372dcc-63aa-48e5-b153-7739078026ef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e69102d376e90485ca565d34e18dbc12067e166f0bdf3c4325a0873f9f2b9b0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52pgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e52e55afef7b68f05e0c7d975d4497ee57dd9fa5aef7f72e9ca7441ea1633c05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52pgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55b4e8ab33316a7f7057ee723ebd49eb0adcd7270124f084b2828ce538ed3297\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52pgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f2aaa340e3927261ef0119f786b348e2091ddcd4b791619cc521122cc0774e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52pgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff0f1dd623e164ebe1862661c5905a38bec3aa1dd5c61982eebde25d0feb4c6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52pgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27b1573f76936a5148d24accaa7fdc1a1da552fb56b348812445e7f1c1e4cbef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52pgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://873da5acf186f02e995f5db997cebf47651cb6a2b87d4641ecb0318c004619e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa6b0394cf2fb97d539acd9a37b2c7184af5640122bf0b1d3f0fb663a78a9e91\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-03T08:41:03Z\\\",\\\"message\\\":\\\"resolver-gprtl in node crc\\\\nI1003 08:41:03.132081 6465 obj_retry.go:386] Retry successful for *v1.Pod openshift-dns/node-resolver-gprtl after 0 failed attempt(s)\\\\nI1003 08:41:03.132085 6465 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1003 08:41:03.132087 6465 default_network_controller.go:776] Recording success event on pod openshift-dns/node-resolver-gprtl\\\\nI1003 08:41:03.132018 6465 obj_retry.go:386] Retry successful for *v1.Pod openshift-kube-apiserver/kube-apiserver-crc after 0 failed attempt(s)\\\\nI1003 08:41:03.132100 6465 default_network_controller.go:776] Recording success event on pod openshift-kube-apiserver/kube-apiserver-crc\\\\nI1003 08:41:03.132110 6465 handler.go:208] Removed *v1.Node event handler 2\\\\nI1003 08:41:03.132122 6465 factory.go:656] Stopping watch factory\\\\nI1003 08:41:03.132130 6465 base_network_controller_pods.go:916] Annotation values: ip=[10.217.0.3/23] ; mac=0a:58:0a:d9:00:03 ; gw=[10.217.0.1]\\\\nI1003 08:41:03.132141 6465 ovnkube.go:599] Stopped ovnkube\\\\nI1003 08:41:03.132178 6465 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1003 08:41:03.132192 6465 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1003 08:41:03.132253 6465 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T08:41:02Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://873da5acf186f02e995f5db997cebf47651cb6a2b87d4641ecb0318c004619e8\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-03T08:41:34Z\\\",\\\"message\\\":\\\"itches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}, built lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-ingress-operator/metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-ingress-operator/metrics\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.244\\\\\\\", Port:9393, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI1003 08:41:34.148688 6814 model_client.go:382] Update operations generated as: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-ingress-operator/metrics]} name:Service_openshift-ingress-operator/metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.244:9393:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {d8772e82-b0a4-4596-87d3-3d517c13344b}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lo\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T08:41:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52pgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74d44e422b0cbbd3793553dad98475d5a93ac38530a167db0dd1ef152edac010\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52pgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e45c39722f37fc673cee930ca10396bba43b56e19e61aa99bd2b89f131341cb4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e45c39722f37fc673cee930ca10396bba43b56e19e61aa99bd2b89f131341cb4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:40:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52pgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vsjvh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:41:35Z is after 2025-08-24T17:21:41Z" Oct 03 08:41:35 crc kubenswrapper[4790]: I1003 08:41:35.024044 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-4vl95" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9236b957-81cb-40bc-969e-a2057884ba50\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j2gd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j2gd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:49Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-4vl95\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:41:35Z is after 2025-08-24T17:21:41Z" Oct 03 08:41:35 crc kubenswrapper[4790]: I1003 08:41:35.040694 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28b99cc027b4390bdc7e3ed519bb798eb4a7b825ff29e4574f3420b9922265b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:41:35Z is after 2025-08-24T17:21:41Z" Oct 03 08:41:35 crc kubenswrapper[4790]: I1003 08:41:35.064703 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"535765e1-fd5d-4501-b13e-684c5f08b296\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:41:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:41:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e0caa9033c82a625d9594342d4e58052cfbfea8251bc0fa2f1b102ba235b5ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db5e8a267e2821ea1c1628ff477a4efc3c5bcc78538460b0f30a160bd738dcee\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d85a637b07da3de31d73486c444bf86e7e2fd799180fc1cd9ab56fee61afaa9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c713cd887b75dc54dfefd8b613f5bdb78ade644b31fc7db5e789414fcacad4b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://905a99c09cfaadc3eddc17fa6e509098364525d7a03add46a97c98dc08def8a2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T08:40:30Z\\\",\\\"message\\\":\\\"file observer\\\\nW1003 08:40:30.396497 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1003 08:40:30.399279 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 08:40:30.400119 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-247026750/tls.crt::/tmp/serving-cert-247026750/tls.key\\\\\\\"\\\\nI1003 08:40:30.809123 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1003 08:40:30.812812 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1003 08:40:30.812831 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1003 08:40:30.812852 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1003 08:40:30.812858 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1003 08:40:30.817440 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1003 08:40:30.817463 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1003 08:40:30.817482 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 08:40:30.817491 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 08:40:30.817495 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1003 08:40:30.817498 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1003 08:40:30.817501 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1003 08:40:30.817504 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1003 08:40:30.821080 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4591d1f2d844cb16eef19a65aaad416898d23ca5ce266e4863a9810d49ff4ba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://444e973af493d5aca2c924bcaee20abca5f1281e6b9c9f364e4603bf622d0e48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://444e973af493d5aca2c924bcaee20abca5f1281e6b9c9f364e4603bf622d0e48\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:40:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:14Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:41:35Z is after 2025-08-24T17:21:41Z" Oct 03 08:41:35 crc kubenswrapper[4790]: I1003 08:41:35.078458 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gprtl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c5bfb61-e69a-4576-b200-18dd7f38c7f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dcb3e7b6550f0ad490e187c2154ec53cb427cbab7acaf0618eb4de188eb724c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66xc5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:34Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gprtl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:41:35Z is after 2025-08-24T17:21:41Z" Oct 03 08:41:35 crc kubenswrapper[4790]: I1003 08:41:35.092344 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3362615c1396a6051d3770c79e4b67eea9d0c9df2a1a45381f729c4120970818\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://41113d498e57788a4c3162e8e981f5b9d9a5a7d49ee50b15085688fbc6b67f4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:41:35Z is after 2025-08-24T17:21:41Z" Oct 03 08:41:35 crc kubenswrapper[4790]: I1003 08:41:35.108830 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7108359a-23ed-4464-a43a-6a1ef9877500\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:41:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:41:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca9eba35e4f0edf15874f10d6b11af0b09a3c3a43eaf8340ec5604122463dc8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d8b7cc917b16307413ed2fb2c91cf91b438cf0499138176e99562d0415bf872c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://55ba2aa3b5d8466c1193dab58f6add86f97ed9065aaa71ba877ad002d7b30ccc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3486e595b261af48bff933da2e9a7c49631e4523babd35cc054b39291e410512\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3486e595b261af48bff933da2e9a7c49631e4523babd35cc054b39291e410512\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:40:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:15Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:14Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:41:35Z is after 2025-08-24T17:21:41Z" Oct 03 08:41:35 crc kubenswrapper[4790]: I1003 08:41:35.109001 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:35 crc kubenswrapper[4790]: I1003 08:41:35.109088 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:35 crc kubenswrapper[4790]: I1003 08:41:35.109099 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:35 crc kubenswrapper[4790]: I1003 08:41:35.109118 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:35 crc kubenswrapper[4790]: I1003 08:41:35.109166 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:35Z","lastTransitionTime":"2025-10-03T08:41:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:35 crc kubenswrapper[4790]: I1003 08:41:35.125526 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08d57a60251c60c6c6a2cab2b862c322c01b531ca1aa12ff4b0933bee9eb95f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:41:35Z is after 2025-08-24T17:21:41Z" Oct 03 08:41:35 crc kubenswrapper[4790]: I1003 08:41:35.143009 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-hncl2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f2f5631d-8486-44f9-81b0-ee78515e7866\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:41:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:41:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://934c9d6aafe23db007c316b08845e9d67eecf0072ba180fa513fa213245cc37c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3fde680972c36057c9e06b0f745acbfb2e6ab399b14d6c25a3c379b65e598afd\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-03T08:41:24Z\\\",\\\"message\\\":\\\"2025-10-03T08:40:38+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_d2710489-37bc-43d0-9069-63e58afa0354\\\\n2025-10-03T08:40:38+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_d2710489-37bc-43d0-9069-63e58afa0354 to /host/opt/cni/bin/\\\\n2025-10-03T08:40:39Z [verbose] multus-daemon started\\\\n2025-10-03T08:40:39Z [verbose] Readiness Indicator file check\\\\n2025-10-03T08:41:24Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:37Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:41:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-strp2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-hncl2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:41:35Z is after 2025-08-24T17:21:41Z" Oct 03 08:41:35 crc kubenswrapper[4790]: I1003 08:41:35.157773 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7ddb59e5-6feb-471d-9594-f86e6993a2e4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b0138e08174582c0c1388716f1f6b9e42a16b009f1a467803c4dacd50c3d80ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c575d680c9bc994b8c2e97e720d794138153bc3329c4b7f8ef81115a0245d0cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://027dd0df8d83044cff9f4e8c87bef3bb18e2169bb93e615486fd1e9b5a17b86e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e837e81f3b907d8506f674d13a9bc8f20f63e6c40e1f952cd57b715855ec9bf9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:14Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:41:35Z is after 2025-08-24T17:21:41Z" Oct 03 08:41:35 crc kubenswrapper[4790]: I1003 08:41:35.171997 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:41:35Z is after 2025-08-24T17:21:41Z" Oct 03 08:41:35 crc kubenswrapper[4790]: I1003 08:41:35.190101 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9fnsr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e01f9373-74e4-439b-bd7f-ad00e60a3284\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45be0671b4ad4d5e294f87dba5d52747298b00a3ebd9a13f2845881c0f1846a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krfjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://876c35e32e5877c2a69cdbb54b84ec0bc0daebd05c7847875a1c122aafac2057\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://876c35e32e5877c2a69cdbb54b84ec0bc0daebd05c7847875a1c122aafac2057\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krfjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://938ff3fffad5347212d00d02f96c105cd5e6252a4d6a7ada258f0367da3f8e77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://938ff3fffad5347212d00d02f96c105cd5e6252a4d6a7ada258f0367da3f8e77\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:40:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krfjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca99fc95f4bc463b8e638cfa1e6dc6be77fc4111b6ec61b4b2913fdf91f46671\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ca99fc95f4bc463b8e638cfa1e6dc6be77fc4111b6ec61b4b2913fdf91f46671\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:40:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krfjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9667f80988a61a4f26f6f2b8b047ef94bdc6fa052083d5d1285f5bd9235615ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9667f80988a61a4f26f6f2b8b047ef94bdc6fa052083d5d1285f5bd9235615ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:40:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krfjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a4e7fb7b6f922e6e0fa2a7fcb5476ba549a6aa21dc88ee145bc55f4dac0912bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a4e7fb7b6f922e6e0fa2a7fcb5476ba549a6aa21dc88ee145bc55f4dac0912bb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:40:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krfjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://76a8239bfbb38c5858f1f9b0b01b9552463d8b2774564d7cdacb6757bf053656\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76a8239bfbb38c5858f1f9b0b01b9552463d8b2774564d7cdacb6757bf053656\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:40:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krfjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9fnsr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:41:35Z is after 2025-08-24T17:21:41Z" Oct 03 08:41:35 crc kubenswrapper[4790]: I1003 08:41:35.202734 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7bpwn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a3923f73-5efb-4ce3-b68f-0c4e1e8eca4b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://305c950f0352fe1bec1c151dc89eed1389144ad5ee321458d48c6609e688bb73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gt4gf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:37Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7bpwn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:41:35Z is after 2025-08-24T17:21:41Z" Oct 03 08:41:35 crc kubenswrapper[4790]: I1003 08:41:35.212085 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:35 crc kubenswrapper[4790]: I1003 08:41:35.212147 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:35 crc kubenswrapper[4790]: I1003 08:41:35.212163 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:35 crc kubenswrapper[4790]: I1003 08:41:35.212184 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:35 crc kubenswrapper[4790]: I1003 08:41:35.212197 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:35Z","lastTransitionTime":"2025-10-03T08:41:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:35 crc kubenswrapper[4790]: I1003 08:41:35.217801 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-fn764" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12525ca4-5242-41bc-b7ba-2c1ab5a45892\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ee4f33c772a396fdd8d2a6f958d402139cb81cfd79da0f32c83bbe9adbe0801\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2t6qw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d79cdfa897fe5ca615930826e3a153a184ab7977626a9837dbcdbddbe4d27af6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2t6qw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-fn764\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:41:35Z is after 2025-08-24T17:21:41Z" Oct 03 08:41:35 crc kubenswrapper[4790]: I1003 08:41:35.315391 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:35 crc kubenswrapper[4790]: I1003 08:41:35.315471 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:35 crc kubenswrapper[4790]: I1003 08:41:35.315482 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:35 crc kubenswrapper[4790]: I1003 08:41:35.315501 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:35 crc kubenswrapper[4790]: I1003 08:41:35.315521 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:35Z","lastTransitionTime":"2025-10-03T08:41:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:35 crc kubenswrapper[4790]: I1003 08:41:35.347329 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Oct 03 08:41:35 crc kubenswrapper[4790]: I1003 08:41:35.418742 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:35 crc kubenswrapper[4790]: I1003 08:41:35.418821 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:35 crc kubenswrapper[4790]: I1003 08:41:35.418842 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:35 crc kubenswrapper[4790]: I1003 08:41:35.418875 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:35 crc kubenswrapper[4790]: I1003 08:41:35.418894 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:35Z","lastTransitionTime":"2025-10-03T08:41:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:35 crc kubenswrapper[4790]: I1003 08:41:35.522722 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:35 crc kubenswrapper[4790]: I1003 08:41:35.522797 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:35 crc kubenswrapper[4790]: I1003 08:41:35.522817 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:35 crc kubenswrapper[4790]: I1003 08:41:35.522842 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:35 crc kubenswrapper[4790]: I1003 08:41:35.522862 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:35Z","lastTransitionTime":"2025-10-03T08:41:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:35 crc kubenswrapper[4790]: I1003 08:41:35.626474 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:35 crc kubenswrapper[4790]: I1003 08:41:35.626605 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:35 crc kubenswrapper[4790]: I1003 08:41:35.626641 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:35 crc kubenswrapper[4790]: I1003 08:41:35.626675 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:35 crc kubenswrapper[4790]: I1003 08:41:35.626702 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:35Z","lastTransitionTime":"2025-10-03T08:41:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:35 crc kubenswrapper[4790]: I1003 08:41:35.729945 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:35 crc kubenswrapper[4790]: I1003 08:41:35.730032 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:35 crc kubenswrapper[4790]: I1003 08:41:35.730056 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:35 crc kubenswrapper[4790]: I1003 08:41:35.730088 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:35 crc kubenswrapper[4790]: I1003 08:41:35.730111 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:35Z","lastTransitionTime":"2025-10-03T08:41:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:35 crc kubenswrapper[4790]: I1003 08:41:35.833546 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:35 crc kubenswrapper[4790]: I1003 08:41:35.833659 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:35 crc kubenswrapper[4790]: I1003 08:41:35.833682 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:35 crc kubenswrapper[4790]: I1003 08:41:35.833714 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:35 crc kubenswrapper[4790]: I1003 08:41:35.833735 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:35Z","lastTransitionTime":"2025-10-03T08:41:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:35 crc kubenswrapper[4790]: I1003 08:41:35.936485 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:35 crc kubenswrapper[4790]: I1003 08:41:35.936550 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:35 crc kubenswrapper[4790]: I1003 08:41:35.936612 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:35 crc kubenswrapper[4790]: I1003 08:41:35.936646 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:35 crc kubenswrapper[4790]: I1003 08:41:35.936671 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:35Z","lastTransitionTime":"2025-10-03T08:41:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:35 crc kubenswrapper[4790]: I1003 08:41:35.945667 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vsjvh_fa372dcc-63aa-48e5-b153-7739078026ef/ovnkube-controller/3.log" Oct 03 08:41:35 crc kubenswrapper[4790]: I1003 08:41:35.950885 4790 scope.go:117] "RemoveContainer" containerID="873da5acf186f02e995f5db997cebf47651cb6a2b87d4641ecb0318c004619e8" Oct 03 08:41:35 crc kubenswrapper[4790]: E1003 08:41:35.951042 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-vsjvh_openshift-ovn-kubernetes(fa372dcc-63aa-48e5-b153-7739078026ef)\"" pod="openshift-ovn-kubernetes/ovnkube-node-vsjvh" podUID="fa372dcc-63aa-48e5-b153-7739078026ef" Oct 03 08:41:35 crc kubenswrapper[4790]: I1003 08:41:35.971342 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:41:35Z is after 2025-08-24T17:21:41Z" Oct 03 08:41:35 crc kubenswrapper[4790]: I1003 08:41:35.992519 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"36eae06a-d8be-4128-8d68-acb32e1f011b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a97a1a7f3aba494cd0da6bcd15052a85d70642642a62d108ac0be51ce8e3d302\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww2bb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8c3aea222eb433729d8eac107ebcdb306f649bef25942cd0390bfc1bbda5c74\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww2bb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:35Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-zqj4j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:41:35Z is after 2025-08-24T17:21:41Z" Oct 03 08:41:36 crc kubenswrapper[4790]: I1003 08:41:36.021888 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vsjvh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa372dcc-63aa-48e5-b153-7739078026ef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e69102d376e90485ca565d34e18dbc12067e166f0bdf3c4325a0873f9f2b9b0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52pgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e52e55afef7b68f05e0c7d975d4497ee57dd9fa5aef7f72e9ca7441ea1633c05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52pgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55b4e8ab33316a7f7057ee723ebd49eb0adcd7270124f084b2828ce538ed3297\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52pgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f2aaa340e3927261ef0119f786b348e2091ddcd4b791619cc521122cc0774e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52pgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff0f1dd623e164ebe1862661c5905a38bec3aa1dd5c61982eebde25d0feb4c6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52pgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27b1573f76936a5148d24accaa7fdc1a1da552fb56b348812445e7f1c1e4cbef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52pgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://873da5acf186f02e995f5db997cebf47651cb6a2b87d4641ecb0318c004619e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://873da5acf186f02e995f5db997cebf47651cb6a2b87d4641ecb0318c004619e8\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-03T08:41:34Z\\\",\\\"message\\\":\\\"itches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}, built lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-ingress-operator/metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-ingress-operator/metrics\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.244\\\\\\\", Port:9393, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI1003 08:41:34.148688 6814 model_client.go:382] Update operations generated as: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-ingress-operator/metrics]} name:Service_openshift-ingress-operator/metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.244:9393:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {d8772e82-b0a4-4596-87d3-3d517c13344b}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lo\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T08:41:33Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-vsjvh_openshift-ovn-kubernetes(fa372dcc-63aa-48e5-b153-7739078026ef)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52pgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74d44e422b0cbbd3793553dad98475d5a93ac38530a167db0dd1ef152edac010\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52pgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e45c39722f37fc673cee930ca10396bba43b56e19e61aa99bd2b89f131341cb4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e45c39722f37fc673cee930ca10396bba43b56e19e61aa99bd2b89f131341cb4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:40:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52pgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vsjvh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:41:36Z is after 2025-08-24T17:21:41Z" Oct 03 08:41:36 crc kubenswrapper[4790]: I1003 08:41:36.039416 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-4vl95" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9236b957-81cb-40bc-969e-a2057884ba50\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j2gd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j2gd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:49Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-4vl95\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:41:36Z is after 2025-08-24T17:21:41Z" Oct 03 08:41:36 crc kubenswrapper[4790]: I1003 08:41:36.040245 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:36 crc kubenswrapper[4790]: I1003 08:41:36.040291 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:36 crc kubenswrapper[4790]: I1003 08:41:36.040302 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:36 crc kubenswrapper[4790]: I1003 08:41:36.040350 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:36 crc kubenswrapper[4790]: I1003 08:41:36.040364 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:36Z","lastTransitionTime":"2025-10-03T08:41:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:36 crc kubenswrapper[4790]: I1003 08:41:36.062350 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28b99cc027b4390bdc7e3ed519bb798eb4a7b825ff29e4574f3420b9922265b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:41:36Z is after 2025-08-24T17:21:41Z" Oct 03 08:41:36 crc kubenswrapper[4790]: I1003 08:41:36.084384 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:41:36Z is after 2025-08-24T17:21:41Z" Oct 03 08:41:36 crc kubenswrapper[4790]: I1003 08:41:36.100044 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gprtl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c5bfb61-e69a-4576-b200-18dd7f38c7f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dcb3e7b6550f0ad490e187c2154ec53cb427cbab7acaf0618eb4de188eb724c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66xc5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:34Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gprtl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:41:36Z is after 2025-08-24T17:21:41Z" Oct 03 08:41:36 crc kubenswrapper[4790]: I1003 08:41:36.120025 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3362615c1396a6051d3770c79e4b67eea9d0c9df2a1a45381f729c4120970818\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://41113d498e57788a4c3162e8e981f5b9d9a5a7d49ee50b15085688fbc6b67f4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:41:36Z is after 2025-08-24T17:21:41Z" Oct 03 08:41:36 crc kubenswrapper[4790]: I1003 08:41:36.134845 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e780b80f-57b7-4b09-8df8-ab5893b7dbd6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac215e99f46978c6b11677b4a1965fb4c1e16820ff221134062941eb162085ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://836c0a93c9a8ca16b512613d02dd11de6a852bc04e06dd03ae74c981f093f676\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://836c0a93c9a8ca16b512613d02dd11de6a852bc04e06dd03ae74c981f093f676\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:40:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:14Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:41:36Z is after 2025-08-24T17:21:41Z" Oct 03 08:41:36 crc kubenswrapper[4790]: I1003 08:41:36.143118 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:36 crc kubenswrapper[4790]: I1003 08:41:36.143177 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:36 crc kubenswrapper[4790]: I1003 08:41:36.143191 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:36 crc kubenswrapper[4790]: I1003 08:41:36.143229 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:36 crc kubenswrapper[4790]: I1003 08:41:36.143239 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:36Z","lastTransitionTime":"2025-10-03T08:41:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:36 crc kubenswrapper[4790]: I1003 08:41:36.154711 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"535765e1-fd5d-4501-b13e-684c5f08b296\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:41:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:41:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e0caa9033c82a625d9594342d4e58052cfbfea8251bc0fa2f1b102ba235b5ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db5e8a267e2821ea1c1628ff477a4efc3c5bcc78538460b0f30a160bd738dcee\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d85a637b07da3de31d73486c444bf86e7e2fd799180fc1cd9ab56fee61afaa9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c713cd887b75dc54dfefd8b613f5bdb78ade644b31fc7db5e789414fcacad4b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://905a99c09cfaadc3eddc17fa6e509098364525d7a03add46a97c98dc08def8a2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T08:40:30Z\\\",\\\"message\\\":\\\"file observer\\\\nW1003 08:40:30.396497 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1003 08:40:30.399279 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 08:40:30.400119 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-247026750/tls.crt::/tmp/serving-cert-247026750/tls.key\\\\\\\"\\\\nI1003 08:40:30.809123 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1003 08:40:30.812812 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1003 08:40:30.812831 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1003 08:40:30.812852 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1003 08:40:30.812858 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1003 08:40:30.817440 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1003 08:40:30.817463 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1003 08:40:30.817482 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 08:40:30.817491 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 08:40:30.817495 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1003 08:40:30.817498 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1003 08:40:30.817501 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1003 08:40:30.817504 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1003 08:40:30.821080 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4591d1f2d844cb16eef19a65aaad416898d23ca5ce266e4863a9810d49ff4ba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://444e973af493d5aca2c924bcaee20abca5f1281e6b9c9f364e4603bf622d0e48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://444e973af493d5aca2c924bcaee20abca5f1281e6b9c9f364e4603bf622d0e48\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:40:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:14Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:41:36Z is after 2025-08-24T17:21:41Z" Oct 03 08:41:36 crc kubenswrapper[4790]: I1003 08:41:36.172231 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08d57a60251c60c6c6a2cab2b862c322c01b531ca1aa12ff4b0933bee9eb95f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:41:36Z is after 2025-08-24T17:21:41Z" Oct 03 08:41:36 crc kubenswrapper[4790]: I1003 08:41:36.191547 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-hncl2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f2f5631d-8486-44f9-81b0-ee78515e7866\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:41:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:41:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://934c9d6aafe23db007c316b08845e9d67eecf0072ba180fa513fa213245cc37c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3fde680972c36057c9e06b0f745acbfb2e6ab399b14d6c25a3c379b65e598afd\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-03T08:41:24Z\\\",\\\"message\\\":\\\"2025-10-03T08:40:38+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_d2710489-37bc-43d0-9069-63e58afa0354\\\\n2025-10-03T08:40:38+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_d2710489-37bc-43d0-9069-63e58afa0354 to /host/opt/cni/bin/\\\\n2025-10-03T08:40:39Z [verbose] multus-daemon started\\\\n2025-10-03T08:40:39Z [verbose] Readiness Indicator file check\\\\n2025-10-03T08:41:24Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:37Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:41:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-strp2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-hncl2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:41:36Z is after 2025-08-24T17:21:41Z" Oct 03 08:41:36 crc kubenswrapper[4790]: I1003 08:41:36.209800 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7ddb59e5-6feb-471d-9594-f86e6993a2e4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b0138e08174582c0c1388716f1f6b9e42a16b009f1a467803c4dacd50c3d80ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c575d680c9bc994b8c2e97e720d794138153bc3329c4b7f8ef81115a0245d0cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://027dd0df8d83044cff9f4e8c87bef3bb18e2169bb93e615486fd1e9b5a17b86e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e837e81f3b907d8506f674d13a9bc8f20f63e6c40e1f952cd57b715855ec9bf9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:14Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:41:36Z is after 2025-08-24T17:21:41Z" Oct 03 08:41:36 crc kubenswrapper[4790]: I1003 08:41:36.225963 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7108359a-23ed-4464-a43a-6a1ef9877500\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:41:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:41:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca9eba35e4f0edf15874f10d6b11af0b09a3c3a43eaf8340ec5604122463dc8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d8b7cc917b16307413ed2fb2c91cf91b438cf0499138176e99562d0415bf872c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://55ba2aa3b5d8466c1193dab58f6add86f97ed9065aaa71ba877ad002d7b30ccc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3486e595b261af48bff933da2e9a7c49631e4523babd35cc054b39291e410512\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3486e595b261af48bff933da2e9a7c49631e4523babd35cc054b39291e410512\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:40:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:15Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:14Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:41:36Z is after 2025-08-24T17:21:41Z" Oct 03 08:41:36 crc kubenswrapper[4790]: I1003 08:41:36.244056 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9fnsr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e01f9373-74e4-439b-bd7f-ad00e60a3284\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45be0671b4ad4d5e294f87dba5d52747298b00a3ebd9a13f2845881c0f1846a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krfjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://876c35e32e5877c2a69cdbb54b84ec0bc0daebd05c7847875a1c122aafac2057\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://876c35e32e5877c2a69cdbb54b84ec0bc0daebd05c7847875a1c122aafac2057\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krfjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://938ff3fffad5347212d00d02f96c105cd5e6252a4d6a7ada258f0367da3f8e77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://938ff3fffad5347212d00d02f96c105cd5e6252a4d6a7ada258f0367da3f8e77\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:40:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krfjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca99fc95f4bc463b8e638cfa1e6dc6be77fc4111b6ec61b4b2913fdf91f46671\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ca99fc95f4bc463b8e638cfa1e6dc6be77fc4111b6ec61b4b2913fdf91f46671\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:40:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krfjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9667f80988a61a4f26f6f2b8b047ef94bdc6fa052083d5d1285f5bd9235615ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9667f80988a61a4f26f6f2b8b047ef94bdc6fa052083d5d1285f5bd9235615ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:40:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krfjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a4e7fb7b6f922e6e0fa2a7fcb5476ba549a6aa21dc88ee145bc55f4dac0912bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a4e7fb7b6f922e6e0fa2a7fcb5476ba549a6aa21dc88ee145bc55f4dac0912bb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:40:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krfjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://76a8239bfbb38c5858f1f9b0b01b9552463d8b2774564d7cdacb6757bf053656\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76a8239bfbb38c5858f1f9b0b01b9552463d8b2774564d7cdacb6757bf053656\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:40:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krfjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9fnsr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:41:36Z is after 2025-08-24T17:21:41Z" Oct 03 08:41:36 crc kubenswrapper[4790]: I1003 08:41:36.246074 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:36 crc kubenswrapper[4790]: I1003 08:41:36.246145 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:36 crc kubenswrapper[4790]: I1003 08:41:36.246159 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:36 crc kubenswrapper[4790]: I1003 08:41:36.246183 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:36 crc kubenswrapper[4790]: I1003 08:41:36.246194 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:36Z","lastTransitionTime":"2025-10-03T08:41:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:36 crc kubenswrapper[4790]: I1003 08:41:36.256542 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7bpwn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a3923f73-5efb-4ce3-b68f-0c4e1e8eca4b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://305c950f0352fe1bec1c151dc89eed1389144ad5ee321458d48c6609e688bb73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gt4gf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:37Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7bpwn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:41:36Z is after 2025-08-24T17:21:41Z" Oct 03 08:41:36 crc kubenswrapper[4790]: I1003 08:41:36.270563 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-fn764" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12525ca4-5242-41bc-b7ba-2c1ab5a45892\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ee4f33c772a396fdd8d2a6f958d402139cb81cfd79da0f32c83bbe9adbe0801\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2t6qw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d79cdfa897fe5ca615930826e3a153a184ab7977626a9837dbcdbddbe4d27af6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2t6qw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-fn764\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:41:36Z is after 2025-08-24T17:21:41Z" Oct 03 08:41:36 crc kubenswrapper[4790]: I1003 08:41:36.289729 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:41:36Z is after 2025-08-24T17:21:41Z" Oct 03 08:41:36 crc kubenswrapper[4790]: I1003 08:41:36.327937 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 08:41:36 crc kubenswrapper[4790]: I1003 08:41:36.328016 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4vl95" Oct 03 08:41:36 crc kubenswrapper[4790]: I1003 08:41:36.327996 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 08:41:36 crc kubenswrapper[4790]: I1003 08:41:36.328113 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 08:41:36 crc kubenswrapper[4790]: E1003 08:41:36.328359 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 08:41:36 crc kubenswrapper[4790]: E1003 08:41:36.328349 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 08:41:36 crc kubenswrapper[4790]: E1003 08:41:36.328524 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4vl95" podUID="9236b957-81cb-40bc-969e-a2057884ba50" Oct 03 08:41:36 crc kubenswrapper[4790]: E1003 08:41:36.328749 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 08:41:36 crc kubenswrapper[4790]: I1003 08:41:36.349895 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:36 crc kubenswrapper[4790]: I1003 08:41:36.349946 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:36 crc kubenswrapper[4790]: I1003 08:41:36.349957 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:36 crc kubenswrapper[4790]: I1003 08:41:36.349969 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:36 crc kubenswrapper[4790]: I1003 08:41:36.349995 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:36Z","lastTransitionTime":"2025-10-03T08:41:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:36 crc kubenswrapper[4790]: I1003 08:41:36.453651 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:36 crc kubenswrapper[4790]: I1003 08:41:36.453693 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:36 crc kubenswrapper[4790]: I1003 08:41:36.453704 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:36 crc kubenswrapper[4790]: I1003 08:41:36.453737 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:36 crc kubenswrapper[4790]: I1003 08:41:36.453750 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:36Z","lastTransitionTime":"2025-10-03T08:41:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:36 crc kubenswrapper[4790]: I1003 08:41:36.556415 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:36 crc kubenswrapper[4790]: I1003 08:41:36.556464 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:36 crc kubenswrapper[4790]: I1003 08:41:36.556474 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:36 crc kubenswrapper[4790]: I1003 08:41:36.556499 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:36 crc kubenswrapper[4790]: I1003 08:41:36.556513 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:36Z","lastTransitionTime":"2025-10-03T08:41:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:36 crc kubenswrapper[4790]: I1003 08:41:36.659261 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:36 crc kubenswrapper[4790]: I1003 08:41:36.659313 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:36 crc kubenswrapper[4790]: I1003 08:41:36.659326 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:36 crc kubenswrapper[4790]: I1003 08:41:36.659347 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:36 crc kubenswrapper[4790]: I1003 08:41:36.659361 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:36Z","lastTransitionTime":"2025-10-03T08:41:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:36 crc kubenswrapper[4790]: I1003 08:41:36.761983 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:36 crc kubenswrapper[4790]: I1003 08:41:36.762041 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:36 crc kubenswrapper[4790]: I1003 08:41:36.762055 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:36 crc kubenswrapper[4790]: I1003 08:41:36.762076 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:36 crc kubenswrapper[4790]: I1003 08:41:36.762089 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:36Z","lastTransitionTime":"2025-10-03T08:41:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:36 crc kubenswrapper[4790]: I1003 08:41:36.869060 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:36 crc kubenswrapper[4790]: I1003 08:41:36.869211 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:36 crc kubenswrapper[4790]: I1003 08:41:36.869249 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:36 crc kubenswrapper[4790]: I1003 08:41:36.869350 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:36 crc kubenswrapper[4790]: I1003 08:41:36.869438 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:36Z","lastTransitionTime":"2025-10-03T08:41:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:36 crc kubenswrapper[4790]: I1003 08:41:36.973029 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:36 crc kubenswrapper[4790]: I1003 08:41:36.973079 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:36 crc kubenswrapper[4790]: I1003 08:41:36.973089 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:36 crc kubenswrapper[4790]: I1003 08:41:36.973108 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:36 crc kubenswrapper[4790]: I1003 08:41:36.973121 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:36Z","lastTransitionTime":"2025-10-03T08:41:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:37 crc kubenswrapper[4790]: I1003 08:41:37.076040 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:37 crc kubenswrapper[4790]: I1003 08:41:37.076109 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:37 crc kubenswrapper[4790]: I1003 08:41:37.076127 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:37 crc kubenswrapper[4790]: I1003 08:41:37.076157 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:37 crc kubenswrapper[4790]: I1003 08:41:37.076174 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:37Z","lastTransitionTime":"2025-10-03T08:41:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:37 crc kubenswrapper[4790]: I1003 08:41:37.178851 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:37 crc kubenswrapper[4790]: I1003 08:41:37.178889 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:37 crc kubenswrapper[4790]: I1003 08:41:37.178900 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:37 crc kubenswrapper[4790]: I1003 08:41:37.178917 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:37 crc kubenswrapper[4790]: I1003 08:41:37.178929 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:37Z","lastTransitionTime":"2025-10-03T08:41:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:37 crc kubenswrapper[4790]: I1003 08:41:37.281630 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:37 crc kubenswrapper[4790]: I1003 08:41:37.281722 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:37 crc kubenswrapper[4790]: I1003 08:41:37.281741 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:37 crc kubenswrapper[4790]: I1003 08:41:37.281767 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:37 crc kubenswrapper[4790]: I1003 08:41:37.281784 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:37Z","lastTransitionTime":"2025-10-03T08:41:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:37 crc kubenswrapper[4790]: I1003 08:41:37.385349 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:37 crc kubenswrapper[4790]: I1003 08:41:37.385430 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:37 crc kubenswrapper[4790]: I1003 08:41:37.385454 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:37 crc kubenswrapper[4790]: I1003 08:41:37.385487 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:37 crc kubenswrapper[4790]: I1003 08:41:37.385512 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:37Z","lastTransitionTime":"2025-10-03T08:41:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:37 crc kubenswrapper[4790]: I1003 08:41:37.488708 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:37 crc kubenswrapper[4790]: I1003 08:41:37.488787 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:37 crc kubenswrapper[4790]: I1003 08:41:37.488802 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:37 crc kubenswrapper[4790]: I1003 08:41:37.488827 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:37 crc kubenswrapper[4790]: I1003 08:41:37.488850 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:37Z","lastTransitionTime":"2025-10-03T08:41:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:37 crc kubenswrapper[4790]: I1003 08:41:37.592273 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:37 crc kubenswrapper[4790]: I1003 08:41:37.592344 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:37 crc kubenswrapper[4790]: I1003 08:41:37.592367 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:37 crc kubenswrapper[4790]: I1003 08:41:37.592397 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:37 crc kubenswrapper[4790]: I1003 08:41:37.592418 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:37Z","lastTransitionTime":"2025-10-03T08:41:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:37 crc kubenswrapper[4790]: I1003 08:41:37.695445 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:37 crc kubenswrapper[4790]: I1003 08:41:37.695492 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:37 crc kubenswrapper[4790]: I1003 08:41:37.695508 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:37 crc kubenswrapper[4790]: I1003 08:41:37.695526 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:37 crc kubenswrapper[4790]: I1003 08:41:37.695538 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:37Z","lastTransitionTime":"2025-10-03T08:41:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:37 crc kubenswrapper[4790]: I1003 08:41:37.799339 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:37 crc kubenswrapper[4790]: I1003 08:41:37.799820 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:37 crc kubenswrapper[4790]: I1003 08:41:37.799845 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:37 crc kubenswrapper[4790]: I1003 08:41:37.799873 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:37 crc kubenswrapper[4790]: I1003 08:41:37.799892 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:37Z","lastTransitionTime":"2025-10-03T08:41:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:37 crc kubenswrapper[4790]: I1003 08:41:37.903726 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:37 crc kubenswrapper[4790]: I1003 08:41:37.904292 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:37 crc kubenswrapper[4790]: I1003 08:41:37.904321 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:37 crc kubenswrapper[4790]: I1003 08:41:37.904347 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:37 crc kubenswrapper[4790]: I1003 08:41:37.904407 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:37Z","lastTransitionTime":"2025-10-03T08:41:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:38 crc kubenswrapper[4790]: I1003 08:41:38.007876 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:38 crc kubenswrapper[4790]: I1003 08:41:38.008297 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:38 crc kubenswrapper[4790]: I1003 08:41:38.008647 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:38 crc kubenswrapper[4790]: I1003 08:41:38.008883 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:38 crc kubenswrapper[4790]: I1003 08:41:38.009108 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:38Z","lastTransitionTime":"2025-10-03T08:41:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:38 crc kubenswrapper[4790]: I1003 08:41:38.075064 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 08:41:38 crc kubenswrapper[4790]: E1003 08:41:38.075282 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 08:42:42.075238832 +0000 UTC m=+148.428173130 (durationBeforeRetry 1m4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 08:41:38 crc kubenswrapper[4790]: I1003 08:41:38.113271 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:38 crc kubenswrapper[4790]: I1003 08:41:38.113779 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:38 crc kubenswrapper[4790]: I1003 08:41:38.114004 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:38 crc kubenswrapper[4790]: I1003 08:41:38.114180 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:38 crc kubenswrapper[4790]: I1003 08:41:38.114323 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:38Z","lastTransitionTime":"2025-10-03T08:41:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:38 crc kubenswrapper[4790]: I1003 08:41:38.217332 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:38 crc kubenswrapper[4790]: I1003 08:41:38.218017 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:38 crc kubenswrapper[4790]: I1003 08:41:38.218091 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:38 crc kubenswrapper[4790]: I1003 08:41:38.218275 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:38 crc kubenswrapper[4790]: I1003 08:41:38.218379 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:38Z","lastTransitionTime":"2025-10-03T08:41:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:38 crc kubenswrapper[4790]: I1003 08:41:38.277473 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 08:41:38 crc kubenswrapper[4790]: I1003 08:41:38.277536 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 08:41:38 crc kubenswrapper[4790]: I1003 08:41:38.277562 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 08:41:38 crc kubenswrapper[4790]: I1003 08:41:38.277619 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 08:41:38 crc kubenswrapper[4790]: E1003 08:41:38.277755 4790 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 03 08:41:38 crc kubenswrapper[4790]: E1003 08:41:38.277791 4790 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 03 08:41:38 crc kubenswrapper[4790]: E1003 08:41:38.277835 4790 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 03 08:41:38 crc kubenswrapper[4790]: E1003 08:41:38.277849 4790 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 03 08:41:38 crc kubenswrapper[4790]: E1003 08:41:38.277842 4790 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 03 08:41:38 crc kubenswrapper[4790]: E1003 08:41:38.277786 4790 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 03 08:41:38 crc kubenswrapper[4790]: E1003 08:41:38.277909 4790 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 03 08:41:38 crc kubenswrapper[4790]: E1003 08:41:38.277919 4790 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 03 08:41:38 crc kubenswrapper[4790]: E1003 08:41:38.277875 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-03 08:42:42.277846115 +0000 UTC m=+148.630780543 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 03 08:41:38 crc kubenswrapper[4790]: E1003 08:41:38.277970 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-03 08:42:42.277947158 +0000 UTC m=+148.630881396 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 03 08:41:38 crc kubenswrapper[4790]: E1003 08:41:38.277987 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-03 08:42:42.277978739 +0000 UTC m=+148.630912977 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 03 08:41:38 crc kubenswrapper[4790]: E1003 08:41:38.278002 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-03 08:42:42.277995409 +0000 UTC m=+148.630929647 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 03 08:41:38 crc kubenswrapper[4790]: I1003 08:41:38.322041 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:38 crc kubenswrapper[4790]: I1003 08:41:38.322106 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:38 crc kubenswrapper[4790]: I1003 08:41:38.322117 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:38 crc kubenswrapper[4790]: I1003 08:41:38.322141 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:38 crc kubenswrapper[4790]: I1003 08:41:38.322153 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:38Z","lastTransitionTime":"2025-10-03T08:41:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:38 crc kubenswrapper[4790]: I1003 08:41:38.327885 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 08:41:38 crc kubenswrapper[4790]: I1003 08:41:38.327903 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 08:41:38 crc kubenswrapper[4790]: I1003 08:41:38.327944 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4vl95" Oct 03 08:41:38 crc kubenswrapper[4790]: I1003 08:41:38.327887 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 08:41:38 crc kubenswrapper[4790]: E1003 08:41:38.328027 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 08:41:38 crc kubenswrapper[4790]: E1003 08:41:38.328258 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 08:41:38 crc kubenswrapper[4790]: E1003 08:41:38.328270 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4vl95" podUID="9236b957-81cb-40bc-969e-a2057884ba50" Oct 03 08:41:38 crc kubenswrapper[4790]: E1003 08:41:38.328328 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 08:41:38 crc kubenswrapper[4790]: I1003 08:41:38.424892 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:38 crc kubenswrapper[4790]: I1003 08:41:38.424988 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:38 crc kubenswrapper[4790]: I1003 08:41:38.425001 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:38 crc kubenswrapper[4790]: I1003 08:41:38.425019 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:38 crc kubenswrapper[4790]: I1003 08:41:38.425031 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:38Z","lastTransitionTime":"2025-10-03T08:41:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:38 crc kubenswrapper[4790]: I1003 08:41:38.529442 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:38 crc kubenswrapper[4790]: I1003 08:41:38.529534 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:38 crc kubenswrapper[4790]: I1003 08:41:38.529558 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:38 crc kubenswrapper[4790]: I1003 08:41:38.529626 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:38 crc kubenswrapper[4790]: I1003 08:41:38.529650 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:38Z","lastTransitionTime":"2025-10-03T08:41:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:38 crc kubenswrapper[4790]: I1003 08:41:38.633266 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:38 crc kubenswrapper[4790]: I1003 08:41:38.633330 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:38 crc kubenswrapper[4790]: I1003 08:41:38.633341 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:38 crc kubenswrapper[4790]: I1003 08:41:38.633362 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:38 crc kubenswrapper[4790]: I1003 08:41:38.633377 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:38Z","lastTransitionTime":"2025-10-03T08:41:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:38 crc kubenswrapper[4790]: I1003 08:41:38.736842 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:38 crc kubenswrapper[4790]: I1003 08:41:38.736903 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:38 crc kubenswrapper[4790]: I1003 08:41:38.736920 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:38 crc kubenswrapper[4790]: I1003 08:41:38.736939 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:38 crc kubenswrapper[4790]: I1003 08:41:38.736952 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:38Z","lastTransitionTime":"2025-10-03T08:41:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:38 crc kubenswrapper[4790]: I1003 08:41:38.840934 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:38 crc kubenswrapper[4790]: I1003 08:41:38.841029 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:38 crc kubenswrapper[4790]: I1003 08:41:38.841054 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:38 crc kubenswrapper[4790]: I1003 08:41:38.841087 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:38 crc kubenswrapper[4790]: I1003 08:41:38.841113 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:38Z","lastTransitionTime":"2025-10-03T08:41:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:38 crc kubenswrapper[4790]: I1003 08:41:38.944153 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:38 crc kubenswrapper[4790]: I1003 08:41:38.944202 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:38 crc kubenswrapper[4790]: I1003 08:41:38.944214 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:38 crc kubenswrapper[4790]: I1003 08:41:38.944237 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:38 crc kubenswrapper[4790]: I1003 08:41:38.944251 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:38Z","lastTransitionTime":"2025-10-03T08:41:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:39 crc kubenswrapper[4790]: I1003 08:41:39.047063 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:39 crc kubenswrapper[4790]: I1003 08:41:39.047128 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:39 crc kubenswrapper[4790]: I1003 08:41:39.047142 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:39 crc kubenswrapper[4790]: I1003 08:41:39.047166 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:39 crc kubenswrapper[4790]: I1003 08:41:39.047181 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:39Z","lastTransitionTime":"2025-10-03T08:41:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:39 crc kubenswrapper[4790]: I1003 08:41:39.150106 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:39 crc kubenswrapper[4790]: I1003 08:41:39.150184 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:39 crc kubenswrapper[4790]: I1003 08:41:39.150204 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:39 crc kubenswrapper[4790]: I1003 08:41:39.150231 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:39 crc kubenswrapper[4790]: I1003 08:41:39.150249 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:39Z","lastTransitionTime":"2025-10-03T08:41:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:39 crc kubenswrapper[4790]: I1003 08:41:39.253611 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:39 crc kubenswrapper[4790]: I1003 08:41:39.253680 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:39 crc kubenswrapper[4790]: I1003 08:41:39.253699 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:39 crc kubenswrapper[4790]: I1003 08:41:39.253731 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:39 crc kubenswrapper[4790]: I1003 08:41:39.253755 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:39Z","lastTransitionTime":"2025-10-03T08:41:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:39 crc kubenswrapper[4790]: I1003 08:41:39.361440 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:39 crc kubenswrapper[4790]: I1003 08:41:39.361597 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:39 crc kubenswrapper[4790]: I1003 08:41:39.362735 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:39 crc kubenswrapper[4790]: I1003 08:41:39.362868 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:39 crc kubenswrapper[4790]: I1003 08:41:39.362899 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:39Z","lastTransitionTime":"2025-10-03T08:41:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:39 crc kubenswrapper[4790]: I1003 08:41:39.467306 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:39 crc kubenswrapper[4790]: I1003 08:41:39.467356 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:39 crc kubenswrapper[4790]: I1003 08:41:39.467369 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:39 crc kubenswrapper[4790]: I1003 08:41:39.467389 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:39 crc kubenswrapper[4790]: I1003 08:41:39.467402 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:39Z","lastTransitionTime":"2025-10-03T08:41:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:39 crc kubenswrapper[4790]: I1003 08:41:39.571455 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:39 crc kubenswrapper[4790]: I1003 08:41:39.571530 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:39 crc kubenswrapper[4790]: I1003 08:41:39.571547 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:39 crc kubenswrapper[4790]: I1003 08:41:39.571612 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:39 crc kubenswrapper[4790]: I1003 08:41:39.571651 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:39Z","lastTransitionTime":"2025-10-03T08:41:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:39 crc kubenswrapper[4790]: I1003 08:41:39.674557 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:39 crc kubenswrapper[4790]: I1003 08:41:39.674617 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:39 crc kubenswrapper[4790]: I1003 08:41:39.674629 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:39 crc kubenswrapper[4790]: I1003 08:41:39.674647 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:39 crc kubenswrapper[4790]: I1003 08:41:39.674658 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:39Z","lastTransitionTime":"2025-10-03T08:41:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:39 crc kubenswrapper[4790]: I1003 08:41:39.778031 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:39 crc kubenswrapper[4790]: I1003 08:41:39.778117 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:39 crc kubenswrapper[4790]: I1003 08:41:39.778141 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:39 crc kubenswrapper[4790]: I1003 08:41:39.778173 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:39 crc kubenswrapper[4790]: I1003 08:41:39.778196 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:39Z","lastTransitionTime":"2025-10-03T08:41:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:39 crc kubenswrapper[4790]: I1003 08:41:39.881682 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:39 crc kubenswrapper[4790]: I1003 08:41:39.882080 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:39 crc kubenswrapper[4790]: I1003 08:41:39.882196 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:39 crc kubenswrapper[4790]: I1003 08:41:39.882304 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:39 crc kubenswrapper[4790]: I1003 08:41:39.882383 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:39Z","lastTransitionTime":"2025-10-03T08:41:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:39 crc kubenswrapper[4790]: I1003 08:41:39.985498 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:39 crc kubenswrapper[4790]: I1003 08:41:39.985897 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:39 crc kubenswrapper[4790]: I1003 08:41:39.986071 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:39 crc kubenswrapper[4790]: I1003 08:41:39.986212 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:39 crc kubenswrapper[4790]: I1003 08:41:39.986340 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:39Z","lastTransitionTime":"2025-10-03T08:41:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:40 crc kubenswrapper[4790]: I1003 08:41:40.089946 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:40 crc kubenswrapper[4790]: I1003 08:41:40.090009 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:40 crc kubenswrapper[4790]: I1003 08:41:40.090024 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:40 crc kubenswrapper[4790]: I1003 08:41:40.090046 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:40 crc kubenswrapper[4790]: I1003 08:41:40.090061 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:40Z","lastTransitionTime":"2025-10-03T08:41:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:40 crc kubenswrapper[4790]: I1003 08:41:40.195003 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:40 crc kubenswrapper[4790]: I1003 08:41:40.195053 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:40 crc kubenswrapper[4790]: I1003 08:41:40.195065 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:40 crc kubenswrapper[4790]: I1003 08:41:40.195087 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:40 crc kubenswrapper[4790]: I1003 08:41:40.195101 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:40Z","lastTransitionTime":"2025-10-03T08:41:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:40 crc kubenswrapper[4790]: I1003 08:41:40.299180 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:40 crc kubenswrapper[4790]: I1003 08:41:40.299261 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:40 crc kubenswrapper[4790]: I1003 08:41:40.299289 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:40 crc kubenswrapper[4790]: I1003 08:41:40.299321 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:40 crc kubenswrapper[4790]: I1003 08:41:40.299340 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:40Z","lastTransitionTime":"2025-10-03T08:41:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:40 crc kubenswrapper[4790]: I1003 08:41:40.327954 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 08:41:40 crc kubenswrapper[4790]: E1003 08:41:40.328514 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 08:41:40 crc kubenswrapper[4790]: I1003 08:41:40.328083 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4vl95" Oct 03 08:41:40 crc kubenswrapper[4790]: I1003 08:41:40.328095 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 08:41:40 crc kubenswrapper[4790]: I1003 08:41:40.328009 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 08:41:40 crc kubenswrapper[4790]: E1003 08:41:40.329663 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 08:41:40 crc kubenswrapper[4790]: E1003 08:41:40.329822 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 08:41:40 crc kubenswrapper[4790]: E1003 08:41:40.329952 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4vl95" podUID="9236b957-81cb-40bc-969e-a2057884ba50" Oct 03 08:41:40 crc kubenswrapper[4790]: I1003 08:41:40.402387 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:40 crc kubenswrapper[4790]: I1003 08:41:40.402435 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:40 crc kubenswrapper[4790]: I1003 08:41:40.402446 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:40 crc kubenswrapper[4790]: I1003 08:41:40.402464 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:40 crc kubenswrapper[4790]: I1003 08:41:40.402489 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:40Z","lastTransitionTime":"2025-10-03T08:41:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:40 crc kubenswrapper[4790]: I1003 08:41:40.505487 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:40 crc kubenswrapper[4790]: I1003 08:41:40.505523 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:40 crc kubenswrapper[4790]: I1003 08:41:40.505533 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:40 crc kubenswrapper[4790]: I1003 08:41:40.505550 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:40 crc kubenswrapper[4790]: I1003 08:41:40.505561 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:40Z","lastTransitionTime":"2025-10-03T08:41:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:40 crc kubenswrapper[4790]: I1003 08:41:40.608657 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:40 crc kubenswrapper[4790]: I1003 08:41:40.608748 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:40 crc kubenswrapper[4790]: I1003 08:41:40.608777 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:40 crc kubenswrapper[4790]: I1003 08:41:40.608810 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:40 crc kubenswrapper[4790]: I1003 08:41:40.608832 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:40Z","lastTransitionTime":"2025-10-03T08:41:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:40 crc kubenswrapper[4790]: I1003 08:41:40.712303 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:40 crc kubenswrapper[4790]: I1003 08:41:40.712929 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:40 crc kubenswrapper[4790]: I1003 08:41:40.712944 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:40 crc kubenswrapper[4790]: I1003 08:41:40.712964 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:40 crc kubenswrapper[4790]: I1003 08:41:40.712976 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:40Z","lastTransitionTime":"2025-10-03T08:41:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:40 crc kubenswrapper[4790]: I1003 08:41:40.816053 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:40 crc kubenswrapper[4790]: I1003 08:41:40.816129 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:40 crc kubenswrapper[4790]: I1003 08:41:40.816158 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:40 crc kubenswrapper[4790]: I1003 08:41:40.816192 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:40 crc kubenswrapper[4790]: I1003 08:41:40.816218 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:40Z","lastTransitionTime":"2025-10-03T08:41:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:40 crc kubenswrapper[4790]: I1003 08:41:40.919710 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:40 crc kubenswrapper[4790]: I1003 08:41:40.919768 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:40 crc kubenswrapper[4790]: I1003 08:41:40.919780 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:40 crc kubenswrapper[4790]: I1003 08:41:40.919798 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:40 crc kubenswrapper[4790]: I1003 08:41:40.919809 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:40Z","lastTransitionTime":"2025-10-03T08:41:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:41 crc kubenswrapper[4790]: I1003 08:41:41.022737 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:41 crc kubenswrapper[4790]: I1003 08:41:41.022802 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:41 crc kubenswrapper[4790]: I1003 08:41:41.022815 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:41 crc kubenswrapper[4790]: I1003 08:41:41.022841 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:41 crc kubenswrapper[4790]: I1003 08:41:41.022855 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:41Z","lastTransitionTime":"2025-10-03T08:41:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:41 crc kubenswrapper[4790]: I1003 08:41:41.125562 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:41 crc kubenswrapper[4790]: I1003 08:41:41.125653 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:41 crc kubenswrapper[4790]: I1003 08:41:41.125667 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:41 crc kubenswrapper[4790]: I1003 08:41:41.125690 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:41 crc kubenswrapper[4790]: I1003 08:41:41.125705 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:41Z","lastTransitionTime":"2025-10-03T08:41:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:41 crc kubenswrapper[4790]: I1003 08:41:41.229253 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:41 crc kubenswrapper[4790]: I1003 08:41:41.230037 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:41 crc kubenswrapper[4790]: I1003 08:41:41.230307 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:41 crc kubenswrapper[4790]: I1003 08:41:41.230385 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:41 crc kubenswrapper[4790]: I1003 08:41:41.230445 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:41Z","lastTransitionTime":"2025-10-03T08:41:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:41 crc kubenswrapper[4790]: I1003 08:41:41.333329 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:41 crc kubenswrapper[4790]: I1003 08:41:41.334508 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:41 crc kubenswrapper[4790]: I1003 08:41:41.334695 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:41 crc kubenswrapper[4790]: I1003 08:41:41.334789 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:41 crc kubenswrapper[4790]: I1003 08:41:41.334861 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:41Z","lastTransitionTime":"2025-10-03T08:41:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:41 crc kubenswrapper[4790]: I1003 08:41:41.438406 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:41 crc kubenswrapper[4790]: I1003 08:41:41.438461 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:41 crc kubenswrapper[4790]: I1003 08:41:41.438474 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:41 crc kubenswrapper[4790]: I1003 08:41:41.438494 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:41 crc kubenswrapper[4790]: I1003 08:41:41.438510 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:41Z","lastTransitionTime":"2025-10-03T08:41:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:41 crc kubenswrapper[4790]: I1003 08:41:41.541472 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:41 crc kubenswrapper[4790]: I1003 08:41:41.541568 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:41 crc kubenswrapper[4790]: I1003 08:41:41.541644 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:41 crc kubenswrapper[4790]: I1003 08:41:41.541695 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:41 crc kubenswrapper[4790]: I1003 08:41:41.541717 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:41Z","lastTransitionTime":"2025-10-03T08:41:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:41 crc kubenswrapper[4790]: I1003 08:41:41.644811 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:41 crc kubenswrapper[4790]: I1003 08:41:41.644899 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:41 crc kubenswrapper[4790]: I1003 08:41:41.644927 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:41 crc kubenswrapper[4790]: I1003 08:41:41.644961 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:41 crc kubenswrapper[4790]: I1003 08:41:41.644982 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:41Z","lastTransitionTime":"2025-10-03T08:41:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:41 crc kubenswrapper[4790]: I1003 08:41:41.748330 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:41 crc kubenswrapper[4790]: I1003 08:41:41.748726 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:41 crc kubenswrapper[4790]: I1003 08:41:41.748925 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:41 crc kubenswrapper[4790]: I1003 08:41:41.749059 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:41 crc kubenswrapper[4790]: I1003 08:41:41.749151 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:41Z","lastTransitionTime":"2025-10-03T08:41:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:41 crc kubenswrapper[4790]: I1003 08:41:41.853007 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:41 crc kubenswrapper[4790]: I1003 08:41:41.853118 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:41 crc kubenswrapper[4790]: I1003 08:41:41.853171 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:41 crc kubenswrapper[4790]: I1003 08:41:41.853199 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:41 crc kubenswrapper[4790]: I1003 08:41:41.853215 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:41Z","lastTransitionTime":"2025-10-03T08:41:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:41 crc kubenswrapper[4790]: I1003 08:41:41.956569 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:41 crc kubenswrapper[4790]: I1003 08:41:41.957074 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:41 crc kubenswrapper[4790]: I1003 08:41:41.957292 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:41 crc kubenswrapper[4790]: I1003 08:41:41.957494 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:41 crc kubenswrapper[4790]: I1003 08:41:41.957760 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:41Z","lastTransitionTime":"2025-10-03T08:41:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:42 crc kubenswrapper[4790]: I1003 08:41:42.061551 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:42 crc kubenswrapper[4790]: I1003 08:41:42.061994 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:42 crc kubenswrapper[4790]: I1003 08:41:42.062128 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:42 crc kubenswrapper[4790]: I1003 08:41:42.062243 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:42 crc kubenswrapper[4790]: I1003 08:41:42.062365 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:42Z","lastTransitionTime":"2025-10-03T08:41:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:42 crc kubenswrapper[4790]: I1003 08:41:42.166120 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:42 crc kubenswrapper[4790]: I1003 08:41:42.166211 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:42 crc kubenswrapper[4790]: I1003 08:41:42.166267 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:42 crc kubenswrapper[4790]: I1003 08:41:42.166291 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:42 crc kubenswrapper[4790]: I1003 08:41:42.166304 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:42Z","lastTransitionTime":"2025-10-03T08:41:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:42 crc kubenswrapper[4790]: I1003 08:41:42.270395 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:42 crc kubenswrapper[4790]: I1003 08:41:42.270925 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:42 crc kubenswrapper[4790]: I1003 08:41:42.271092 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:42 crc kubenswrapper[4790]: I1003 08:41:42.271252 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:42 crc kubenswrapper[4790]: I1003 08:41:42.271480 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:42Z","lastTransitionTime":"2025-10-03T08:41:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:42 crc kubenswrapper[4790]: I1003 08:41:42.327426 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 08:41:42 crc kubenswrapper[4790]: E1003 08:41:42.327643 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 08:41:42 crc kubenswrapper[4790]: I1003 08:41:42.327774 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 08:41:42 crc kubenswrapper[4790]: E1003 08:41:42.327966 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 08:41:42 crc kubenswrapper[4790]: I1003 08:41:42.328040 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4vl95" Oct 03 08:41:42 crc kubenswrapper[4790]: I1003 08:41:42.328406 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 08:41:42 crc kubenswrapper[4790]: E1003 08:41:42.328692 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4vl95" podUID="9236b957-81cb-40bc-969e-a2057884ba50" Oct 03 08:41:42 crc kubenswrapper[4790]: E1003 08:41:42.328931 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 08:41:42 crc kubenswrapper[4790]: I1003 08:41:42.352078 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Oct 03 08:41:42 crc kubenswrapper[4790]: I1003 08:41:42.374681 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:42 crc kubenswrapper[4790]: I1003 08:41:42.374980 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:42 crc kubenswrapper[4790]: I1003 08:41:42.375122 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:42 crc kubenswrapper[4790]: I1003 08:41:42.375260 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:42 crc kubenswrapper[4790]: I1003 08:41:42.375393 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:42Z","lastTransitionTime":"2025-10-03T08:41:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:42 crc kubenswrapper[4790]: I1003 08:41:42.478711 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:42 crc kubenswrapper[4790]: I1003 08:41:42.479112 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:42 crc kubenswrapper[4790]: I1003 08:41:42.479300 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:42 crc kubenswrapper[4790]: I1003 08:41:42.479499 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:42 crc kubenswrapper[4790]: I1003 08:41:42.479735 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:42Z","lastTransitionTime":"2025-10-03T08:41:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:42 crc kubenswrapper[4790]: I1003 08:41:42.548505 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:42 crc kubenswrapper[4790]: I1003 08:41:42.548993 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:42 crc kubenswrapper[4790]: I1003 08:41:42.549107 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:42 crc kubenswrapper[4790]: I1003 08:41:42.549225 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:42 crc kubenswrapper[4790]: I1003 08:41:42.549314 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:42Z","lastTransitionTime":"2025-10-03T08:41:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:42 crc kubenswrapper[4790]: E1003 08:41:42.564808 4790 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T08:41:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T08:41:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T08:41:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T08:41:42Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T08:41:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T08:41:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T08:41:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T08:41:42Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e374b313-7030-4739-b7f1-da1c9b39fe8d\\\",\\\"systemUUID\\\":\\\"312be368-bbdc-461d-b74b-f792414190de\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:41:42Z is after 2025-08-24T17:21:41Z" Oct 03 08:41:42 crc kubenswrapper[4790]: I1003 08:41:42.570497 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:42 crc kubenswrapper[4790]: I1003 08:41:42.570560 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:42 crc kubenswrapper[4790]: I1003 08:41:42.570607 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:42 crc kubenswrapper[4790]: I1003 08:41:42.570639 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:42 crc kubenswrapper[4790]: I1003 08:41:42.570656 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:42Z","lastTransitionTime":"2025-10-03T08:41:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:42 crc kubenswrapper[4790]: E1003 08:41:42.588228 4790 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T08:41:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T08:41:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T08:41:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T08:41:42Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T08:41:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T08:41:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T08:41:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T08:41:42Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e374b313-7030-4739-b7f1-da1c9b39fe8d\\\",\\\"systemUUID\\\":\\\"312be368-bbdc-461d-b74b-f792414190de\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:41:42Z is after 2025-08-24T17:21:41Z" Oct 03 08:41:42 crc kubenswrapper[4790]: I1003 08:41:42.592749 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:42 crc kubenswrapper[4790]: I1003 08:41:42.592807 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:42 crc kubenswrapper[4790]: I1003 08:41:42.592821 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:42 crc kubenswrapper[4790]: I1003 08:41:42.592849 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:42 crc kubenswrapper[4790]: I1003 08:41:42.592862 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:42Z","lastTransitionTime":"2025-10-03T08:41:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:42 crc kubenswrapper[4790]: E1003 08:41:42.605236 4790 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T08:41:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T08:41:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T08:41:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T08:41:42Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T08:41:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T08:41:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T08:41:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T08:41:42Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e374b313-7030-4739-b7f1-da1c9b39fe8d\\\",\\\"systemUUID\\\":\\\"312be368-bbdc-461d-b74b-f792414190de\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:41:42Z is after 2025-08-24T17:21:41Z" Oct 03 08:41:42 crc kubenswrapper[4790]: I1003 08:41:42.608645 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:42 crc kubenswrapper[4790]: I1003 08:41:42.608756 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:42 crc kubenswrapper[4790]: I1003 08:41:42.608823 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:42 crc kubenswrapper[4790]: I1003 08:41:42.609150 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:42 crc kubenswrapper[4790]: I1003 08:41:42.609224 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:42Z","lastTransitionTime":"2025-10-03T08:41:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:42 crc kubenswrapper[4790]: E1003 08:41:42.619820 4790 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T08:41:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T08:41:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T08:41:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T08:41:42Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T08:41:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T08:41:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T08:41:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T08:41:42Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e374b313-7030-4739-b7f1-da1c9b39fe8d\\\",\\\"systemUUID\\\":\\\"312be368-bbdc-461d-b74b-f792414190de\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:41:42Z is after 2025-08-24T17:21:41Z" Oct 03 08:41:42 crc kubenswrapper[4790]: I1003 08:41:42.623294 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:42 crc kubenswrapper[4790]: I1003 08:41:42.623335 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:42 crc kubenswrapper[4790]: I1003 08:41:42.623344 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:42 crc kubenswrapper[4790]: I1003 08:41:42.623366 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:42 crc kubenswrapper[4790]: I1003 08:41:42.623387 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:42Z","lastTransitionTime":"2025-10-03T08:41:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:42 crc kubenswrapper[4790]: E1003 08:41:42.634128 4790 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T08:41:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T08:41:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T08:41:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T08:41:42Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T08:41:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T08:41:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T08:41:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T08:41:42Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e374b313-7030-4739-b7f1-da1c9b39fe8d\\\",\\\"systemUUID\\\":\\\"312be368-bbdc-461d-b74b-f792414190de\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:41:42Z is after 2025-08-24T17:21:41Z" Oct 03 08:41:42 crc kubenswrapper[4790]: E1003 08:41:42.634252 4790 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 03 08:41:42 crc kubenswrapper[4790]: I1003 08:41:42.636209 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:42 crc kubenswrapper[4790]: I1003 08:41:42.636240 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:42 crc kubenswrapper[4790]: I1003 08:41:42.636250 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:42 crc kubenswrapper[4790]: I1003 08:41:42.636266 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:42 crc kubenswrapper[4790]: I1003 08:41:42.636277 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:42Z","lastTransitionTime":"2025-10-03T08:41:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:42 crc kubenswrapper[4790]: I1003 08:41:42.739433 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:42 crc kubenswrapper[4790]: I1003 08:41:42.739480 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:42 crc kubenswrapper[4790]: I1003 08:41:42.739497 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:42 crc kubenswrapper[4790]: I1003 08:41:42.739521 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:42 crc kubenswrapper[4790]: I1003 08:41:42.739537 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:42Z","lastTransitionTime":"2025-10-03T08:41:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:42 crc kubenswrapper[4790]: I1003 08:41:42.842363 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:42 crc kubenswrapper[4790]: I1003 08:41:42.842823 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:42 crc kubenswrapper[4790]: I1003 08:41:42.842960 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:42 crc kubenswrapper[4790]: I1003 08:41:42.843095 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:42 crc kubenswrapper[4790]: I1003 08:41:42.843188 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:42Z","lastTransitionTime":"2025-10-03T08:41:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:42 crc kubenswrapper[4790]: I1003 08:41:42.946116 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:42 crc kubenswrapper[4790]: I1003 08:41:42.946194 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:42 crc kubenswrapper[4790]: I1003 08:41:42.946217 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:42 crc kubenswrapper[4790]: I1003 08:41:42.946250 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:42 crc kubenswrapper[4790]: I1003 08:41:42.946269 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:42Z","lastTransitionTime":"2025-10-03T08:41:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:43 crc kubenswrapper[4790]: I1003 08:41:43.049667 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:43 crc kubenswrapper[4790]: I1003 08:41:43.049743 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:43 crc kubenswrapper[4790]: I1003 08:41:43.049763 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:43 crc kubenswrapper[4790]: I1003 08:41:43.049793 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:43 crc kubenswrapper[4790]: I1003 08:41:43.049813 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:43Z","lastTransitionTime":"2025-10-03T08:41:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:43 crc kubenswrapper[4790]: I1003 08:41:43.154016 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:43 crc kubenswrapper[4790]: I1003 08:41:43.154099 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:43 crc kubenswrapper[4790]: I1003 08:41:43.154123 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:43 crc kubenswrapper[4790]: I1003 08:41:43.154170 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:43 crc kubenswrapper[4790]: I1003 08:41:43.154196 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:43Z","lastTransitionTime":"2025-10-03T08:41:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:43 crc kubenswrapper[4790]: I1003 08:41:43.259194 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:43 crc kubenswrapper[4790]: I1003 08:41:43.259274 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:43 crc kubenswrapper[4790]: I1003 08:41:43.259298 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:43 crc kubenswrapper[4790]: I1003 08:41:43.259330 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:43 crc kubenswrapper[4790]: I1003 08:41:43.259352 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:43Z","lastTransitionTime":"2025-10-03T08:41:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:43 crc kubenswrapper[4790]: I1003 08:41:43.365265 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:43 crc kubenswrapper[4790]: I1003 08:41:43.365323 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:43 crc kubenswrapper[4790]: I1003 08:41:43.365336 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:43 crc kubenswrapper[4790]: I1003 08:41:43.365359 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:43 crc kubenswrapper[4790]: I1003 08:41:43.365389 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:43Z","lastTransitionTime":"2025-10-03T08:41:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:43 crc kubenswrapper[4790]: I1003 08:41:43.469822 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:43 crc kubenswrapper[4790]: I1003 08:41:43.470384 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:43 crc kubenswrapper[4790]: I1003 08:41:43.470603 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:43 crc kubenswrapper[4790]: I1003 08:41:43.470773 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:43 crc kubenswrapper[4790]: I1003 08:41:43.470939 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:43Z","lastTransitionTime":"2025-10-03T08:41:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:43 crc kubenswrapper[4790]: I1003 08:41:43.573980 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:43 crc kubenswrapper[4790]: I1003 08:41:43.574036 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:43 crc kubenswrapper[4790]: I1003 08:41:43.574053 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:43 crc kubenswrapper[4790]: I1003 08:41:43.574076 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:43 crc kubenswrapper[4790]: I1003 08:41:43.574093 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:43Z","lastTransitionTime":"2025-10-03T08:41:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:43 crc kubenswrapper[4790]: I1003 08:41:43.677084 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:43 crc kubenswrapper[4790]: I1003 08:41:43.677151 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:43 crc kubenswrapper[4790]: I1003 08:41:43.677162 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:43 crc kubenswrapper[4790]: I1003 08:41:43.677185 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:43 crc kubenswrapper[4790]: I1003 08:41:43.677197 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:43Z","lastTransitionTime":"2025-10-03T08:41:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:43 crc kubenswrapper[4790]: I1003 08:41:43.779564 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:43 crc kubenswrapper[4790]: I1003 08:41:43.779714 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:43 crc kubenswrapper[4790]: I1003 08:41:43.779727 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:43 crc kubenswrapper[4790]: I1003 08:41:43.779745 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:43 crc kubenswrapper[4790]: I1003 08:41:43.779765 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:43Z","lastTransitionTime":"2025-10-03T08:41:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:43 crc kubenswrapper[4790]: I1003 08:41:43.882957 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:43 crc kubenswrapper[4790]: I1003 08:41:43.883020 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:43 crc kubenswrapper[4790]: I1003 08:41:43.883031 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:43 crc kubenswrapper[4790]: I1003 08:41:43.883051 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:43 crc kubenswrapper[4790]: I1003 08:41:43.883065 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:43Z","lastTransitionTime":"2025-10-03T08:41:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:43 crc kubenswrapper[4790]: I1003 08:41:43.985822 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:43 crc kubenswrapper[4790]: I1003 08:41:43.985903 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:43 crc kubenswrapper[4790]: I1003 08:41:43.985930 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:43 crc kubenswrapper[4790]: I1003 08:41:43.985967 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:43 crc kubenswrapper[4790]: I1003 08:41:43.985991 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:43Z","lastTransitionTime":"2025-10-03T08:41:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:44 crc kubenswrapper[4790]: I1003 08:41:44.089462 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:44 crc kubenswrapper[4790]: I1003 08:41:44.089542 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:44 crc kubenswrapper[4790]: I1003 08:41:44.089564 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:44 crc kubenswrapper[4790]: I1003 08:41:44.089630 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:44 crc kubenswrapper[4790]: I1003 08:41:44.089650 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:44Z","lastTransitionTime":"2025-10-03T08:41:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:44 crc kubenswrapper[4790]: I1003 08:41:44.193277 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:44 crc kubenswrapper[4790]: I1003 08:41:44.193359 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:44 crc kubenswrapper[4790]: I1003 08:41:44.193385 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:44 crc kubenswrapper[4790]: I1003 08:41:44.193419 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:44 crc kubenswrapper[4790]: I1003 08:41:44.193444 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:44Z","lastTransitionTime":"2025-10-03T08:41:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:44 crc kubenswrapper[4790]: I1003 08:41:44.296922 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:44 crc kubenswrapper[4790]: I1003 08:41:44.297006 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:44 crc kubenswrapper[4790]: I1003 08:41:44.297030 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:44 crc kubenswrapper[4790]: I1003 08:41:44.297060 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:44 crc kubenswrapper[4790]: I1003 08:41:44.297084 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:44Z","lastTransitionTime":"2025-10-03T08:41:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:44 crc kubenswrapper[4790]: I1003 08:41:44.327984 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4vl95" Oct 03 08:41:44 crc kubenswrapper[4790]: I1003 08:41:44.328109 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 08:41:44 crc kubenswrapper[4790]: I1003 08:41:44.328184 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 08:41:44 crc kubenswrapper[4790]: I1003 08:41:44.327984 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 08:41:44 crc kubenswrapper[4790]: E1003 08:41:44.328280 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4vl95" podUID="9236b957-81cb-40bc-969e-a2057884ba50" Oct 03 08:41:44 crc kubenswrapper[4790]: E1003 08:41:44.328442 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 08:41:44 crc kubenswrapper[4790]: E1003 08:41:44.328683 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 08:41:44 crc kubenswrapper[4790]: E1003 08:41:44.328835 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 08:41:44 crc kubenswrapper[4790]: I1003 08:41:44.362646 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"019ea9ec-f9ed-4136-bda1-e0766548367a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd9a12a9a1b0afd247d92ff2723994cdeba69d3cbcfe00eb589ffb1be4b7b374\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://637948fd9e320da9f1d3edbb91399d9bec76dd14c1e12f23975484c96f258a4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cab8364048d83836f12d33731b82fcadd0527f407da4b5343733f5e245d29a64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://70b83da4dd337f5d3e58363e69374a5e11b89f25c0b50578a753542a14b6b0d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f031864d14c6032eab35a3563f92c99253446c550dd27903656bde7e5892fc8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://400a812d6dcd1be3f7dd3023c3e52a4896f677984accef7ecfd27a7770afe056\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://400a812d6dcd1be3f7dd3023c3e52a4896f677984accef7ecfd27a7770afe056\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:40:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://047ddbddda224cd9235029ce051d9e3594a329bf0afee33fe82f357986193386\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://047ddbddda224cd9235029ce051d9e3594a329bf0afee33fe82f357986193386\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:40:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:16Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://9386a8f03b1066fc35cd6a10d0eb073b525b6c8802d3211ac60488f6a231d837\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9386a8f03b1066fc35cd6a10d0eb073b525b6c8802d3211ac60488f6a231d837\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:40:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:14Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:41:44Z is after 2025-08-24T17:21:41Z" Oct 03 08:41:44 crc kubenswrapper[4790]: I1003 08:41:44.381426 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:41:44Z is after 2025-08-24T17:21:41Z" Oct 03 08:41:44 crc kubenswrapper[4790]: I1003 08:41:44.405156 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9fnsr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e01f9373-74e4-439b-bd7f-ad00e60a3284\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45be0671b4ad4d5e294f87dba5d52747298b00a3ebd9a13f2845881c0f1846a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krfjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://876c35e32e5877c2a69cdbb54b84ec0bc0daebd05c7847875a1c122aafac2057\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://876c35e32e5877c2a69cdbb54b84ec0bc0daebd05c7847875a1c122aafac2057\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krfjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://938ff3fffad5347212d00d02f96c105cd5e6252a4d6a7ada258f0367da3f8e77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://938ff3fffad5347212d00d02f96c105cd5e6252a4d6a7ada258f0367da3f8e77\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:40:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krfjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca99fc95f4bc463b8e638cfa1e6dc6be77fc4111b6ec61b4b2913fdf91f46671\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ca99fc95f4bc463b8e638cfa1e6dc6be77fc4111b6ec61b4b2913fdf91f46671\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:40:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krfjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9667f80988a61a4f26f6f2b8b047ef94bdc6fa052083d5d1285f5bd9235615ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9667f80988a61a4f26f6f2b8b047ef94bdc6fa052083d5d1285f5bd9235615ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:40:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krfjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a4e7fb7b6f922e6e0fa2a7fcb5476ba549a6aa21dc88ee145bc55f4dac0912bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a4e7fb7b6f922e6e0fa2a7fcb5476ba549a6aa21dc88ee145bc55f4dac0912bb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:40:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krfjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://76a8239bfbb38c5858f1f9b0b01b9552463d8b2774564d7cdacb6757bf053656\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76a8239bfbb38c5858f1f9b0b01b9552463d8b2774564d7cdacb6757bf053656\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:40:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krfjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9fnsr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:41:44Z is after 2025-08-24T17:21:41Z" Oct 03 08:41:44 crc kubenswrapper[4790]: I1003 08:41:44.406233 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:44 crc kubenswrapper[4790]: I1003 08:41:44.406493 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:44 crc kubenswrapper[4790]: I1003 08:41:44.406722 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:44 crc kubenswrapper[4790]: I1003 08:41:44.406926 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:44 crc kubenswrapper[4790]: I1003 08:41:44.407084 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:44Z","lastTransitionTime":"2025-10-03T08:41:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:44 crc kubenswrapper[4790]: I1003 08:41:44.422312 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-7bpwn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a3923f73-5efb-4ce3-b68f-0c4e1e8eca4b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://305c950f0352fe1bec1c151dc89eed1389144ad5ee321458d48c6609e688bb73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gt4gf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:37Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-7bpwn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:41:44Z is after 2025-08-24T17:21:41Z" Oct 03 08:41:44 crc kubenswrapper[4790]: I1003 08:41:44.437734 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-fn764" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12525ca4-5242-41bc-b7ba-2c1ab5a45892\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ee4f33c772a396fdd8d2a6f958d402139cb81cfd79da0f32c83bbe9adbe0801\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2t6qw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d79cdfa897fe5ca615930826e3a153a184ab7977626a9837dbcdbddbe4d27af6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2t6qw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-fn764\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:41:44Z is after 2025-08-24T17:21:41Z" Oct 03 08:41:44 crc kubenswrapper[4790]: I1003 08:41:44.459487 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://28b99cc027b4390bdc7e3ed519bb798eb4a7b825ff29e4574f3420b9922265b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:41:44Z is after 2025-08-24T17:21:41Z" Oct 03 08:41:44 crc kubenswrapper[4790]: I1003 08:41:44.474882 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:41:44Z is after 2025-08-24T17:21:41Z" Oct 03 08:41:44 crc kubenswrapper[4790]: I1003 08:41:44.494701 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:41:44Z is after 2025-08-24T17:21:41Z" Oct 03 08:41:44 crc kubenswrapper[4790]: I1003 08:41:44.510672 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:44 crc kubenswrapper[4790]: I1003 08:41:44.510720 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:44 crc kubenswrapper[4790]: I1003 08:41:44.510729 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:44 crc kubenswrapper[4790]: I1003 08:41:44.510750 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:44 crc kubenswrapper[4790]: I1003 08:41:44.510760 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:44Z","lastTransitionTime":"2025-10-03T08:41:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:44 crc kubenswrapper[4790]: I1003 08:41:44.512562 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"36eae06a-d8be-4128-8d68-acb32e1f011b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a97a1a7f3aba494cd0da6bcd15052a85d70642642a62d108ac0be51ce8e3d302\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww2bb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8c3aea222eb433729d8eac107ebcdb306f649bef25942cd0390bfc1bbda5c74\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww2bb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:35Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-zqj4j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:41:44Z is after 2025-08-24T17:21:41Z" Oct 03 08:41:44 crc kubenswrapper[4790]: I1003 08:41:44.542532 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vsjvh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa372dcc-63aa-48e5-b153-7739078026ef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e69102d376e90485ca565d34e18dbc12067e166f0bdf3c4325a0873f9f2b9b0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52pgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e52e55afef7b68f05e0c7d975d4497ee57dd9fa5aef7f72e9ca7441ea1633c05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52pgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55b4e8ab33316a7f7057ee723ebd49eb0adcd7270124f084b2828ce538ed3297\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52pgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f2aaa340e3927261ef0119f786b348e2091ddcd4b791619cc521122cc0774e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52pgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff0f1dd623e164ebe1862661c5905a38bec3aa1dd5c61982eebde25d0feb4c6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52pgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27b1573f76936a5148d24accaa7fdc1a1da552fb56b348812445e7f1c1e4cbef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52pgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://873da5acf186f02e995f5db997cebf47651cb6a2b87d4641ecb0318c004619e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://873da5acf186f02e995f5db997cebf47651cb6a2b87d4641ecb0318c004619e8\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-03T08:41:34Z\\\",\\\"message\\\":\\\"itches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}, built lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-ingress-operator/metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-ingress-operator/metrics\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.244\\\\\\\", Port:9393, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI1003 08:41:34.148688 6814 model_client.go:382] Update operations generated as: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-ingress-operator/metrics]} name:Service_openshift-ingress-operator/metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.244:9393:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {d8772e82-b0a4-4596-87d3-3d517c13344b}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lo\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T08:41:33Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-vsjvh_openshift-ovn-kubernetes(fa372dcc-63aa-48e5-b153-7739078026ef)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52pgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74d44e422b0cbbd3793553dad98475d5a93ac38530a167db0dd1ef152edac010\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52pgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e45c39722f37fc673cee930ca10396bba43b56e19e61aa99bd2b89f131341cb4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e45c39722f37fc673cee930ca10396bba43b56e19e61aa99bd2b89f131341cb4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:40:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52pgl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vsjvh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:41:44Z is after 2025-08-24T17:21:41Z" Oct 03 08:41:44 crc kubenswrapper[4790]: I1003 08:41:44.557528 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-4vl95" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9236b957-81cb-40bc-969e-a2057884ba50\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j2gd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j2gd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:49Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-4vl95\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:41:44Z is after 2025-08-24T17:21:41Z" Oct 03 08:41:44 crc kubenswrapper[4790]: I1003 08:41:44.569340 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e780b80f-57b7-4b09-8df8-ab5893b7dbd6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac215e99f46978c6b11677b4a1965fb4c1e16820ff221134062941eb162085ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://836c0a93c9a8ca16b512613d02dd11de6a852bc04e06dd03ae74c981f093f676\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://836c0a93c9a8ca16b512613d02dd11de6a852bc04e06dd03ae74c981f093f676\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:40:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:14Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:41:44Z is after 2025-08-24T17:21:41Z" Oct 03 08:41:44 crc kubenswrapper[4790]: I1003 08:41:44.587361 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"535765e1-fd5d-4501-b13e-684c5f08b296\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:41:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:41:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2e0caa9033c82a625d9594342d4e58052cfbfea8251bc0fa2f1b102ba235b5ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db5e8a267e2821ea1c1628ff477a4efc3c5bcc78538460b0f30a160bd738dcee\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d85a637b07da3de31d73486c444bf86e7e2fd799180fc1cd9ab56fee61afaa9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c713cd887b75dc54dfefd8b613f5bdb78ade644b31fc7db5e789414fcacad4b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://905a99c09cfaadc3eddc17fa6e509098364525d7a03add46a97c98dc08def8a2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-03T08:40:30Z\\\",\\\"message\\\":\\\"file observer\\\\nW1003 08:40:30.396497 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1003 08:40:30.399279 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1003 08:40:30.400119 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-247026750/tls.crt::/tmp/serving-cert-247026750/tls.key\\\\\\\"\\\\nI1003 08:40:30.809123 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1003 08:40:30.812812 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1003 08:40:30.812831 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1003 08:40:30.812852 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1003 08:40:30.812858 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1003 08:40:30.817440 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1003 08:40:30.817463 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1003 08:40:30.817482 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 08:40:30.817491 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1003 08:40:30.817495 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1003 08:40:30.817498 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1003 08:40:30.817501 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1003 08:40:30.817504 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1003 08:40:30.821080 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4591d1f2d844cb16eef19a65aaad416898d23ca5ce266e4863a9810d49ff4ba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://444e973af493d5aca2c924bcaee20abca5f1281e6b9c9f364e4603bf622d0e48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://444e973af493d5aca2c924bcaee20abca5f1281e6b9c9f364e4603bf622d0e48\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:40:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:14Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:41:44Z is after 2025-08-24T17:21:41Z" Oct 03 08:41:44 crc kubenswrapper[4790]: I1003 08:41:44.599614 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gprtl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c5bfb61-e69a-4576-b200-18dd7f38c7f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dcb3e7b6550f0ad490e187c2154ec53cb427cbab7acaf0618eb4de188eb724c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66xc5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:34Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gprtl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:41:44Z is after 2025-08-24T17:21:41Z" Oct 03 08:41:44 crc kubenswrapper[4790]: I1003 08:41:44.615217 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:44 crc kubenswrapper[4790]: I1003 08:41:44.615283 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:44 crc kubenswrapper[4790]: I1003 08:41:44.615304 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:44 crc kubenswrapper[4790]: I1003 08:41:44.615328 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:44 crc kubenswrapper[4790]: I1003 08:41:44.615347 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:44Z","lastTransitionTime":"2025-10-03T08:41:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:44 crc kubenswrapper[4790]: I1003 08:41:44.615936 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3362615c1396a6051d3770c79e4b67eea9d0c9df2a1a45381f729c4120970818\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://41113d498e57788a4c3162e8e981f5b9d9a5a7d49ee50b15085688fbc6b67f4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:41:44Z is after 2025-08-24T17:21:41Z" Oct 03 08:41:44 crc kubenswrapper[4790]: I1003 08:41:44.633182 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7ddb59e5-6feb-471d-9594-f86e6993a2e4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b0138e08174582c0c1388716f1f6b9e42a16b009f1a467803c4dacd50c3d80ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c575d680c9bc994b8c2e97e720d794138153bc3329c4b7f8ef81115a0245d0cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://027dd0df8d83044cff9f4e8c87bef3bb18e2169bb93e615486fd1e9b5a17b86e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e837e81f3b907d8506f674d13a9bc8f20f63e6c40e1f952cd57b715855ec9bf9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:14Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:41:44Z is after 2025-08-24T17:21:41Z" Oct 03 08:41:44 crc kubenswrapper[4790]: I1003 08:41:44.651935 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7108359a-23ed-4464-a43a-6a1ef9877500\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:41:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:41:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca9eba35e4f0edf15874f10d6b11af0b09a3c3a43eaf8340ec5604122463dc8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d8b7cc917b16307413ed2fb2c91cf91b438cf0499138176e99562d0415bf872c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://55ba2aa3b5d8466c1193dab58f6add86f97ed9065aaa71ba877ad002d7b30ccc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3486e595b261af48bff933da2e9a7c49631e4523babd35cc054b39291e410512\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3486e595b261af48bff933da2e9a7c49631e4523babd35cc054b39291e410512\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-03T08:40:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:15Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:14Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:41:44Z is after 2025-08-24T17:21:41Z" Oct 03 08:41:44 crc kubenswrapper[4790]: I1003 08:41:44.667362 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08d57a60251c60c6c6a2cab2b862c322c01b531ca1aa12ff4b0933bee9eb95f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:40:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:41:44Z is after 2025-08-24T17:21:41Z" Oct 03 08:41:44 crc kubenswrapper[4790]: I1003 08:41:44.689232 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-hncl2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f2f5631d-8486-44f9-81b0-ee78515e7866\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:40:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:41:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-03T08:41:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://934c9d6aafe23db007c316b08845e9d67eecf0072ba180fa513fa213245cc37c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3fde680972c36057c9e06b0f745acbfb2e6ab399b14d6c25a3c379b65e598afd\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-03T08:41:24Z\\\",\\\"message\\\":\\\"2025-10-03T08:40:38+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_d2710489-37bc-43d0-9069-63e58afa0354\\\\n2025-10-03T08:40:38+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_d2710489-37bc-43d0-9069-63e58afa0354 to /host/opt/cni/bin/\\\\n2025-10-03T08:40:39Z [verbose] multus-daemon started\\\\n2025-10-03T08:40:39Z [verbose] Readiness Indicator file check\\\\n2025-10-03T08:41:24Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-03T08:40:37Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-03T08:41:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-strp2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-03T08:40:35Z\\\"}}\" for pod \"openshift-multus\"/\"multus-hncl2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:41:44Z is after 2025-08-24T17:21:41Z" Oct 03 08:41:44 crc kubenswrapper[4790]: I1003 08:41:44.719287 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:44 crc kubenswrapper[4790]: I1003 08:41:44.719370 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:44 crc kubenswrapper[4790]: I1003 08:41:44.719396 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:44 crc kubenswrapper[4790]: I1003 08:41:44.719427 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:44 crc kubenswrapper[4790]: I1003 08:41:44.719450 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:44Z","lastTransitionTime":"2025-10-03T08:41:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:44 crc kubenswrapper[4790]: I1003 08:41:44.823105 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:44 crc kubenswrapper[4790]: I1003 08:41:44.823178 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:44 crc kubenswrapper[4790]: I1003 08:41:44.823194 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:44 crc kubenswrapper[4790]: I1003 08:41:44.823222 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:44 crc kubenswrapper[4790]: I1003 08:41:44.823241 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:44Z","lastTransitionTime":"2025-10-03T08:41:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:44 crc kubenswrapper[4790]: I1003 08:41:44.927390 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:44 crc kubenswrapper[4790]: I1003 08:41:44.927473 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:44 crc kubenswrapper[4790]: I1003 08:41:44.927612 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:44 crc kubenswrapper[4790]: I1003 08:41:44.927679 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:44 crc kubenswrapper[4790]: I1003 08:41:44.927710 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:44Z","lastTransitionTime":"2025-10-03T08:41:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:45 crc kubenswrapper[4790]: I1003 08:41:45.031796 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:45 crc kubenswrapper[4790]: I1003 08:41:45.031895 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:45 crc kubenswrapper[4790]: I1003 08:41:45.031925 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:45 crc kubenswrapper[4790]: I1003 08:41:45.031957 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:45 crc kubenswrapper[4790]: I1003 08:41:45.031981 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:45Z","lastTransitionTime":"2025-10-03T08:41:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:45 crc kubenswrapper[4790]: I1003 08:41:45.134843 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:45 crc kubenswrapper[4790]: I1003 08:41:45.134909 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:45 crc kubenswrapper[4790]: I1003 08:41:45.134929 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:45 crc kubenswrapper[4790]: I1003 08:41:45.134951 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:45 crc kubenswrapper[4790]: I1003 08:41:45.134968 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:45Z","lastTransitionTime":"2025-10-03T08:41:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:45 crc kubenswrapper[4790]: I1003 08:41:45.238511 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:45 crc kubenswrapper[4790]: I1003 08:41:45.238564 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:45 crc kubenswrapper[4790]: I1003 08:41:45.238658 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:45 crc kubenswrapper[4790]: I1003 08:41:45.238702 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:45 crc kubenswrapper[4790]: I1003 08:41:45.238727 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:45Z","lastTransitionTime":"2025-10-03T08:41:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:45 crc kubenswrapper[4790]: I1003 08:41:45.341742 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:45 crc kubenswrapper[4790]: I1003 08:41:45.341794 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:45 crc kubenswrapper[4790]: I1003 08:41:45.341804 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:45 crc kubenswrapper[4790]: I1003 08:41:45.341822 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:45 crc kubenswrapper[4790]: I1003 08:41:45.341834 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:45Z","lastTransitionTime":"2025-10-03T08:41:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:45 crc kubenswrapper[4790]: I1003 08:41:45.444767 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:45 crc kubenswrapper[4790]: I1003 08:41:45.444816 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:45 crc kubenswrapper[4790]: I1003 08:41:45.444825 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:45 crc kubenswrapper[4790]: I1003 08:41:45.444842 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:45 crc kubenswrapper[4790]: I1003 08:41:45.444852 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:45Z","lastTransitionTime":"2025-10-03T08:41:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:45 crc kubenswrapper[4790]: I1003 08:41:45.548505 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:45 crc kubenswrapper[4790]: I1003 08:41:45.548571 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:45 crc kubenswrapper[4790]: I1003 08:41:45.548653 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:45 crc kubenswrapper[4790]: I1003 08:41:45.548680 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:45 crc kubenswrapper[4790]: I1003 08:41:45.548698 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:45Z","lastTransitionTime":"2025-10-03T08:41:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:45 crc kubenswrapper[4790]: I1003 08:41:45.651462 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:45 crc kubenswrapper[4790]: I1003 08:41:45.651539 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:45 crc kubenswrapper[4790]: I1003 08:41:45.651558 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:45 crc kubenswrapper[4790]: I1003 08:41:45.651626 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:45 crc kubenswrapper[4790]: I1003 08:41:45.651647 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:45Z","lastTransitionTime":"2025-10-03T08:41:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:45 crc kubenswrapper[4790]: I1003 08:41:45.755316 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:45 crc kubenswrapper[4790]: I1003 08:41:45.755373 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:45 crc kubenswrapper[4790]: I1003 08:41:45.755391 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:45 crc kubenswrapper[4790]: I1003 08:41:45.755416 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:45 crc kubenswrapper[4790]: I1003 08:41:45.755435 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:45Z","lastTransitionTime":"2025-10-03T08:41:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:45 crc kubenswrapper[4790]: I1003 08:41:45.859439 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:45 crc kubenswrapper[4790]: I1003 08:41:45.859500 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:45 crc kubenswrapper[4790]: I1003 08:41:45.859517 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:45 crc kubenswrapper[4790]: I1003 08:41:45.859542 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:45 crc kubenswrapper[4790]: I1003 08:41:45.859598 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:45Z","lastTransitionTime":"2025-10-03T08:41:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:45 crc kubenswrapper[4790]: I1003 08:41:45.962891 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:45 crc kubenswrapper[4790]: I1003 08:41:45.962970 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:45 crc kubenswrapper[4790]: I1003 08:41:45.962992 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:45 crc kubenswrapper[4790]: I1003 08:41:45.963026 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:45 crc kubenswrapper[4790]: I1003 08:41:45.963048 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:45Z","lastTransitionTime":"2025-10-03T08:41:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:46 crc kubenswrapper[4790]: I1003 08:41:46.065739 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:46 crc kubenswrapper[4790]: I1003 08:41:46.065783 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:46 crc kubenswrapper[4790]: I1003 08:41:46.065795 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:46 crc kubenswrapper[4790]: I1003 08:41:46.065818 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:46 crc kubenswrapper[4790]: I1003 08:41:46.065830 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:46Z","lastTransitionTime":"2025-10-03T08:41:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:46 crc kubenswrapper[4790]: I1003 08:41:46.169621 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:46 crc kubenswrapper[4790]: I1003 08:41:46.169824 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:46 crc kubenswrapper[4790]: I1003 08:41:46.169914 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:46 crc kubenswrapper[4790]: I1003 08:41:46.170023 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:46 crc kubenswrapper[4790]: I1003 08:41:46.170134 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:46Z","lastTransitionTime":"2025-10-03T08:41:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:46 crc kubenswrapper[4790]: I1003 08:41:46.274146 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:46 crc kubenswrapper[4790]: I1003 08:41:46.274199 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:46 crc kubenswrapper[4790]: I1003 08:41:46.274213 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:46 crc kubenswrapper[4790]: I1003 08:41:46.274236 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:46 crc kubenswrapper[4790]: I1003 08:41:46.274250 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:46Z","lastTransitionTime":"2025-10-03T08:41:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:46 crc kubenswrapper[4790]: I1003 08:41:46.328289 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 08:41:46 crc kubenswrapper[4790]: I1003 08:41:46.328400 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4vl95" Oct 03 08:41:46 crc kubenswrapper[4790]: I1003 08:41:46.328349 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 08:41:46 crc kubenswrapper[4790]: E1003 08:41:46.328546 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 08:41:46 crc kubenswrapper[4790]: I1003 08:41:46.328292 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 08:41:46 crc kubenswrapper[4790]: E1003 08:41:46.328729 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4vl95" podUID="9236b957-81cb-40bc-969e-a2057884ba50" Oct 03 08:41:46 crc kubenswrapper[4790]: E1003 08:41:46.328777 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 08:41:46 crc kubenswrapper[4790]: E1003 08:41:46.328821 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 08:41:46 crc kubenswrapper[4790]: I1003 08:41:46.376386 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:46 crc kubenswrapper[4790]: I1003 08:41:46.376437 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:46 crc kubenswrapper[4790]: I1003 08:41:46.376447 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:46 crc kubenswrapper[4790]: I1003 08:41:46.376466 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:46 crc kubenswrapper[4790]: I1003 08:41:46.376478 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:46Z","lastTransitionTime":"2025-10-03T08:41:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:46 crc kubenswrapper[4790]: I1003 08:41:46.480015 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:46 crc kubenswrapper[4790]: I1003 08:41:46.480074 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:46 crc kubenswrapper[4790]: I1003 08:41:46.480090 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:46 crc kubenswrapper[4790]: I1003 08:41:46.480117 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:46 crc kubenswrapper[4790]: I1003 08:41:46.480134 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:46Z","lastTransitionTime":"2025-10-03T08:41:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:46 crc kubenswrapper[4790]: I1003 08:41:46.583456 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:46 crc kubenswrapper[4790]: I1003 08:41:46.583513 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:46 crc kubenswrapper[4790]: I1003 08:41:46.583532 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:46 crc kubenswrapper[4790]: I1003 08:41:46.583560 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:46 crc kubenswrapper[4790]: I1003 08:41:46.583618 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:46Z","lastTransitionTime":"2025-10-03T08:41:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:46 crc kubenswrapper[4790]: I1003 08:41:46.686891 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:46 crc kubenswrapper[4790]: I1003 08:41:46.686940 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:46 crc kubenswrapper[4790]: I1003 08:41:46.686949 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:46 crc kubenswrapper[4790]: I1003 08:41:46.686963 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:46 crc kubenswrapper[4790]: I1003 08:41:46.686973 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:46Z","lastTransitionTime":"2025-10-03T08:41:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:46 crc kubenswrapper[4790]: I1003 08:41:46.792641 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:46 crc kubenswrapper[4790]: I1003 08:41:46.792729 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:46 crc kubenswrapper[4790]: I1003 08:41:46.792741 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:46 crc kubenswrapper[4790]: I1003 08:41:46.792762 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:46 crc kubenswrapper[4790]: I1003 08:41:46.792774 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:46Z","lastTransitionTime":"2025-10-03T08:41:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:46 crc kubenswrapper[4790]: I1003 08:41:46.896372 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:46 crc kubenswrapper[4790]: I1003 08:41:46.896881 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:46 crc kubenswrapper[4790]: I1003 08:41:46.897097 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:46 crc kubenswrapper[4790]: I1003 08:41:46.897239 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:46 crc kubenswrapper[4790]: I1003 08:41:46.897400 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:46Z","lastTransitionTime":"2025-10-03T08:41:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:47 crc kubenswrapper[4790]: I1003 08:41:47.000397 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:47 crc kubenswrapper[4790]: I1003 08:41:47.000730 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:47 crc kubenswrapper[4790]: I1003 08:41:47.000825 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:47 crc kubenswrapper[4790]: I1003 08:41:47.000965 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:47 crc kubenswrapper[4790]: I1003 08:41:47.001075 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:47Z","lastTransitionTime":"2025-10-03T08:41:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:47 crc kubenswrapper[4790]: I1003 08:41:47.104708 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:47 crc kubenswrapper[4790]: I1003 08:41:47.104766 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:47 crc kubenswrapper[4790]: I1003 08:41:47.104779 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:47 crc kubenswrapper[4790]: I1003 08:41:47.104798 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:47 crc kubenswrapper[4790]: I1003 08:41:47.104811 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:47Z","lastTransitionTime":"2025-10-03T08:41:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:47 crc kubenswrapper[4790]: I1003 08:41:47.208176 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:47 crc kubenswrapper[4790]: I1003 08:41:47.208507 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:47 crc kubenswrapper[4790]: I1003 08:41:47.208637 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:47 crc kubenswrapper[4790]: I1003 08:41:47.208710 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:47 crc kubenswrapper[4790]: I1003 08:41:47.208794 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:47Z","lastTransitionTime":"2025-10-03T08:41:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:47 crc kubenswrapper[4790]: I1003 08:41:47.312064 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:47 crc kubenswrapper[4790]: I1003 08:41:47.312198 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:47 crc kubenswrapper[4790]: I1003 08:41:47.312217 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:47 crc kubenswrapper[4790]: I1003 08:41:47.312247 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:47 crc kubenswrapper[4790]: I1003 08:41:47.312267 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:47Z","lastTransitionTime":"2025-10-03T08:41:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:47 crc kubenswrapper[4790]: I1003 08:41:47.415379 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:47 crc kubenswrapper[4790]: I1003 08:41:47.415454 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:47 crc kubenswrapper[4790]: I1003 08:41:47.415466 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:47 crc kubenswrapper[4790]: I1003 08:41:47.415483 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:47 crc kubenswrapper[4790]: I1003 08:41:47.415493 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:47Z","lastTransitionTime":"2025-10-03T08:41:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:47 crc kubenswrapper[4790]: I1003 08:41:47.517848 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:47 crc kubenswrapper[4790]: I1003 08:41:47.517921 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:47 crc kubenswrapper[4790]: I1003 08:41:47.517930 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:47 crc kubenswrapper[4790]: I1003 08:41:47.517951 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:47 crc kubenswrapper[4790]: I1003 08:41:47.517963 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:47Z","lastTransitionTime":"2025-10-03T08:41:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:47 crc kubenswrapper[4790]: I1003 08:41:47.621152 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:47 crc kubenswrapper[4790]: I1003 08:41:47.621197 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:47 crc kubenswrapper[4790]: I1003 08:41:47.621211 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:47 crc kubenswrapper[4790]: I1003 08:41:47.621231 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:47 crc kubenswrapper[4790]: I1003 08:41:47.621245 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:47Z","lastTransitionTime":"2025-10-03T08:41:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:47 crc kubenswrapper[4790]: I1003 08:41:47.724041 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:47 crc kubenswrapper[4790]: I1003 08:41:47.724112 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:47 crc kubenswrapper[4790]: I1003 08:41:47.724131 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:47 crc kubenswrapper[4790]: I1003 08:41:47.724160 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:47 crc kubenswrapper[4790]: I1003 08:41:47.724178 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:47Z","lastTransitionTime":"2025-10-03T08:41:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:47 crc kubenswrapper[4790]: I1003 08:41:47.828142 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:47 crc kubenswrapper[4790]: I1003 08:41:47.828225 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:47 crc kubenswrapper[4790]: I1003 08:41:47.828244 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:47 crc kubenswrapper[4790]: I1003 08:41:47.828273 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:47 crc kubenswrapper[4790]: I1003 08:41:47.828298 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:47Z","lastTransitionTime":"2025-10-03T08:41:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:47 crc kubenswrapper[4790]: I1003 08:41:47.932724 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:47 crc kubenswrapper[4790]: I1003 08:41:47.932782 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:47 crc kubenswrapper[4790]: I1003 08:41:47.932794 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:47 crc kubenswrapper[4790]: I1003 08:41:47.932815 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:47 crc kubenswrapper[4790]: I1003 08:41:47.932828 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:47Z","lastTransitionTime":"2025-10-03T08:41:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:48 crc kubenswrapper[4790]: I1003 08:41:48.037212 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:48 crc kubenswrapper[4790]: I1003 08:41:48.037293 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:48 crc kubenswrapper[4790]: I1003 08:41:48.037318 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:48 crc kubenswrapper[4790]: I1003 08:41:48.037344 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:48 crc kubenswrapper[4790]: I1003 08:41:48.037359 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:48Z","lastTransitionTime":"2025-10-03T08:41:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:48 crc kubenswrapper[4790]: I1003 08:41:48.141657 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:48 crc kubenswrapper[4790]: I1003 08:41:48.141728 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:48 crc kubenswrapper[4790]: I1003 08:41:48.141745 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:48 crc kubenswrapper[4790]: I1003 08:41:48.141776 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:48 crc kubenswrapper[4790]: I1003 08:41:48.141798 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:48Z","lastTransitionTime":"2025-10-03T08:41:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:48 crc kubenswrapper[4790]: I1003 08:41:48.244817 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:48 crc kubenswrapper[4790]: I1003 08:41:48.244863 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:48 crc kubenswrapper[4790]: I1003 08:41:48.244877 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:48 crc kubenswrapper[4790]: I1003 08:41:48.244895 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:48 crc kubenswrapper[4790]: I1003 08:41:48.244908 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:48Z","lastTransitionTime":"2025-10-03T08:41:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:48 crc kubenswrapper[4790]: I1003 08:41:48.328068 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 08:41:48 crc kubenswrapper[4790]: E1003 08:41:48.328213 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 08:41:48 crc kubenswrapper[4790]: I1003 08:41:48.328401 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 08:41:48 crc kubenswrapper[4790]: E1003 08:41:48.328448 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 08:41:48 crc kubenswrapper[4790]: I1003 08:41:48.328570 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4vl95" Oct 03 08:41:48 crc kubenswrapper[4790]: E1003 08:41:48.328661 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4vl95" podUID="9236b957-81cb-40bc-969e-a2057884ba50" Oct 03 08:41:48 crc kubenswrapper[4790]: I1003 08:41:48.328761 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 08:41:48 crc kubenswrapper[4790]: E1003 08:41:48.328962 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 08:41:48 crc kubenswrapper[4790]: I1003 08:41:48.347756 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:48 crc kubenswrapper[4790]: I1003 08:41:48.347796 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:48 crc kubenswrapper[4790]: I1003 08:41:48.347807 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:48 crc kubenswrapper[4790]: I1003 08:41:48.347822 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:48 crc kubenswrapper[4790]: I1003 08:41:48.347832 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:48Z","lastTransitionTime":"2025-10-03T08:41:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:48 crc kubenswrapper[4790]: I1003 08:41:48.450805 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:48 crc kubenswrapper[4790]: I1003 08:41:48.450847 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:48 crc kubenswrapper[4790]: I1003 08:41:48.450856 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:48 crc kubenswrapper[4790]: I1003 08:41:48.450871 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:48 crc kubenswrapper[4790]: I1003 08:41:48.450880 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:48Z","lastTransitionTime":"2025-10-03T08:41:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:48 crc kubenswrapper[4790]: I1003 08:41:48.554420 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:48 crc kubenswrapper[4790]: I1003 08:41:48.554484 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:48 crc kubenswrapper[4790]: I1003 08:41:48.554498 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:48 crc kubenswrapper[4790]: I1003 08:41:48.554522 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:48 crc kubenswrapper[4790]: I1003 08:41:48.554538 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:48Z","lastTransitionTime":"2025-10-03T08:41:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:48 crc kubenswrapper[4790]: I1003 08:41:48.657505 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:48 crc kubenswrapper[4790]: I1003 08:41:48.657637 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:48 crc kubenswrapper[4790]: I1003 08:41:48.657677 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:48 crc kubenswrapper[4790]: I1003 08:41:48.657712 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:48 crc kubenswrapper[4790]: I1003 08:41:48.657740 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:48Z","lastTransitionTime":"2025-10-03T08:41:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:48 crc kubenswrapper[4790]: I1003 08:41:48.760900 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:48 crc kubenswrapper[4790]: I1003 08:41:48.760954 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:48 crc kubenswrapper[4790]: I1003 08:41:48.760966 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:48 crc kubenswrapper[4790]: I1003 08:41:48.760986 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:48 crc kubenswrapper[4790]: I1003 08:41:48.761001 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:48Z","lastTransitionTime":"2025-10-03T08:41:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:48 crc kubenswrapper[4790]: I1003 08:41:48.864189 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:48 crc kubenswrapper[4790]: I1003 08:41:48.864262 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:48 crc kubenswrapper[4790]: I1003 08:41:48.864288 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:48 crc kubenswrapper[4790]: I1003 08:41:48.864321 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:48 crc kubenswrapper[4790]: I1003 08:41:48.864356 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:48Z","lastTransitionTime":"2025-10-03T08:41:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:48 crc kubenswrapper[4790]: I1003 08:41:48.967210 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:48 crc kubenswrapper[4790]: I1003 08:41:48.967280 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:48 crc kubenswrapper[4790]: I1003 08:41:48.967303 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:48 crc kubenswrapper[4790]: I1003 08:41:48.967336 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:48 crc kubenswrapper[4790]: I1003 08:41:48.967361 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:48Z","lastTransitionTime":"2025-10-03T08:41:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:49 crc kubenswrapper[4790]: I1003 08:41:49.070490 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:49 crc kubenswrapper[4790]: I1003 08:41:49.070532 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:49 crc kubenswrapper[4790]: I1003 08:41:49.070542 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:49 crc kubenswrapper[4790]: I1003 08:41:49.070558 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:49 crc kubenswrapper[4790]: I1003 08:41:49.070568 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:49Z","lastTransitionTime":"2025-10-03T08:41:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:49 crc kubenswrapper[4790]: I1003 08:41:49.173869 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:49 crc kubenswrapper[4790]: I1003 08:41:49.173921 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:49 crc kubenswrapper[4790]: I1003 08:41:49.173931 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:49 crc kubenswrapper[4790]: I1003 08:41:49.173949 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:49 crc kubenswrapper[4790]: I1003 08:41:49.173959 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:49Z","lastTransitionTime":"2025-10-03T08:41:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:49 crc kubenswrapper[4790]: I1003 08:41:49.277516 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:49 crc kubenswrapper[4790]: I1003 08:41:49.277627 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:49 crc kubenswrapper[4790]: I1003 08:41:49.277652 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:49 crc kubenswrapper[4790]: I1003 08:41:49.277684 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:49 crc kubenswrapper[4790]: I1003 08:41:49.277707 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:49Z","lastTransitionTime":"2025-10-03T08:41:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:49 crc kubenswrapper[4790]: I1003 08:41:49.381348 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:49 crc kubenswrapper[4790]: I1003 08:41:49.381395 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:49 crc kubenswrapper[4790]: I1003 08:41:49.381406 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:49 crc kubenswrapper[4790]: I1003 08:41:49.381425 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:49 crc kubenswrapper[4790]: I1003 08:41:49.381437 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:49Z","lastTransitionTime":"2025-10-03T08:41:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:49 crc kubenswrapper[4790]: I1003 08:41:49.483929 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:49 crc kubenswrapper[4790]: I1003 08:41:49.483982 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:49 crc kubenswrapper[4790]: I1003 08:41:49.483994 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:49 crc kubenswrapper[4790]: I1003 08:41:49.484014 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:49 crc kubenswrapper[4790]: I1003 08:41:49.484030 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:49Z","lastTransitionTime":"2025-10-03T08:41:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:49 crc kubenswrapper[4790]: I1003 08:41:49.586465 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:49 crc kubenswrapper[4790]: I1003 08:41:49.586534 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:49 crc kubenswrapper[4790]: I1003 08:41:49.586552 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:49 crc kubenswrapper[4790]: I1003 08:41:49.586601 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:49 crc kubenswrapper[4790]: I1003 08:41:49.586619 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:49Z","lastTransitionTime":"2025-10-03T08:41:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:49 crc kubenswrapper[4790]: I1003 08:41:49.689322 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:49 crc kubenswrapper[4790]: I1003 08:41:49.689370 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:49 crc kubenswrapper[4790]: I1003 08:41:49.689381 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:49 crc kubenswrapper[4790]: I1003 08:41:49.689397 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:49 crc kubenswrapper[4790]: I1003 08:41:49.689406 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:49Z","lastTransitionTime":"2025-10-03T08:41:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:49 crc kubenswrapper[4790]: I1003 08:41:49.792581 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:49 crc kubenswrapper[4790]: I1003 08:41:49.792617 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:49 crc kubenswrapper[4790]: I1003 08:41:49.792626 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:49 crc kubenswrapper[4790]: I1003 08:41:49.792641 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:49 crc kubenswrapper[4790]: I1003 08:41:49.792651 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:49Z","lastTransitionTime":"2025-10-03T08:41:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:49 crc kubenswrapper[4790]: I1003 08:41:49.895616 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:49 crc kubenswrapper[4790]: I1003 08:41:49.895727 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:49 crc kubenswrapper[4790]: I1003 08:41:49.895745 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:49 crc kubenswrapper[4790]: I1003 08:41:49.895770 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:49 crc kubenswrapper[4790]: I1003 08:41:49.895787 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:49Z","lastTransitionTime":"2025-10-03T08:41:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:49 crc kubenswrapper[4790]: I1003 08:41:49.998535 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:49 crc kubenswrapper[4790]: I1003 08:41:49.998606 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:49 crc kubenswrapper[4790]: I1003 08:41:49.998619 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:49 crc kubenswrapper[4790]: I1003 08:41:49.998638 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:49 crc kubenswrapper[4790]: I1003 08:41:49.998654 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:49Z","lastTransitionTime":"2025-10-03T08:41:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:50 crc kubenswrapper[4790]: I1003 08:41:50.101785 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:50 crc kubenswrapper[4790]: I1003 08:41:50.101842 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:50 crc kubenswrapper[4790]: I1003 08:41:50.101860 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:50 crc kubenswrapper[4790]: I1003 08:41:50.101884 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:50 crc kubenswrapper[4790]: I1003 08:41:50.101904 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:50Z","lastTransitionTime":"2025-10-03T08:41:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:50 crc kubenswrapper[4790]: I1003 08:41:50.205027 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:50 crc kubenswrapper[4790]: I1003 08:41:50.205146 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:50 crc kubenswrapper[4790]: I1003 08:41:50.205173 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:50 crc kubenswrapper[4790]: I1003 08:41:50.205203 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:50 crc kubenswrapper[4790]: I1003 08:41:50.205225 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:50Z","lastTransitionTime":"2025-10-03T08:41:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:50 crc kubenswrapper[4790]: I1003 08:41:50.307924 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:50 crc kubenswrapper[4790]: I1003 08:41:50.308001 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:50 crc kubenswrapper[4790]: I1003 08:41:50.308019 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:50 crc kubenswrapper[4790]: I1003 08:41:50.308047 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:50 crc kubenswrapper[4790]: I1003 08:41:50.308066 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:50Z","lastTransitionTime":"2025-10-03T08:41:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:50 crc kubenswrapper[4790]: I1003 08:41:50.327654 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4vl95" Oct 03 08:41:50 crc kubenswrapper[4790]: I1003 08:41:50.327743 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 08:41:50 crc kubenswrapper[4790]: I1003 08:41:50.327813 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 08:41:50 crc kubenswrapper[4790]: E1003 08:41:50.327885 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4vl95" podUID="9236b957-81cb-40bc-969e-a2057884ba50" Oct 03 08:41:50 crc kubenswrapper[4790]: I1003 08:41:50.328156 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 08:41:50 crc kubenswrapper[4790]: E1003 08:41:50.328697 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 08:41:50 crc kubenswrapper[4790]: E1003 08:41:50.329167 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 08:41:50 crc kubenswrapper[4790]: E1003 08:41:50.329485 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 08:41:50 crc kubenswrapper[4790]: I1003 08:41:50.411104 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:50 crc kubenswrapper[4790]: I1003 08:41:50.411156 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:50 crc kubenswrapper[4790]: I1003 08:41:50.411168 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:50 crc kubenswrapper[4790]: I1003 08:41:50.411183 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:50 crc kubenswrapper[4790]: I1003 08:41:50.411195 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:50Z","lastTransitionTime":"2025-10-03T08:41:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:50 crc kubenswrapper[4790]: I1003 08:41:50.514164 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:50 crc kubenswrapper[4790]: I1003 08:41:50.514215 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:50 crc kubenswrapper[4790]: I1003 08:41:50.514228 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:50 crc kubenswrapper[4790]: I1003 08:41:50.514248 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:50 crc kubenswrapper[4790]: I1003 08:41:50.514264 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:50Z","lastTransitionTime":"2025-10-03T08:41:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:50 crc kubenswrapper[4790]: I1003 08:41:50.616886 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:50 crc kubenswrapper[4790]: I1003 08:41:50.616925 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:50 crc kubenswrapper[4790]: I1003 08:41:50.616933 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:50 crc kubenswrapper[4790]: I1003 08:41:50.616948 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:50 crc kubenswrapper[4790]: I1003 08:41:50.616958 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:50Z","lastTransitionTime":"2025-10-03T08:41:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:50 crc kubenswrapper[4790]: I1003 08:41:50.719829 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:50 crc kubenswrapper[4790]: I1003 08:41:50.719868 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:50 crc kubenswrapper[4790]: I1003 08:41:50.719880 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:50 crc kubenswrapper[4790]: I1003 08:41:50.719895 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:50 crc kubenswrapper[4790]: I1003 08:41:50.719907 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:50Z","lastTransitionTime":"2025-10-03T08:41:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:50 crc kubenswrapper[4790]: I1003 08:41:50.822992 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:50 crc kubenswrapper[4790]: I1003 08:41:50.823049 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:50 crc kubenswrapper[4790]: I1003 08:41:50.823060 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:50 crc kubenswrapper[4790]: I1003 08:41:50.823080 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:50 crc kubenswrapper[4790]: I1003 08:41:50.823095 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:50Z","lastTransitionTime":"2025-10-03T08:41:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:50 crc kubenswrapper[4790]: I1003 08:41:50.925921 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:50 crc kubenswrapper[4790]: I1003 08:41:50.926020 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:50 crc kubenswrapper[4790]: I1003 08:41:50.926039 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:50 crc kubenswrapper[4790]: I1003 08:41:50.926065 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:50 crc kubenswrapper[4790]: I1003 08:41:50.926084 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:50Z","lastTransitionTime":"2025-10-03T08:41:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:51 crc kubenswrapper[4790]: I1003 08:41:51.029536 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:51 crc kubenswrapper[4790]: I1003 08:41:51.029623 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:51 crc kubenswrapper[4790]: I1003 08:41:51.029637 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:51 crc kubenswrapper[4790]: I1003 08:41:51.029658 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:51 crc kubenswrapper[4790]: I1003 08:41:51.029670 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:51Z","lastTransitionTime":"2025-10-03T08:41:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:51 crc kubenswrapper[4790]: I1003 08:41:51.132616 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:51 crc kubenswrapper[4790]: I1003 08:41:51.132664 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:51 crc kubenswrapper[4790]: I1003 08:41:51.132675 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:51 crc kubenswrapper[4790]: I1003 08:41:51.132692 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:51 crc kubenswrapper[4790]: I1003 08:41:51.132703 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:51Z","lastTransitionTime":"2025-10-03T08:41:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:51 crc kubenswrapper[4790]: I1003 08:41:51.236191 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:51 crc kubenswrapper[4790]: I1003 08:41:51.236282 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:51 crc kubenswrapper[4790]: I1003 08:41:51.236307 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:51 crc kubenswrapper[4790]: I1003 08:41:51.236349 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:51 crc kubenswrapper[4790]: I1003 08:41:51.236374 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:51Z","lastTransitionTime":"2025-10-03T08:41:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:51 crc kubenswrapper[4790]: I1003 08:41:51.328348 4790 scope.go:117] "RemoveContainer" containerID="873da5acf186f02e995f5db997cebf47651cb6a2b87d4641ecb0318c004619e8" Oct 03 08:41:51 crc kubenswrapper[4790]: E1003 08:41:51.328601 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-vsjvh_openshift-ovn-kubernetes(fa372dcc-63aa-48e5-b153-7739078026ef)\"" pod="openshift-ovn-kubernetes/ovnkube-node-vsjvh" podUID="fa372dcc-63aa-48e5-b153-7739078026ef" Oct 03 08:41:51 crc kubenswrapper[4790]: I1003 08:41:51.339616 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:51 crc kubenswrapper[4790]: I1003 08:41:51.339665 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:51 crc kubenswrapper[4790]: I1003 08:41:51.339681 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:51 crc kubenswrapper[4790]: I1003 08:41:51.339697 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:51 crc kubenswrapper[4790]: I1003 08:41:51.339708 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:51Z","lastTransitionTime":"2025-10-03T08:41:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:51 crc kubenswrapper[4790]: I1003 08:41:51.442885 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:51 crc kubenswrapper[4790]: I1003 08:41:51.442936 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:51 crc kubenswrapper[4790]: I1003 08:41:51.442947 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:51 crc kubenswrapper[4790]: I1003 08:41:51.442969 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:51 crc kubenswrapper[4790]: I1003 08:41:51.442982 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:51Z","lastTransitionTime":"2025-10-03T08:41:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:51 crc kubenswrapper[4790]: I1003 08:41:51.546559 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:51 crc kubenswrapper[4790]: I1003 08:41:51.546624 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:51 crc kubenswrapper[4790]: I1003 08:41:51.546634 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:51 crc kubenswrapper[4790]: I1003 08:41:51.546650 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:51 crc kubenswrapper[4790]: I1003 08:41:51.546662 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:51Z","lastTransitionTime":"2025-10-03T08:41:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:51 crc kubenswrapper[4790]: I1003 08:41:51.649655 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:51 crc kubenswrapper[4790]: I1003 08:41:51.649718 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:51 crc kubenswrapper[4790]: I1003 08:41:51.649731 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:51 crc kubenswrapper[4790]: I1003 08:41:51.649746 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:51 crc kubenswrapper[4790]: I1003 08:41:51.649759 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:51Z","lastTransitionTime":"2025-10-03T08:41:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:51 crc kubenswrapper[4790]: I1003 08:41:51.752775 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:51 crc kubenswrapper[4790]: I1003 08:41:51.752835 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:51 crc kubenswrapper[4790]: I1003 08:41:51.752848 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:51 crc kubenswrapper[4790]: I1003 08:41:51.752869 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:51 crc kubenswrapper[4790]: I1003 08:41:51.752883 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:51Z","lastTransitionTime":"2025-10-03T08:41:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:51 crc kubenswrapper[4790]: I1003 08:41:51.855912 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:51 crc kubenswrapper[4790]: I1003 08:41:51.855989 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:51 crc kubenswrapper[4790]: I1003 08:41:51.856001 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:51 crc kubenswrapper[4790]: I1003 08:41:51.856023 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:51 crc kubenswrapper[4790]: I1003 08:41:51.856042 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:51Z","lastTransitionTime":"2025-10-03T08:41:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:51 crc kubenswrapper[4790]: I1003 08:41:51.959146 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:51 crc kubenswrapper[4790]: I1003 08:41:51.959208 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:51 crc kubenswrapper[4790]: I1003 08:41:51.959219 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:51 crc kubenswrapper[4790]: I1003 08:41:51.959237 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:51 crc kubenswrapper[4790]: I1003 08:41:51.959249 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:51Z","lastTransitionTime":"2025-10-03T08:41:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:52 crc kubenswrapper[4790]: I1003 08:41:52.062206 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:52 crc kubenswrapper[4790]: I1003 08:41:52.062296 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:52 crc kubenswrapper[4790]: I1003 08:41:52.062311 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:52 crc kubenswrapper[4790]: I1003 08:41:52.062333 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:52 crc kubenswrapper[4790]: I1003 08:41:52.062346 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:52Z","lastTransitionTime":"2025-10-03T08:41:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:52 crc kubenswrapper[4790]: I1003 08:41:52.165288 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:52 crc kubenswrapper[4790]: I1003 08:41:52.165363 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:52 crc kubenswrapper[4790]: I1003 08:41:52.165385 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:52 crc kubenswrapper[4790]: I1003 08:41:52.165415 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:52 crc kubenswrapper[4790]: I1003 08:41:52.165438 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:52Z","lastTransitionTime":"2025-10-03T08:41:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:52 crc kubenswrapper[4790]: I1003 08:41:52.268188 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:52 crc kubenswrapper[4790]: I1003 08:41:52.268245 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:52 crc kubenswrapper[4790]: I1003 08:41:52.268262 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:52 crc kubenswrapper[4790]: I1003 08:41:52.268287 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:52 crc kubenswrapper[4790]: I1003 08:41:52.268321 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:52Z","lastTransitionTime":"2025-10-03T08:41:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:52 crc kubenswrapper[4790]: I1003 08:41:52.328184 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 08:41:52 crc kubenswrapper[4790]: E1003 08:41:52.328367 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 08:41:52 crc kubenswrapper[4790]: I1003 08:41:52.328185 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4vl95" Oct 03 08:41:52 crc kubenswrapper[4790]: I1003 08:41:52.328224 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 08:41:52 crc kubenswrapper[4790]: E1003 08:41:52.328652 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4vl95" podUID="9236b957-81cb-40bc-969e-a2057884ba50" Oct 03 08:41:52 crc kubenswrapper[4790]: I1003 08:41:52.328213 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 08:41:52 crc kubenswrapper[4790]: E1003 08:41:52.328765 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 08:41:52 crc kubenswrapper[4790]: E1003 08:41:52.328851 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 08:41:52 crc kubenswrapper[4790]: I1003 08:41:52.372165 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:52 crc kubenswrapper[4790]: I1003 08:41:52.372220 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:52 crc kubenswrapper[4790]: I1003 08:41:52.372232 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:52 crc kubenswrapper[4790]: I1003 08:41:52.372253 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:52 crc kubenswrapper[4790]: I1003 08:41:52.372267 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:52Z","lastTransitionTime":"2025-10-03T08:41:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:52 crc kubenswrapper[4790]: I1003 08:41:52.475246 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:52 crc kubenswrapper[4790]: I1003 08:41:52.475304 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:52 crc kubenswrapper[4790]: I1003 08:41:52.475317 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:52 crc kubenswrapper[4790]: I1003 08:41:52.475336 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:52 crc kubenswrapper[4790]: I1003 08:41:52.475350 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:52Z","lastTransitionTime":"2025-10-03T08:41:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:52 crc kubenswrapper[4790]: I1003 08:41:52.579079 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:52 crc kubenswrapper[4790]: I1003 08:41:52.579139 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:52 crc kubenswrapper[4790]: I1003 08:41:52.579155 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:52 crc kubenswrapper[4790]: I1003 08:41:52.579179 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:52 crc kubenswrapper[4790]: I1003 08:41:52.579193 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:52Z","lastTransitionTime":"2025-10-03T08:41:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:52 crc kubenswrapper[4790]: I1003 08:41:52.681863 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:52 crc kubenswrapper[4790]: I1003 08:41:52.681915 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:52 crc kubenswrapper[4790]: I1003 08:41:52.681924 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:52 crc kubenswrapper[4790]: I1003 08:41:52.681943 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:52 crc kubenswrapper[4790]: I1003 08:41:52.681954 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:52Z","lastTransitionTime":"2025-10-03T08:41:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:52 crc kubenswrapper[4790]: I1003 08:41:52.785287 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:52 crc kubenswrapper[4790]: I1003 08:41:52.785348 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:52 crc kubenswrapper[4790]: I1003 08:41:52.785363 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:52 crc kubenswrapper[4790]: I1003 08:41:52.785384 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:52 crc kubenswrapper[4790]: I1003 08:41:52.785400 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:52Z","lastTransitionTime":"2025-10-03T08:41:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:52 crc kubenswrapper[4790]: I1003 08:41:52.887687 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:52 crc kubenswrapper[4790]: I1003 08:41:52.887802 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:52 crc kubenswrapper[4790]: I1003 08:41:52.887821 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:52 crc kubenswrapper[4790]: I1003 08:41:52.887849 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:52 crc kubenswrapper[4790]: I1003 08:41:52.887868 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:52Z","lastTransitionTime":"2025-10-03T08:41:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:52 crc kubenswrapper[4790]: I1003 08:41:52.949421 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:52 crc kubenswrapper[4790]: I1003 08:41:52.949480 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:52 crc kubenswrapper[4790]: I1003 08:41:52.949493 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:52 crc kubenswrapper[4790]: I1003 08:41:52.949514 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:52 crc kubenswrapper[4790]: I1003 08:41:52.949528 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:52Z","lastTransitionTime":"2025-10-03T08:41:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:52 crc kubenswrapper[4790]: E1003 08:41:52.964299 4790 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T08:41:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T08:41:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T08:41:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T08:41:52Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T08:41:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T08:41:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T08:41:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T08:41:52Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e374b313-7030-4739-b7f1-da1c9b39fe8d\\\",\\\"systemUUID\\\":\\\"312be368-bbdc-461d-b74b-f792414190de\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:41:52Z is after 2025-08-24T17:21:41Z" Oct 03 08:41:52 crc kubenswrapper[4790]: I1003 08:41:52.970479 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:52 crc kubenswrapper[4790]: I1003 08:41:52.970762 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:52 crc kubenswrapper[4790]: I1003 08:41:52.970916 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:52 crc kubenswrapper[4790]: I1003 08:41:52.971071 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:52 crc kubenswrapper[4790]: I1003 08:41:52.971208 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:52Z","lastTransitionTime":"2025-10-03T08:41:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:52 crc kubenswrapper[4790]: E1003 08:41:52.985503 4790 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T08:41:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T08:41:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T08:41:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T08:41:52Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T08:41:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T08:41:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T08:41:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T08:41:52Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e374b313-7030-4739-b7f1-da1c9b39fe8d\\\",\\\"systemUUID\\\":\\\"312be368-bbdc-461d-b74b-f792414190de\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:41:52Z is after 2025-08-24T17:21:41Z" Oct 03 08:41:52 crc kubenswrapper[4790]: I1003 08:41:52.989634 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:52 crc kubenswrapper[4790]: I1003 08:41:52.989683 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:52 crc kubenswrapper[4790]: I1003 08:41:52.989696 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:52 crc kubenswrapper[4790]: I1003 08:41:52.989716 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:52 crc kubenswrapper[4790]: I1003 08:41:52.989729 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:52Z","lastTransitionTime":"2025-10-03T08:41:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:53 crc kubenswrapper[4790]: E1003 08:41:53.004362 4790 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T08:41:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T08:41:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T08:41:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T08:41:52Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T08:41:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T08:41:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T08:41:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T08:41:52Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e374b313-7030-4739-b7f1-da1c9b39fe8d\\\",\\\"systemUUID\\\":\\\"312be368-bbdc-461d-b74b-f792414190de\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:41:53Z is after 2025-08-24T17:21:41Z" Oct 03 08:41:53 crc kubenswrapper[4790]: I1003 08:41:53.009223 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:53 crc kubenswrapper[4790]: I1003 08:41:53.009272 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:53 crc kubenswrapper[4790]: I1003 08:41:53.009288 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:53 crc kubenswrapper[4790]: I1003 08:41:53.009306 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:53 crc kubenswrapper[4790]: I1003 08:41:53.009317 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:53Z","lastTransitionTime":"2025-10-03T08:41:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:53 crc kubenswrapper[4790]: E1003 08:41:53.021834 4790 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T08:41:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T08:41:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T08:41:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T08:41:53Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T08:41:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T08:41:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T08:41:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T08:41:53Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e374b313-7030-4739-b7f1-da1c9b39fe8d\\\",\\\"systemUUID\\\":\\\"312be368-bbdc-461d-b74b-f792414190de\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:41:53Z is after 2025-08-24T17:21:41Z" Oct 03 08:41:53 crc kubenswrapper[4790]: I1003 08:41:53.026850 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:53 crc kubenswrapper[4790]: I1003 08:41:53.026905 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:53 crc kubenswrapper[4790]: I1003 08:41:53.026918 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:53 crc kubenswrapper[4790]: I1003 08:41:53.026940 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:53 crc kubenswrapper[4790]: I1003 08:41:53.026953 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:53Z","lastTransitionTime":"2025-10-03T08:41:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:53 crc kubenswrapper[4790]: E1003 08:41:53.040815 4790 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T08:41:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T08:41:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T08:41:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T08:41:53Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T08:41:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T08:41:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-03T08:41:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-03T08:41:53Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e374b313-7030-4739-b7f1-da1c9b39fe8d\\\",\\\"systemUUID\\\":\\\"312be368-bbdc-461d-b74b-f792414190de\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-03T08:41:53Z is after 2025-08-24T17:21:41Z" Oct 03 08:41:53 crc kubenswrapper[4790]: E1003 08:41:53.040966 4790 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 03 08:41:53 crc kubenswrapper[4790]: I1003 08:41:53.042684 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:53 crc kubenswrapper[4790]: I1003 08:41:53.042737 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:53 crc kubenswrapper[4790]: I1003 08:41:53.042746 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:53 crc kubenswrapper[4790]: I1003 08:41:53.042766 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:53 crc kubenswrapper[4790]: I1003 08:41:53.042802 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:53Z","lastTransitionTime":"2025-10-03T08:41:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:53 crc kubenswrapper[4790]: I1003 08:41:53.047384 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9236b957-81cb-40bc-969e-a2057884ba50-metrics-certs\") pod \"network-metrics-daemon-4vl95\" (UID: \"9236b957-81cb-40bc-969e-a2057884ba50\") " pod="openshift-multus/network-metrics-daemon-4vl95" Oct 03 08:41:53 crc kubenswrapper[4790]: E1003 08:41:53.047590 4790 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 03 08:41:53 crc kubenswrapper[4790]: E1003 08:41:53.047665 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9236b957-81cb-40bc-969e-a2057884ba50-metrics-certs podName:9236b957-81cb-40bc-969e-a2057884ba50 nodeName:}" failed. No retries permitted until 2025-10-03 08:42:57.047648829 +0000 UTC m=+163.400583067 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9236b957-81cb-40bc-969e-a2057884ba50-metrics-certs") pod "network-metrics-daemon-4vl95" (UID: "9236b957-81cb-40bc-969e-a2057884ba50") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 03 08:41:53 crc kubenswrapper[4790]: I1003 08:41:53.147397 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:53 crc kubenswrapper[4790]: I1003 08:41:53.147472 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:53 crc kubenswrapper[4790]: I1003 08:41:53.147495 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:53 crc kubenswrapper[4790]: I1003 08:41:53.147524 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:53 crc kubenswrapper[4790]: I1003 08:41:53.147548 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:53Z","lastTransitionTime":"2025-10-03T08:41:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:53 crc kubenswrapper[4790]: I1003 08:41:53.250195 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:53 crc kubenswrapper[4790]: I1003 08:41:53.250825 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:53 crc kubenswrapper[4790]: I1003 08:41:53.250945 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:53 crc kubenswrapper[4790]: I1003 08:41:53.251066 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:53 crc kubenswrapper[4790]: I1003 08:41:53.251167 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:53Z","lastTransitionTime":"2025-10-03T08:41:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:53 crc kubenswrapper[4790]: I1003 08:41:53.354338 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:53 crc kubenswrapper[4790]: I1003 08:41:53.354403 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:53 crc kubenswrapper[4790]: I1003 08:41:53.354416 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:53 crc kubenswrapper[4790]: I1003 08:41:53.354437 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:53 crc kubenswrapper[4790]: I1003 08:41:53.354451 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:53Z","lastTransitionTime":"2025-10-03T08:41:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:53 crc kubenswrapper[4790]: I1003 08:41:53.458319 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:53 crc kubenswrapper[4790]: I1003 08:41:53.458363 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:53 crc kubenswrapper[4790]: I1003 08:41:53.458376 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:53 crc kubenswrapper[4790]: I1003 08:41:53.458396 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:53 crc kubenswrapper[4790]: I1003 08:41:53.458410 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:53Z","lastTransitionTime":"2025-10-03T08:41:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:53 crc kubenswrapper[4790]: I1003 08:41:53.568375 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:53 crc kubenswrapper[4790]: I1003 08:41:53.568443 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:53 crc kubenswrapper[4790]: I1003 08:41:53.568463 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:53 crc kubenswrapper[4790]: I1003 08:41:53.568490 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:53 crc kubenswrapper[4790]: I1003 08:41:53.568510 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:53Z","lastTransitionTime":"2025-10-03T08:41:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:53 crc kubenswrapper[4790]: I1003 08:41:53.671435 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:53 crc kubenswrapper[4790]: I1003 08:41:53.671478 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:53 crc kubenswrapper[4790]: I1003 08:41:53.671489 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:53 crc kubenswrapper[4790]: I1003 08:41:53.671504 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:53 crc kubenswrapper[4790]: I1003 08:41:53.671516 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:53Z","lastTransitionTime":"2025-10-03T08:41:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:53 crc kubenswrapper[4790]: I1003 08:41:53.773931 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:53 crc kubenswrapper[4790]: I1003 08:41:53.773980 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:53 crc kubenswrapper[4790]: I1003 08:41:53.773991 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:53 crc kubenswrapper[4790]: I1003 08:41:53.774008 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:53 crc kubenswrapper[4790]: I1003 08:41:53.774022 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:53Z","lastTransitionTime":"2025-10-03T08:41:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:53 crc kubenswrapper[4790]: I1003 08:41:53.876623 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:53 crc kubenswrapper[4790]: I1003 08:41:53.876674 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:53 crc kubenswrapper[4790]: I1003 08:41:53.876685 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:53 crc kubenswrapper[4790]: I1003 08:41:53.876703 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:53 crc kubenswrapper[4790]: I1003 08:41:53.876717 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:53Z","lastTransitionTime":"2025-10-03T08:41:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:53 crc kubenswrapper[4790]: I1003 08:41:53.979305 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:53 crc kubenswrapper[4790]: I1003 08:41:53.979380 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:53 crc kubenswrapper[4790]: I1003 08:41:53.979392 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:53 crc kubenswrapper[4790]: I1003 08:41:53.979418 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:53 crc kubenswrapper[4790]: I1003 08:41:53.979432 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:53Z","lastTransitionTime":"2025-10-03T08:41:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:54 crc kubenswrapper[4790]: I1003 08:41:54.082930 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:54 crc kubenswrapper[4790]: I1003 08:41:54.083022 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:54 crc kubenswrapper[4790]: I1003 08:41:54.083030 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:54 crc kubenswrapper[4790]: I1003 08:41:54.083048 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:54 crc kubenswrapper[4790]: I1003 08:41:54.083061 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:54Z","lastTransitionTime":"2025-10-03T08:41:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:54 crc kubenswrapper[4790]: I1003 08:41:54.194660 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:54 crc kubenswrapper[4790]: I1003 08:41:54.194722 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:54 crc kubenswrapper[4790]: I1003 08:41:54.194735 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:54 crc kubenswrapper[4790]: I1003 08:41:54.194758 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:54 crc kubenswrapper[4790]: I1003 08:41:54.194774 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:54Z","lastTransitionTime":"2025-10-03T08:41:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:54 crc kubenswrapper[4790]: I1003 08:41:54.298909 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:54 crc kubenswrapper[4790]: I1003 08:41:54.299006 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:54 crc kubenswrapper[4790]: I1003 08:41:54.299018 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:54 crc kubenswrapper[4790]: I1003 08:41:54.299041 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:54 crc kubenswrapper[4790]: I1003 08:41:54.299055 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:54Z","lastTransitionTime":"2025-10-03T08:41:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:54 crc kubenswrapper[4790]: I1003 08:41:54.328474 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 08:41:54 crc kubenswrapper[4790]: I1003 08:41:54.328634 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4vl95" Oct 03 08:41:54 crc kubenswrapper[4790]: I1003 08:41:54.328517 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 08:41:54 crc kubenswrapper[4790]: I1003 08:41:54.328517 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 08:41:54 crc kubenswrapper[4790]: E1003 08:41:54.328806 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 08:41:54 crc kubenswrapper[4790]: E1003 08:41:54.328960 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 08:41:54 crc kubenswrapper[4790]: E1003 08:41:54.329050 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4vl95" podUID="9236b957-81cb-40bc-969e-a2057884ba50" Oct 03 08:41:54 crc kubenswrapper[4790]: E1003 08:41:54.329187 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 08:41:54 crc kubenswrapper[4790]: I1003 08:41:54.402378 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:54 crc kubenswrapper[4790]: I1003 08:41:54.402411 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:54 crc kubenswrapper[4790]: I1003 08:41:54.402420 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:54 crc kubenswrapper[4790]: I1003 08:41:54.402435 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:54 crc kubenswrapper[4790]: I1003 08:41:54.402446 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:54Z","lastTransitionTime":"2025-10-03T08:41:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:54 crc kubenswrapper[4790]: I1003 08:41:54.437869 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" podStartSLOduration=79.437846575 podStartE2EDuration="1m19.437846575s" podCreationTimestamp="2025-10-03 08:40:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 08:41:54.412324704 +0000 UTC m=+100.765258942" watchObservedRunningTime="2025-10-03 08:41:54.437846575 +0000 UTC m=+100.790780813" Oct 03 08:41:54 crc kubenswrapper[4790]: I1003 08:41:54.487049 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=80.487025522 podStartE2EDuration="1m20.487025522s" podCreationTimestamp="2025-10-03 08:40:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 08:41:54.485173791 +0000 UTC m=+100.838108049" watchObservedRunningTime="2025-10-03 08:41:54.487025522 +0000 UTC m=+100.839959770" Oct 03 08:41:54 crc kubenswrapper[4790]: I1003 08:41:54.487620 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=19.487613558 podStartE2EDuration="19.487613558s" podCreationTimestamp="2025-10-03 08:41:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 08:41:54.463763155 +0000 UTC m=+100.816697403" watchObservedRunningTime="2025-10-03 08:41:54.487613558 +0000 UTC m=+100.840547806" Oct 03 08:41:54 crc kubenswrapper[4790]: I1003 08:41:54.501361 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-gprtl" podStartSLOduration=81.501342894 podStartE2EDuration="1m21.501342894s" podCreationTimestamp="2025-10-03 08:40:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 08:41:54.500954074 +0000 UTC m=+100.853888342" watchObservedRunningTime="2025-10-03 08:41:54.501342894 +0000 UTC m=+100.854277142" Oct 03 08:41:54 crc kubenswrapper[4790]: I1003 08:41:54.506219 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:54 crc kubenswrapper[4790]: I1003 08:41:54.506292 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:54 crc kubenswrapper[4790]: I1003 08:41:54.506312 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:54 crc kubenswrapper[4790]: I1003 08:41:54.506338 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:54 crc kubenswrapper[4790]: I1003 08:41:54.506356 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:54Z","lastTransitionTime":"2025-10-03T08:41:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:54 crc kubenswrapper[4790]: I1003 08:41:54.535814 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=74.535786899 podStartE2EDuration="1m14.535786899s" podCreationTimestamp="2025-10-03 08:40:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 08:41:54.5351085 +0000 UTC m=+100.888042758" watchObservedRunningTime="2025-10-03 08:41:54.535786899 +0000 UTC m=+100.888721137" Oct 03 08:41:54 crc kubenswrapper[4790]: I1003 08:41:54.552179 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=47.552154857 podStartE2EDuration="47.552154857s" podCreationTimestamp="2025-10-03 08:41:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 08:41:54.551313784 +0000 UTC m=+100.904248042" watchObservedRunningTime="2025-10-03 08:41:54.552154857 +0000 UTC m=+100.905089095" Oct 03 08:41:54 crc kubenswrapper[4790]: I1003 08:41:54.609052 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:54 crc kubenswrapper[4790]: I1003 08:41:54.609120 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:54 crc kubenswrapper[4790]: I1003 08:41:54.609134 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:54 crc kubenswrapper[4790]: I1003 08:41:54.609155 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:54 crc kubenswrapper[4790]: I1003 08:41:54.609175 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:54Z","lastTransitionTime":"2025-10-03T08:41:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:54 crc kubenswrapper[4790]: I1003 08:41:54.610729 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-hncl2" podStartSLOduration=80.610714812 podStartE2EDuration="1m20.610714812s" podCreationTimestamp="2025-10-03 08:40:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 08:41:54.610141957 +0000 UTC m=+100.963076215" watchObservedRunningTime="2025-10-03 08:41:54.610714812 +0000 UTC m=+100.963649060" Oct 03 08:41:54 crc kubenswrapper[4790]: I1003 08:41:54.667195 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=12.667169079 podStartE2EDuration="12.667169079s" podCreationTimestamp="2025-10-03 08:41:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 08:41:54.651450759 +0000 UTC m=+101.004384997" watchObservedRunningTime="2025-10-03 08:41:54.667169079 +0000 UTC m=+101.020103317" Oct 03 08:41:54 crc kubenswrapper[4790]: I1003 08:41:54.701609 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-9fnsr" podStartSLOduration=79.701562323 podStartE2EDuration="1m19.701562323s" podCreationTimestamp="2025-10-03 08:40:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 08:41:54.688436573 +0000 UTC m=+101.041370811" watchObservedRunningTime="2025-10-03 08:41:54.701562323 +0000 UTC m=+101.054496571" Oct 03 08:41:54 crc kubenswrapper[4790]: I1003 08:41:54.702931 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-7bpwn" podStartSLOduration=80.70292343 podStartE2EDuration="1m20.70292343s" podCreationTimestamp="2025-10-03 08:40:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 08:41:54.702094737 +0000 UTC m=+101.055028975" watchObservedRunningTime="2025-10-03 08:41:54.70292343 +0000 UTC m=+101.055857668" Oct 03 08:41:54 crc kubenswrapper[4790]: I1003 08:41:54.712246 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:54 crc kubenswrapper[4790]: I1003 08:41:54.712332 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:54 crc kubenswrapper[4790]: I1003 08:41:54.712368 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:54 crc kubenswrapper[4790]: I1003 08:41:54.712463 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:54 crc kubenswrapper[4790]: I1003 08:41:54.712475 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:54Z","lastTransitionTime":"2025-10-03T08:41:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:54 crc kubenswrapper[4790]: I1003 08:41:54.815790 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:54 crc kubenswrapper[4790]: I1003 08:41:54.815836 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:54 crc kubenswrapper[4790]: I1003 08:41:54.815850 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:54 crc kubenswrapper[4790]: I1003 08:41:54.815868 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:54 crc kubenswrapper[4790]: I1003 08:41:54.815880 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:54Z","lastTransitionTime":"2025-10-03T08:41:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:54 crc kubenswrapper[4790]: I1003 08:41:54.919417 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:54 crc kubenswrapper[4790]: I1003 08:41:54.919476 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:54 crc kubenswrapper[4790]: I1003 08:41:54.919490 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:54 crc kubenswrapper[4790]: I1003 08:41:54.919515 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:54 crc kubenswrapper[4790]: I1003 08:41:54.919529 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:54Z","lastTransitionTime":"2025-10-03T08:41:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:55 crc kubenswrapper[4790]: I1003 08:41:55.022606 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:55 crc kubenswrapper[4790]: I1003 08:41:55.022664 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:55 crc kubenswrapper[4790]: I1003 08:41:55.022679 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:55 crc kubenswrapper[4790]: I1003 08:41:55.022700 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:55 crc kubenswrapper[4790]: I1003 08:41:55.022713 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:55Z","lastTransitionTime":"2025-10-03T08:41:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:55 crc kubenswrapper[4790]: I1003 08:41:55.126309 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:55 crc kubenswrapper[4790]: I1003 08:41:55.126400 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:55 crc kubenswrapper[4790]: I1003 08:41:55.126409 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:55 crc kubenswrapper[4790]: I1003 08:41:55.126429 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:55 crc kubenswrapper[4790]: I1003 08:41:55.126441 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:55Z","lastTransitionTime":"2025-10-03T08:41:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:55 crc kubenswrapper[4790]: I1003 08:41:55.229466 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:55 crc kubenswrapper[4790]: I1003 08:41:55.229537 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:55 crc kubenswrapper[4790]: I1003 08:41:55.229553 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:55 crc kubenswrapper[4790]: I1003 08:41:55.229614 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:55 crc kubenswrapper[4790]: I1003 08:41:55.229631 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:55Z","lastTransitionTime":"2025-10-03T08:41:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:55 crc kubenswrapper[4790]: I1003 08:41:55.333081 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:55 crc kubenswrapper[4790]: I1003 08:41:55.333143 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:55 crc kubenswrapper[4790]: I1003 08:41:55.333155 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:55 crc kubenswrapper[4790]: I1003 08:41:55.333172 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:55 crc kubenswrapper[4790]: I1003 08:41:55.333182 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:55Z","lastTransitionTime":"2025-10-03T08:41:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:55 crc kubenswrapper[4790]: I1003 08:41:55.436630 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:55 crc kubenswrapper[4790]: I1003 08:41:55.436682 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:55 crc kubenswrapper[4790]: I1003 08:41:55.436693 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:55 crc kubenswrapper[4790]: I1003 08:41:55.436717 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:55 crc kubenswrapper[4790]: I1003 08:41:55.436733 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:55Z","lastTransitionTime":"2025-10-03T08:41:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:55 crc kubenswrapper[4790]: I1003 08:41:55.542045 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:55 crc kubenswrapper[4790]: I1003 08:41:55.542109 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:55 crc kubenswrapper[4790]: I1003 08:41:55.542123 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:55 crc kubenswrapper[4790]: I1003 08:41:55.542142 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:55 crc kubenswrapper[4790]: I1003 08:41:55.542151 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:55Z","lastTransitionTime":"2025-10-03T08:41:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:55 crc kubenswrapper[4790]: I1003 08:41:55.645037 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:55 crc kubenswrapper[4790]: I1003 08:41:55.645114 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:55 crc kubenswrapper[4790]: I1003 08:41:55.645125 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:55 crc kubenswrapper[4790]: I1003 08:41:55.645145 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:55 crc kubenswrapper[4790]: I1003 08:41:55.645155 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:55Z","lastTransitionTime":"2025-10-03T08:41:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:55 crc kubenswrapper[4790]: I1003 08:41:55.748422 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:55 crc kubenswrapper[4790]: I1003 08:41:55.748470 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:55 crc kubenswrapper[4790]: I1003 08:41:55.748480 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:55 crc kubenswrapper[4790]: I1003 08:41:55.748495 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:55 crc kubenswrapper[4790]: I1003 08:41:55.748509 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:55Z","lastTransitionTime":"2025-10-03T08:41:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:55 crc kubenswrapper[4790]: I1003 08:41:55.851708 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:55 crc kubenswrapper[4790]: I1003 08:41:55.851760 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:55 crc kubenswrapper[4790]: I1003 08:41:55.851770 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:55 crc kubenswrapper[4790]: I1003 08:41:55.851792 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:55 crc kubenswrapper[4790]: I1003 08:41:55.851803 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:55Z","lastTransitionTime":"2025-10-03T08:41:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:55 crc kubenswrapper[4790]: I1003 08:41:55.955058 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:55 crc kubenswrapper[4790]: I1003 08:41:55.955103 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:55 crc kubenswrapper[4790]: I1003 08:41:55.955115 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:55 crc kubenswrapper[4790]: I1003 08:41:55.955135 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:55 crc kubenswrapper[4790]: I1003 08:41:55.955149 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:55Z","lastTransitionTime":"2025-10-03T08:41:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:56 crc kubenswrapper[4790]: I1003 08:41:56.057737 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:56 crc kubenswrapper[4790]: I1003 08:41:56.057803 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:56 crc kubenswrapper[4790]: I1003 08:41:56.057814 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:56 crc kubenswrapper[4790]: I1003 08:41:56.057835 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:56 crc kubenswrapper[4790]: I1003 08:41:56.057849 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:56Z","lastTransitionTime":"2025-10-03T08:41:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:56 crc kubenswrapper[4790]: I1003 08:41:56.160861 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:56 crc kubenswrapper[4790]: I1003 08:41:56.160910 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:56 crc kubenswrapper[4790]: I1003 08:41:56.160922 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:56 crc kubenswrapper[4790]: I1003 08:41:56.160941 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:56 crc kubenswrapper[4790]: I1003 08:41:56.160952 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:56Z","lastTransitionTime":"2025-10-03T08:41:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:56 crc kubenswrapper[4790]: I1003 08:41:56.264214 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:56 crc kubenswrapper[4790]: I1003 08:41:56.264290 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:56 crc kubenswrapper[4790]: I1003 08:41:56.264727 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:56 crc kubenswrapper[4790]: I1003 08:41:56.264768 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:56 crc kubenswrapper[4790]: I1003 08:41:56.264782 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:56Z","lastTransitionTime":"2025-10-03T08:41:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:56 crc kubenswrapper[4790]: I1003 08:41:56.328323 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 08:41:56 crc kubenswrapper[4790]: I1003 08:41:56.328360 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 08:41:56 crc kubenswrapper[4790]: E1003 08:41:56.328489 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 08:41:56 crc kubenswrapper[4790]: I1003 08:41:56.328521 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 08:41:56 crc kubenswrapper[4790]: E1003 08:41:56.328803 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 08:41:56 crc kubenswrapper[4790]: I1003 08:41:56.328852 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4vl95" Oct 03 08:41:56 crc kubenswrapper[4790]: E1003 08:41:56.328937 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 08:41:56 crc kubenswrapper[4790]: E1003 08:41:56.329051 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4vl95" podUID="9236b957-81cb-40bc-969e-a2057884ba50" Oct 03 08:41:56 crc kubenswrapper[4790]: I1003 08:41:56.368230 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:56 crc kubenswrapper[4790]: I1003 08:41:56.368293 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:56 crc kubenswrapper[4790]: I1003 08:41:56.368302 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:56 crc kubenswrapper[4790]: I1003 08:41:56.368336 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:56 crc kubenswrapper[4790]: I1003 08:41:56.368355 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:56Z","lastTransitionTime":"2025-10-03T08:41:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:56 crc kubenswrapper[4790]: I1003 08:41:56.471348 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:56 crc kubenswrapper[4790]: I1003 08:41:56.471393 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:56 crc kubenswrapper[4790]: I1003 08:41:56.471409 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:56 crc kubenswrapper[4790]: I1003 08:41:56.471437 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:56 crc kubenswrapper[4790]: I1003 08:41:56.471451 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:56Z","lastTransitionTime":"2025-10-03T08:41:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:56 crc kubenswrapper[4790]: I1003 08:41:56.574550 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:56 crc kubenswrapper[4790]: I1003 08:41:56.574623 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:56 crc kubenswrapper[4790]: I1003 08:41:56.574635 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:56 crc kubenswrapper[4790]: I1003 08:41:56.574657 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:56 crc kubenswrapper[4790]: I1003 08:41:56.574672 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:56Z","lastTransitionTime":"2025-10-03T08:41:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:56 crc kubenswrapper[4790]: I1003 08:41:56.677660 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:56 crc kubenswrapper[4790]: I1003 08:41:56.677725 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:56 crc kubenswrapper[4790]: I1003 08:41:56.677741 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:56 crc kubenswrapper[4790]: I1003 08:41:56.677762 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:56 crc kubenswrapper[4790]: I1003 08:41:56.677775 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:56Z","lastTransitionTime":"2025-10-03T08:41:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:56 crc kubenswrapper[4790]: I1003 08:41:56.780683 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:56 crc kubenswrapper[4790]: I1003 08:41:56.780764 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:56 crc kubenswrapper[4790]: I1003 08:41:56.780783 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:56 crc kubenswrapper[4790]: I1003 08:41:56.780817 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:56 crc kubenswrapper[4790]: I1003 08:41:56.780844 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:56Z","lastTransitionTime":"2025-10-03T08:41:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:56 crc kubenswrapper[4790]: I1003 08:41:56.883820 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:56 crc kubenswrapper[4790]: I1003 08:41:56.883875 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:56 crc kubenswrapper[4790]: I1003 08:41:56.883884 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:56 crc kubenswrapper[4790]: I1003 08:41:56.883902 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:56 crc kubenswrapper[4790]: I1003 08:41:56.883915 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:56Z","lastTransitionTime":"2025-10-03T08:41:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:56 crc kubenswrapper[4790]: I1003 08:41:56.987160 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:56 crc kubenswrapper[4790]: I1003 08:41:56.987217 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:56 crc kubenswrapper[4790]: I1003 08:41:56.987231 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:56 crc kubenswrapper[4790]: I1003 08:41:56.987252 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:56 crc kubenswrapper[4790]: I1003 08:41:56.987266 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:56Z","lastTransitionTime":"2025-10-03T08:41:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:57 crc kubenswrapper[4790]: I1003 08:41:57.089968 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:57 crc kubenswrapper[4790]: I1003 08:41:57.090011 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:57 crc kubenswrapper[4790]: I1003 08:41:57.090022 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:57 crc kubenswrapper[4790]: I1003 08:41:57.090041 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:57 crc kubenswrapper[4790]: I1003 08:41:57.090053 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:57Z","lastTransitionTime":"2025-10-03T08:41:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:57 crc kubenswrapper[4790]: I1003 08:41:57.192081 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:57 crc kubenswrapper[4790]: I1003 08:41:57.192116 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:57 crc kubenswrapper[4790]: I1003 08:41:57.192125 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:57 crc kubenswrapper[4790]: I1003 08:41:57.192139 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:57 crc kubenswrapper[4790]: I1003 08:41:57.192149 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:57Z","lastTransitionTime":"2025-10-03T08:41:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:57 crc kubenswrapper[4790]: I1003 08:41:57.295325 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:57 crc kubenswrapper[4790]: I1003 08:41:57.295365 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:57 crc kubenswrapper[4790]: I1003 08:41:57.295375 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:57 crc kubenswrapper[4790]: I1003 08:41:57.295390 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:57 crc kubenswrapper[4790]: I1003 08:41:57.295401 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:57Z","lastTransitionTime":"2025-10-03T08:41:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:57 crc kubenswrapper[4790]: I1003 08:41:57.397413 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:57 crc kubenswrapper[4790]: I1003 08:41:57.397463 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:57 crc kubenswrapper[4790]: I1003 08:41:57.397473 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:57 crc kubenswrapper[4790]: I1003 08:41:57.397489 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:57 crc kubenswrapper[4790]: I1003 08:41:57.397498 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:57Z","lastTransitionTime":"2025-10-03T08:41:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:57 crc kubenswrapper[4790]: I1003 08:41:57.500106 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:57 crc kubenswrapper[4790]: I1003 08:41:57.500180 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:57 crc kubenswrapper[4790]: I1003 08:41:57.500195 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:57 crc kubenswrapper[4790]: I1003 08:41:57.500219 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:57 crc kubenswrapper[4790]: I1003 08:41:57.500236 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:57Z","lastTransitionTime":"2025-10-03T08:41:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:57 crc kubenswrapper[4790]: I1003 08:41:57.604382 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:57 crc kubenswrapper[4790]: I1003 08:41:57.604438 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:57 crc kubenswrapper[4790]: I1003 08:41:57.604453 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:57 crc kubenswrapper[4790]: I1003 08:41:57.604475 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:57 crc kubenswrapper[4790]: I1003 08:41:57.604488 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:57Z","lastTransitionTime":"2025-10-03T08:41:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:57 crc kubenswrapper[4790]: I1003 08:41:57.707780 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:57 crc kubenswrapper[4790]: I1003 08:41:57.707814 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:57 crc kubenswrapper[4790]: I1003 08:41:57.707823 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:57 crc kubenswrapper[4790]: I1003 08:41:57.707843 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:57 crc kubenswrapper[4790]: I1003 08:41:57.707854 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:57Z","lastTransitionTime":"2025-10-03T08:41:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:57 crc kubenswrapper[4790]: I1003 08:41:57.810826 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:57 crc kubenswrapper[4790]: I1003 08:41:57.810887 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:57 crc kubenswrapper[4790]: I1003 08:41:57.810900 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:57 crc kubenswrapper[4790]: I1003 08:41:57.810924 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:57 crc kubenswrapper[4790]: I1003 08:41:57.810941 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:57Z","lastTransitionTime":"2025-10-03T08:41:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:57 crc kubenswrapper[4790]: I1003 08:41:57.915163 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:57 crc kubenswrapper[4790]: I1003 08:41:57.915233 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:57 crc kubenswrapper[4790]: I1003 08:41:57.915250 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:57 crc kubenswrapper[4790]: I1003 08:41:57.915275 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:57 crc kubenswrapper[4790]: I1003 08:41:57.915293 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:57Z","lastTransitionTime":"2025-10-03T08:41:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:58 crc kubenswrapper[4790]: I1003 08:41:58.017798 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:58 crc kubenswrapper[4790]: I1003 08:41:58.017898 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:58 crc kubenswrapper[4790]: I1003 08:41:58.017927 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:58 crc kubenswrapper[4790]: I1003 08:41:58.017954 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:58 crc kubenswrapper[4790]: I1003 08:41:58.017972 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:58Z","lastTransitionTime":"2025-10-03T08:41:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:58 crc kubenswrapper[4790]: I1003 08:41:58.121524 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:58 crc kubenswrapper[4790]: I1003 08:41:58.121626 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:58 crc kubenswrapper[4790]: I1003 08:41:58.121641 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:58 crc kubenswrapper[4790]: I1003 08:41:58.121666 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:58 crc kubenswrapper[4790]: I1003 08:41:58.121683 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:58Z","lastTransitionTime":"2025-10-03T08:41:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:58 crc kubenswrapper[4790]: I1003 08:41:58.224819 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:58 crc kubenswrapper[4790]: I1003 08:41:58.224895 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:58 crc kubenswrapper[4790]: I1003 08:41:58.224906 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:58 crc kubenswrapper[4790]: I1003 08:41:58.224930 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:58 crc kubenswrapper[4790]: I1003 08:41:58.224941 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:58Z","lastTransitionTime":"2025-10-03T08:41:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:58 crc kubenswrapper[4790]: I1003 08:41:58.327295 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 08:41:58 crc kubenswrapper[4790]: I1003 08:41:58.327365 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 08:41:58 crc kubenswrapper[4790]: I1003 08:41:58.327393 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4vl95" Oct 03 08:41:58 crc kubenswrapper[4790]: I1003 08:41:58.327295 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 08:41:58 crc kubenswrapper[4790]: E1003 08:41:58.327486 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 08:41:58 crc kubenswrapper[4790]: E1003 08:41:58.327670 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 08:41:58 crc kubenswrapper[4790]: E1003 08:41:58.327814 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 08:41:58 crc kubenswrapper[4790]: E1003 08:41:58.327868 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4vl95" podUID="9236b957-81cb-40bc-969e-a2057884ba50" Oct 03 08:41:58 crc kubenswrapper[4790]: I1003 08:41:58.328109 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:58 crc kubenswrapper[4790]: I1003 08:41:58.328148 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:58 crc kubenswrapper[4790]: I1003 08:41:58.328160 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:58 crc kubenswrapper[4790]: I1003 08:41:58.328176 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:58 crc kubenswrapper[4790]: I1003 08:41:58.328187 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:58Z","lastTransitionTime":"2025-10-03T08:41:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:58 crc kubenswrapper[4790]: I1003 08:41:58.431035 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:58 crc kubenswrapper[4790]: I1003 08:41:58.431088 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:58 crc kubenswrapper[4790]: I1003 08:41:58.431097 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:58 crc kubenswrapper[4790]: I1003 08:41:58.431119 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:58 crc kubenswrapper[4790]: I1003 08:41:58.431132 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:58Z","lastTransitionTime":"2025-10-03T08:41:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:58 crc kubenswrapper[4790]: I1003 08:41:58.534710 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:58 crc kubenswrapper[4790]: I1003 08:41:58.534816 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:58 crc kubenswrapper[4790]: I1003 08:41:58.534845 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:58 crc kubenswrapper[4790]: I1003 08:41:58.534894 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:58 crc kubenswrapper[4790]: I1003 08:41:58.534919 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:58Z","lastTransitionTime":"2025-10-03T08:41:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:58 crc kubenswrapper[4790]: I1003 08:41:58.638158 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:58 crc kubenswrapper[4790]: I1003 08:41:58.638230 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:58 crc kubenswrapper[4790]: I1003 08:41:58.638254 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:58 crc kubenswrapper[4790]: I1003 08:41:58.638289 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:58 crc kubenswrapper[4790]: I1003 08:41:58.638312 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:58Z","lastTransitionTime":"2025-10-03T08:41:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:58 crc kubenswrapper[4790]: I1003 08:41:58.740904 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:58 crc kubenswrapper[4790]: I1003 08:41:58.740987 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:58 crc kubenswrapper[4790]: I1003 08:41:58.741000 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:58 crc kubenswrapper[4790]: I1003 08:41:58.741022 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:58 crc kubenswrapper[4790]: I1003 08:41:58.741037 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:58Z","lastTransitionTime":"2025-10-03T08:41:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:58 crc kubenswrapper[4790]: I1003 08:41:58.844287 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:58 crc kubenswrapper[4790]: I1003 08:41:58.844350 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:58 crc kubenswrapper[4790]: I1003 08:41:58.844360 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:58 crc kubenswrapper[4790]: I1003 08:41:58.844378 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:58 crc kubenswrapper[4790]: I1003 08:41:58.844388 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:58Z","lastTransitionTime":"2025-10-03T08:41:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:58 crc kubenswrapper[4790]: I1003 08:41:58.948377 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:58 crc kubenswrapper[4790]: I1003 08:41:58.948442 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:58 crc kubenswrapper[4790]: I1003 08:41:58.948461 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:58 crc kubenswrapper[4790]: I1003 08:41:58.948490 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:58 crc kubenswrapper[4790]: I1003 08:41:58.948508 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:58Z","lastTransitionTime":"2025-10-03T08:41:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:59 crc kubenswrapper[4790]: I1003 08:41:59.051832 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:59 crc kubenswrapper[4790]: I1003 08:41:59.051909 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:59 crc kubenswrapper[4790]: I1003 08:41:59.051932 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:59 crc kubenswrapper[4790]: I1003 08:41:59.051969 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:59 crc kubenswrapper[4790]: I1003 08:41:59.051996 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:59Z","lastTransitionTime":"2025-10-03T08:41:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:59 crc kubenswrapper[4790]: I1003 08:41:59.155488 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:59 crc kubenswrapper[4790]: I1003 08:41:59.155544 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:59 crc kubenswrapper[4790]: I1003 08:41:59.155556 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:59 crc kubenswrapper[4790]: I1003 08:41:59.155597 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:59 crc kubenswrapper[4790]: I1003 08:41:59.155614 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:59Z","lastTransitionTime":"2025-10-03T08:41:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:59 crc kubenswrapper[4790]: I1003 08:41:59.258745 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:59 crc kubenswrapper[4790]: I1003 08:41:59.258803 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:59 crc kubenswrapper[4790]: I1003 08:41:59.258814 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:59 crc kubenswrapper[4790]: I1003 08:41:59.258836 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:59 crc kubenswrapper[4790]: I1003 08:41:59.258849 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:59Z","lastTransitionTime":"2025-10-03T08:41:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:59 crc kubenswrapper[4790]: I1003 08:41:59.361821 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:59 crc kubenswrapper[4790]: I1003 08:41:59.361871 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:59 crc kubenswrapper[4790]: I1003 08:41:59.361882 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:59 crc kubenswrapper[4790]: I1003 08:41:59.361903 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:59 crc kubenswrapper[4790]: I1003 08:41:59.361915 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:59Z","lastTransitionTime":"2025-10-03T08:41:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:59 crc kubenswrapper[4790]: I1003 08:41:59.464968 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:59 crc kubenswrapper[4790]: I1003 08:41:59.465018 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:59 crc kubenswrapper[4790]: I1003 08:41:59.465037 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:59 crc kubenswrapper[4790]: I1003 08:41:59.465063 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:59 crc kubenswrapper[4790]: I1003 08:41:59.465081 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:59Z","lastTransitionTime":"2025-10-03T08:41:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:59 crc kubenswrapper[4790]: I1003 08:41:59.567837 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:59 crc kubenswrapper[4790]: I1003 08:41:59.567915 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:59 crc kubenswrapper[4790]: I1003 08:41:59.567925 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:59 crc kubenswrapper[4790]: I1003 08:41:59.567946 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:59 crc kubenswrapper[4790]: I1003 08:41:59.567959 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:59Z","lastTransitionTime":"2025-10-03T08:41:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:59 crc kubenswrapper[4790]: I1003 08:41:59.671682 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:59 crc kubenswrapper[4790]: I1003 08:41:59.671770 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:59 crc kubenswrapper[4790]: I1003 08:41:59.671781 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:59 crc kubenswrapper[4790]: I1003 08:41:59.671798 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:59 crc kubenswrapper[4790]: I1003 08:41:59.671809 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:59Z","lastTransitionTime":"2025-10-03T08:41:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:59 crc kubenswrapper[4790]: I1003 08:41:59.775138 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:59 crc kubenswrapper[4790]: I1003 08:41:59.775204 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:59 crc kubenswrapper[4790]: I1003 08:41:59.775222 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:59 crc kubenswrapper[4790]: I1003 08:41:59.775253 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:59 crc kubenswrapper[4790]: I1003 08:41:59.775274 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:59Z","lastTransitionTime":"2025-10-03T08:41:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:59 crc kubenswrapper[4790]: I1003 08:41:59.877851 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:59 crc kubenswrapper[4790]: I1003 08:41:59.877915 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:59 crc kubenswrapper[4790]: I1003 08:41:59.877927 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:59 crc kubenswrapper[4790]: I1003 08:41:59.877955 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:59 crc kubenswrapper[4790]: I1003 08:41:59.877970 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:59Z","lastTransitionTime":"2025-10-03T08:41:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:41:59 crc kubenswrapper[4790]: I1003 08:41:59.980960 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:41:59 crc kubenswrapper[4790]: I1003 08:41:59.981006 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:41:59 crc kubenswrapper[4790]: I1003 08:41:59.981017 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:41:59 crc kubenswrapper[4790]: I1003 08:41:59.981035 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:41:59 crc kubenswrapper[4790]: I1003 08:41:59.981048 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:41:59Z","lastTransitionTime":"2025-10-03T08:41:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:42:00 crc kubenswrapper[4790]: I1003 08:42:00.084426 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:42:00 crc kubenswrapper[4790]: I1003 08:42:00.084500 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:42:00 crc kubenswrapper[4790]: I1003 08:42:00.084518 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:42:00 crc kubenswrapper[4790]: I1003 08:42:00.084545 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:42:00 crc kubenswrapper[4790]: I1003 08:42:00.084564 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:42:00Z","lastTransitionTime":"2025-10-03T08:42:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:42:00 crc kubenswrapper[4790]: I1003 08:42:00.187723 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:42:00 crc kubenswrapper[4790]: I1003 08:42:00.187796 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:42:00 crc kubenswrapper[4790]: I1003 08:42:00.187812 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:42:00 crc kubenswrapper[4790]: I1003 08:42:00.187839 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:42:00 crc kubenswrapper[4790]: I1003 08:42:00.187860 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:42:00Z","lastTransitionTime":"2025-10-03T08:42:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:42:00 crc kubenswrapper[4790]: I1003 08:42:00.290955 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:42:00 crc kubenswrapper[4790]: I1003 08:42:00.291067 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:42:00 crc kubenswrapper[4790]: I1003 08:42:00.291092 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:42:00 crc kubenswrapper[4790]: I1003 08:42:00.291128 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:42:00 crc kubenswrapper[4790]: I1003 08:42:00.291152 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:42:00Z","lastTransitionTime":"2025-10-03T08:42:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:42:00 crc kubenswrapper[4790]: I1003 08:42:00.327661 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4vl95" Oct 03 08:42:00 crc kubenswrapper[4790]: I1003 08:42:00.327756 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 08:42:00 crc kubenswrapper[4790]: I1003 08:42:00.327761 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 08:42:00 crc kubenswrapper[4790]: I1003 08:42:00.327821 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 08:42:00 crc kubenswrapper[4790]: E1003 08:42:00.327871 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4vl95" podUID="9236b957-81cb-40bc-969e-a2057884ba50" Oct 03 08:42:00 crc kubenswrapper[4790]: E1003 08:42:00.328083 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 08:42:00 crc kubenswrapper[4790]: E1003 08:42:00.328288 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 08:42:00 crc kubenswrapper[4790]: E1003 08:42:00.328439 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 08:42:00 crc kubenswrapper[4790]: I1003 08:42:00.394434 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:42:00 crc kubenswrapper[4790]: I1003 08:42:00.394492 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:42:00 crc kubenswrapper[4790]: I1003 08:42:00.394504 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:42:00 crc kubenswrapper[4790]: I1003 08:42:00.394532 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:42:00 crc kubenswrapper[4790]: I1003 08:42:00.394549 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:42:00Z","lastTransitionTime":"2025-10-03T08:42:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:42:00 crc kubenswrapper[4790]: I1003 08:42:00.497679 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:42:00 crc kubenswrapper[4790]: I1003 08:42:00.497751 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:42:00 crc kubenswrapper[4790]: I1003 08:42:00.497783 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:42:00 crc kubenswrapper[4790]: I1003 08:42:00.497817 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:42:00 crc kubenswrapper[4790]: I1003 08:42:00.497841 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:42:00Z","lastTransitionTime":"2025-10-03T08:42:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:42:00 crc kubenswrapper[4790]: I1003 08:42:00.600860 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:42:00 crc kubenswrapper[4790]: I1003 08:42:00.600933 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:42:00 crc kubenswrapper[4790]: I1003 08:42:00.600945 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:42:00 crc kubenswrapper[4790]: I1003 08:42:00.600967 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:42:00 crc kubenswrapper[4790]: I1003 08:42:00.600980 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:42:00Z","lastTransitionTime":"2025-10-03T08:42:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:42:00 crc kubenswrapper[4790]: I1003 08:42:00.704291 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:42:00 crc kubenswrapper[4790]: I1003 08:42:00.704345 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:42:00 crc kubenswrapper[4790]: I1003 08:42:00.704354 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:42:00 crc kubenswrapper[4790]: I1003 08:42:00.704372 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:42:00 crc kubenswrapper[4790]: I1003 08:42:00.704384 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:42:00Z","lastTransitionTime":"2025-10-03T08:42:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:42:00 crc kubenswrapper[4790]: I1003 08:42:00.807419 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:42:00 crc kubenswrapper[4790]: I1003 08:42:00.807497 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:42:00 crc kubenswrapper[4790]: I1003 08:42:00.807524 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:42:00 crc kubenswrapper[4790]: I1003 08:42:00.807559 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:42:00 crc kubenswrapper[4790]: I1003 08:42:00.807612 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:42:00Z","lastTransitionTime":"2025-10-03T08:42:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:42:00 crc kubenswrapper[4790]: I1003 08:42:00.910127 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:42:00 crc kubenswrapper[4790]: I1003 08:42:00.910192 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:42:00 crc kubenswrapper[4790]: I1003 08:42:00.910204 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:42:00 crc kubenswrapper[4790]: I1003 08:42:00.910221 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:42:00 crc kubenswrapper[4790]: I1003 08:42:00.910231 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:42:00Z","lastTransitionTime":"2025-10-03T08:42:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:42:01 crc kubenswrapper[4790]: I1003 08:42:01.012981 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:42:01 crc kubenswrapper[4790]: I1003 08:42:01.013046 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:42:01 crc kubenswrapper[4790]: I1003 08:42:01.013058 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:42:01 crc kubenswrapper[4790]: I1003 08:42:01.013075 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:42:01 crc kubenswrapper[4790]: I1003 08:42:01.013085 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:42:01Z","lastTransitionTime":"2025-10-03T08:42:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:42:01 crc kubenswrapper[4790]: I1003 08:42:01.116306 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:42:01 crc kubenswrapper[4790]: I1003 08:42:01.116370 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:42:01 crc kubenswrapper[4790]: I1003 08:42:01.116379 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:42:01 crc kubenswrapper[4790]: I1003 08:42:01.116404 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:42:01 crc kubenswrapper[4790]: I1003 08:42:01.116414 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:42:01Z","lastTransitionTime":"2025-10-03T08:42:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:42:01 crc kubenswrapper[4790]: I1003 08:42:01.219511 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:42:01 crc kubenswrapper[4790]: I1003 08:42:01.219567 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:42:01 crc kubenswrapper[4790]: I1003 08:42:01.219603 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:42:01 crc kubenswrapper[4790]: I1003 08:42:01.219625 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:42:01 crc kubenswrapper[4790]: I1003 08:42:01.219637 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:42:01Z","lastTransitionTime":"2025-10-03T08:42:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:42:01 crc kubenswrapper[4790]: I1003 08:42:01.322130 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:42:01 crc kubenswrapper[4790]: I1003 08:42:01.322185 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:42:01 crc kubenswrapper[4790]: I1003 08:42:01.322199 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:42:01 crc kubenswrapper[4790]: I1003 08:42:01.322214 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:42:01 crc kubenswrapper[4790]: I1003 08:42:01.322223 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:42:01Z","lastTransitionTime":"2025-10-03T08:42:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:42:01 crc kubenswrapper[4790]: I1003 08:42:01.425563 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:42:01 crc kubenswrapper[4790]: I1003 08:42:01.425637 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:42:01 crc kubenswrapper[4790]: I1003 08:42:01.425650 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:42:01 crc kubenswrapper[4790]: I1003 08:42:01.425671 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:42:01 crc kubenswrapper[4790]: I1003 08:42:01.425687 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:42:01Z","lastTransitionTime":"2025-10-03T08:42:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:42:01 crc kubenswrapper[4790]: I1003 08:42:01.528994 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:42:01 crc kubenswrapper[4790]: I1003 08:42:01.529058 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:42:01 crc kubenswrapper[4790]: I1003 08:42:01.529069 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:42:01 crc kubenswrapper[4790]: I1003 08:42:01.529087 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:42:01 crc kubenswrapper[4790]: I1003 08:42:01.529099 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:42:01Z","lastTransitionTime":"2025-10-03T08:42:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:42:01 crc kubenswrapper[4790]: I1003 08:42:01.631917 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:42:01 crc kubenswrapper[4790]: I1003 08:42:01.631984 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:42:01 crc kubenswrapper[4790]: I1003 08:42:01.632001 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:42:01 crc kubenswrapper[4790]: I1003 08:42:01.632026 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:42:01 crc kubenswrapper[4790]: I1003 08:42:01.632045 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:42:01Z","lastTransitionTime":"2025-10-03T08:42:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:42:01 crc kubenswrapper[4790]: I1003 08:42:01.734461 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:42:01 crc kubenswrapper[4790]: I1003 08:42:01.734546 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:42:01 crc kubenswrapper[4790]: I1003 08:42:01.734558 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:42:01 crc kubenswrapper[4790]: I1003 08:42:01.734591 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:42:01 crc kubenswrapper[4790]: I1003 08:42:01.734602 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:42:01Z","lastTransitionTime":"2025-10-03T08:42:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:42:01 crc kubenswrapper[4790]: I1003 08:42:01.838039 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:42:01 crc kubenswrapper[4790]: I1003 08:42:01.838116 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:42:01 crc kubenswrapper[4790]: I1003 08:42:01.838132 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:42:01 crc kubenswrapper[4790]: I1003 08:42:01.838154 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:42:01 crc kubenswrapper[4790]: I1003 08:42:01.838166 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:42:01Z","lastTransitionTime":"2025-10-03T08:42:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:42:01 crc kubenswrapper[4790]: I1003 08:42:01.940785 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:42:01 crc kubenswrapper[4790]: I1003 08:42:01.940869 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:42:01 crc kubenswrapper[4790]: I1003 08:42:01.940897 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:42:01 crc kubenswrapper[4790]: I1003 08:42:01.940931 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:42:01 crc kubenswrapper[4790]: I1003 08:42:01.940959 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:42:01Z","lastTransitionTime":"2025-10-03T08:42:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:42:02 crc kubenswrapper[4790]: I1003 08:42:02.042754 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:42:02 crc kubenswrapper[4790]: I1003 08:42:02.042793 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:42:02 crc kubenswrapper[4790]: I1003 08:42:02.042803 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:42:02 crc kubenswrapper[4790]: I1003 08:42:02.042841 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:42:02 crc kubenswrapper[4790]: I1003 08:42:02.042853 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:42:02Z","lastTransitionTime":"2025-10-03T08:42:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:42:02 crc kubenswrapper[4790]: I1003 08:42:02.146892 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:42:02 crc kubenswrapper[4790]: I1003 08:42:02.146940 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:42:02 crc kubenswrapper[4790]: I1003 08:42:02.146952 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:42:02 crc kubenswrapper[4790]: I1003 08:42:02.146970 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:42:02 crc kubenswrapper[4790]: I1003 08:42:02.146982 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:42:02Z","lastTransitionTime":"2025-10-03T08:42:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:42:02 crc kubenswrapper[4790]: I1003 08:42:02.250175 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:42:02 crc kubenswrapper[4790]: I1003 08:42:02.250238 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:42:02 crc kubenswrapper[4790]: I1003 08:42:02.250246 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:42:02 crc kubenswrapper[4790]: I1003 08:42:02.250264 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:42:02 crc kubenswrapper[4790]: I1003 08:42:02.250275 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:42:02Z","lastTransitionTime":"2025-10-03T08:42:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:42:02 crc kubenswrapper[4790]: I1003 08:42:02.327321 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 08:42:02 crc kubenswrapper[4790]: I1003 08:42:02.327439 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 08:42:02 crc kubenswrapper[4790]: I1003 08:42:02.327517 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 08:42:02 crc kubenswrapper[4790]: I1003 08:42:02.327757 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4vl95" Oct 03 08:42:02 crc kubenswrapper[4790]: E1003 08:42:02.327746 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 08:42:02 crc kubenswrapper[4790]: E1003 08:42:02.327867 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 08:42:02 crc kubenswrapper[4790]: E1003 08:42:02.327958 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 08:42:02 crc kubenswrapper[4790]: E1003 08:42:02.328269 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4vl95" podUID="9236b957-81cb-40bc-969e-a2057884ba50" Oct 03 08:42:02 crc kubenswrapper[4790]: I1003 08:42:02.353339 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:42:02 crc kubenswrapper[4790]: I1003 08:42:02.353397 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:42:02 crc kubenswrapper[4790]: I1003 08:42:02.353410 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:42:02 crc kubenswrapper[4790]: I1003 08:42:02.353429 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:42:02 crc kubenswrapper[4790]: I1003 08:42:02.353445 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:42:02Z","lastTransitionTime":"2025-10-03T08:42:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:42:02 crc kubenswrapper[4790]: I1003 08:42:02.457110 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:42:02 crc kubenswrapper[4790]: I1003 08:42:02.457188 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:42:02 crc kubenswrapper[4790]: I1003 08:42:02.457200 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:42:02 crc kubenswrapper[4790]: I1003 08:42:02.457222 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:42:02 crc kubenswrapper[4790]: I1003 08:42:02.457713 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:42:02Z","lastTransitionTime":"2025-10-03T08:42:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:42:02 crc kubenswrapper[4790]: I1003 08:42:02.559943 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:42:02 crc kubenswrapper[4790]: I1003 08:42:02.560007 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:42:02 crc kubenswrapper[4790]: I1003 08:42:02.560025 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:42:02 crc kubenswrapper[4790]: I1003 08:42:02.560048 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:42:02 crc kubenswrapper[4790]: I1003 08:42:02.560060 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:42:02Z","lastTransitionTime":"2025-10-03T08:42:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:42:02 crc kubenswrapper[4790]: I1003 08:42:02.662449 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:42:02 crc kubenswrapper[4790]: I1003 08:42:02.662505 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:42:02 crc kubenswrapper[4790]: I1003 08:42:02.662517 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:42:02 crc kubenswrapper[4790]: I1003 08:42:02.662534 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:42:02 crc kubenswrapper[4790]: I1003 08:42:02.662545 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:42:02Z","lastTransitionTime":"2025-10-03T08:42:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:42:02 crc kubenswrapper[4790]: I1003 08:42:02.765177 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:42:02 crc kubenswrapper[4790]: I1003 08:42:02.765228 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:42:02 crc kubenswrapper[4790]: I1003 08:42:02.765240 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:42:02 crc kubenswrapper[4790]: I1003 08:42:02.765256 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:42:02 crc kubenswrapper[4790]: I1003 08:42:02.765270 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:42:02Z","lastTransitionTime":"2025-10-03T08:42:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:42:02 crc kubenswrapper[4790]: I1003 08:42:02.868535 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:42:02 crc kubenswrapper[4790]: I1003 08:42:02.868591 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:42:02 crc kubenswrapper[4790]: I1003 08:42:02.868605 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:42:02 crc kubenswrapper[4790]: I1003 08:42:02.868622 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:42:02 crc kubenswrapper[4790]: I1003 08:42:02.868635 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:42:02Z","lastTransitionTime":"2025-10-03T08:42:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:42:02 crc kubenswrapper[4790]: I1003 08:42:02.972065 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:42:02 crc kubenswrapper[4790]: I1003 08:42:02.972143 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:42:02 crc kubenswrapper[4790]: I1003 08:42:02.972154 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:42:02 crc kubenswrapper[4790]: I1003 08:42:02.972173 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:42:02 crc kubenswrapper[4790]: I1003 08:42:02.972185 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:42:02Z","lastTransitionTime":"2025-10-03T08:42:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:42:03 crc kubenswrapper[4790]: I1003 08:42:03.075600 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:42:03 crc kubenswrapper[4790]: I1003 08:42:03.075652 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:42:03 crc kubenswrapper[4790]: I1003 08:42:03.075664 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:42:03 crc kubenswrapper[4790]: I1003 08:42:03.075682 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:42:03 crc kubenswrapper[4790]: I1003 08:42:03.075699 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:42:03Z","lastTransitionTime":"2025-10-03T08:42:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:42:03 crc kubenswrapper[4790]: I1003 08:42:03.178285 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:42:03 crc kubenswrapper[4790]: I1003 08:42:03.178355 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:42:03 crc kubenswrapper[4790]: I1003 08:42:03.178374 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:42:03 crc kubenswrapper[4790]: I1003 08:42:03.178396 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:42:03 crc kubenswrapper[4790]: I1003 08:42:03.178409 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:42:03Z","lastTransitionTime":"2025-10-03T08:42:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:42:03 crc kubenswrapper[4790]: I1003 08:42:03.281472 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:42:03 crc kubenswrapper[4790]: I1003 08:42:03.281533 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:42:03 crc kubenswrapper[4790]: I1003 08:42:03.281543 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:42:03 crc kubenswrapper[4790]: I1003 08:42:03.281562 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:42:03 crc kubenswrapper[4790]: I1003 08:42:03.281591 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:42:03Z","lastTransitionTime":"2025-10-03T08:42:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:42:03 crc kubenswrapper[4790]: I1003 08:42:03.288818 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 03 08:42:03 crc kubenswrapper[4790]: I1003 08:42:03.288854 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 03 08:42:03 crc kubenswrapper[4790]: I1003 08:42:03.288867 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 03 08:42:03 crc kubenswrapper[4790]: I1003 08:42:03.288884 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 03 08:42:03 crc kubenswrapper[4790]: I1003 08:42:03.288896 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-03T08:42:03Z","lastTransitionTime":"2025-10-03T08:42:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 03 08:42:03 crc kubenswrapper[4790]: I1003 08:42:03.336064 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-fn764" podStartSLOduration=88.336011718 podStartE2EDuration="1m28.336011718s" podCreationTimestamp="2025-10-03 08:40:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 08:41:54.718044165 +0000 UTC m=+101.070978403" watchObservedRunningTime="2025-10-03 08:42:03.336011718 +0000 UTC m=+109.688945976" Oct 03 08:42:03 crc kubenswrapper[4790]: I1003 08:42:03.337657 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-dl8w5"] Oct 03 08:42:03 crc kubenswrapper[4790]: I1003 08:42:03.338675 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-dl8w5" Oct 03 08:42:03 crc kubenswrapper[4790]: I1003 08:42:03.342798 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Oct 03 08:42:03 crc kubenswrapper[4790]: I1003 08:42:03.343130 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Oct 03 08:42:03 crc kubenswrapper[4790]: I1003 08:42:03.343429 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Oct 03 08:42:03 crc kubenswrapper[4790]: I1003 08:42:03.344896 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Oct 03 08:42:03 crc kubenswrapper[4790]: I1003 08:42:03.467340 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/bf5619c1-d3f7-4133-99cc-e6af15152842-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-dl8w5\" (UID: \"bf5619c1-d3f7-4133-99cc-e6af15152842\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-dl8w5" Oct 03 08:42:03 crc kubenswrapper[4790]: I1003 08:42:03.467401 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/bf5619c1-d3f7-4133-99cc-e6af15152842-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-dl8w5\" (UID: \"bf5619c1-d3f7-4133-99cc-e6af15152842\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-dl8w5" Oct 03 08:42:03 crc kubenswrapper[4790]: I1003 08:42:03.467441 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/bf5619c1-d3f7-4133-99cc-e6af15152842-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-dl8w5\" (UID: \"bf5619c1-d3f7-4133-99cc-e6af15152842\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-dl8w5" Oct 03 08:42:03 crc kubenswrapper[4790]: I1003 08:42:03.467494 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bf5619c1-d3f7-4133-99cc-e6af15152842-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-dl8w5\" (UID: \"bf5619c1-d3f7-4133-99cc-e6af15152842\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-dl8w5" Oct 03 08:42:03 crc kubenswrapper[4790]: I1003 08:42:03.467541 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/bf5619c1-d3f7-4133-99cc-e6af15152842-service-ca\") pod \"cluster-version-operator-5c965bbfc6-dl8w5\" (UID: \"bf5619c1-d3f7-4133-99cc-e6af15152842\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-dl8w5" Oct 03 08:42:03 crc kubenswrapper[4790]: I1003 08:42:03.569194 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/bf5619c1-d3f7-4133-99cc-e6af15152842-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-dl8w5\" (UID: \"bf5619c1-d3f7-4133-99cc-e6af15152842\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-dl8w5" Oct 03 08:42:03 crc kubenswrapper[4790]: I1003 08:42:03.569270 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/bf5619c1-d3f7-4133-99cc-e6af15152842-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-dl8w5\" (UID: \"bf5619c1-d3f7-4133-99cc-e6af15152842\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-dl8w5" Oct 03 08:42:03 crc kubenswrapper[4790]: I1003 08:42:03.569300 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/bf5619c1-d3f7-4133-99cc-e6af15152842-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-dl8w5\" (UID: \"bf5619c1-d3f7-4133-99cc-e6af15152842\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-dl8w5" Oct 03 08:42:03 crc kubenswrapper[4790]: I1003 08:42:03.569338 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bf5619c1-d3f7-4133-99cc-e6af15152842-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-dl8w5\" (UID: \"bf5619c1-d3f7-4133-99cc-e6af15152842\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-dl8w5" Oct 03 08:42:03 crc kubenswrapper[4790]: I1003 08:42:03.569404 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/bf5619c1-d3f7-4133-99cc-e6af15152842-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-dl8w5\" (UID: \"bf5619c1-d3f7-4133-99cc-e6af15152842\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-dl8w5" Oct 03 08:42:03 crc kubenswrapper[4790]: I1003 08:42:03.569469 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/bf5619c1-d3f7-4133-99cc-e6af15152842-service-ca\") pod \"cluster-version-operator-5c965bbfc6-dl8w5\" (UID: \"bf5619c1-d3f7-4133-99cc-e6af15152842\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-dl8w5" Oct 03 08:42:03 crc kubenswrapper[4790]: I1003 08:42:03.569674 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/bf5619c1-d3f7-4133-99cc-e6af15152842-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-dl8w5\" (UID: \"bf5619c1-d3f7-4133-99cc-e6af15152842\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-dl8w5" Oct 03 08:42:03 crc kubenswrapper[4790]: I1003 08:42:03.571828 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/bf5619c1-d3f7-4133-99cc-e6af15152842-service-ca\") pod \"cluster-version-operator-5c965bbfc6-dl8w5\" (UID: \"bf5619c1-d3f7-4133-99cc-e6af15152842\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-dl8w5" Oct 03 08:42:03 crc kubenswrapper[4790]: I1003 08:42:03.579142 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bf5619c1-d3f7-4133-99cc-e6af15152842-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-dl8w5\" (UID: \"bf5619c1-d3f7-4133-99cc-e6af15152842\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-dl8w5" Oct 03 08:42:03 crc kubenswrapper[4790]: I1003 08:42:03.588747 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/bf5619c1-d3f7-4133-99cc-e6af15152842-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-dl8w5\" (UID: \"bf5619c1-d3f7-4133-99cc-e6af15152842\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-dl8w5" Oct 03 08:42:03 crc kubenswrapper[4790]: I1003 08:42:03.664041 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-dl8w5" Oct 03 08:42:04 crc kubenswrapper[4790]: I1003 08:42:04.047867 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-dl8w5" event={"ID":"bf5619c1-d3f7-4133-99cc-e6af15152842","Type":"ContainerStarted","Data":"529fd34d35e3731edfbd30adf9189ccabd27da6fcb5e2220808090e7117a9973"} Oct 03 08:42:04 crc kubenswrapper[4790]: I1003 08:42:04.047943 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-dl8w5" event={"ID":"bf5619c1-d3f7-4133-99cc-e6af15152842","Type":"ContainerStarted","Data":"227aefe060c6c28991982ced10e5c1645930ea4c48fe3fd6f67bbf7698481564"} Oct 03 08:42:04 crc kubenswrapper[4790]: I1003 08:42:04.327593 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4vl95" Oct 03 08:42:04 crc kubenswrapper[4790]: I1003 08:42:04.327667 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 08:42:04 crc kubenswrapper[4790]: I1003 08:42:04.327612 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 08:42:04 crc kubenswrapper[4790]: I1003 08:42:04.327788 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 08:42:04 crc kubenswrapper[4790]: E1003 08:42:04.330372 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4vl95" podUID="9236b957-81cb-40bc-969e-a2057884ba50" Oct 03 08:42:04 crc kubenswrapper[4790]: E1003 08:42:04.330537 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 08:42:04 crc kubenswrapper[4790]: E1003 08:42:04.330726 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 08:42:04 crc kubenswrapper[4790]: E1003 08:42:04.330727 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 08:42:05 crc kubenswrapper[4790]: I1003 08:42:05.328546 4790 scope.go:117] "RemoveContainer" containerID="873da5acf186f02e995f5db997cebf47651cb6a2b87d4641ecb0318c004619e8" Oct 03 08:42:05 crc kubenswrapper[4790]: E1003 08:42:05.328859 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-vsjvh_openshift-ovn-kubernetes(fa372dcc-63aa-48e5-b153-7739078026ef)\"" pod="openshift-ovn-kubernetes/ovnkube-node-vsjvh" podUID="fa372dcc-63aa-48e5-b153-7739078026ef" Oct 03 08:42:06 crc kubenswrapper[4790]: I1003 08:42:06.327677 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 08:42:06 crc kubenswrapper[4790]: I1003 08:42:06.327725 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 08:42:06 crc kubenswrapper[4790]: I1003 08:42:06.327677 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4vl95" Oct 03 08:42:06 crc kubenswrapper[4790]: E1003 08:42:06.327833 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 08:42:06 crc kubenswrapper[4790]: I1003 08:42:06.327915 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 08:42:06 crc kubenswrapper[4790]: E1003 08:42:06.327948 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 08:42:06 crc kubenswrapper[4790]: E1003 08:42:06.328169 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4vl95" podUID="9236b957-81cb-40bc-969e-a2057884ba50" Oct 03 08:42:06 crc kubenswrapper[4790]: E1003 08:42:06.328348 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 08:42:08 crc kubenswrapper[4790]: I1003 08:42:08.327714 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4vl95" Oct 03 08:42:08 crc kubenswrapper[4790]: I1003 08:42:08.327783 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 08:42:08 crc kubenswrapper[4790]: I1003 08:42:08.327873 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 08:42:08 crc kubenswrapper[4790]: I1003 08:42:08.328804 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 08:42:08 crc kubenswrapper[4790]: E1003 08:42:08.328974 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4vl95" podUID="9236b957-81cb-40bc-969e-a2057884ba50" Oct 03 08:42:08 crc kubenswrapper[4790]: E1003 08:42:08.329108 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 08:42:08 crc kubenswrapper[4790]: E1003 08:42:08.329257 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 08:42:08 crc kubenswrapper[4790]: E1003 08:42:08.329358 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 08:42:10 crc kubenswrapper[4790]: I1003 08:42:10.328160 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4vl95" Oct 03 08:42:10 crc kubenswrapper[4790]: I1003 08:42:10.328401 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 08:42:10 crc kubenswrapper[4790]: I1003 08:42:10.328409 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 08:42:10 crc kubenswrapper[4790]: I1003 08:42:10.328478 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 08:42:10 crc kubenswrapper[4790]: E1003 08:42:10.328686 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4vl95" podUID="9236b957-81cb-40bc-969e-a2057884ba50" Oct 03 08:42:10 crc kubenswrapper[4790]: E1003 08:42:10.328860 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 08:42:10 crc kubenswrapper[4790]: E1003 08:42:10.329046 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 08:42:10 crc kubenswrapper[4790]: E1003 08:42:10.329166 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 08:42:11 crc kubenswrapper[4790]: I1003 08:42:11.073857 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-hncl2_f2f5631d-8486-44f9-81b0-ee78515e7866/kube-multus/1.log" Oct 03 08:42:11 crc kubenswrapper[4790]: I1003 08:42:11.074979 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-hncl2_f2f5631d-8486-44f9-81b0-ee78515e7866/kube-multus/0.log" Oct 03 08:42:11 crc kubenswrapper[4790]: I1003 08:42:11.075043 4790 generic.go:334] "Generic (PLEG): container finished" podID="f2f5631d-8486-44f9-81b0-ee78515e7866" containerID="934c9d6aafe23db007c316b08845e9d67eecf0072ba180fa513fa213245cc37c" exitCode=1 Oct 03 08:42:11 crc kubenswrapper[4790]: I1003 08:42:11.075084 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-hncl2" event={"ID":"f2f5631d-8486-44f9-81b0-ee78515e7866","Type":"ContainerDied","Data":"934c9d6aafe23db007c316b08845e9d67eecf0072ba180fa513fa213245cc37c"} Oct 03 08:42:11 crc kubenswrapper[4790]: I1003 08:42:11.075136 4790 scope.go:117] "RemoveContainer" containerID="3fde680972c36057c9e06b0f745acbfb2e6ab399b14d6c25a3c379b65e598afd" Oct 03 08:42:11 crc kubenswrapper[4790]: I1003 08:42:11.075742 4790 scope.go:117] "RemoveContainer" containerID="934c9d6aafe23db007c316b08845e9d67eecf0072ba180fa513fa213245cc37c" Oct 03 08:42:11 crc kubenswrapper[4790]: E1003 08:42:11.075994 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-multus pod=multus-hncl2_openshift-multus(f2f5631d-8486-44f9-81b0-ee78515e7866)\"" pod="openshift-multus/multus-hncl2" podUID="f2f5631d-8486-44f9-81b0-ee78515e7866" Oct 03 08:42:11 crc kubenswrapper[4790]: I1003 08:42:11.104018 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-dl8w5" podStartSLOduration=97.103988975 podStartE2EDuration="1m37.103988975s" podCreationTimestamp="2025-10-03 08:40:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 08:42:04.063978061 +0000 UTC m=+110.416912319" watchObservedRunningTime="2025-10-03 08:42:11.103988975 +0000 UTC m=+117.456923233" Oct 03 08:42:12 crc kubenswrapper[4790]: I1003 08:42:12.082561 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-hncl2_f2f5631d-8486-44f9-81b0-ee78515e7866/kube-multus/1.log" Oct 03 08:42:12 crc kubenswrapper[4790]: I1003 08:42:12.327614 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4vl95" Oct 03 08:42:12 crc kubenswrapper[4790]: I1003 08:42:12.327723 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 08:42:12 crc kubenswrapper[4790]: I1003 08:42:12.327750 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 08:42:12 crc kubenswrapper[4790]: I1003 08:42:12.327869 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 08:42:12 crc kubenswrapper[4790]: E1003 08:42:12.327860 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4vl95" podUID="9236b957-81cb-40bc-969e-a2057884ba50" Oct 03 08:42:12 crc kubenswrapper[4790]: E1003 08:42:12.328100 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 08:42:12 crc kubenswrapper[4790]: E1003 08:42:12.328235 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 08:42:12 crc kubenswrapper[4790]: E1003 08:42:12.328387 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 08:42:14 crc kubenswrapper[4790]: I1003 08:42:14.327828 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4vl95" Oct 03 08:42:14 crc kubenswrapper[4790]: I1003 08:42:14.327915 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 08:42:14 crc kubenswrapper[4790]: I1003 08:42:14.327959 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 08:42:14 crc kubenswrapper[4790]: I1003 08:42:14.329043 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 08:42:14 crc kubenswrapper[4790]: E1003 08:42:14.329027 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4vl95" podUID="9236b957-81cb-40bc-969e-a2057884ba50" Oct 03 08:42:14 crc kubenswrapper[4790]: E1003 08:42:14.329236 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 08:42:14 crc kubenswrapper[4790]: E1003 08:42:14.329282 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 08:42:14 crc kubenswrapper[4790]: E1003 08:42:14.329349 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 08:42:14 crc kubenswrapper[4790]: E1003 08:42:14.330704 4790 kubelet_node_status.go:497] "Node not becoming ready in time after startup" Oct 03 08:42:14 crc kubenswrapper[4790]: E1003 08:42:14.463362 4790 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Oct 03 08:42:16 crc kubenswrapper[4790]: I1003 08:42:16.328016 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 08:42:16 crc kubenswrapper[4790]: E1003 08:42:16.328198 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 08:42:16 crc kubenswrapper[4790]: I1003 08:42:16.328856 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 08:42:16 crc kubenswrapper[4790]: I1003 08:42:16.328958 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4vl95" Oct 03 08:42:16 crc kubenswrapper[4790]: I1003 08:42:16.329000 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 08:42:16 crc kubenswrapper[4790]: E1003 08:42:16.329165 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 08:42:16 crc kubenswrapper[4790]: E1003 08:42:16.329243 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4vl95" podUID="9236b957-81cb-40bc-969e-a2057884ba50" Oct 03 08:42:16 crc kubenswrapper[4790]: E1003 08:42:16.329419 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 08:42:18 crc kubenswrapper[4790]: I1003 08:42:18.327391 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 08:42:18 crc kubenswrapper[4790]: I1003 08:42:18.327740 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 08:42:18 crc kubenswrapper[4790]: I1003 08:42:18.327737 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4vl95" Oct 03 08:42:18 crc kubenswrapper[4790]: I1003 08:42:18.327731 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 08:42:18 crc kubenswrapper[4790]: E1003 08:42:18.327850 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 08:42:18 crc kubenswrapper[4790]: E1003 08:42:18.328287 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4vl95" podUID="9236b957-81cb-40bc-969e-a2057884ba50" Oct 03 08:42:18 crc kubenswrapper[4790]: E1003 08:42:18.328346 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 08:42:18 crc kubenswrapper[4790]: E1003 08:42:18.328406 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 08:42:18 crc kubenswrapper[4790]: I1003 08:42:18.328508 4790 scope.go:117] "RemoveContainer" containerID="873da5acf186f02e995f5db997cebf47651cb6a2b87d4641ecb0318c004619e8" Oct 03 08:42:19 crc kubenswrapper[4790]: I1003 08:42:19.109405 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vsjvh_fa372dcc-63aa-48e5-b153-7739078026ef/ovnkube-controller/3.log" Oct 03 08:42:19 crc kubenswrapper[4790]: I1003 08:42:19.112701 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vsjvh" event={"ID":"fa372dcc-63aa-48e5-b153-7739078026ef","Type":"ContainerStarted","Data":"85ca20fdf25ea02c6a17221523f0f3ffcf7af8343ea18ff5121a059dea5bddb2"} Oct 03 08:42:19 crc kubenswrapper[4790]: I1003 08:42:19.113215 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-vsjvh" Oct 03 08:42:19 crc kubenswrapper[4790]: I1003 08:42:19.143748 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-vsjvh" podStartSLOduration=104.143724631 podStartE2EDuration="1m44.143724631s" podCreationTimestamp="2025-10-03 08:40:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 08:42:19.142818946 +0000 UTC m=+125.495753264" watchObservedRunningTime="2025-10-03 08:42:19.143724631 +0000 UTC m=+125.496658869" Oct 03 08:42:19 crc kubenswrapper[4790]: I1003 08:42:19.265521 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-4vl95"] Oct 03 08:42:19 crc kubenswrapper[4790]: I1003 08:42:19.265933 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4vl95" Oct 03 08:42:19 crc kubenswrapper[4790]: E1003 08:42:19.266076 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4vl95" podUID="9236b957-81cb-40bc-969e-a2057884ba50" Oct 03 08:42:19 crc kubenswrapper[4790]: E1003 08:42:19.465308 4790 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Oct 03 08:42:20 crc kubenswrapper[4790]: I1003 08:42:20.328042 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 08:42:20 crc kubenswrapper[4790]: I1003 08:42:20.328132 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 08:42:20 crc kubenswrapper[4790]: E1003 08:42:20.328214 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 08:42:20 crc kubenswrapper[4790]: I1003 08:42:20.328236 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 08:42:20 crc kubenswrapper[4790]: E1003 08:42:20.328360 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 08:42:20 crc kubenswrapper[4790]: E1003 08:42:20.328569 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 08:42:21 crc kubenswrapper[4790]: I1003 08:42:21.327973 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4vl95" Oct 03 08:42:21 crc kubenswrapper[4790]: E1003 08:42:21.328121 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4vl95" podUID="9236b957-81cb-40bc-969e-a2057884ba50" Oct 03 08:42:22 crc kubenswrapper[4790]: I1003 08:42:22.328419 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 08:42:22 crc kubenswrapper[4790]: I1003 08:42:22.328545 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 08:42:22 crc kubenswrapper[4790]: I1003 08:42:22.328625 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 08:42:22 crc kubenswrapper[4790]: E1003 08:42:22.328931 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 08:42:22 crc kubenswrapper[4790]: I1003 08:42:22.329070 4790 scope.go:117] "RemoveContainer" containerID="934c9d6aafe23db007c316b08845e9d67eecf0072ba180fa513fa213245cc37c" Oct 03 08:42:22 crc kubenswrapper[4790]: E1003 08:42:22.329105 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 08:42:22 crc kubenswrapper[4790]: E1003 08:42:22.329395 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 08:42:23 crc kubenswrapper[4790]: I1003 08:42:23.130190 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-hncl2_f2f5631d-8486-44f9-81b0-ee78515e7866/kube-multus/1.log" Oct 03 08:42:23 crc kubenswrapper[4790]: I1003 08:42:23.130835 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-hncl2" event={"ID":"f2f5631d-8486-44f9-81b0-ee78515e7866","Type":"ContainerStarted","Data":"6c79a380a2fa475e309213e6650566780b9ed65a03fcabc66eca015ddcc0ecde"} Oct 03 08:42:23 crc kubenswrapper[4790]: I1003 08:42:23.327726 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4vl95" Oct 03 08:42:23 crc kubenswrapper[4790]: E1003 08:42:23.327945 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4vl95" podUID="9236b957-81cb-40bc-969e-a2057884ba50" Oct 03 08:42:24 crc kubenswrapper[4790]: I1003 08:42:24.327877 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 08:42:24 crc kubenswrapper[4790]: E1003 08:42:24.329413 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 03 08:42:24 crc kubenswrapper[4790]: I1003 08:42:24.329722 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 08:42:24 crc kubenswrapper[4790]: I1003 08:42:24.329795 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 08:42:24 crc kubenswrapper[4790]: E1003 08:42:24.329928 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 03 08:42:24 crc kubenswrapper[4790]: E1003 08:42:24.329802 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 03 08:42:25 crc kubenswrapper[4790]: I1003 08:42:25.327620 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4vl95" Oct 03 08:42:25 crc kubenswrapper[4790]: I1003 08:42:25.330599 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Oct 03 08:42:25 crc kubenswrapper[4790]: I1003 08:42:25.330660 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Oct 03 08:42:26 crc kubenswrapper[4790]: I1003 08:42:26.328119 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 08:42:26 crc kubenswrapper[4790]: I1003 08:42:26.328125 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 08:42:26 crc kubenswrapper[4790]: I1003 08:42:26.328085 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 08:42:26 crc kubenswrapper[4790]: I1003 08:42:26.331058 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Oct 03 08:42:26 crc kubenswrapper[4790]: I1003 08:42:26.331067 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Oct 03 08:42:26 crc kubenswrapper[4790]: I1003 08:42:26.332851 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Oct 03 08:42:26 crc kubenswrapper[4790]: I1003 08:42:26.333076 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Oct 03 08:42:33 crc kubenswrapper[4790]: I1003 08:42:33.785138 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Oct 03 08:42:33 crc kubenswrapper[4790]: I1003 08:42:33.818722 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-wlp9g"] Oct 03 08:42:33 crc kubenswrapper[4790]: I1003 08:42:33.819526 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-wlp9g" Oct 03 08:42:33 crc kubenswrapper[4790]: I1003 08:42:33.821626 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-jjxln"] Oct 03 08:42:33 crc kubenswrapper[4790]: I1003 08:42:33.822392 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jjxln" Oct 03 08:42:33 crc kubenswrapper[4790]: I1003 08:42:33.825739 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Oct 03 08:42:33 crc kubenswrapper[4790]: I1003 08:42:33.826654 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Oct 03 08:42:33 crc kubenswrapper[4790]: I1003 08:42:33.826704 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Oct 03 08:42:33 crc kubenswrapper[4790]: I1003 08:42:33.826923 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Oct 03 08:42:33 crc kubenswrapper[4790]: I1003 08:42:33.827375 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-f9drg"] Oct 03 08:42:33 crc kubenswrapper[4790]: I1003 08:42:33.828075 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-f9drg" Oct 03 08:42:33 crc kubenswrapper[4790]: I1003 08:42:33.828098 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-cxxjr"] Oct 03 08:42:33 crc kubenswrapper[4790]: I1003 08:42:33.828535 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-cxxjr" Oct 03 08:42:33 crc kubenswrapper[4790]: I1003 08:42:33.832339 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Oct 03 08:42:33 crc kubenswrapper[4790]: I1003 08:42:33.832823 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Oct 03 08:42:33 crc kubenswrapper[4790]: I1003 08:42:33.833168 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Oct 03 08:42:33 crc kubenswrapper[4790]: I1003 08:42:33.833348 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Oct 03 08:42:33 crc kubenswrapper[4790]: I1003 08:42:33.833537 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Oct 03 08:42:33 crc kubenswrapper[4790]: I1003 08:42:33.833748 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Oct 03 08:42:33 crc kubenswrapper[4790]: I1003 08:42:33.833542 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Oct 03 08:42:33 crc kubenswrapper[4790]: I1003 08:42:33.835253 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Oct 03 08:42:33 crc kubenswrapper[4790]: I1003 08:42:33.835491 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Oct 03 08:42:33 crc kubenswrapper[4790]: I1003 08:42:33.835669 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Oct 03 08:42:33 crc kubenswrapper[4790]: I1003 08:42:33.835834 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Oct 03 08:42:33 crc kubenswrapper[4790]: I1003 08:42:33.836218 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Oct 03 08:42:33 crc kubenswrapper[4790]: I1003 08:42:33.836642 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Oct 03 08:42:33 crc kubenswrapper[4790]: I1003 08:42:33.837079 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Oct 03 08:42:33 crc kubenswrapper[4790]: I1003 08:42:33.837359 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Oct 03 08:42:33 crc kubenswrapper[4790]: I1003 08:42:33.837086 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Oct 03 08:42:33 crc kubenswrapper[4790]: I1003 08:42:33.837473 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-tml4h"] Oct 03 08:42:33 crc kubenswrapper[4790]: I1003 08:42:33.837631 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Oct 03 08:42:33 crc kubenswrapper[4790]: I1003 08:42:33.838015 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Oct 03 08:42:33 crc kubenswrapper[4790]: I1003 08:42:33.838115 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-tml4h" Oct 03 08:42:33 crc kubenswrapper[4790]: I1003 08:42:33.838496 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-vm8dg"] Oct 03 08:42:33 crc kubenswrapper[4790]: I1003 08:42:33.839114 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-vm8dg" Oct 03 08:42:33 crc kubenswrapper[4790]: I1003 08:42:33.842142 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Oct 03 08:42:33 crc kubenswrapper[4790]: I1003 08:42:33.842814 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Oct 03 08:42:33 crc kubenswrapper[4790]: I1003 08:42:33.843462 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Oct 03 08:42:33 crc kubenswrapper[4790]: I1003 08:42:33.843704 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Oct 03 08:42:33 crc kubenswrapper[4790]: I1003 08:42:33.843901 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Oct 03 08:42:33 crc kubenswrapper[4790]: I1003 08:42:33.844285 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-jk7ch"] Oct 03 08:42:33 crc kubenswrapper[4790]: I1003 08:42:33.844959 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-jk7ch" Oct 03 08:42:33 crc kubenswrapper[4790]: I1003 08:42:33.844967 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-4gzvp"] Oct 03 08:42:33 crc kubenswrapper[4790]: I1003 08:42:33.845693 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-4gzvp" Oct 03 08:42:33 crc kubenswrapper[4790]: I1003 08:42:33.845920 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-tvcgx"] Oct 03 08:42:33 crc kubenswrapper[4790]: I1003 08:42:33.846689 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-tvcgx" Oct 03 08:42:33 crc kubenswrapper[4790]: I1003 08:42:33.846795 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Oct 03 08:42:33 crc kubenswrapper[4790]: I1003 08:42:33.847542 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Oct 03 08:42:33 crc kubenswrapper[4790]: I1003 08:42:33.847673 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Oct 03 08:42:33 crc kubenswrapper[4790]: I1003 08:42:33.847788 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Oct 03 08:42:33 crc kubenswrapper[4790]: I1003 08:42:33.848219 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-tdsgg"] Oct 03 08:42:33 crc kubenswrapper[4790]: I1003 08:42:33.848388 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Oct 03 08:42:33 crc kubenswrapper[4790]: I1003 08:42:33.848659 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Oct 03 08:42:33 crc kubenswrapper[4790]: I1003 08:42:33.848816 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Oct 03 08:42:33 crc kubenswrapper[4790]: I1003 08:42:33.848925 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-tdsgg" Oct 03 08:42:33 crc kubenswrapper[4790]: I1003 08:42:33.848978 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Oct 03 08:42:33 crc kubenswrapper[4790]: I1003 08:42:33.849116 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Oct 03 08:42:33 crc kubenswrapper[4790]: I1003 08:42:33.849256 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Oct 03 08:42:33 crc kubenswrapper[4790]: I1003 08:42:33.848816 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-ggr2t"] Oct 03 08:42:33 crc kubenswrapper[4790]: I1003 08:42:33.849928 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-8rqjf"] Oct 03 08:42:33 crc kubenswrapper[4790]: I1003 08:42:33.850066 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Oct 03 08:42:33 crc kubenswrapper[4790]: I1003 08:42:33.858928 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-ggr2t" Oct 03 08:42:33 crc kubenswrapper[4790]: I1003 08:42:33.863514 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-xnn4z"] Oct 03 08:42:33 crc kubenswrapper[4790]: I1003 08:42:33.863922 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-8rqjf" Oct 03 08:42:33 crc kubenswrapper[4790]: I1003 08:42:33.906029 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Oct 03 08:42:33 crc kubenswrapper[4790]: I1003 08:42:33.906878 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Oct 03 08:42:33 crc kubenswrapper[4790]: I1003 08:42:33.909750 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Oct 03 08:42:33 crc kubenswrapper[4790]: I1003 08:42:33.910103 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Oct 03 08:42:33 crc kubenswrapper[4790]: I1003 08:42:33.910861 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-62t8j"] Oct 03 08:42:33 crc kubenswrapper[4790]: I1003 08:42:33.911197 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-msk65"] Oct 03 08:42:33 crc kubenswrapper[4790]: I1003 08:42:33.911406 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-62t8j" Oct 03 08:42:33 crc kubenswrapper[4790]: I1003 08:42:33.911811 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-xnn4z" Oct 03 08:42:33 crc kubenswrapper[4790]: I1003 08:42:33.912363 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Oct 03 08:42:33 crc kubenswrapper[4790]: I1003 08:42:33.912504 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-msk65" Oct 03 08:42:33 crc kubenswrapper[4790]: I1003 08:42:33.914561 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Oct 03 08:42:33 crc kubenswrapper[4790]: I1003 08:42:33.914903 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Oct 03 08:42:33 crc kubenswrapper[4790]: I1003 08:42:33.915090 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Oct 03 08:42:33 crc kubenswrapper[4790]: I1003 08:42:33.915482 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Oct 03 08:42:33 crc kubenswrapper[4790]: I1003 08:42:33.916737 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Oct 03 08:42:33 crc kubenswrapper[4790]: I1003 08:42:33.916934 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-lzfvb"] Oct 03 08:42:33 crc kubenswrapper[4790]: I1003 08:42:33.917404 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-kzt96"] Oct 03 08:42:33 crc kubenswrapper[4790]: I1003 08:42:33.917533 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Oct 03 08:42:33 crc kubenswrapper[4790]: I1003 08:42:33.917866 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-kzt96" Oct 03 08:42:33 crc kubenswrapper[4790]: I1003 08:42:33.918125 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Oct 03 08:42:33 crc kubenswrapper[4790]: I1003 08:42:33.918347 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Oct 03 08:42:33 crc kubenswrapper[4790]: I1003 08:42:33.918613 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Oct 03 08:42:33 crc kubenswrapper[4790]: I1003 08:42:33.918901 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Oct 03 08:42:33 crc kubenswrapper[4790]: I1003 08:42:33.919127 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Oct 03 08:42:33 crc kubenswrapper[4790]: I1003 08:42:33.919217 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Oct 03 08:42:33 crc kubenswrapper[4790]: I1003 08:42:33.919261 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Oct 03 08:42:33 crc kubenswrapper[4790]: I1003 08:42:33.919434 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Oct 03 08:42:33 crc kubenswrapper[4790]: I1003 08:42:33.919554 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Oct 03 08:42:33 crc kubenswrapper[4790]: I1003 08:42:33.919723 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Oct 03 08:42:33 crc kubenswrapper[4790]: I1003 08:42:33.919876 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Oct 03 08:42:33 crc kubenswrapper[4790]: I1003 08:42:33.919951 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Oct 03 08:42:33 crc kubenswrapper[4790]: I1003 08:42:33.920037 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Oct 03 08:42:33 crc kubenswrapper[4790]: I1003 08:42:33.920352 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Oct 03 08:42:33 crc kubenswrapper[4790]: I1003 08:42:33.920456 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Oct 03 08:42:33 crc kubenswrapper[4790]: I1003 08:42:33.920652 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Oct 03 08:42:33 crc kubenswrapper[4790]: I1003 08:42:33.921910 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Oct 03 08:42:33 crc kubenswrapper[4790]: I1003 08:42:33.922152 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Oct 03 08:42:33 crc kubenswrapper[4790]: I1003 08:42:33.922872 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-lzfvb" Oct 03 08:42:33 crc kubenswrapper[4790]: I1003 08:42:33.925760 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-r9zxx"] Oct 03 08:42:33 crc kubenswrapper[4790]: I1003 08:42:33.926540 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-cxxjr"] Oct 03 08:42:33 crc kubenswrapper[4790]: I1003 08:42:33.926729 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-r9zxx" Oct 03 08:42:33 crc kubenswrapper[4790]: I1003 08:42:33.933930 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-tml4h"] Oct 03 08:42:33 crc kubenswrapper[4790]: I1003 08:42:33.936973 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-jjxln"] Oct 03 08:42:33 crc kubenswrapper[4790]: I1003 08:42:33.938031 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-wlp9g"] Oct 03 08:42:33 crc kubenswrapper[4790]: I1003 08:42:33.941041 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-f9drg"] Oct 03 08:42:33 crc kubenswrapper[4790]: I1003 08:42:33.942190 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Oct 03 08:42:33 crc kubenswrapper[4790]: I1003 08:42:33.945645 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-8p9cs"] Oct 03 08:42:33 crc kubenswrapper[4790]: I1003 08:42:33.946184 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Oct 03 08:42:33 crc kubenswrapper[4790]: I1003 08:42:33.946514 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Oct 03 08:42:33 crc kubenswrapper[4790]: I1003 08:42:33.948687 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-8p9cs" Oct 03 08:42:33 crc kubenswrapper[4790]: I1003 08:42:33.949031 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Oct 03 08:42:33 crc kubenswrapper[4790]: I1003 08:42:33.949137 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Oct 03 08:42:33 crc kubenswrapper[4790]: I1003 08:42:33.949435 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Oct 03 08:42:33 crc kubenswrapper[4790]: I1003 08:42:33.949699 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Oct 03 08:42:33 crc kubenswrapper[4790]: I1003 08:42:33.949899 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Oct 03 08:42:33 crc kubenswrapper[4790]: I1003 08:42:33.950064 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Oct 03 08:42:33 crc kubenswrapper[4790]: I1003 08:42:33.950214 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Oct 03 08:42:33 crc kubenswrapper[4790]: I1003 08:42:33.950401 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Oct 03 08:42:33 crc kubenswrapper[4790]: I1003 08:42:33.950677 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Oct 03 08:42:33 crc kubenswrapper[4790]: I1003 08:42:33.950763 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Oct 03 08:42:33 crc kubenswrapper[4790]: I1003 08:42:33.950901 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Oct 03 08:42:33 crc kubenswrapper[4790]: I1003 08:42:33.951326 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Oct 03 08:42:33 crc kubenswrapper[4790]: I1003 08:42:33.951371 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Oct 03 08:42:33 crc kubenswrapper[4790]: I1003 08:42:33.955041 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Oct 03 08:42:33 crc kubenswrapper[4790]: I1003 08:42:33.955287 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Oct 03 08:42:33 crc kubenswrapper[4790]: I1003 08:42:33.955545 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Oct 03 08:42:33 crc kubenswrapper[4790]: I1003 08:42:33.955690 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Oct 03 08:42:33 crc kubenswrapper[4790]: I1003 08:42:33.955816 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Oct 03 08:42:33 crc kubenswrapper[4790]: I1003 08:42:33.964339 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/01e7c005-7917-4002-b9d7-df2f8f138162-machine-approver-tls\") pod \"machine-approver-56656f9798-8rqjf\" (UID: \"01e7c005-7917-4002-b9d7-df2f8f138162\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-8rqjf" Oct 03 08:42:33 crc kubenswrapper[4790]: I1003 08:42:33.964397 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6f9925f6-f8f0-4e9d-8f05-1d3e08d9e4dd-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-cxxjr\" (UID: \"6f9925f6-f8f0-4e9d-8f05-1d3e08d9e4dd\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-cxxjr" Oct 03 08:42:33 crc kubenswrapper[4790]: I1003 08:42:33.964442 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/e2c063f6-68dc-4e05-bb17-fcbee1a2eec6-etcd-serving-ca\") pod \"apiserver-76f77b778f-wlp9g\" (UID: \"e2c063f6-68dc-4e05-bb17-fcbee1a2eec6\") " pod="openshift-apiserver/apiserver-76f77b778f-wlp9g" Oct 03 08:42:33 crc kubenswrapper[4790]: I1003 08:42:33.964479 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e2c063f6-68dc-4e05-bb17-fcbee1a2eec6-config\") pod \"apiserver-76f77b778f-wlp9g\" (UID: \"e2c063f6-68dc-4e05-bb17-fcbee1a2eec6\") " pod="openshift-apiserver/apiserver-76f77b778f-wlp9g" Oct 03 08:42:33 crc kubenswrapper[4790]: I1003 08:42:33.964504 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b4rdw\" (UniqueName: \"kubernetes.io/projected/5b33ea81-e9c2-4297-a0ed-6fe0a1688563-kube-api-access-b4rdw\") pod \"cluster-samples-operator-665b6dd947-f9drg\" (UID: \"5b33ea81-e9c2-4297-a0ed-6fe0a1688563\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-f9drg" Oct 03 08:42:33 crc kubenswrapper[4790]: I1003 08:42:33.964528 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/80c013e7-68e8-462d-9d0d-3079d2245c8f-serving-cert\") pod \"authentication-operator-69f744f599-jk7ch\" (UID: \"80c013e7-68e8-462d-9d0d-3079d2245c8f\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-jk7ch" Oct 03 08:42:33 crc kubenswrapper[4790]: I1003 08:42:33.964554 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3af1e215-a4dc-4be0-b4a8-d6fdbd120f2f-serving-cert\") pod \"etcd-operator-b45778765-xnn4z\" (UID: \"3af1e215-a4dc-4be0-b4a8-d6fdbd120f2f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-xnn4z" Oct 03 08:42:33 crc kubenswrapper[4790]: I1003 08:42:33.964601 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gjnph\" (UniqueName: \"kubernetes.io/projected/e2c063f6-68dc-4e05-bb17-fcbee1a2eec6-kube-api-access-gjnph\") pod \"apiserver-76f77b778f-wlp9g\" (UID: \"e2c063f6-68dc-4e05-bb17-fcbee1a2eec6\") " pod="openshift-apiserver/apiserver-76f77b778f-wlp9g" Oct 03 08:42:33 crc kubenswrapper[4790]: I1003 08:42:33.964628 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/f53170bf-fd8d-41c2-b353-50c5d5cff3d8-console-oauth-config\") pod \"console-f9d7485db-ggr2t\" (UID: \"f53170bf-fd8d-41c2-b353-50c5d5cff3d8\") " pod="openshift-console/console-f9d7485db-ggr2t" Oct 03 08:42:33 crc kubenswrapper[4790]: I1003 08:42:33.964650 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f53170bf-fd8d-41c2-b353-50c5d5cff3d8-service-ca\") pod \"console-f9d7485db-ggr2t\" (UID: \"f53170bf-fd8d-41c2-b353-50c5d5cff3d8\") " pod="openshift-console/console-f9d7485db-ggr2t" Oct 03 08:42:33 crc kubenswrapper[4790]: I1003 08:42:33.964683 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/74be60a5-ccdf-40b5-bc76-ebf2196ae376-config\") pod \"route-controller-manager-6576b87f9c-tvcgx\" (UID: \"74be60a5-ccdf-40b5-bc76-ebf2196ae376\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-tvcgx" Oct 03 08:42:33 crc kubenswrapper[4790]: I1003 08:42:33.964710 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x985s\" (UniqueName: \"kubernetes.io/projected/1f090fa3-d832-4693-a9a7-24cdae8aae07-kube-api-access-x985s\") pod \"openshift-config-operator-7777fb866f-vm8dg\" (UID: \"1f090fa3-d832-4693-a9a7-24cdae8aae07\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-vm8dg" Oct 03 08:42:33 crc kubenswrapper[4790]: I1003 08:42:33.964731 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e2c063f6-68dc-4e05-bb17-fcbee1a2eec6-trusted-ca-bundle\") pod \"apiserver-76f77b778f-wlp9g\" (UID: \"e2c063f6-68dc-4e05-bb17-fcbee1a2eec6\") " pod="openshift-apiserver/apiserver-76f77b778f-wlp9g" Oct 03 08:42:33 crc kubenswrapper[4790]: I1003 08:42:33.964755 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1f090fa3-d832-4693-a9a7-24cdae8aae07-serving-cert\") pod \"openshift-config-operator-7777fb866f-vm8dg\" (UID: \"1f090fa3-d832-4693-a9a7-24cdae8aae07\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-vm8dg" Oct 03 08:42:33 crc kubenswrapper[4790]: I1003 08:42:33.964776 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/e2c063f6-68dc-4e05-bb17-fcbee1a2eec6-node-pullsecrets\") pod \"apiserver-76f77b778f-wlp9g\" (UID: \"e2c063f6-68dc-4e05-bb17-fcbee1a2eec6\") " pod="openshift-apiserver/apiserver-76f77b778f-wlp9g" Oct 03 08:42:33 crc kubenswrapper[4790]: I1003 08:42:33.964795 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/e2c063f6-68dc-4e05-bb17-fcbee1a2eec6-etcd-client\") pod \"apiserver-76f77b778f-wlp9g\" (UID: \"e2c063f6-68dc-4e05-bb17-fcbee1a2eec6\") " pod="openshift-apiserver/apiserver-76f77b778f-wlp9g" Oct 03 08:42:33 crc kubenswrapper[4790]: I1003 08:42:33.964815 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3bf63bc2-e5fc-4974-b78e-185796fea03e-client-ca\") pod \"controller-manager-879f6c89f-tdsgg\" (UID: \"3bf63bc2-e5fc-4974-b78e-185796fea03e\") " pod="openshift-controller-manager/controller-manager-879f6c89f-tdsgg" Oct 03 08:42:33 crc kubenswrapper[4790]: I1003 08:42:33.964837 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/74be60a5-ccdf-40b5-bc76-ebf2196ae376-serving-cert\") pod \"route-controller-manager-6576b87f9c-tvcgx\" (UID: \"74be60a5-ccdf-40b5-bc76-ebf2196ae376\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-tvcgx" Oct 03 08:42:33 crc kubenswrapper[4790]: I1003 08:42:33.964859 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/3af1e215-a4dc-4be0-b4a8-d6fdbd120f2f-etcd-ca\") pod \"etcd-operator-b45778765-xnn4z\" (UID: \"3af1e215-a4dc-4be0-b4a8-d6fdbd120f2f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-xnn4z" Oct 03 08:42:33 crc kubenswrapper[4790]: I1003 08:42:33.964885 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/44aed4cf-ccda-49ee-9d8a-c1b95946bdf9-audit-policies\") pod \"apiserver-7bbb656c7d-jjxln\" (UID: \"44aed4cf-ccda-49ee-9d8a-c1b95946bdf9\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jjxln" Oct 03 08:42:33 crc kubenswrapper[4790]: I1003 08:42:33.964908 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/44aed4cf-ccda-49ee-9d8a-c1b95946bdf9-encryption-config\") pod \"apiserver-7bbb656c7d-jjxln\" (UID: \"44aed4cf-ccda-49ee-9d8a-c1b95946bdf9\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jjxln" Oct 03 08:42:33 crc kubenswrapper[4790]: I1003 08:42:33.964936 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01e7c005-7917-4002-b9d7-df2f8f138162-config\") pod \"machine-approver-56656f9798-8rqjf\" (UID: \"01e7c005-7917-4002-b9d7-df2f8f138162\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-8rqjf" Oct 03 08:42:33 crc kubenswrapper[4790]: I1003 08:42:33.964961 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/44aed4cf-ccda-49ee-9d8a-c1b95946bdf9-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-jjxln\" (UID: \"44aed4cf-ccda-49ee-9d8a-c1b95946bdf9\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jjxln" Oct 03 08:42:33 crc kubenswrapper[4790]: I1003 08:42:33.964987 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/484bb0d4-e7ab-406b-a9e8-5972d7cbfa8d-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-62t8j\" (UID: \"484bb0d4-e7ab-406b-a9e8-5972d7cbfa8d\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-62t8j" Oct 03 08:42:33 crc kubenswrapper[4790]: I1003 08:42:33.965014 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/e2c063f6-68dc-4e05-bb17-fcbee1a2eec6-audit\") pod \"apiserver-76f77b778f-wlp9g\" (UID: \"e2c063f6-68dc-4e05-bb17-fcbee1a2eec6\") " pod="openshift-apiserver/apiserver-76f77b778f-wlp9g" Oct 03 08:42:33 crc kubenswrapper[4790]: I1003 08:42:33.965036 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-drfp6\" (UniqueName: \"kubernetes.io/projected/3bf63bc2-e5fc-4974-b78e-185796fea03e-kube-api-access-drfp6\") pod \"controller-manager-879f6c89f-tdsgg\" (UID: \"3bf63bc2-e5fc-4974-b78e-185796fea03e\") " pod="openshift-controller-manager/controller-manager-879f6c89f-tdsgg" Oct 03 08:42:33 crc kubenswrapper[4790]: I1003 08:42:33.965065 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/80c013e7-68e8-462d-9d0d-3079d2245c8f-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-jk7ch\" (UID: \"80c013e7-68e8-462d-9d0d-3079d2245c8f\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-jk7ch" Oct 03 08:42:33 crc kubenswrapper[4790]: I1003 08:42:33.965092 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/74be60a5-ccdf-40b5-bc76-ebf2196ae376-client-ca\") pod \"route-controller-manager-6576b87f9c-tvcgx\" (UID: \"74be60a5-ccdf-40b5-bc76-ebf2196ae376\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-tvcgx" Oct 03 08:42:33 crc kubenswrapper[4790]: I1003 08:42:33.965251 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Oct 03 08:42:33 crc kubenswrapper[4790]: I1003 08:42:33.965513 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Oct 03 08:42:33 crc kubenswrapper[4790]: I1003 08:42:33.965517 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/44aed4cf-ccda-49ee-9d8a-c1b95946bdf9-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-jjxln\" (UID: \"44aed4cf-ccda-49ee-9d8a-c1b95946bdf9\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jjxln" Oct 03 08:42:33 crc kubenswrapper[4790]: I1003 08:42:33.965555 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/e2c063f6-68dc-4e05-bb17-fcbee1a2eec6-image-import-ca\") pod \"apiserver-76f77b778f-wlp9g\" (UID: \"e2c063f6-68dc-4e05-bb17-fcbee1a2eec6\") " pod="openshift-apiserver/apiserver-76f77b778f-wlp9g" Oct 03 08:42:33 crc kubenswrapper[4790]: I1003 08:42:33.965563 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Oct 03 08:42:33 crc kubenswrapper[4790]: I1003 08:42:33.965637 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/f53170bf-fd8d-41c2-b353-50c5d5cff3d8-oauth-serving-cert\") pod \"console-f9d7485db-ggr2t\" (UID: \"f53170bf-fd8d-41c2-b353-50c5d5cff3d8\") " pod="openshift-console/console-f9d7485db-ggr2t" Oct 03 08:42:33 crc kubenswrapper[4790]: I1003 08:42:33.965714 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7fx7z\" (UniqueName: \"kubernetes.io/projected/01e7c005-7917-4002-b9d7-df2f8f138162-kube-api-access-7fx7z\") pod \"machine-approver-56656f9798-8rqjf\" (UID: \"01e7c005-7917-4002-b9d7-df2f8f138162\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-8rqjf" Oct 03 08:42:33 crc kubenswrapper[4790]: I1003 08:42:33.965755 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xlxb9\" (UniqueName: \"kubernetes.io/projected/04e0d67a-57a0-47be-8b3a-79e45a4c17cd-kube-api-access-xlxb9\") pod \"console-operator-58897d9998-msk65\" (UID: \"04e0d67a-57a0-47be-8b3a-79e45a4c17cd\") " pod="openshift-console-operator/console-operator-58897d9998-msk65" Oct 03 08:42:33 crc kubenswrapper[4790]: I1003 08:42:33.965777 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/e2c063f6-68dc-4e05-bb17-fcbee1a2eec6-audit-dir\") pod \"apiserver-76f77b778f-wlp9g\" (UID: \"e2c063f6-68dc-4e05-bb17-fcbee1a2eec6\") " pod="openshift-apiserver/apiserver-76f77b778f-wlp9g" Oct 03 08:42:33 crc kubenswrapper[4790]: I1003 08:42:33.965803 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/f53170bf-fd8d-41c2-b353-50c5d5cff3d8-console-serving-cert\") pod \"console-f9d7485db-ggr2t\" (UID: \"f53170bf-fd8d-41c2-b353-50c5d5cff3d8\") " pod="openshift-console/console-f9d7485db-ggr2t" Oct 03 08:42:33 crc kubenswrapper[4790]: I1003 08:42:33.965828 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/5b33ea81-e9c2-4297-a0ed-6fe0a1688563-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-f9drg\" (UID: \"5b33ea81-e9c2-4297-a0ed-6fe0a1688563\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-f9drg" Oct 03 08:42:33 crc kubenswrapper[4790]: I1003 08:42:33.965852 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6nfwn\" (UniqueName: \"kubernetes.io/projected/3af1e215-a4dc-4be0-b4a8-d6fdbd120f2f-kube-api-access-6nfwn\") pod \"etcd-operator-b45778765-xnn4z\" (UID: \"3af1e215-a4dc-4be0-b4a8-d6fdbd120f2f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-xnn4z" Oct 03 08:42:33 crc kubenswrapper[4790]: I1003 08:42:33.965880 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x642r\" (UniqueName: \"kubernetes.io/projected/2f81f816-0c58-437b-be43-b782515e8997-kube-api-access-x642r\") pod \"machine-api-operator-5694c8668f-tml4h\" (UID: \"2f81f816-0c58-437b-be43-b782515e8997\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-tml4h" Oct 03 08:42:33 crc kubenswrapper[4790]: I1003 08:42:33.965901 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3bf63bc2-e5fc-4974-b78e-185796fea03e-serving-cert\") pod \"controller-manager-879f6c89f-tdsgg\" (UID: \"3bf63bc2-e5fc-4974-b78e-185796fea03e\") " pod="openshift-controller-manager/controller-manager-879f6c89f-tdsgg" Oct 03 08:42:33 crc kubenswrapper[4790]: I1003 08:42:33.965918 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f53170bf-fd8d-41c2-b353-50c5d5cff3d8-trusted-ca-bundle\") pod \"console-f9d7485db-ggr2t\" (UID: \"f53170bf-fd8d-41c2-b353-50c5d5cff3d8\") " pod="openshift-console/console-f9d7485db-ggr2t" Oct 03 08:42:33 crc kubenswrapper[4790]: I1003 08:42:33.966016 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/e2c063f6-68dc-4e05-bb17-fcbee1a2eec6-encryption-config\") pod \"apiserver-76f77b778f-wlp9g\" (UID: \"e2c063f6-68dc-4e05-bb17-fcbee1a2eec6\") " pod="openshift-apiserver/apiserver-76f77b778f-wlp9g" Oct 03 08:42:33 crc kubenswrapper[4790]: I1003 08:42:33.966043 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/3bf63bc2-e5fc-4974-b78e-185796fea03e-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-tdsgg\" (UID: \"3bf63bc2-e5fc-4974-b78e-185796fea03e\") " pod="openshift-controller-manager/controller-manager-879f6c89f-tdsgg" Oct 03 08:42:33 crc kubenswrapper[4790]: I1003 08:42:33.966077 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/44aed4cf-ccda-49ee-9d8a-c1b95946bdf9-etcd-client\") pod \"apiserver-7bbb656c7d-jjxln\" (UID: \"44aed4cf-ccda-49ee-9d8a-c1b95946bdf9\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jjxln" Oct 03 08:42:33 crc kubenswrapper[4790]: I1003 08:42:33.966101 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2f81f816-0c58-437b-be43-b782515e8997-config\") pod \"machine-api-operator-5694c8668f-tml4h\" (UID: \"2f81f816-0c58-437b-be43-b782515e8997\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-tml4h" Oct 03 08:42:33 crc kubenswrapper[4790]: I1003 08:42:33.966122 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e2c063f6-68dc-4e05-bb17-fcbee1a2eec6-serving-cert\") pod \"apiserver-76f77b778f-wlp9g\" (UID: \"e2c063f6-68dc-4e05-bb17-fcbee1a2eec6\") " pod="openshift-apiserver/apiserver-76f77b778f-wlp9g" Oct 03 08:42:33 crc kubenswrapper[4790]: I1003 08:42:33.966149 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/80c013e7-68e8-462d-9d0d-3079d2245c8f-config\") pod \"authentication-operator-69f744f599-jk7ch\" (UID: \"80c013e7-68e8-462d-9d0d-3079d2245c8f\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-jk7ch" Oct 03 08:42:33 crc kubenswrapper[4790]: I1003 08:42:33.966171 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/44aed4cf-ccda-49ee-9d8a-c1b95946bdf9-audit-dir\") pod \"apiserver-7bbb656c7d-jjxln\" (UID: \"44aed4cf-ccda-49ee-9d8a-c1b95946bdf9\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jjxln" Oct 03 08:42:33 crc kubenswrapper[4790]: I1003 08:42:33.966195 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/484bb0d4-e7ab-406b-a9e8-5972d7cbfa8d-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-62t8j\" (UID: \"484bb0d4-e7ab-406b-a9e8-5972d7cbfa8d\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-62t8j" Oct 03 08:42:33 crc kubenswrapper[4790]: I1003 08:42:33.966221 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kclsv\" (UniqueName: \"kubernetes.io/projected/6f9925f6-f8f0-4e9d-8f05-1d3e08d9e4dd-kube-api-access-kclsv\") pod \"openshift-apiserver-operator-796bbdcf4f-cxxjr\" (UID: \"6f9925f6-f8f0-4e9d-8f05-1d3e08d9e4dd\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-cxxjr" Oct 03 08:42:33 crc kubenswrapper[4790]: I1003 08:42:33.966243 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/f53170bf-fd8d-41c2-b353-50c5d5cff3d8-console-config\") pod \"console-f9d7485db-ggr2t\" (UID: \"f53170bf-fd8d-41c2-b353-50c5d5cff3d8\") " pod="openshift-console/console-f9d7485db-ggr2t" Oct 03 08:42:33 crc kubenswrapper[4790]: I1003 08:42:33.966303 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ljf4f\" (UniqueName: \"kubernetes.io/projected/74be60a5-ccdf-40b5-bc76-ebf2196ae376-kube-api-access-ljf4f\") pod \"route-controller-manager-6576b87f9c-tvcgx\" (UID: \"74be60a5-ccdf-40b5-bc76-ebf2196ae376\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-tvcgx" Oct 03 08:42:33 crc kubenswrapper[4790]: I1003 08:42:33.966347 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/484bb0d4-e7ab-406b-a9e8-5972d7cbfa8d-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-62t8j\" (UID: \"484bb0d4-e7ab-406b-a9e8-5972d7cbfa8d\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-62t8j" Oct 03 08:42:33 crc kubenswrapper[4790]: I1003 08:42:33.966368 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/3af1e215-a4dc-4be0-b4a8-d6fdbd120f2f-etcd-service-ca\") pod \"etcd-operator-b45778765-xnn4z\" (UID: \"3af1e215-a4dc-4be0-b4a8-d6fdbd120f2f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-xnn4z" Oct 03 08:42:33 crc kubenswrapper[4790]: I1003 08:42:33.966417 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/2f81f816-0c58-437b-be43-b782515e8997-images\") pod \"machine-api-operator-5694c8668f-tml4h\" (UID: \"2f81f816-0c58-437b-be43-b782515e8997\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-tml4h" Oct 03 08:42:33 crc kubenswrapper[4790]: I1003 08:42:33.966442 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/2f81f816-0c58-437b-be43-b782515e8997-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-tml4h\" (UID: \"2f81f816-0c58-437b-be43-b782515e8997\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-tml4h" Oct 03 08:42:33 crc kubenswrapper[4790]: I1003 08:42:33.966463 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/04e0d67a-57a0-47be-8b3a-79e45a4c17cd-config\") pod \"console-operator-58897d9998-msk65\" (UID: \"04e0d67a-57a0-47be-8b3a-79e45a4c17cd\") " pod="openshift-console-operator/console-operator-58897d9998-msk65" Oct 03 08:42:33 crc kubenswrapper[4790]: I1003 08:42:33.966489 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wjchd\" (UniqueName: \"kubernetes.io/projected/f53170bf-fd8d-41c2-b353-50c5d5cff3d8-kube-api-access-wjchd\") pod \"console-f9d7485db-ggr2t\" (UID: \"f53170bf-fd8d-41c2-b353-50c5d5cff3d8\") " pod="openshift-console/console-f9d7485db-ggr2t" Oct 03 08:42:33 crc kubenswrapper[4790]: I1003 08:42:33.966546 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4n54s\" (UniqueName: \"kubernetes.io/projected/80c013e7-68e8-462d-9d0d-3079d2245c8f-kube-api-access-4n54s\") pod \"authentication-operator-69f744f599-jk7ch\" (UID: \"80c013e7-68e8-462d-9d0d-3079d2245c8f\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-jk7ch" Oct 03 08:42:33 crc kubenswrapper[4790]: I1003 08:42:33.966569 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/01e7c005-7917-4002-b9d7-df2f8f138162-auth-proxy-config\") pod \"machine-approver-56656f9798-8rqjf\" (UID: \"01e7c005-7917-4002-b9d7-df2f8f138162\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-8rqjf" Oct 03 08:42:33 crc kubenswrapper[4790]: I1003 08:42:33.966614 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3af1e215-a4dc-4be0-b4a8-d6fdbd120f2f-config\") pod \"etcd-operator-b45778765-xnn4z\" (UID: \"3af1e215-a4dc-4be0-b4a8-d6fdbd120f2f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-xnn4z" Oct 03 08:42:33 crc kubenswrapper[4790]: I1003 08:42:33.966637 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/04e0d67a-57a0-47be-8b3a-79e45a4c17cd-serving-cert\") pod \"console-operator-58897d9998-msk65\" (UID: \"04e0d67a-57a0-47be-8b3a-79e45a4c17cd\") " pod="openshift-console-operator/console-operator-58897d9998-msk65" Oct 03 08:42:33 crc kubenswrapper[4790]: I1003 08:42:33.966658 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cfphj\" (UniqueName: \"kubernetes.io/projected/44aed4cf-ccda-49ee-9d8a-c1b95946bdf9-kube-api-access-cfphj\") pod \"apiserver-7bbb656c7d-jjxln\" (UID: \"44aed4cf-ccda-49ee-9d8a-c1b95946bdf9\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jjxln" Oct 03 08:42:33 crc kubenswrapper[4790]: I1003 08:42:33.966708 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/80c013e7-68e8-462d-9d0d-3079d2245c8f-service-ca-bundle\") pod \"authentication-operator-69f744f599-jk7ch\" (UID: \"80c013e7-68e8-462d-9d0d-3079d2245c8f\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-jk7ch" Oct 03 08:42:33 crc kubenswrapper[4790]: I1003 08:42:33.966732 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/3af1e215-a4dc-4be0-b4a8-d6fdbd120f2f-etcd-client\") pod \"etcd-operator-b45778765-xnn4z\" (UID: \"3af1e215-a4dc-4be0-b4a8-d6fdbd120f2f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-xnn4z" Oct 03 08:42:33 crc kubenswrapper[4790]: I1003 08:42:33.966750 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/04e0d67a-57a0-47be-8b3a-79e45a4c17cd-trusted-ca\") pod \"console-operator-58897d9998-msk65\" (UID: \"04e0d67a-57a0-47be-8b3a-79e45a4c17cd\") " pod="openshift-console-operator/console-operator-58897d9998-msk65" Oct 03 08:42:33 crc kubenswrapper[4790]: I1003 08:42:33.966773 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3bf63bc2-e5fc-4974-b78e-185796fea03e-config\") pod \"controller-manager-879f6c89f-tdsgg\" (UID: \"3bf63bc2-e5fc-4974-b78e-185796fea03e\") " pod="openshift-controller-manager/controller-manager-879f6c89f-tdsgg" Oct 03 08:42:33 crc kubenswrapper[4790]: I1003 08:42:33.966819 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/44aed4cf-ccda-49ee-9d8a-c1b95946bdf9-serving-cert\") pod \"apiserver-7bbb656c7d-jjxln\" (UID: \"44aed4cf-ccda-49ee-9d8a-c1b95946bdf9\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jjxln" Oct 03 08:42:33 crc kubenswrapper[4790]: I1003 08:42:33.966843 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/1f090fa3-d832-4693-a9a7-24cdae8aae07-available-featuregates\") pod \"openshift-config-operator-7777fb866f-vm8dg\" (UID: \"1f090fa3-d832-4693-a9a7-24cdae8aae07\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-vm8dg" Oct 03 08:42:33 crc kubenswrapper[4790]: I1003 08:42:33.966878 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tlbkz\" (UniqueName: \"kubernetes.io/projected/484bb0d4-e7ab-406b-a9e8-5972d7cbfa8d-kube-api-access-tlbkz\") pod \"cluster-image-registry-operator-dc59b4c8b-62t8j\" (UID: \"484bb0d4-e7ab-406b-a9e8-5972d7cbfa8d\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-62t8j" Oct 03 08:42:33 crc kubenswrapper[4790]: I1003 08:42:33.966903 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bbxtm\" (UniqueName: \"kubernetes.io/projected/cf6939a0-5f2b-4db5-b5c2-613e19bb4b43-kube-api-access-bbxtm\") pod \"downloads-7954f5f757-4gzvp\" (UID: \"cf6939a0-5f2b-4db5-b5c2-613e19bb4b43\") " pod="openshift-console/downloads-7954f5f757-4gzvp" Oct 03 08:42:33 crc kubenswrapper[4790]: I1003 08:42:33.966939 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6f9925f6-f8f0-4e9d-8f05-1d3e08d9e4dd-config\") pod \"openshift-apiserver-operator-796bbdcf4f-cxxjr\" (UID: \"6f9925f6-f8f0-4e9d-8f05-1d3e08d9e4dd\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-cxxjr" Oct 03 08:42:33 crc kubenswrapper[4790]: I1003 08:42:33.972926 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Oct 03 08:42:33 crc kubenswrapper[4790]: I1003 08:42:33.973098 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Oct 03 08:42:33 crc kubenswrapper[4790]: I1003 08:42:33.973202 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Oct 03 08:42:33 crc kubenswrapper[4790]: I1003 08:42:33.973307 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Oct 03 08:42:33 crc kubenswrapper[4790]: I1003 08:42:33.973404 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Oct 03 08:42:33 crc kubenswrapper[4790]: I1003 08:42:33.973495 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Oct 03 08:42:33 crc kubenswrapper[4790]: I1003 08:42:33.973607 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Oct 03 08:42:33 crc kubenswrapper[4790]: I1003 08:42:33.981653 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Oct 03 08:42:33 crc kubenswrapper[4790]: I1003 08:42:33.982126 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Oct 03 08:42:33 crc kubenswrapper[4790]: I1003 08:42:33.982381 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Oct 03 08:42:33 crc kubenswrapper[4790]: I1003 08:42:33.998279 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-tvcgx"] Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.004035 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.004387 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.006510 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.009481 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-v6bmf"] Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.039388 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.039708 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-v6bmf" Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.040128 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-2j9mx"] Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.041120 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-2j9mx" Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.050338 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.054752 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.056031 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-vsjvh" Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.056098 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.060122 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-7nt9l"] Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.060773 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-c8l5f"] Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.061192 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-csbrj"] Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.063028 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-csbrj" Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.063078 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-7nt9l" Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.063253 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.063743 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-c8l5f" Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.063773 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.064821 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-dp7r9"] Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.065628 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-dp7r9" Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.065828 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-mksjr"] Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.067844 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tlbkz\" (UniqueName: \"kubernetes.io/projected/484bb0d4-e7ab-406b-a9e8-5972d7cbfa8d-kube-api-access-tlbkz\") pod \"cluster-image-registry-operator-dc59b4c8b-62t8j\" (UID: \"484bb0d4-e7ab-406b-a9e8-5972d7cbfa8d\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-62t8j" Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.067900 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bbxtm\" (UniqueName: \"kubernetes.io/projected/cf6939a0-5f2b-4db5-b5c2-613e19bb4b43-kube-api-access-bbxtm\") pod \"downloads-7954f5f757-4gzvp\" (UID: \"cf6939a0-5f2b-4db5-b5c2-613e19bb4b43\") " pod="openshift-console/downloads-7954f5f757-4gzvp" Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.067927 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6f9925f6-f8f0-4e9d-8f05-1d3e08d9e4dd-config\") pod \"openshift-apiserver-operator-796bbdcf4f-cxxjr\" (UID: \"6f9925f6-f8f0-4e9d-8f05-1d3e08d9e4dd\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-cxxjr" Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.067957 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/a6d4f573-ea4a-4c8b-85da-cd3ce54f732b-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-r9zxx\" (UID: \"a6d4f573-ea4a-4c8b-85da-cd3ce54f732b\") " pod="openshift-authentication/oauth-openshift-558db77b4-r9zxx" Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.067990 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/a6d4f573-ea4a-4c8b-85da-cd3ce54f732b-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-r9zxx\" (UID: \"a6d4f573-ea4a-4c8b-85da-cd3ce54f732b\") " pod="openshift-authentication/oauth-openshift-558db77b4-r9zxx" Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.068021 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/01e7c005-7917-4002-b9d7-df2f8f138162-machine-approver-tls\") pod \"machine-approver-56656f9798-8rqjf\" (UID: \"01e7c005-7917-4002-b9d7-df2f8f138162\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-8rqjf" Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.068046 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/81ab3c47-6cf1-493f-b1bd-730c970de51f-webhook-cert\") pod \"packageserver-d55dfcdfc-8p9cs\" (UID: \"81ab3c47-6cf1-493f-b1bd-730c970de51f\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-8p9cs" Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.068072 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b8gwr\" (UniqueName: \"kubernetes.io/projected/81ab3c47-6cf1-493f-b1bd-730c970de51f-kube-api-access-b8gwr\") pod \"packageserver-d55dfcdfc-8p9cs\" (UID: \"81ab3c47-6cf1-493f-b1bd-730c970de51f\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-8p9cs" Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.068095 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6f9925f6-f8f0-4e9d-8f05-1d3e08d9e4dd-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-cxxjr\" (UID: \"6f9925f6-f8f0-4e9d-8f05-1d3e08d9e4dd\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-cxxjr" Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.068123 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/a6d4f573-ea4a-4c8b-85da-cd3ce54f732b-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-r9zxx\" (UID: \"a6d4f573-ea4a-4c8b-85da-cd3ce54f732b\") " pod="openshift-authentication/oauth-openshift-558db77b4-r9zxx" Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.068146 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nsj27\" (UniqueName: \"kubernetes.io/projected/0826b49c-d336-4af5-82fc-c0e9d03cc7f4-kube-api-access-nsj27\") pod \"dns-operator-744455d44c-kzt96\" (UID: \"0826b49c-d336-4af5-82fc-c0e9d03cc7f4\") " pod="openshift-dns-operator/dns-operator-744455d44c-kzt96" Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.068190 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/e2c063f6-68dc-4e05-bb17-fcbee1a2eec6-etcd-serving-ca\") pod \"apiserver-76f77b778f-wlp9g\" (UID: \"e2c063f6-68dc-4e05-bb17-fcbee1a2eec6\") " pod="openshift-apiserver/apiserver-76f77b778f-wlp9g" Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.069168 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e2c063f6-68dc-4e05-bb17-fcbee1a2eec6-config\") pod \"apiserver-76f77b778f-wlp9g\" (UID: \"e2c063f6-68dc-4e05-bb17-fcbee1a2eec6\") " pod="openshift-apiserver/apiserver-76f77b778f-wlp9g" Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.069200 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b4rdw\" (UniqueName: \"kubernetes.io/projected/5b33ea81-e9c2-4297-a0ed-6fe0a1688563-kube-api-access-b4rdw\") pod \"cluster-samples-operator-665b6dd947-f9drg\" (UID: \"5b33ea81-e9c2-4297-a0ed-6fe0a1688563\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-f9drg" Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.069229 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/80c013e7-68e8-462d-9d0d-3079d2245c8f-serving-cert\") pod \"authentication-operator-69f744f599-jk7ch\" (UID: \"80c013e7-68e8-462d-9d0d-3079d2245c8f\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-jk7ch" Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.069258 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3af1e215-a4dc-4be0-b4a8-d6fdbd120f2f-serving-cert\") pod \"etcd-operator-b45778765-xnn4z\" (UID: \"3af1e215-a4dc-4be0-b4a8-d6fdbd120f2f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-xnn4z" Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.069288 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gjnph\" (UniqueName: \"kubernetes.io/projected/e2c063f6-68dc-4e05-bb17-fcbee1a2eec6-kube-api-access-gjnph\") pod \"apiserver-76f77b778f-wlp9g\" (UID: \"e2c063f6-68dc-4e05-bb17-fcbee1a2eec6\") " pod="openshift-apiserver/apiserver-76f77b778f-wlp9g" Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.069307 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/e2c063f6-68dc-4e05-bb17-fcbee1a2eec6-etcd-serving-ca\") pod \"apiserver-76f77b778f-wlp9g\" (UID: \"e2c063f6-68dc-4e05-bb17-fcbee1a2eec6\") " pod="openshift-apiserver/apiserver-76f77b778f-wlp9g" Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.069315 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/f53170bf-fd8d-41c2-b353-50c5d5cff3d8-console-oauth-config\") pod \"console-f9d7485db-ggr2t\" (UID: \"f53170bf-fd8d-41c2-b353-50c5d5cff3d8\") " pod="openshift-console/console-f9d7485db-ggr2t" Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.069381 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f53170bf-fd8d-41c2-b353-50c5d5cff3d8-service-ca\") pod \"console-f9d7485db-ggr2t\" (UID: \"f53170bf-fd8d-41c2-b353-50c5d5cff3d8\") " pod="openshift-console/console-f9d7485db-ggr2t" Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.069413 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/74be60a5-ccdf-40b5-bc76-ebf2196ae376-config\") pod \"route-controller-manager-6576b87f9c-tvcgx\" (UID: \"74be60a5-ccdf-40b5-bc76-ebf2196ae376\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-tvcgx" Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.069432 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x985s\" (UniqueName: \"kubernetes.io/projected/1f090fa3-d832-4693-a9a7-24cdae8aae07-kube-api-access-x985s\") pod \"openshift-config-operator-7777fb866f-vm8dg\" (UID: \"1f090fa3-d832-4693-a9a7-24cdae8aae07\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-vm8dg" Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.069452 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e2c063f6-68dc-4e05-bb17-fcbee1a2eec6-trusted-ca-bundle\") pod \"apiserver-76f77b778f-wlp9g\" (UID: \"e2c063f6-68dc-4e05-bb17-fcbee1a2eec6\") " pod="openshift-apiserver/apiserver-76f77b778f-wlp9g" Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.069475 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/a6d4f573-ea4a-4c8b-85da-cd3ce54f732b-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-r9zxx\" (UID: \"a6d4f573-ea4a-4c8b-85da-cd3ce54f732b\") " pod="openshift-authentication/oauth-openshift-558db77b4-r9zxx" Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.069498 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1f090fa3-d832-4693-a9a7-24cdae8aae07-serving-cert\") pod \"openshift-config-operator-7777fb866f-vm8dg\" (UID: \"1f090fa3-d832-4693-a9a7-24cdae8aae07\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-vm8dg" Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.069527 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/e2c063f6-68dc-4e05-bb17-fcbee1a2eec6-node-pullsecrets\") pod \"apiserver-76f77b778f-wlp9g\" (UID: \"e2c063f6-68dc-4e05-bb17-fcbee1a2eec6\") " pod="openshift-apiserver/apiserver-76f77b778f-wlp9g" Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.069544 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/e2c063f6-68dc-4e05-bb17-fcbee1a2eec6-etcd-client\") pod \"apiserver-76f77b778f-wlp9g\" (UID: \"e2c063f6-68dc-4e05-bb17-fcbee1a2eec6\") " pod="openshift-apiserver/apiserver-76f77b778f-wlp9g" Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.069564 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3bf63bc2-e5fc-4974-b78e-185796fea03e-client-ca\") pod \"controller-manager-879f6c89f-tdsgg\" (UID: \"3bf63bc2-e5fc-4974-b78e-185796fea03e\") " pod="openshift-controller-manager/controller-manager-879f6c89f-tdsgg" Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.069604 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/74be60a5-ccdf-40b5-bc76-ebf2196ae376-serving-cert\") pod \"route-controller-manager-6576b87f9c-tvcgx\" (UID: \"74be60a5-ccdf-40b5-bc76-ebf2196ae376\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-tvcgx" Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.069622 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/3af1e215-a4dc-4be0-b4a8-d6fdbd120f2f-etcd-ca\") pod \"etcd-operator-b45778765-xnn4z\" (UID: \"3af1e215-a4dc-4be0-b4a8-d6fdbd120f2f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-xnn4z" Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.069642 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/44aed4cf-ccda-49ee-9d8a-c1b95946bdf9-audit-policies\") pod \"apiserver-7bbb656c7d-jjxln\" (UID: \"44aed4cf-ccda-49ee-9d8a-c1b95946bdf9\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jjxln" Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.069660 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/44aed4cf-ccda-49ee-9d8a-c1b95946bdf9-encryption-config\") pod \"apiserver-7bbb656c7d-jjxln\" (UID: \"44aed4cf-ccda-49ee-9d8a-c1b95946bdf9\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jjxln" Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.069681 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01e7c005-7917-4002-b9d7-df2f8f138162-config\") pod \"machine-approver-56656f9798-8rqjf\" (UID: \"01e7c005-7917-4002-b9d7-df2f8f138162\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-8rqjf" Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.069698 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/a6d4f573-ea4a-4c8b-85da-cd3ce54f732b-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-r9zxx\" (UID: \"a6d4f573-ea4a-4c8b-85da-cd3ce54f732b\") " pod="openshift-authentication/oauth-openshift-558db77b4-r9zxx" Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.069716 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/44aed4cf-ccda-49ee-9d8a-c1b95946bdf9-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-jjxln\" (UID: \"44aed4cf-ccda-49ee-9d8a-c1b95946bdf9\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jjxln" Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.069733 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/484bb0d4-e7ab-406b-a9e8-5972d7cbfa8d-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-62t8j\" (UID: \"484bb0d4-e7ab-406b-a9e8-5972d7cbfa8d\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-62t8j" Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.069750 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/e2c063f6-68dc-4e05-bb17-fcbee1a2eec6-audit\") pod \"apiserver-76f77b778f-wlp9g\" (UID: \"e2c063f6-68dc-4e05-bb17-fcbee1a2eec6\") " pod="openshift-apiserver/apiserver-76f77b778f-wlp9g" Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.069771 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-drfp6\" (UniqueName: \"kubernetes.io/projected/3bf63bc2-e5fc-4974-b78e-185796fea03e-kube-api-access-drfp6\") pod \"controller-manager-879f6c89f-tdsgg\" (UID: \"3bf63bc2-e5fc-4974-b78e-185796fea03e\") " pod="openshift-controller-manager/controller-manager-879f6c89f-tdsgg" Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.069787 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/a6d4f573-ea4a-4c8b-85da-cd3ce54f732b-audit-policies\") pod \"oauth-openshift-558db77b4-r9zxx\" (UID: \"a6d4f573-ea4a-4c8b-85da-cd3ce54f732b\") " pod="openshift-authentication/oauth-openshift-558db77b4-r9zxx" Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.069804 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mhmwn\" (UniqueName: \"kubernetes.io/projected/a6d4f573-ea4a-4c8b-85da-cd3ce54f732b-kube-api-access-mhmwn\") pod \"oauth-openshift-558db77b4-r9zxx\" (UID: \"a6d4f573-ea4a-4c8b-85da-cd3ce54f732b\") " pod="openshift-authentication/oauth-openshift-558db77b4-r9zxx" Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.069831 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/80c013e7-68e8-462d-9d0d-3079d2245c8f-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-jk7ch\" (UID: \"80c013e7-68e8-462d-9d0d-3079d2245c8f\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-jk7ch" Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.069855 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/74be60a5-ccdf-40b5-bc76-ebf2196ae376-client-ca\") pod \"route-controller-manager-6576b87f9c-tvcgx\" (UID: \"74be60a5-ccdf-40b5-bc76-ebf2196ae376\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-tvcgx" Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.069881 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/44aed4cf-ccda-49ee-9d8a-c1b95946bdf9-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-jjxln\" (UID: \"44aed4cf-ccda-49ee-9d8a-c1b95946bdf9\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jjxln" Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.069903 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/e2c063f6-68dc-4e05-bb17-fcbee1a2eec6-image-import-ca\") pod \"apiserver-76f77b778f-wlp9g\" (UID: \"e2c063f6-68dc-4e05-bb17-fcbee1a2eec6\") " pod="openshift-apiserver/apiserver-76f77b778f-wlp9g" Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.069920 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/f53170bf-fd8d-41c2-b353-50c5d5cff3d8-oauth-serving-cert\") pod \"console-f9d7485db-ggr2t\" (UID: \"f53170bf-fd8d-41c2-b353-50c5d5cff3d8\") " pod="openshift-console/console-f9d7485db-ggr2t" Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.069943 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7fx7z\" (UniqueName: \"kubernetes.io/projected/01e7c005-7917-4002-b9d7-df2f8f138162-kube-api-access-7fx7z\") pod \"machine-approver-56656f9798-8rqjf\" (UID: \"01e7c005-7917-4002-b9d7-df2f8f138162\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-8rqjf" Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.069963 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xlxb9\" (UniqueName: \"kubernetes.io/projected/04e0d67a-57a0-47be-8b3a-79e45a4c17cd-kube-api-access-xlxb9\") pod \"console-operator-58897d9998-msk65\" (UID: \"04e0d67a-57a0-47be-8b3a-79e45a4c17cd\") " pod="openshift-console-operator/console-operator-58897d9998-msk65" Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.069980 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/e2c063f6-68dc-4e05-bb17-fcbee1a2eec6-audit-dir\") pod \"apiserver-76f77b778f-wlp9g\" (UID: \"e2c063f6-68dc-4e05-bb17-fcbee1a2eec6\") " pod="openshift-apiserver/apiserver-76f77b778f-wlp9g" Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.069995 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/f53170bf-fd8d-41c2-b353-50c5d5cff3d8-console-serving-cert\") pod \"console-f9d7485db-ggr2t\" (UID: \"f53170bf-fd8d-41c2-b353-50c5d5cff3d8\") " pod="openshift-console/console-f9d7485db-ggr2t" Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.070015 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/5b33ea81-e9c2-4297-a0ed-6fe0a1688563-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-f9drg\" (UID: \"5b33ea81-e9c2-4297-a0ed-6fe0a1688563\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-f9drg" Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.070030 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6nfwn\" (UniqueName: \"kubernetes.io/projected/3af1e215-a4dc-4be0-b4a8-d6fdbd120f2f-kube-api-access-6nfwn\") pod \"etcd-operator-b45778765-xnn4z\" (UID: \"3af1e215-a4dc-4be0-b4a8-d6fdbd120f2f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-xnn4z" Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.070052 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x642r\" (UniqueName: \"kubernetes.io/projected/2f81f816-0c58-437b-be43-b782515e8997-kube-api-access-x642r\") pod \"machine-api-operator-5694c8668f-tml4h\" (UID: \"2f81f816-0c58-437b-be43-b782515e8997\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-tml4h" Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.070073 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3bf63bc2-e5fc-4974-b78e-185796fea03e-serving-cert\") pod \"controller-manager-879f6c89f-tdsgg\" (UID: \"3bf63bc2-e5fc-4974-b78e-185796fea03e\") " pod="openshift-controller-manager/controller-manager-879f6c89f-tdsgg" Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.070089 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f53170bf-fd8d-41c2-b353-50c5d5cff3d8-trusted-ca-bundle\") pod \"console-f9d7485db-ggr2t\" (UID: \"f53170bf-fd8d-41c2-b353-50c5d5cff3d8\") " pod="openshift-console/console-f9d7485db-ggr2t" Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.070106 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/a6d4f573-ea4a-4c8b-85da-cd3ce54f732b-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-r9zxx\" (UID: \"a6d4f573-ea4a-4c8b-85da-cd3ce54f732b\") " pod="openshift-authentication/oauth-openshift-558db77b4-r9zxx" Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.070125 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/81ab3c47-6cf1-493f-b1bd-730c970de51f-apiservice-cert\") pod \"packageserver-d55dfcdfc-8p9cs\" (UID: \"81ab3c47-6cf1-493f-b1bd-730c970de51f\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-8p9cs" Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.070143 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/0826b49c-d336-4af5-82fc-c0e9d03cc7f4-metrics-tls\") pod \"dns-operator-744455d44c-kzt96\" (UID: \"0826b49c-d336-4af5-82fc-c0e9d03cc7f4\") " pod="openshift-dns-operator/dns-operator-744455d44c-kzt96" Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.070163 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7fpg5\" (UniqueName: \"kubernetes.io/projected/560651e5-35a6-4428-b763-972bdbef6c6d-kube-api-access-7fpg5\") pod \"openshift-controller-manager-operator-756b6f6bc6-lzfvb\" (UID: \"560651e5-35a6-4428-b763-972bdbef6c6d\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-lzfvb" Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.070183 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a6d4f573-ea4a-4c8b-85da-cd3ce54f732b-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-r9zxx\" (UID: \"a6d4f573-ea4a-4c8b-85da-cd3ce54f732b\") " pod="openshift-authentication/oauth-openshift-558db77b4-r9zxx" Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.070203 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/e2c063f6-68dc-4e05-bb17-fcbee1a2eec6-encryption-config\") pod \"apiserver-76f77b778f-wlp9g\" (UID: \"e2c063f6-68dc-4e05-bb17-fcbee1a2eec6\") " pod="openshift-apiserver/apiserver-76f77b778f-wlp9g" Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.070219 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/3bf63bc2-e5fc-4974-b78e-185796fea03e-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-tdsgg\" (UID: \"3bf63bc2-e5fc-4974-b78e-185796fea03e\") " pod="openshift-controller-manager/controller-manager-879f6c89f-tdsgg" Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.070238 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/560651e5-35a6-4428-b763-972bdbef6c6d-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-lzfvb\" (UID: \"560651e5-35a6-4428-b763-972bdbef6c6d\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-lzfvb" Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.070254 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/a6d4f573-ea4a-4c8b-85da-cd3ce54f732b-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-r9zxx\" (UID: \"a6d4f573-ea4a-4c8b-85da-cd3ce54f732b\") " pod="openshift-authentication/oauth-openshift-558db77b4-r9zxx" Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.070273 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/44aed4cf-ccda-49ee-9d8a-c1b95946bdf9-etcd-client\") pod \"apiserver-7bbb656c7d-jjxln\" (UID: \"44aed4cf-ccda-49ee-9d8a-c1b95946bdf9\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jjxln" Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.070290 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2f81f816-0c58-437b-be43-b782515e8997-config\") pod \"machine-api-operator-5694c8668f-tml4h\" (UID: \"2f81f816-0c58-437b-be43-b782515e8997\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-tml4h" Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.070305 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e2c063f6-68dc-4e05-bb17-fcbee1a2eec6-serving-cert\") pod \"apiserver-76f77b778f-wlp9g\" (UID: \"e2c063f6-68dc-4e05-bb17-fcbee1a2eec6\") " pod="openshift-apiserver/apiserver-76f77b778f-wlp9g" Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.070323 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/80c013e7-68e8-462d-9d0d-3079d2245c8f-config\") pod \"authentication-operator-69f744f599-jk7ch\" (UID: \"80c013e7-68e8-462d-9d0d-3079d2245c8f\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-jk7ch" Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.070340 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/44aed4cf-ccda-49ee-9d8a-c1b95946bdf9-audit-dir\") pod \"apiserver-7bbb656c7d-jjxln\" (UID: \"44aed4cf-ccda-49ee-9d8a-c1b95946bdf9\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jjxln" Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.070356 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/484bb0d4-e7ab-406b-a9e8-5972d7cbfa8d-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-62t8j\" (UID: \"484bb0d4-e7ab-406b-a9e8-5972d7cbfa8d\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-62t8j" Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.070372 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kclsv\" (UniqueName: \"kubernetes.io/projected/6f9925f6-f8f0-4e9d-8f05-1d3e08d9e4dd-kube-api-access-kclsv\") pod \"openshift-apiserver-operator-796bbdcf4f-cxxjr\" (UID: \"6f9925f6-f8f0-4e9d-8f05-1d3e08d9e4dd\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-cxxjr" Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.070387 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/f53170bf-fd8d-41c2-b353-50c5d5cff3d8-console-config\") pod \"console-f9d7485db-ggr2t\" (UID: \"f53170bf-fd8d-41c2-b353-50c5d5cff3d8\") " pod="openshift-console/console-f9d7485db-ggr2t" Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.070407 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ljf4f\" (UniqueName: \"kubernetes.io/projected/74be60a5-ccdf-40b5-bc76-ebf2196ae376-kube-api-access-ljf4f\") pod \"route-controller-manager-6576b87f9c-tvcgx\" (UID: \"74be60a5-ccdf-40b5-bc76-ebf2196ae376\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-tvcgx" Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.070425 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/484bb0d4-e7ab-406b-a9e8-5972d7cbfa8d-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-62t8j\" (UID: \"484bb0d4-e7ab-406b-a9e8-5972d7cbfa8d\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-62t8j" Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.070444 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/3af1e215-a4dc-4be0-b4a8-d6fdbd120f2f-etcd-service-ca\") pod \"etcd-operator-b45778765-xnn4z\" (UID: \"3af1e215-a4dc-4be0-b4a8-d6fdbd120f2f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-xnn4z" Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.070462 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/a6d4f573-ea4a-4c8b-85da-cd3ce54f732b-audit-dir\") pod \"oauth-openshift-558db77b4-r9zxx\" (UID: \"a6d4f573-ea4a-4c8b-85da-cd3ce54f732b\") " pod="openshift-authentication/oauth-openshift-558db77b4-r9zxx" Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.070489 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/2f81f816-0c58-437b-be43-b782515e8997-images\") pod \"machine-api-operator-5694c8668f-tml4h\" (UID: \"2f81f816-0c58-437b-be43-b782515e8997\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-tml4h" Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.070507 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/2f81f816-0c58-437b-be43-b782515e8997-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-tml4h\" (UID: \"2f81f816-0c58-437b-be43-b782515e8997\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-tml4h" Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.070524 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/04e0d67a-57a0-47be-8b3a-79e45a4c17cd-config\") pod \"console-operator-58897d9998-msk65\" (UID: \"04e0d67a-57a0-47be-8b3a-79e45a4c17cd\") " pod="openshift-console-operator/console-operator-58897d9998-msk65" Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.070543 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wjchd\" (UniqueName: \"kubernetes.io/projected/f53170bf-fd8d-41c2-b353-50c5d5cff3d8-kube-api-access-wjchd\") pod \"console-f9d7485db-ggr2t\" (UID: \"f53170bf-fd8d-41c2-b353-50c5d5cff3d8\") " pod="openshift-console/console-f9d7485db-ggr2t" Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.070594 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/560651e5-35a6-4428-b763-972bdbef6c6d-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-lzfvb\" (UID: \"560651e5-35a6-4428-b763-972bdbef6c6d\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-lzfvb" Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.070614 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/a6d4f573-ea4a-4c8b-85da-cd3ce54f732b-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-r9zxx\" (UID: \"a6d4f573-ea4a-4c8b-85da-cd3ce54f732b\") " pod="openshift-authentication/oauth-openshift-558db77b4-r9zxx" Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.070661 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4n54s\" (UniqueName: \"kubernetes.io/projected/80c013e7-68e8-462d-9d0d-3079d2245c8f-kube-api-access-4n54s\") pod \"authentication-operator-69f744f599-jk7ch\" (UID: \"80c013e7-68e8-462d-9d0d-3079d2245c8f\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-jk7ch" Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.070682 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/01e7c005-7917-4002-b9d7-df2f8f138162-auth-proxy-config\") pod \"machine-approver-56656f9798-8rqjf\" (UID: \"01e7c005-7917-4002-b9d7-df2f8f138162\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-8rqjf" Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.070709 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3af1e215-a4dc-4be0-b4a8-d6fdbd120f2f-config\") pod \"etcd-operator-b45778765-xnn4z\" (UID: \"3af1e215-a4dc-4be0-b4a8-d6fdbd120f2f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-xnn4z" Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.070729 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/04e0d67a-57a0-47be-8b3a-79e45a4c17cd-serving-cert\") pod \"console-operator-58897d9998-msk65\" (UID: \"04e0d67a-57a0-47be-8b3a-79e45a4c17cd\") " pod="openshift-console-operator/console-operator-58897d9998-msk65" Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.070751 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cfphj\" (UniqueName: \"kubernetes.io/projected/44aed4cf-ccda-49ee-9d8a-c1b95946bdf9-kube-api-access-cfphj\") pod \"apiserver-7bbb656c7d-jjxln\" (UID: \"44aed4cf-ccda-49ee-9d8a-c1b95946bdf9\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jjxln" Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.070784 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/80c013e7-68e8-462d-9d0d-3079d2245c8f-service-ca-bundle\") pod \"authentication-operator-69f744f599-jk7ch\" (UID: \"80c013e7-68e8-462d-9d0d-3079d2245c8f\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-jk7ch" Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.070804 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/3af1e215-a4dc-4be0-b4a8-d6fdbd120f2f-etcd-client\") pod \"etcd-operator-b45778765-xnn4z\" (UID: \"3af1e215-a4dc-4be0-b4a8-d6fdbd120f2f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-xnn4z" Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.070823 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/04e0d67a-57a0-47be-8b3a-79e45a4c17cd-trusted-ca\") pod \"console-operator-58897d9998-msk65\" (UID: \"04e0d67a-57a0-47be-8b3a-79e45a4c17cd\") " pod="openshift-console-operator/console-operator-58897d9998-msk65" Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.070850 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3bf63bc2-e5fc-4974-b78e-185796fea03e-config\") pod \"controller-manager-879f6c89f-tdsgg\" (UID: \"3bf63bc2-e5fc-4974-b78e-185796fea03e\") " pod="openshift-controller-manager/controller-manager-879f6c89f-tdsgg" Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.070869 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/a6d4f573-ea4a-4c8b-85da-cd3ce54f732b-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-r9zxx\" (UID: \"a6d4f573-ea4a-4c8b-85da-cd3ce54f732b\") " pod="openshift-authentication/oauth-openshift-558db77b4-r9zxx" Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.070895 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/44aed4cf-ccda-49ee-9d8a-c1b95946bdf9-serving-cert\") pod \"apiserver-7bbb656c7d-jjxln\" (UID: \"44aed4cf-ccda-49ee-9d8a-c1b95946bdf9\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jjxln" Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.070912 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/1f090fa3-d832-4693-a9a7-24cdae8aae07-available-featuregates\") pod \"openshift-config-operator-7777fb866f-vm8dg\" (UID: \"1f090fa3-d832-4693-a9a7-24cdae8aae07\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-vm8dg" Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.070928 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/a6d4f573-ea4a-4c8b-85da-cd3ce54f732b-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-r9zxx\" (UID: \"a6d4f573-ea4a-4c8b-85da-cd3ce54f732b\") " pod="openshift-authentication/oauth-openshift-558db77b4-r9zxx" Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.070946 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/81ab3c47-6cf1-493f-b1bd-730c970de51f-tmpfs\") pod \"packageserver-d55dfcdfc-8p9cs\" (UID: \"81ab3c47-6cf1-493f-b1bd-730c970de51f\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-8p9cs" Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.068233 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-qgfpf"] Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.071719 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-sv5st"] Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.072132 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-hwr6b"] Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.072965 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/74be60a5-ccdf-40b5-bc76-ebf2196ae376-config\") pod \"route-controller-manager-6576b87f9c-tvcgx\" (UID: \"74be60a5-ccdf-40b5-bc76-ebf2196ae376\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-tvcgx" Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.073162 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e2c063f6-68dc-4e05-bb17-fcbee1a2eec6-trusted-ca-bundle\") pod \"apiserver-76f77b778f-wlp9g\" (UID: \"e2c063f6-68dc-4e05-bb17-fcbee1a2eec6\") " pod="openshift-apiserver/apiserver-76f77b778f-wlp9g" Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.073738 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3bf63bc2-e5fc-4974-b78e-185796fea03e-client-ca\") pod \"controller-manager-879f6c89f-tdsgg\" (UID: \"3bf63bc2-e5fc-4974-b78e-185796fea03e\") " pod="openshift-controller-manager/controller-manager-879f6c89f-tdsgg" Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.073808 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/e2c063f6-68dc-4e05-bb17-fcbee1a2eec6-node-pullsecrets\") pod \"apiserver-76f77b778f-wlp9g\" (UID: \"e2c063f6-68dc-4e05-bb17-fcbee1a2eec6\") " pod="openshift-apiserver/apiserver-76f77b778f-wlp9g" Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.075918 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/e2c063f6-68dc-4e05-bb17-fcbee1a2eec6-audit\") pod \"apiserver-76f77b778f-wlp9g\" (UID: \"e2c063f6-68dc-4e05-bb17-fcbee1a2eec6\") " pod="openshift-apiserver/apiserver-76f77b778f-wlp9g" Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.077533 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6f9925f6-f8f0-4e9d-8f05-1d3e08d9e4dd-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-cxxjr\" (UID: \"6f9925f6-f8f0-4e9d-8f05-1d3e08d9e4dd\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-cxxjr" Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.077670 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-8zvcg"] Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.077801 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01e7c005-7917-4002-b9d7-df2f8f138162-config\") pod \"machine-approver-56656f9798-8rqjf\" (UID: \"01e7c005-7917-4002-b9d7-df2f8f138162\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-8rqjf" Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.078122 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f53170bf-fd8d-41c2-b353-50c5d5cff3d8-service-ca\") pod \"console-f9d7485db-ggr2t\" (UID: \"f53170bf-fd8d-41c2-b353-50c5d5cff3d8\") " pod="openshift-console/console-f9d7485db-ggr2t" Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.078251 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-nlpzc"] Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.078448 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/44aed4cf-ccda-49ee-9d8a-c1b95946bdf9-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-jjxln\" (UID: \"44aed4cf-ccda-49ee-9d8a-c1b95946bdf9\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jjxln" Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.078711 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/f53170bf-fd8d-41c2-b353-50c5d5cff3d8-console-oauth-config\") pod \"console-f9d7485db-ggr2t\" (UID: \"f53170bf-fd8d-41c2-b353-50c5d5cff3d8\") " pod="openshift-console/console-f9d7485db-ggr2t" Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.078784 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-lzfvb"] Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.078805 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-87tx4"] Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.078959 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/44aed4cf-ccda-49ee-9d8a-c1b95946bdf9-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-jjxln\" (UID: \"44aed4cf-ccda-49ee-9d8a-c1b95946bdf9\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jjxln" Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.079251 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-sv5st" Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.068376 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-mksjr" Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.077735 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/74be60a5-ccdf-40b5-bc76-ebf2196ae376-client-ca\") pod \"route-controller-manager-6576b87f9c-tvcgx\" (UID: \"74be60a5-ccdf-40b5-bc76-ebf2196ae376\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-tvcgx" Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.095096 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/e2c063f6-68dc-4e05-bb17-fcbee1a2eec6-etcd-client\") pod \"apiserver-76f77b778f-wlp9g\" (UID: \"e2c063f6-68dc-4e05-bb17-fcbee1a2eec6\") " pod="openshift-apiserver/apiserver-76f77b778f-wlp9g" Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.096143 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f53170bf-fd8d-41c2-b353-50c5d5cff3d8-trusted-ca-bundle\") pod \"console-f9d7485db-ggr2t\" (UID: \"f53170bf-fd8d-41c2-b353-50c5d5cff3d8\") " pod="openshift-console/console-f9d7485db-ggr2t" Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.096845 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/5b33ea81-e9c2-4297-a0ed-6fe0a1688563-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-f9drg\" (UID: \"5b33ea81-e9c2-4297-a0ed-6fe0a1688563\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-f9drg" Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.097988 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/3af1e215-a4dc-4be0-b4a8-d6fdbd120f2f-etcd-ca\") pod \"etcd-operator-b45778765-xnn4z\" (UID: \"3af1e215-a4dc-4be0-b4a8-d6fdbd120f2f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-xnn4z" Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.098139 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-8f5qg"] Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.098338 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/80c013e7-68e8-462d-9d0d-3079d2245c8f-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-jk7ch\" (UID: \"80c013e7-68e8-462d-9d0d-3079d2245c8f\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-jk7ch" Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.098390 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/74be60a5-ccdf-40b5-bc76-ebf2196ae376-serving-cert\") pod \"route-controller-manager-6576b87f9c-tvcgx\" (UID: \"74be60a5-ccdf-40b5-bc76-ebf2196ae376\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-tvcgx" Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.098355 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/44aed4cf-ccda-49ee-9d8a-c1b95946bdf9-audit-policies\") pod \"apiserver-7bbb656c7d-jjxln\" (UID: \"44aed4cf-ccda-49ee-9d8a-c1b95946bdf9\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jjxln" Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.099000 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/44aed4cf-ccda-49ee-9d8a-c1b95946bdf9-encryption-config\") pod \"apiserver-7bbb656c7d-jjxln\" (UID: \"44aed4cf-ccda-49ee-9d8a-c1b95946bdf9\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jjxln" Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.099555 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3af1e215-a4dc-4be0-b4a8-d6fdbd120f2f-serving-cert\") pod \"etcd-operator-b45778765-xnn4z\" (UID: \"3af1e215-a4dc-4be0-b4a8-d6fdbd120f2f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-xnn4z" Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.100323 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-pssx9"] Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.099561 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1f090fa3-d832-4693-a9a7-24cdae8aae07-serving-cert\") pod \"openshift-config-operator-7777fb866f-vm8dg\" (UID: \"1f090fa3-d832-4693-a9a7-24cdae8aae07\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-vm8dg" Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.100542 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e2c063f6-68dc-4e05-bb17-fcbee1a2eec6-config\") pod \"apiserver-76f77b778f-wlp9g\" (UID: \"e2c063f6-68dc-4e05-bb17-fcbee1a2eec6\") " pod="openshift-apiserver/apiserver-76f77b778f-wlp9g" Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.100962 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/01e7c005-7917-4002-b9d7-df2f8f138162-machine-approver-tls\") pod \"machine-approver-56656f9798-8rqjf\" (UID: \"01e7c005-7917-4002-b9d7-df2f8f138162\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-8rqjf" Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.101277 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3bf63bc2-e5fc-4974-b78e-185796fea03e-serving-cert\") pod \"controller-manager-879f6c89f-tdsgg\" (UID: \"3bf63bc2-e5fc-4974-b78e-185796fea03e\") " pod="openshift-controller-manager/controller-manager-879f6c89f-tdsgg" Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.101784 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-qgfpf" Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.101879 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-87tx4" Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.101972 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-8f5qg" Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.102242 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-nlpzc" Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.102594 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.102636 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/e2c063f6-68dc-4e05-bb17-fcbee1a2eec6-image-import-ca\") pod \"apiserver-76f77b778f-wlp9g\" (UID: \"e2c063f6-68dc-4e05-bb17-fcbee1a2eec6\") " pod="openshift-apiserver/apiserver-76f77b778f-wlp9g" Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.102976 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/e2c063f6-68dc-4e05-bb17-fcbee1a2eec6-audit-dir\") pod \"apiserver-76f77b778f-wlp9g\" (UID: \"e2c063f6-68dc-4e05-bb17-fcbee1a2eec6\") " pod="openshift-apiserver/apiserver-76f77b778f-wlp9g" Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.103713 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/f53170bf-fd8d-41c2-b353-50c5d5cff3d8-oauth-serving-cert\") pod \"console-f9d7485db-ggr2t\" (UID: \"f53170bf-fd8d-41c2-b353-50c5d5cff3d8\") " pod="openshift-console/console-f9d7485db-ggr2t" Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.068694 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6f9925f6-f8f0-4e9d-8f05-1d3e08d9e4dd-config\") pod \"openshift-apiserver-operator-796bbdcf4f-cxxjr\" (UID: \"6f9925f6-f8f0-4e9d-8f05-1d3e08d9e4dd\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-cxxjr" Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.103986 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-hwr6b" Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.104045 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-8zvcg" Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.104729 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3bf63bc2-e5fc-4974-b78e-185796fea03e-config\") pod \"controller-manager-879f6c89f-tdsgg\" (UID: \"3bf63bc2-e5fc-4974-b78e-185796fea03e\") " pod="openshift-controller-manager/controller-manager-879f6c89f-tdsgg" Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.105413 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/44aed4cf-ccda-49ee-9d8a-c1b95946bdf9-audit-dir\") pod \"apiserver-7bbb656c7d-jjxln\" (UID: \"44aed4cf-ccda-49ee-9d8a-c1b95946bdf9\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jjxln" Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.107027 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/1f090fa3-d832-4693-a9a7-24cdae8aae07-available-featuregates\") pod \"openshift-config-operator-7777fb866f-vm8dg\" (UID: \"1f090fa3-d832-4693-a9a7-24cdae8aae07\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-vm8dg" Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.107314 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/04e0d67a-57a0-47be-8b3a-79e45a4c17cd-config\") pod \"console-operator-58897d9998-msk65\" (UID: \"04e0d67a-57a0-47be-8b3a-79e45a4c17cd\") " pod="openshift-console-operator/console-operator-58897d9998-msk65" Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.107973 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-h7tdr"] Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.108562 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/3af1e215-a4dc-4be0-b4a8-d6fdbd120f2f-etcd-client\") pod \"etcd-operator-b45778765-xnn4z\" (UID: \"3af1e215-a4dc-4be0-b4a8-d6fdbd120f2f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-xnn4z" Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.108713 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/80c013e7-68e8-462d-9d0d-3079d2245c8f-service-ca-bundle\") pod \"authentication-operator-69f744f599-jk7ch\" (UID: \"80c013e7-68e8-462d-9d0d-3079d2245c8f\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-jk7ch" Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.108920 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/e2c063f6-68dc-4e05-bb17-fcbee1a2eec6-encryption-config\") pod \"apiserver-76f77b778f-wlp9g\" (UID: \"e2c063f6-68dc-4e05-bb17-fcbee1a2eec6\") " pod="openshift-apiserver/apiserver-76f77b778f-wlp9g" Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.109245 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/484bb0d4-e7ab-406b-a9e8-5972d7cbfa8d-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-62t8j\" (UID: \"484bb0d4-e7ab-406b-a9e8-5972d7cbfa8d\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-62t8j" Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.109378 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-pssx9" Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.110859 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3af1e215-a4dc-4be0-b4a8-d6fdbd120f2f-config\") pod \"etcd-operator-b45778765-xnn4z\" (UID: \"3af1e215-a4dc-4be0-b4a8-d6fdbd120f2f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-xnn4z" Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.111561 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.112075 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2f81f816-0c58-437b-be43-b782515e8997-config\") pod \"machine-api-operator-5694c8668f-tml4h\" (UID: \"2f81f816-0c58-437b-be43-b782515e8997\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-tml4h" Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.113863 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/80c013e7-68e8-462d-9d0d-3079d2245c8f-config\") pod \"authentication-operator-69f744f599-jk7ch\" (UID: \"80c013e7-68e8-462d-9d0d-3079d2245c8f\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-jk7ch" Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.114014 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/3bf63bc2-e5fc-4974-b78e-185796fea03e-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-tdsgg\" (UID: \"3bf63bc2-e5fc-4974-b78e-185796fea03e\") " pod="openshift-controller-manager/controller-manager-879f6c89f-tdsgg" Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.114352 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/44aed4cf-ccda-49ee-9d8a-c1b95946bdf9-serving-cert\") pod \"apiserver-7bbb656c7d-jjxln\" (UID: \"44aed4cf-ccda-49ee-9d8a-c1b95946bdf9\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jjxln" Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.114333 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/80c013e7-68e8-462d-9d0d-3079d2245c8f-serving-cert\") pod \"authentication-operator-69f744f599-jk7ch\" (UID: \"80c013e7-68e8-462d-9d0d-3079d2245c8f\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-jk7ch" Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.114930 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/f53170bf-fd8d-41c2-b353-50c5d5cff3d8-console-config\") pod \"console-f9d7485db-ggr2t\" (UID: \"f53170bf-fd8d-41c2-b353-50c5d5cff3d8\") " pod="openshift-console/console-f9d7485db-ggr2t" Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.115051 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/3af1e215-a4dc-4be0-b4a8-d6fdbd120f2f-etcd-service-ca\") pod \"etcd-operator-b45778765-xnn4z\" (UID: \"3af1e215-a4dc-4be0-b4a8-d6fdbd120f2f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-xnn4z" Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.115423 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/04e0d67a-57a0-47be-8b3a-79e45a4c17cd-trusted-ca\") pod \"console-operator-58897d9998-msk65\" (UID: \"04e0d67a-57a0-47be-8b3a-79e45a4c17cd\") " pod="openshift-console-operator/console-operator-58897d9998-msk65" Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.115561 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/2f81f816-0c58-437b-be43-b782515e8997-images\") pod \"machine-api-operator-5694c8668f-tml4h\" (UID: \"2f81f816-0c58-437b-be43-b782515e8997\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-tml4h" Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.116295 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/484bb0d4-e7ab-406b-a9e8-5972d7cbfa8d-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-62t8j\" (UID: \"484bb0d4-e7ab-406b-a9e8-5972d7cbfa8d\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-62t8j" Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.118353 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/2f81f816-0c58-437b-be43-b782515e8997-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-tml4h\" (UID: \"2f81f816-0c58-437b-be43-b782515e8997\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-tml4h" Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.118671 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-xnn4z"] Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.117212 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/01e7c005-7917-4002-b9d7-df2f8f138162-auth-proxy-config\") pod \"machine-approver-56656f9798-8rqjf\" (UID: \"01e7c005-7917-4002-b9d7-df2f8f138162\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-8rqjf" Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.118728 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-xhdkx"] Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.118703 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/44aed4cf-ccda-49ee-9d8a-c1b95946bdf9-etcd-client\") pod \"apiserver-7bbb656c7d-jjxln\" (UID: \"44aed4cf-ccda-49ee-9d8a-c1b95946bdf9\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jjxln" Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.118720 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/f53170bf-fd8d-41c2-b353-50c5d5cff3d8-console-serving-cert\") pod \"console-f9d7485db-ggr2t\" (UID: \"f53170bf-fd8d-41c2-b353-50c5d5cff3d8\") " pod="openshift-console/console-f9d7485db-ggr2t" Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.118905 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-h7tdr" Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.119547 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-kzt96"] Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.119712 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-ggr2t"] Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.119739 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-xhdkx" Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.120258 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-msk65"] Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.120287 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e2c063f6-68dc-4e05-bb17-fcbee1a2eec6-serving-cert\") pod \"apiserver-76f77b778f-wlp9g\" (UID: \"e2c063f6-68dc-4e05-bb17-fcbee1a2eec6\") " pod="openshift-apiserver/apiserver-76f77b778f-wlp9g" Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.121316 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.121353 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/04e0d67a-57a0-47be-8b3a-79e45a4c17cd-serving-cert\") pod \"console-operator-58897d9998-msk65\" (UID: \"04e0d67a-57a0-47be-8b3a-79e45a4c17cd\") " pod="openshift-console-operator/console-operator-58897d9998-msk65" Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.122423 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-r9zxx"] Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.123745 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-4gzvp"] Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.124958 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-8p9cs"] Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.126141 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-vm8dg"] Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.127204 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-5btj7"] Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.128111 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-5btj7" Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.129222 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-jk7ch"] Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.131176 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-tdsgg"] Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.131828 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29324670-6j9ps"] Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.132996 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-np5k5"] Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.133603 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-np5k5" Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.136067 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-c8l5f"] Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.137938 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29324670-6j9ps" Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.138748 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-v6bmf"] Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.140249 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-qgfpf"] Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.141282 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-2j9mx"] Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.141431 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.142405 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-mksjr"] Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.143600 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-jwdjw"] Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.148767 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-hwr6b"] Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.148799 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-csbrj"] Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.148810 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-7nt9l"] Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.148912 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-jwdjw" Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.149205 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-62t8j"] Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.150828 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-dp7r9"] Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.152183 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-sv5st"] Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.155031 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-nlpzc"] Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.158014 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-pssx9"] Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.160328 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-87tx4"] Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.162674 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.162953 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-8zvcg"] Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.164007 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-5btj7"] Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.164315 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-8f5qg"] Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.166982 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-h7tdr"] Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.168350 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-np5k5"] Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.169930 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-qc2rd"] Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.170754 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-qc2rd" Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.171682 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29324670-6j9ps"] Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.171799 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/81ab3c47-6cf1-493f-b1bd-730c970de51f-webhook-cert\") pod \"packageserver-d55dfcdfc-8p9cs\" (UID: \"81ab3c47-6cf1-493f-b1bd-730c970de51f\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-8p9cs" Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.171856 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b8gwr\" (UniqueName: \"kubernetes.io/projected/81ab3c47-6cf1-493f-b1bd-730c970de51f-kube-api-access-b8gwr\") pod \"packageserver-d55dfcdfc-8p9cs\" (UID: \"81ab3c47-6cf1-493f-b1bd-730c970de51f\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-8p9cs" Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.171895 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/a6d4f573-ea4a-4c8b-85da-cd3ce54f732b-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-r9zxx\" (UID: \"a6d4f573-ea4a-4c8b-85da-cd3ce54f732b\") " pod="openshift-authentication/oauth-openshift-558db77b4-r9zxx" Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.171920 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nsj27\" (UniqueName: \"kubernetes.io/projected/0826b49c-d336-4af5-82fc-c0e9d03cc7f4-kube-api-access-nsj27\") pod \"dns-operator-744455d44c-kzt96\" (UID: \"0826b49c-d336-4af5-82fc-c0e9d03cc7f4\") " pod="openshift-dns-operator/dns-operator-744455d44c-kzt96" Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.171992 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/6fd95bed-c64a-41fa-9382-233d6b69b31b-profile-collector-cert\") pod \"catalog-operator-68c6474976-8zvcg\" (UID: \"6fd95bed-c64a-41fa-9382-233d6b69b31b\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-8zvcg" Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.172025 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kgcpz\" (UniqueName: \"kubernetes.io/projected/3db3e714-d4c8-4e90-9d7a-d694fadfa856-kube-api-access-kgcpz\") pod \"dns-default-2j9mx\" (UID: \"3db3e714-d4c8-4e90-9d7a-d694fadfa856\") " pod="openshift-dns/dns-default-2j9mx" Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.172059 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/a6d4f573-ea4a-4c8b-85da-cd3ce54f732b-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-r9zxx\" (UID: \"a6d4f573-ea4a-4c8b-85da-cd3ce54f732b\") " pod="openshift-authentication/oauth-openshift-558db77b4-r9zxx" Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.172102 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9b926d95-536f-4925-8078-d14df3d56611-trusted-ca\") pod \"ingress-operator-5b745b69d9-v6bmf\" (UID: \"9b926d95-536f-4925-8078-d14df3d56611\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-v6bmf" Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.172130 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fc49b92e-3574-4789-9f65-c2aeb678899b-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-qgfpf\" (UID: \"fc49b92e-3574-4789-9f65-c2aeb678899b\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-qgfpf" Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.172157 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/a6d4f573-ea4a-4c8b-85da-cd3ce54f732b-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-r9zxx\" (UID: \"a6d4f573-ea4a-4c8b-85da-cd3ce54f732b\") " pod="openshift-authentication/oauth-openshift-558db77b4-r9zxx" Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.172183 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a04ec950-87ce-4f04-9c9d-201a4f6099a1-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-h7tdr\" (UID: \"a04ec950-87ce-4f04-9c9d-201a4f6099a1\") " pod="openshift-marketplace/marketplace-operator-79b997595-h7tdr" Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.172222 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/a6d4f573-ea4a-4c8b-85da-cd3ce54f732b-audit-policies\") pod \"oauth-openshift-558db77b4-r9zxx\" (UID: \"a6d4f573-ea4a-4c8b-85da-cd3ce54f732b\") " pod="openshift-authentication/oauth-openshift-558db77b4-r9zxx" Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.172246 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mhmwn\" (UniqueName: \"kubernetes.io/projected/a6d4f573-ea4a-4c8b-85da-cd3ce54f732b-kube-api-access-mhmwn\") pod \"oauth-openshift-558db77b4-r9zxx\" (UID: \"a6d4f573-ea4a-4c8b-85da-cd3ce54f732b\") " pod="openshift-authentication/oauth-openshift-558db77b4-r9zxx" Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.172269 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/fc49b92e-3574-4789-9f65-c2aeb678899b-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-qgfpf\" (UID: \"fc49b92e-3574-4789-9f65-c2aeb678899b\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-qgfpf" Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.172310 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/6fd95bed-c64a-41fa-9382-233d6b69b31b-srv-cert\") pod \"catalog-operator-68c6474976-8zvcg\" (UID: \"6fd95bed-c64a-41fa-9382-233d6b69b31b\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-8zvcg" Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.172370 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/a6d4f573-ea4a-4c8b-85da-cd3ce54f732b-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-r9zxx\" (UID: \"a6d4f573-ea4a-4c8b-85da-cd3ce54f732b\") " pod="openshift-authentication/oauth-openshift-558db77b4-r9zxx" Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.172422 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/81ab3c47-6cf1-493f-b1bd-730c970de51f-apiservice-cert\") pod \"packageserver-d55dfcdfc-8p9cs\" (UID: \"81ab3c47-6cf1-493f-b1bd-730c970de51f\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-8p9cs" Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.172449 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rqz8p\" (UniqueName: \"kubernetes.io/projected/89e2e475-d36b-4626-ab23-e767070eb376-kube-api-access-rqz8p\") pod \"control-plane-machine-set-operator-78cbb6b69f-dp7r9\" (UID: \"89e2e475-d36b-4626-ab23-e767070eb376\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-dp7r9" Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.172477 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/0826b49c-d336-4af5-82fc-c0e9d03cc7f4-metrics-tls\") pod \"dns-operator-744455d44c-kzt96\" (UID: \"0826b49c-d336-4af5-82fc-c0e9d03cc7f4\") " pod="openshift-dns-operator/dns-operator-744455d44c-kzt96" Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.172503 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z6ngr\" (UniqueName: \"kubernetes.io/projected/a04ec950-87ce-4f04-9c9d-201a4f6099a1-kube-api-access-z6ngr\") pod \"marketplace-operator-79b997595-h7tdr\" (UID: \"a04ec950-87ce-4f04-9c9d-201a4f6099a1\") " pod="openshift-marketplace/marketplace-operator-79b997595-h7tdr" Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.172535 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7fpg5\" (UniqueName: \"kubernetes.io/projected/560651e5-35a6-4428-b763-972bdbef6c6d-kube-api-access-7fpg5\") pod \"openshift-controller-manager-operator-756b6f6bc6-lzfvb\" (UID: \"560651e5-35a6-4428-b763-972bdbef6c6d\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-lzfvb" Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.172560 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/9b926d95-536f-4925-8078-d14df3d56611-metrics-tls\") pod \"ingress-operator-5b745b69d9-v6bmf\" (UID: \"9b926d95-536f-4925-8078-d14df3d56611\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-v6bmf" Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.172609 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a6d4f573-ea4a-4c8b-85da-cd3ce54f732b-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-r9zxx\" (UID: \"a6d4f573-ea4a-4c8b-85da-cd3ce54f732b\") " pod="openshift-authentication/oauth-openshift-558db77b4-r9zxx" Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.172641 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/560651e5-35a6-4428-b763-972bdbef6c6d-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-lzfvb\" (UID: \"560651e5-35a6-4428-b763-972bdbef6c6d\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-lzfvb" Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.172679 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/a6d4f573-ea4a-4c8b-85da-cd3ce54f732b-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-r9zxx\" (UID: \"a6d4f573-ea4a-4c8b-85da-cd3ce54f732b\") " pod="openshift-authentication/oauth-openshift-558db77b4-r9zxx" Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.172716 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qwc8p\" (UniqueName: \"kubernetes.io/projected/9b926d95-536f-4925-8078-d14df3d56611-kube-api-access-qwc8p\") pod \"ingress-operator-5b745b69d9-v6bmf\" (UID: \"9b926d95-536f-4925-8078-d14df3d56611\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-v6bmf" Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.172793 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/a6d4f573-ea4a-4c8b-85da-cd3ce54f732b-audit-dir\") pod \"oauth-openshift-558db77b4-r9zxx\" (UID: \"a6d4f573-ea4a-4c8b-85da-cd3ce54f732b\") " pod="openshift-authentication/oauth-openshift-558db77b4-r9zxx" Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.172829 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/89e2e475-d36b-4626-ab23-e767070eb376-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-dp7r9\" (UID: \"89e2e475-d36b-4626-ab23-e767070eb376\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-dp7r9" Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.172871 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/9b926d95-536f-4925-8078-d14df3d56611-bound-sa-token\") pod \"ingress-operator-5b745b69d9-v6bmf\" (UID: \"9b926d95-536f-4925-8078-d14df3d56611\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-v6bmf" Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.172934 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/560651e5-35a6-4428-b763-972bdbef6c6d-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-lzfvb\" (UID: \"560651e5-35a6-4428-b763-972bdbef6c6d\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-lzfvb" Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.172962 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/a6d4f573-ea4a-4c8b-85da-cd3ce54f732b-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-r9zxx\" (UID: \"a6d4f573-ea4a-4c8b-85da-cd3ce54f732b\") " pod="openshift-authentication/oauth-openshift-558db77b4-r9zxx" Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.173004 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-69spb\" (UniqueName: \"kubernetes.io/projected/6fd95bed-c64a-41fa-9382-233d6b69b31b-kube-api-access-69spb\") pod \"catalog-operator-68c6474976-8zvcg\" (UID: \"6fd95bed-c64a-41fa-9382-233d6b69b31b\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-8zvcg" Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.173056 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/a04ec950-87ce-4f04-9c9d-201a4f6099a1-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-h7tdr\" (UID: \"a04ec950-87ce-4f04-9c9d-201a4f6099a1\") " pod="openshift-marketplace/marketplace-operator-79b997595-h7tdr" Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.173082 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fc49b92e-3574-4789-9f65-c2aeb678899b-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-qgfpf\" (UID: \"fc49b92e-3574-4789-9f65-c2aeb678899b\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-qgfpf" Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.173171 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/a6d4f573-ea4a-4c8b-85da-cd3ce54f732b-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-r9zxx\" (UID: \"a6d4f573-ea4a-4c8b-85da-cd3ce54f732b\") " pod="openshift-authentication/oauth-openshift-558db77b4-r9zxx" Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.173669 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/3db3e714-d4c8-4e90-9d7a-d694fadfa856-metrics-tls\") pod \"dns-default-2j9mx\" (UID: \"3db3e714-d4c8-4e90-9d7a-d694fadfa856\") " pod="openshift-dns/dns-default-2j9mx" Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.173723 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/a6d4f573-ea4a-4c8b-85da-cd3ce54f732b-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-r9zxx\" (UID: \"a6d4f573-ea4a-4c8b-85da-cd3ce54f732b\") " pod="openshift-authentication/oauth-openshift-558db77b4-r9zxx" Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.173732 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/a6d4f573-ea4a-4c8b-85da-cd3ce54f732b-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-r9zxx\" (UID: \"a6d4f573-ea4a-4c8b-85da-cd3ce54f732b\") " pod="openshift-authentication/oauth-openshift-558db77b4-r9zxx" Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.173744 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/81ab3c47-6cf1-493f-b1bd-730c970de51f-tmpfs\") pod \"packageserver-d55dfcdfc-8p9cs\" (UID: \"81ab3c47-6cf1-493f-b1bd-730c970de51f\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-8p9cs" Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.173786 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3db3e714-d4c8-4e90-9d7a-d694fadfa856-config-volume\") pod \"dns-default-2j9mx\" (UID: \"3db3e714-d4c8-4e90-9d7a-d694fadfa856\") " pod="openshift-dns/dns-default-2j9mx" Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.173671 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/a6d4f573-ea4a-4c8b-85da-cd3ce54f732b-audit-dir\") pod \"oauth-openshift-558db77b4-r9zxx\" (UID: \"a6d4f573-ea4a-4c8b-85da-cd3ce54f732b\") " pod="openshift-authentication/oauth-openshift-558db77b4-r9zxx" Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.173988 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/a6d4f573-ea4a-4c8b-85da-cd3ce54f732b-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-r9zxx\" (UID: \"a6d4f573-ea4a-4c8b-85da-cd3ce54f732b\") " pod="openshift-authentication/oauth-openshift-558db77b4-r9zxx" Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.174126 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/a6d4f573-ea4a-4c8b-85da-cd3ce54f732b-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-r9zxx\" (UID: \"a6d4f573-ea4a-4c8b-85da-cd3ce54f732b\") " pod="openshift-authentication/oauth-openshift-558db77b4-r9zxx" Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.174135 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/a6d4f573-ea4a-4c8b-85da-cd3ce54f732b-audit-policies\") pod \"oauth-openshift-558db77b4-r9zxx\" (UID: \"a6d4f573-ea4a-4c8b-85da-cd3ce54f732b\") " pod="openshift-authentication/oauth-openshift-558db77b4-r9zxx" Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.174329 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/a6d4f573-ea4a-4c8b-85da-cd3ce54f732b-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-r9zxx\" (UID: \"a6d4f573-ea4a-4c8b-85da-cd3ce54f732b\") " pod="openshift-authentication/oauth-openshift-558db77b4-r9zxx" Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.174804 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a6d4f573-ea4a-4c8b-85da-cd3ce54f732b-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-r9zxx\" (UID: \"a6d4f573-ea4a-4c8b-85da-cd3ce54f732b\") " pod="openshift-authentication/oauth-openshift-558db77b4-r9zxx" Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.175034 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/560651e5-35a6-4428-b763-972bdbef6c6d-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-lzfvb\" (UID: \"560651e5-35a6-4428-b763-972bdbef6c6d\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-lzfvb" Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.175160 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/81ab3c47-6cf1-493f-b1bd-730c970de51f-tmpfs\") pod \"packageserver-d55dfcdfc-8p9cs\" (UID: \"81ab3c47-6cf1-493f-b1bd-730c970de51f\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-8p9cs" Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.176964 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/a6d4f573-ea4a-4c8b-85da-cd3ce54f732b-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-r9zxx\" (UID: \"a6d4f573-ea4a-4c8b-85da-cd3ce54f732b\") " pod="openshift-authentication/oauth-openshift-558db77b4-r9zxx" Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.177455 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/a6d4f573-ea4a-4c8b-85da-cd3ce54f732b-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-r9zxx\" (UID: \"a6d4f573-ea4a-4c8b-85da-cd3ce54f732b\") " pod="openshift-authentication/oauth-openshift-558db77b4-r9zxx" Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.177625 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/a6d4f573-ea4a-4c8b-85da-cd3ce54f732b-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-r9zxx\" (UID: \"a6d4f573-ea4a-4c8b-85da-cd3ce54f732b\") " pod="openshift-authentication/oauth-openshift-558db77b4-r9zxx" Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.178752 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-qc2rd"] Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.179096 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/560651e5-35a6-4428-b763-972bdbef6c6d-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-lzfvb\" (UID: \"560651e5-35a6-4428-b763-972bdbef6c6d\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-lzfvb" Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.179455 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/a6d4f573-ea4a-4c8b-85da-cd3ce54f732b-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-r9zxx\" (UID: \"a6d4f573-ea4a-4c8b-85da-cd3ce54f732b\") " pod="openshift-authentication/oauth-openshift-558db77b4-r9zxx" Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.179651 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/a6d4f573-ea4a-4c8b-85da-cd3ce54f732b-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-r9zxx\" (UID: \"a6d4f573-ea4a-4c8b-85da-cd3ce54f732b\") " pod="openshift-authentication/oauth-openshift-558db77b4-r9zxx" Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.179670 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/a6d4f573-ea4a-4c8b-85da-cd3ce54f732b-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-r9zxx\" (UID: \"a6d4f573-ea4a-4c8b-85da-cd3ce54f732b\") " pod="openshift-authentication/oauth-openshift-558db77b4-r9zxx" Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.179816 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/a6d4f573-ea4a-4c8b-85da-cd3ce54f732b-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-r9zxx\" (UID: \"a6d4f573-ea4a-4c8b-85da-cd3ce54f732b\") " pod="openshift-authentication/oauth-openshift-558db77b4-r9zxx" Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.180374 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-t9hsf"] Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.182017 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.182079 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-t9hsf" Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.182195 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-t9hsf"] Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.183340 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/0826b49c-d336-4af5-82fc-c0e9d03cc7f4-metrics-tls\") pod \"dns-operator-744455d44c-kzt96\" (UID: \"0826b49c-d336-4af5-82fc-c0e9d03cc7f4\") " pod="openshift-dns-operator/dns-operator-744455d44c-kzt96" Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.186365 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/a6d4f573-ea4a-4c8b-85da-cd3ce54f732b-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-r9zxx\" (UID: \"a6d4f573-ea4a-4c8b-85da-cd3ce54f732b\") " pod="openshift-authentication/oauth-openshift-558db77b4-r9zxx" Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.201611 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.221924 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.225982 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/81ab3c47-6cf1-493f-b1bd-730c970de51f-webhook-cert\") pod \"packageserver-d55dfcdfc-8p9cs\" (UID: \"81ab3c47-6cf1-493f-b1bd-730c970de51f\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-8p9cs" Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.226880 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/81ab3c47-6cf1-493f-b1bd-730c970de51f-apiservice-cert\") pod \"packageserver-d55dfcdfc-8p9cs\" (UID: \"81ab3c47-6cf1-493f-b1bd-730c970de51f\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-8p9cs" Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.242202 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.261834 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.275919 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/fc49b92e-3574-4789-9f65-c2aeb678899b-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-qgfpf\" (UID: \"fc49b92e-3574-4789-9f65-c2aeb678899b\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-qgfpf" Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.277324 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/6fd95bed-c64a-41fa-9382-233d6b69b31b-srv-cert\") pod \"catalog-operator-68c6474976-8zvcg\" (UID: \"6fd95bed-c64a-41fa-9382-233d6b69b31b\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-8zvcg" Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.277456 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z6ngr\" (UniqueName: \"kubernetes.io/projected/a04ec950-87ce-4f04-9c9d-201a4f6099a1-kube-api-access-z6ngr\") pod \"marketplace-operator-79b997595-h7tdr\" (UID: \"a04ec950-87ce-4f04-9c9d-201a4f6099a1\") " pod="openshift-marketplace/marketplace-operator-79b997595-h7tdr" Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.277555 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rqz8p\" (UniqueName: \"kubernetes.io/projected/89e2e475-d36b-4626-ab23-e767070eb376-kube-api-access-rqz8p\") pod \"control-plane-machine-set-operator-78cbb6b69f-dp7r9\" (UID: \"89e2e475-d36b-4626-ab23-e767070eb376\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-dp7r9" Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.277677 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/9b926d95-536f-4925-8078-d14df3d56611-metrics-tls\") pod \"ingress-operator-5b745b69d9-v6bmf\" (UID: \"9b926d95-536f-4925-8078-d14df3d56611\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-v6bmf" Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.277743 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qwc8p\" (UniqueName: \"kubernetes.io/projected/9b926d95-536f-4925-8078-d14df3d56611-kube-api-access-qwc8p\") pod \"ingress-operator-5b745b69d9-v6bmf\" (UID: \"9b926d95-536f-4925-8078-d14df3d56611\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-v6bmf" Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.278999 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/89e2e475-d36b-4626-ab23-e767070eb376-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-dp7r9\" (UID: \"89e2e475-d36b-4626-ab23-e767070eb376\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-dp7r9" Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.279062 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/9b926d95-536f-4925-8078-d14df3d56611-bound-sa-token\") pod \"ingress-operator-5b745b69d9-v6bmf\" (UID: \"9b926d95-536f-4925-8078-d14df3d56611\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-v6bmf" Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.279141 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-69spb\" (UniqueName: \"kubernetes.io/projected/6fd95bed-c64a-41fa-9382-233d6b69b31b-kube-api-access-69spb\") pod \"catalog-operator-68c6474976-8zvcg\" (UID: \"6fd95bed-c64a-41fa-9382-233d6b69b31b\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-8zvcg" Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.279206 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/a04ec950-87ce-4f04-9c9d-201a4f6099a1-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-h7tdr\" (UID: \"a04ec950-87ce-4f04-9c9d-201a4f6099a1\") " pod="openshift-marketplace/marketplace-operator-79b997595-h7tdr" Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.279239 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fc49b92e-3574-4789-9f65-c2aeb678899b-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-qgfpf\" (UID: \"fc49b92e-3574-4789-9f65-c2aeb678899b\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-qgfpf" Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.279297 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/3db3e714-d4c8-4e90-9d7a-d694fadfa856-metrics-tls\") pod \"dns-default-2j9mx\" (UID: \"3db3e714-d4c8-4e90-9d7a-d694fadfa856\") " pod="openshift-dns/dns-default-2j9mx" Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.279349 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3db3e714-d4c8-4e90-9d7a-d694fadfa856-config-volume\") pod \"dns-default-2j9mx\" (UID: \"3db3e714-d4c8-4e90-9d7a-d694fadfa856\") " pod="openshift-dns/dns-default-2j9mx" Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.279520 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/6fd95bed-c64a-41fa-9382-233d6b69b31b-profile-collector-cert\") pod \"catalog-operator-68c6474976-8zvcg\" (UID: \"6fd95bed-c64a-41fa-9382-233d6b69b31b\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-8zvcg" Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.279556 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kgcpz\" (UniqueName: \"kubernetes.io/projected/3db3e714-d4c8-4e90-9d7a-d694fadfa856-kube-api-access-kgcpz\") pod \"dns-default-2j9mx\" (UID: \"3db3e714-d4c8-4e90-9d7a-d694fadfa856\") " pod="openshift-dns/dns-default-2j9mx" Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.279685 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9b926d95-536f-4925-8078-d14df3d56611-trusted-ca\") pod \"ingress-operator-5b745b69d9-v6bmf\" (UID: \"9b926d95-536f-4925-8078-d14df3d56611\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-v6bmf" Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.279726 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fc49b92e-3574-4789-9f65-c2aeb678899b-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-qgfpf\" (UID: \"fc49b92e-3574-4789-9f65-c2aeb678899b\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-qgfpf" Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.279762 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a04ec950-87ce-4f04-9c9d-201a4f6099a1-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-h7tdr\" (UID: \"a04ec950-87ce-4f04-9c9d-201a4f6099a1\") " pod="openshift-marketplace/marketplace-operator-79b997595-h7tdr" Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.310706 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.312634 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9b926d95-536f-4925-8078-d14df3d56611-trusted-ca\") pod \"ingress-operator-5b745b69d9-v6bmf\" (UID: \"9b926d95-536f-4925-8078-d14df3d56611\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-v6bmf" Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.322842 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.341923 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.362815 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.381744 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.391749 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3db3e714-d4c8-4e90-9d7a-d694fadfa856-config-volume\") pod \"dns-default-2j9mx\" (UID: \"3db3e714-d4c8-4e90-9d7a-d694fadfa856\") " pod="openshift-dns/dns-default-2j9mx" Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.401950 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.416271 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/3db3e714-d4c8-4e90-9d7a-d694fadfa856-metrics-tls\") pod \"dns-default-2j9mx\" (UID: \"3db3e714-d4c8-4e90-9d7a-d694fadfa856\") " pod="openshift-dns/dns-default-2j9mx" Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.422905 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.442373 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.454152 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/9b926d95-536f-4925-8078-d14df3d56611-metrics-tls\") pod \"ingress-operator-5b745b69d9-v6bmf\" (UID: \"9b926d95-536f-4925-8078-d14df3d56611\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-v6bmf" Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.462882 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.482027 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.501847 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.521850 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.542873 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.562714 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.581822 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.602763 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.622736 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.642231 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.663004 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.675504 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/89e2e475-d36b-4626-ab23-e767070eb376-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-dp7r9\" (UID: \"89e2e475-d36b-4626-ab23-e767070eb376\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-dp7r9" Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.710282 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tlbkz\" (UniqueName: \"kubernetes.io/projected/484bb0d4-e7ab-406b-a9e8-5972d7cbfa8d-kube-api-access-tlbkz\") pod \"cluster-image-registry-operator-dc59b4c8b-62t8j\" (UID: \"484bb0d4-e7ab-406b-a9e8-5972d7cbfa8d\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-62t8j" Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.718178 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bbxtm\" (UniqueName: \"kubernetes.io/projected/cf6939a0-5f2b-4db5-b5c2-613e19bb4b43-kube-api-access-bbxtm\") pod \"downloads-7954f5f757-4gzvp\" (UID: \"cf6939a0-5f2b-4db5-b5c2-613e19bb4b43\") " pod="openshift-console/downloads-7954f5f757-4gzvp" Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.743716 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b4rdw\" (UniqueName: \"kubernetes.io/projected/5b33ea81-e9c2-4297-a0ed-6fe0a1688563-kube-api-access-b4rdw\") pod \"cluster-samples-operator-665b6dd947-f9drg\" (UID: \"5b33ea81-e9c2-4297-a0ed-6fe0a1688563\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-f9drg" Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.743751 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-4gzvp" Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.758190 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x985s\" (UniqueName: \"kubernetes.io/projected/1f090fa3-d832-4693-a9a7-24cdae8aae07-kube-api-access-x985s\") pod \"openshift-config-operator-7777fb866f-vm8dg\" (UID: \"1f090fa3-d832-4693-a9a7-24cdae8aae07\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-vm8dg" Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.776825 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-drfp6\" (UniqueName: \"kubernetes.io/projected/3bf63bc2-e5fc-4974-b78e-185796fea03e-kube-api-access-drfp6\") pod \"controller-manager-879f6c89f-tdsgg\" (UID: \"3bf63bc2-e5fc-4974-b78e-185796fea03e\") " pod="openshift-controller-manager/controller-manager-879f6c89f-tdsgg" Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.799085 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6nfwn\" (UniqueName: \"kubernetes.io/projected/3af1e215-a4dc-4be0-b4a8-d6fdbd120f2f-kube-api-access-6nfwn\") pod \"etcd-operator-b45778765-xnn4z\" (UID: \"3af1e215-a4dc-4be0-b4a8-d6fdbd120f2f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-xnn4z" Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.815852 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-xnn4z" Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.818685 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x642r\" (UniqueName: \"kubernetes.io/projected/2f81f816-0c58-437b-be43-b782515e8997-kube-api-access-x642r\") pod \"machine-api-operator-5694c8668f-tml4h\" (UID: \"2f81f816-0c58-437b-be43-b782515e8997\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-tml4h" Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.842031 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gjnph\" (UniqueName: \"kubernetes.io/projected/e2c063f6-68dc-4e05-bb17-fcbee1a2eec6-kube-api-access-gjnph\") pod \"apiserver-76f77b778f-wlp9g\" (UID: \"e2c063f6-68dc-4e05-bb17-fcbee1a2eec6\") " pod="openshift-apiserver/apiserver-76f77b778f-wlp9g" Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.842709 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-f9drg" Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.842881 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.883820 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.890591 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7fx7z\" (UniqueName: \"kubernetes.io/projected/01e7c005-7917-4002-b9d7-df2f8f138162-kube-api-access-7fx7z\") pod \"machine-approver-56656f9798-8rqjf\" (UID: \"01e7c005-7917-4002-b9d7-df2f8f138162\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-8rqjf" Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.905918 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-tml4h" Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.921750 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.923479 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cfphj\" (UniqueName: \"kubernetes.io/projected/44aed4cf-ccda-49ee-9d8a-c1b95946bdf9-kube-api-access-cfphj\") pod \"apiserver-7bbb656c7d-jjxln\" (UID: \"44aed4cf-ccda-49ee-9d8a-c1b95946bdf9\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jjxln" Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.943165 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.954720 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-vm8dg" Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.955408 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-4gzvp"] Oct 03 08:42:34 crc kubenswrapper[4790]: W1003 08:42:34.960891 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcf6939a0_5f2b_4db5_b5c2_613e19bb4b43.slice/crio-7a5c9807dba9643976d94ed381a8408c43d794d6e6de553065032932422358ce WatchSource:0}: Error finding container 7a5c9807dba9643976d94ed381a8408c43d794d6e6de553065032932422358ce: Status 404 returned error can't find the container with id 7a5c9807dba9643976d94ed381a8408c43d794d6e6de553065032932422358ce Oct 03 08:42:34 crc kubenswrapper[4790]: I1003 08:42:34.962887 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Oct 03 08:42:35 crc kubenswrapper[4790]: I1003 08:42:35.000157 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xlxb9\" (UniqueName: \"kubernetes.io/projected/04e0d67a-57a0-47be-8b3a-79e45a4c17cd-kube-api-access-xlxb9\") pod \"console-operator-58897d9998-msk65\" (UID: \"04e0d67a-57a0-47be-8b3a-79e45a4c17cd\") " pod="openshift-console-operator/console-operator-58897d9998-msk65" Oct 03 08:42:35 crc kubenswrapper[4790]: I1003 08:42:35.002049 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Oct 03 08:42:35 crc kubenswrapper[4790]: I1003 08:42:35.025521 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Oct 03 08:42:35 crc kubenswrapper[4790]: I1003 08:42:35.038145 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-wlp9g" Oct 03 08:42:35 crc kubenswrapper[4790]: I1003 08:42:35.041268 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-xnn4z"] Oct 03 08:42:35 crc kubenswrapper[4790]: W1003 08:42:35.052888 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3af1e215_a4dc_4be0_b4a8_d6fdbd120f2f.slice/crio-f952124f46345b52b02a16cea253960dd2e7a884eef82ce58b715a4d9aeae470 WatchSource:0}: Error finding container f952124f46345b52b02a16cea253960dd2e7a884eef82ce58b715a4d9aeae470: Status 404 returned error can't find the container with id f952124f46345b52b02a16cea253960dd2e7a884eef82ce58b715a4d9aeae470 Oct 03 08:42:35 crc kubenswrapper[4790]: I1003 08:42:35.066305 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4n54s\" (UniqueName: \"kubernetes.io/projected/80c013e7-68e8-462d-9d0d-3079d2245c8f-kube-api-access-4n54s\") pod \"authentication-operator-69f744f599-jk7ch\" (UID: \"80c013e7-68e8-462d-9d0d-3079d2245c8f\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-jk7ch" Oct 03 08:42:35 crc kubenswrapper[4790]: I1003 08:42:35.068266 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-tdsgg" Oct 03 08:42:35 crc kubenswrapper[4790]: I1003 08:42:35.082241 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Oct 03 08:42:35 crc kubenswrapper[4790]: I1003 08:42:35.083432 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-f9drg"] Oct 03 08:42:35 crc kubenswrapper[4790]: I1003 08:42:35.088316 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wjchd\" (UniqueName: \"kubernetes.io/projected/f53170bf-fd8d-41c2-b353-50c5d5cff3d8-kube-api-access-wjchd\") pod \"console-f9d7485db-ggr2t\" (UID: \"f53170bf-fd8d-41c2-b353-50c5d5cff3d8\") " pod="openshift-console/console-f9d7485db-ggr2t" Oct 03 08:42:35 crc kubenswrapper[4790]: I1003 08:42:35.099493 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-8rqjf" Oct 03 08:42:35 crc kubenswrapper[4790]: I1003 08:42:35.102899 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Oct 03 08:42:35 crc kubenswrapper[4790]: I1003 08:42:35.107614 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jjxln" Oct 03 08:42:35 crc kubenswrapper[4790]: I1003 08:42:35.125824 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-msk65" Oct 03 08:42:35 crc kubenswrapper[4790]: I1003 08:42:35.127548 4790 request.go:700] Waited for 1.023252168s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-scheduler-operator/secrets?fieldSelector=metadata.name%3Dkube-scheduler-operator-serving-cert&limit=500&resourceVersion=0 Oct 03 08:42:35 crc kubenswrapper[4790]: I1003 08:42:35.129771 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Oct 03 08:42:35 crc kubenswrapper[4790]: I1003 08:42:35.132762 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-tml4h"] Oct 03 08:42:35 crc kubenswrapper[4790]: I1003 08:42:35.149980 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fc49b92e-3574-4789-9f65-c2aeb678899b-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-qgfpf\" (UID: \"fc49b92e-3574-4789-9f65-c2aeb678899b\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-qgfpf" Oct 03 08:42:35 crc kubenswrapper[4790]: I1003 08:42:35.150513 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Oct 03 08:42:35 crc kubenswrapper[4790]: I1003 08:42:35.152101 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fc49b92e-3574-4789-9f65-c2aeb678899b-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-qgfpf\" (UID: \"fc49b92e-3574-4789-9f65-c2aeb678899b\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-qgfpf" Oct 03 08:42:35 crc kubenswrapper[4790]: I1003 08:42:35.163754 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Oct 03 08:42:35 crc kubenswrapper[4790]: I1003 08:42:35.184370 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Oct 03 08:42:35 crc kubenswrapper[4790]: W1003 08:42:35.184826 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2f81f816_0c58_437b_be43_b782515e8997.slice/crio-c489ea5b50e4f651c855cc5d66d0f2c389a5a9f999da52dee624174a4e3177f4 WatchSource:0}: Error finding container c489ea5b50e4f651c855cc5d66d0f2c389a5a9f999da52dee624174a4e3177f4: Status 404 returned error can't find the container with id c489ea5b50e4f651c855cc5d66d0f2c389a5a9f999da52dee624174a4e3177f4 Oct 03 08:42:35 crc kubenswrapper[4790]: I1003 08:42:35.186782 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-xnn4z" event={"ID":"3af1e215-a4dc-4be0-b4a8-d6fdbd120f2f","Type":"ContainerStarted","Data":"f952124f46345b52b02a16cea253960dd2e7a884eef82ce58b715a4d9aeae470"} Oct 03 08:42:35 crc kubenswrapper[4790]: I1003 08:42:35.192605 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-4gzvp" event={"ID":"cf6939a0-5f2b-4db5-b5c2-613e19bb4b43","Type":"ContainerStarted","Data":"9f0a66de7417e8f483735c569283c06c12dc7f44b7733bd829bae388b25e1eb1"} Oct 03 08:42:35 crc kubenswrapper[4790]: I1003 08:42:35.192669 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-4gzvp" event={"ID":"cf6939a0-5f2b-4db5-b5c2-613e19bb4b43","Type":"ContainerStarted","Data":"7a5c9807dba9643976d94ed381a8408c43d794d6e6de553065032932422358ce"} Oct 03 08:42:35 crc kubenswrapper[4790]: I1003 08:42:35.193129 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-4gzvp" Oct 03 08:42:35 crc kubenswrapper[4790]: I1003 08:42:35.197030 4790 patch_prober.go:28] interesting pod/downloads-7954f5f757-4gzvp container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.11:8080/\": dial tcp 10.217.0.11:8080: connect: connection refused" start-of-body= Oct 03 08:42:35 crc kubenswrapper[4790]: I1003 08:42:35.197090 4790 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-4gzvp" podUID="cf6939a0-5f2b-4db5-b5c2-613e19bb4b43" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.11:8080/\": dial tcp 10.217.0.11:8080: connect: connection refused" Oct 03 08:42:35 crc kubenswrapper[4790]: W1003 08:42:35.197191 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod01e7c005_7917_4002_b9d7_df2f8f138162.slice/crio-702b7f67b51b84c4437c8dd3e66a2fa5e0a4e1ccd083b8fb35599b27668cf4c5 WatchSource:0}: Error finding container 702b7f67b51b84c4437c8dd3e66a2fa5e0a4e1ccd083b8fb35599b27668cf4c5: Status 404 returned error can't find the container with id 702b7f67b51b84c4437c8dd3e66a2fa5e0a4e1ccd083b8fb35599b27668cf4c5 Oct 03 08:42:35 crc kubenswrapper[4790]: I1003 08:42:35.197341 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/6fd95bed-c64a-41fa-9382-233d6b69b31b-profile-collector-cert\") pod \"catalog-operator-68c6474976-8zvcg\" (UID: \"6fd95bed-c64a-41fa-9382-233d6b69b31b\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-8zvcg" Oct 03 08:42:35 crc kubenswrapper[4790]: I1003 08:42:35.201003 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Oct 03 08:42:35 crc kubenswrapper[4790]: I1003 08:42:35.218708 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-vm8dg"] Oct 03 08:42:35 crc kubenswrapper[4790]: I1003 08:42:35.222431 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Oct 03 08:42:35 crc kubenswrapper[4790]: I1003 08:42:35.242809 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Oct 03 08:42:35 crc kubenswrapper[4790]: I1003 08:42:35.264952 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Oct 03 08:42:35 crc kubenswrapper[4790]: I1003 08:42:35.270780 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-jk7ch" Oct 03 08:42:35 crc kubenswrapper[4790]: I1003 08:42:35.277293 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/6fd95bed-c64a-41fa-9382-233d6b69b31b-srv-cert\") pod \"catalog-operator-68c6474976-8zvcg\" (UID: \"6fd95bed-c64a-41fa-9382-233d6b69b31b\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-8zvcg" Oct 03 08:42:35 crc kubenswrapper[4790]: E1003 08:42:35.284695 4790 secret.go:188] Couldn't get secret openshift-marketplace/marketplace-operator-metrics: failed to sync secret cache: timed out waiting for the condition Oct 03 08:42:35 crc kubenswrapper[4790]: E1003 08:42:35.284816 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a04ec950-87ce-4f04-9c9d-201a4f6099a1-marketplace-operator-metrics podName:a04ec950-87ce-4f04-9c9d-201a4f6099a1 nodeName:}" failed. No retries permitted until 2025-10-03 08:42:35.784788314 +0000 UTC m=+142.137722552 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "marketplace-operator-metrics" (UniqueName: "kubernetes.io/secret/a04ec950-87ce-4f04-9c9d-201a4f6099a1-marketplace-operator-metrics") pod "marketplace-operator-79b997595-h7tdr" (UID: "a04ec950-87ce-4f04-9c9d-201a4f6099a1") : failed to sync secret cache: timed out waiting for the condition Oct 03 08:42:35 crc kubenswrapper[4790]: E1003 08:42:35.285099 4790 configmap.go:193] Couldn't get configMap openshift-marketplace/marketplace-trusted-ca: failed to sync configmap cache: timed out waiting for the condition Oct 03 08:42:35 crc kubenswrapper[4790]: E1003 08:42:35.285148 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/a04ec950-87ce-4f04-9c9d-201a4f6099a1-marketplace-trusted-ca podName:a04ec950-87ce-4f04-9c9d-201a4f6099a1 nodeName:}" failed. No retries permitted until 2025-10-03 08:42:35.785135224 +0000 UTC m=+142.138069462 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "marketplace-trusted-ca" (UniqueName: "kubernetes.io/configmap/a04ec950-87ce-4f04-9c9d-201a4f6099a1-marketplace-trusted-ca") pod "marketplace-operator-79b997595-h7tdr" (UID: "a04ec950-87ce-4f04-9c9d-201a4f6099a1") : failed to sync configmap cache: timed out waiting for the condition Oct 03 08:42:35 crc kubenswrapper[4790]: I1003 08:42:35.286724 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Oct 03 08:42:35 crc kubenswrapper[4790]: I1003 08:42:35.305593 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Oct 03 08:42:35 crc kubenswrapper[4790]: I1003 08:42:35.309362 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-wlp9g"] Oct 03 08:42:35 crc kubenswrapper[4790]: I1003 08:42:35.322341 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Oct 03 08:42:35 crc kubenswrapper[4790]: I1003 08:42:35.346753 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Oct 03 08:42:35 crc kubenswrapper[4790]: I1003 08:42:35.362428 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Oct 03 08:42:35 crc kubenswrapper[4790]: I1003 08:42:35.375975 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-ggr2t" Oct 03 08:42:35 crc kubenswrapper[4790]: I1003 08:42:35.385247 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Oct 03 08:42:35 crc kubenswrapper[4790]: I1003 08:42:35.400278 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-tdsgg"] Oct 03 08:42:35 crc kubenswrapper[4790]: I1003 08:42:35.402228 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Oct 03 08:42:35 crc kubenswrapper[4790]: I1003 08:42:35.422739 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Oct 03 08:42:35 crc kubenswrapper[4790]: W1003 08:42:35.436292 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3bf63bc2_e5fc_4974_b78e_185796fea03e.slice/crio-baffeff961dc8555947bfd2a0d8dd5a3dbaccffc3162a3c75124d4e4b4bec747 WatchSource:0}: Error finding container baffeff961dc8555947bfd2a0d8dd5a3dbaccffc3162a3c75124d4e4b4bec747: Status 404 returned error can't find the container with id baffeff961dc8555947bfd2a0d8dd5a3dbaccffc3162a3c75124d4e4b4bec747 Oct 03 08:42:35 crc kubenswrapper[4790]: I1003 08:42:35.467344 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ljf4f\" (UniqueName: \"kubernetes.io/projected/74be60a5-ccdf-40b5-bc76-ebf2196ae376-kube-api-access-ljf4f\") pod \"route-controller-manager-6576b87f9c-tvcgx\" (UID: \"74be60a5-ccdf-40b5-bc76-ebf2196ae376\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-tvcgx" Oct 03 08:42:35 crc kubenswrapper[4790]: I1003 08:42:35.481984 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kclsv\" (UniqueName: \"kubernetes.io/projected/6f9925f6-f8f0-4e9d-8f05-1d3e08d9e4dd-kube-api-access-kclsv\") pod \"openshift-apiserver-operator-796bbdcf4f-cxxjr\" (UID: \"6f9925f6-f8f0-4e9d-8f05-1d3e08d9e4dd\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-cxxjr" Oct 03 08:42:35 crc kubenswrapper[4790]: I1003 08:42:35.486532 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-msk65"] Oct 03 08:42:35 crc kubenswrapper[4790]: I1003 08:42:35.501057 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/484bb0d4-e7ab-406b-a9e8-5972d7cbfa8d-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-62t8j\" (UID: \"484bb0d4-e7ab-406b-a9e8-5972d7cbfa8d\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-62t8j" Oct 03 08:42:35 crc kubenswrapper[4790]: I1003 08:42:35.502798 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Oct 03 08:42:35 crc kubenswrapper[4790]: I1003 08:42:35.523001 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Oct 03 08:42:35 crc kubenswrapper[4790]: I1003 08:42:35.542268 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Oct 03 08:42:35 crc kubenswrapper[4790]: I1003 08:42:35.556346 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-jjxln"] Oct 03 08:42:35 crc kubenswrapper[4790]: I1003 08:42:35.564507 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Oct 03 08:42:35 crc kubenswrapper[4790]: W1003 08:42:35.572400 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod44aed4cf_ccda_49ee_9d8a_c1b95946bdf9.slice/crio-7bac19897984f4dcede801bdcd85908aee731a948eb7ebef766593aa4d0a423e WatchSource:0}: Error finding container 7bac19897984f4dcede801bdcd85908aee731a948eb7ebef766593aa4d0a423e: Status 404 returned error can't find the container with id 7bac19897984f4dcede801bdcd85908aee731a948eb7ebef766593aa4d0a423e Oct 03 08:42:35 crc kubenswrapper[4790]: I1003 08:42:35.585045 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Oct 03 08:42:35 crc kubenswrapper[4790]: I1003 08:42:35.606350 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Oct 03 08:42:35 crc kubenswrapper[4790]: I1003 08:42:35.621631 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Oct 03 08:42:35 crc kubenswrapper[4790]: I1003 08:42:35.635120 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-jk7ch"] Oct 03 08:42:35 crc kubenswrapper[4790]: I1003 08:42:35.643897 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Oct 03 08:42:35 crc kubenswrapper[4790]: I1003 08:42:35.664559 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-tvcgx" Oct 03 08:42:35 crc kubenswrapper[4790]: I1003 08:42:35.666269 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Oct 03 08:42:35 crc kubenswrapper[4790]: I1003 08:42:35.683650 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Oct 03 08:42:35 crc kubenswrapper[4790]: I1003 08:42:35.704689 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Oct 03 08:42:35 crc kubenswrapper[4790]: I1003 08:42:35.708048 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-62t8j" Oct 03 08:42:35 crc kubenswrapper[4790]: I1003 08:42:35.723681 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-ggr2t"] Oct 03 08:42:35 crc kubenswrapper[4790]: W1003 08:42:35.730481 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod80c013e7_68e8_462d_9d0d_3079d2245c8f.slice/crio-3acfb07e3ea407695536bbbdd2a40329169830a174e5b040c892520c63d9509a WatchSource:0}: Error finding container 3acfb07e3ea407695536bbbdd2a40329169830a174e5b040c892520c63d9509a: Status 404 returned error can't find the container with id 3acfb07e3ea407695536bbbdd2a40329169830a174e5b040c892520c63d9509a Oct 03 08:42:35 crc kubenswrapper[4790]: I1003 08:42:35.742354 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Oct 03 08:42:35 crc kubenswrapper[4790]: I1003 08:42:35.752707 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Oct 03 08:42:35 crc kubenswrapper[4790]: I1003 08:42:35.762166 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Oct 03 08:42:35 crc kubenswrapper[4790]: I1003 08:42:35.777981 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-cxxjr" Oct 03 08:42:35 crc kubenswrapper[4790]: I1003 08:42:35.785506 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Oct 03 08:42:35 crc kubenswrapper[4790]: I1003 08:42:35.804468 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a04ec950-87ce-4f04-9c9d-201a4f6099a1-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-h7tdr\" (UID: \"a04ec950-87ce-4f04-9c9d-201a4f6099a1\") " pod="openshift-marketplace/marketplace-operator-79b997595-h7tdr" Oct 03 08:42:35 crc kubenswrapper[4790]: I1003 08:42:35.804627 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/a04ec950-87ce-4f04-9c9d-201a4f6099a1-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-h7tdr\" (UID: \"a04ec950-87ce-4f04-9c9d-201a4f6099a1\") " pod="openshift-marketplace/marketplace-operator-79b997595-h7tdr" Oct 03 08:42:35 crc kubenswrapper[4790]: I1003 08:42:35.805319 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Oct 03 08:42:35 crc kubenswrapper[4790]: I1003 08:42:35.806494 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a04ec950-87ce-4f04-9c9d-201a4f6099a1-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-h7tdr\" (UID: \"a04ec950-87ce-4f04-9c9d-201a4f6099a1\") " pod="openshift-marketplace/marketplace-operator-79b997595-h7tdr" Oct 03 08:42:35 crc kubenswrapper[4790]: I1003 08:42:35.816177 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/a04ec950-87ce-4f04-9c9d-201a4f6099a1-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-h7tdr\" (UID: \"a04ec950-87ce-4f04-9c9d-201a4f6099a1\") " pod="openshift-marketplace/marketplace-operator-79b997595-h7tdr" Oct 03 08:42:35 crc kubenswrapper[4790]: I1003 08:42:35.823186 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Oct 03 08:42:35 crc kubenswrapper[4790]: I1003 08:42:35.843084 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Oct 03 08:42:35 crc kubenswrapper[4790]: I1003 08:42:35.866179 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Oct 03 08:42:35 crc kubenswrapper[4790]: I1003 08:42:35.884044 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Oct 03 08:42:35 crc kubenswrapper[4790]: I1003 08:42:35.912195 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Oct 03 08:42:35 crc kubenswrapper[4790]: I1003 08:42:35.922509 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Oct 03 08:42:35 crc kubenswrapper[4790]: I1003 08:42:35.941096 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-tvcgx"] Oct 03 08:42:35 crc kubenswrapper[4790]: I1003 08:42:35.946899 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 03 08:42:35 crc kubenswrapper[4790]: I1003 08:42:35.983055 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 03 08:42:36 crc kubenswrapper[4790]: I1003 08:42:36.002992 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Oct 03 08:42:36 crc kubenswrapper[4790]: I1003 08:42:36.023401 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Oct 03 08:42:36 crc kubenswrapper[4790]: I1003 08:42:36.030599 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-62t8j"] Oct 03 08:42:36 crc kubenswrapper[4790]: I1003 08:42:36.045371 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Oct 03 08:42:36 crc kubenswrapper[4790]: W1003 08:42:36.051306 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod484bb0d4_e7ab_406b_a9e8_5972d7cbfa8d.slice/crio-206e6159d5b0a2e9b8bd34e64504a52a48a946253fbe57c809b447c42fcae9d3 WatchSource:0}: Error finding container 206e6159d5b0a2e9b8bd34e64504a52a48a946253fbe57c809b447c42fcae9d3: Status 404 returned error can't find the container with id 206e6159d5b0a2e9b8bd34e64504a52a48a946253fbe57c809b447c42fcae9d3 Oct 03 08:42:36 crc kubenswrapper[4790]: I1003 08:42:36.063261 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Oct 03 08:42:36 crc kubenswrapper[4790]: I1003 08:42:36.097048 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Oct 03 08:42:36 crc kubenswrapper[4790]: I1003 08:42:36.107927 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Oct 03 08:42:36 crc kubenswrapper[4790]: I1003 08:42:36.129610 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Oct 03 08:42:36 crc kubenswrapper[4790]: I1003 08:42:36.140031 4790 request.go:700] Waited for 1.967982324s due to client-side throttling, not priority and fairness, request: POST:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-operator-lifecycle-manager/serviceaccounts/olm-operator-serviceaccount/token Oct 03 08:42:36 crc kubenswrapper[4790]: I1003 08:42:36.160817 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b8gwr\" (UniqueName: \"kubernetes.io/projected/81ab3c47-6cf1-493f-b1bd-730c970de51f-kube-api-access-b8gwr\") pod \"packageserver-d55dfcdfc-8p9cs\" (UID: \"81ab3c47-6cf1-493f-b1bd-730c970de51f\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-8p9cs" Oct 03 08:42:36 crc kubenswrapper[4790]: I1003 08:42:36.181435 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nsj27\" (UniqueName: \"kubernetes.io/projected/0826b49c-d336-4af5-82fc-c0e9d03cc7f4-kube-api-access-nsj27\") pod \"dns-operator-744455d44c-kzt96\" (UID: \"0826b49c-d336-4af5-82fc-c0e9d03cc7f4\") " pod="openshift-dns-operator/dns-operator-744455d44c-kzt96" Oct 03 08:42:36 crc kubenswrapper[4790]: I1003 08:42:36.198152 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7fpg5\" (UniqueName: \"kubernetes.io/projected/560651e5-35a6-4428-b763-972bdbef6c6d-kube-api-access-7fpg5\") pod \"openshift-controller-manager-operator-756b6f6bc6-lzfvb\" (UID: \"560651e5-35a6-4428-b763-972bdbef6c6d\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-lzfvb" Oct 03 08:42:36 crc kubenswrapper[4790]: I1003 08:42:36.198386 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-tdsgg" event={"ID":"3bf63bc2-e5fc-4974-b78e-185796fea03e","Type":"ContainerStarted","Data":"e2db524342a88bc7dcd429091f1f63fcfaf4f9a51cde62b28fc9fd2132843d3a"} Oct 03 08:42:36 crc kubenswrapper[4790]: I1003 08:42:36.198480 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-tdsgg" Oct 03 08:42:36 crc kubenswrapper[4790]: I1003 08:42:36.198499 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-tdsgg" event={"ID":"3bf63bc2-e5fc-4974-b78e-185796fea03e","Type":"ContainerStarted","Data":"baffeff961dc8555947bfd2a0d8dd5a3dbaccffc3162a3c75124d4e4b4bec747"} Oct 03 08:42:36 crc kubenswrapper[4790]: I1003 08:42:36.201152 4790 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-tdsgg container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.19:8443/healthz\": dial tcp 10.217.0.19:8443: connect: connection refused" start-of-body= Oct 03 08:42:36 crc kubenswrapper[4790]: I1003 08:42:36.201199 4790 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-tdsgg" podUID="3bf63bc2-e5fc-4974-b78e-185796fea03e" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.19:8443/healthz\": dial tcp 10.217.0.19:8443: connect: connection refused" Oct 03 08:42:36 crc kubenswrapper[4790]: I1003 08:42:36.201887 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-jk7ch" event={"ID":"80c013e7-68e8-462d-9d0d-3079d2245c8f","Type":"ContainerStarted","Data":"aa5199d16173900f5f494670d91113ef2a5ac30fff8d9e5ca63b4d20fb09e0fe"} Oct 03 08:42:36 crc kubenswrapper[4790]: I1003 08:42:36.201925 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-jk7ch" event={"ID":"80c013e7-68e8-462d-9d0d-3079d2245c8f","Type":"ContainerStarted","Data":"3acfb07e3ea407695536bbbdd2a40329169830a174e5b040c892520c63d9509a"} Oct 03 08:42:36 crc kubenswrapper[4790]: I1003 08:42:36.204412 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-62t8j" event={"ID":"484bb0d4-e7ab-406b-a9e8-5972d7cbfa8d","Type":"ContainerStarted","Data":"206e6159d5b0a2e9b8bd34e64504a52a48a946253fbe57c809b447c42fcae9d3"} Oct 03 08:42:36 crc kubenswrapper[4790]: I1003 08:42:36.206252 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-msk65" event={"ID":"04e0d67a-57a0-47be-8b3a-79e45a4c17cd","Type":"ContainerStarted","Data":"190b9c5a5c90693c31c26663cc6b34a092be77604c258e34c8d74df700a984f1"} Oct 03 08:42:36 crc kubenswrapper[4790]: I1003 08:42:36.206289 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-msk65" event={"ID":"04e0d67a-57a0-47be-8b3a-79e45a4c17cd","Type":"ContainerStarted","Data":"dc2dba70dfb038de375f62d7d76f924d452eec8b36c81886371e0390f70a639b"} Oct 03 08:42:36 crc kubenswrapper[4790]: I1003 08:42:36.207157 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-msk65" Oct 03 08:42:36 crc kubenswrapper[4790]: I1003 08:42:36.209523 4790 patch_prober.go:28] interesting pod/console-operator-58897d9998-msk65 container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.18:8443/readyz\": dial tcp 10.217.0.18:8443: connect: connection refused" start-of-body= Oct 03 08:42:36 crc kubenswrapper[4790]: I1003 08:42:36.209557 4790 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-msk65" podUID="04e0d67a-57a0-47be-8b3a-79e45a4c17cd" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.18:8443/readyz\": dial tcp 10.217.0.18:8443: connect: connection refused" Oct 03 08:42:36 crc kubenswrapper[4790]: I1003 08:42:36.209639 4790 generic.go:334] "Generic (PLEG): container finished" podID="e2c063f6-68dc-4e05-bb17-fcbee1a2eec6" containerID="262449af3210b3e67bf479014e5499ca2047ab43c3d6d8608ff8b194641486aa" exitCode=0 Oct 03 08:42:36 crc kubenswrapper[4790]: I1003 08:42:36.209656 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-wlp9g" event={"ID":"e2c063f6-68dc-4e05-bb17-fcbee1a2eec6","Type":"ContainerDied","Data":"262449af3210b3e67bf479014e5499ca2047ab43c3d6d8608ff8b194641486aa"} Oct 03 08:42:36 crc kubenswrapper[4790]: I1003 08:42:36.209984 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-wlp9g" event={"ID":"e2c063f6-68dc-4e05-bb17-fcbee1a2eec6","Type":"ContainerStarted","Data":"7928a2879db1eae9c416820c1988c72aa835bd8f2ad68bad2427ba6129e93f06"} Oct 03 08:42:36 crc kubenswrapper[4790]: I1003 08:42:36.211905 4790 generic.go:334] "Generic (PLEG): container finished" podID="1f090fa3-d832-4693-a9a7-24cdae8aae07" containerID="5cd933a58a5f4cf507041db9811fab78bb3f9aaf64120a411b9fd936501da122" exitCode=0 Oct 03 08:42:36 crc kubenswrapper[4790]: I1003 08:42:36.212003 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-vm8dg" event={"ID":"1f090fa3-d832-4693-a9a7-24cdae8aae07","Type":"ContainerDied","Data":"5cd933a58a5f4cf507041db9811fab78bb3f9aaf64120a411b9fd936501da122"} Oct 03 08:42:36 crc kubenswrapper[4790]: I1003 08:42:36.212062 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-vm8dg" event={"ID":"1f090fa3-d832-4693-a9a7-24cdae8aae07","Type":"ContainerStarted","Data":"80d93b6e3c2882224bf704c6f4b599a88905300b1d70f0997bee9011705c5c94"} Oct 03 08:42:36 crc kubenswrapper[4790]: I1003 08:42:36.215730 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-8rqjf" event={"ID":"01e7c005-7917-4002-b9d7-df2f8f138162","Type":"ContainerStarted","Data":"29b6b0137bb12a705674924b68587413fa6ad3f88fe4fb01e99e4e4620ae343d"} Oct 03 08:42:36 crc kubenswrapper[4790]: I1003 08:42:36.215777 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-8rqjf" event={"ID":"01e7c005-7917-4002-b9d7-df2f8f138162","Type":"ContainerStarted","Data":"c81ca0119ef048ee8b5490c17e7205628b445a4b018b0e6aa8ce3f4fc7a7b21f"} Oct 03 08:42:36 crc kubenswrapper[4790]: I1003 08:42:36.215793 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-8rqjf" event={"ID":"01e7c005-7917-4002-b9d7-df2f8f138162","Type":"ContainerStarted","Data":"702b7f67b51b84c4437c8dd3e66a2fa5e0a4e1ccd083b8fb35599b27668cf4c5"} Oct 03 08:42:36 crc kubenswrapper[4790]: I1003 08:42:36.221601 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mhmwn\" (UniqueName: \"kubernetes.io/projected/a6d4f573-ea4a-4c8b-85da-cd3ce54f732b-kube-api-access-mhmwn\") pod \"oauth-openshift-558db77b4-r9zxx\" (UID: \"a6d4f573-ea4a-4c8b-85da-cd3ce54f732b\") " pod="openshift-authentication/oauth-openshift-558db77b4-r9zxx" Oct 03 08:42:36 crc kubenswrapper[4790]: I1003 08:42:36.223093 4790 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Oct 03 08:42:36 crc kubenswrapper[4790]: I1003 08:42:36.227320 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-xnn4z" event={"ID":"3af1e215-a4dc-4be0-b4a8-d6fdbd120f2f","Type":"ContainerStarted","Data":"71815906a0cb76a4f9f42f26b41583a80dc435b059041dc730d62f441d5345af"} Oct 03 08:42:36 crc kubenswrapper[4790]: I1003 08:42:36.230987 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-tvcgx" event={"ID":"74be60a5-ccdf-40b5-bc76-ebf2196ae376","Type":"ContainerStarted","Data":"ce6b5ae317d76771beeaac7b0a3d9a71ec020b54a995aae36e41c7855e739511"} Oct 03 08:42:36 crc kubenswrapper[4790]: I1003 08:42:36.236245 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-ggr2t" event={"ID":"f53170bf-fd8d-41c2-b353-50c5d5cff3d8","Type":"ContainerStarted","Data":"60247fffbebd95e7197adf13ca5ae7db6d4f3985d854e90f215fb37cff5269a1"} Oct 03 08:42:36 crc kubenswrapper[4790]: I1003 08:42:36.236319 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-ggr2t" event={"ID":"f53170bf-fd8d-41c2-b353-50c5d5cff3d8","Type":"ContainerStarted","Data":"db3e9e4f1ee8ad6d7c52a497378325ae5667f61f6e1c6d515e9f10dfbf54f0a5"} Oct 03 08:42:36 crc kubenswrapper[4790]: I1003 08:42:36.238072 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-tml4h" event={"ID":"2f81f816-0c58-437b-be43-b782515e8997","Type":"ContainerStarted","Data":"9a7f0b7fb89ac0d0b3873b5258c7cc7852bc1bc93f09438cecb2b528f75e2ea3"} Oct 03 08:42:36 crc kubenswrapper[4790]: I1003 08:42:36.238106 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-tml4h" event={"ID":"2f81f816-0c58-437b-be43-b782515e8997","Type":"ContainerStarted","Data":"65900daa534a207d66a43bdcb077e9746dabd53915fe7db9b8ab74bb03e18aaf"} Oct 03 08:42:36 crc kubenswrapper[4790]: I1003 08:42:36.238122 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-tml4h" event={"ID":"2f81f816-0c58-437b-be43-b782515e8997","Type":"ContainerStarted","Data":"c489ea5b50e4f651c855cc5d66d0f2c389a5a9f999da52dee624174a4e3177f4"} Oct 03 08:42:36 crc kubenswrapper[4790]: I1003 08:42:36.242710 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Oct 03 08:42:36 crc kubenswrapper[4790]: I1003 08:42:36.243058 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-f9drg" event={"ID":"5b33ea81-e9c2-4297-a0ed-6fe0a1688563","Type":"ContainerStarted","Data":"7fab29469e11c2b3d140f3d78a8c1804f6b751a5184b22951e2f70202a1d7b16"} Oct 03 08:42:36 crc kubenswrapper[4790]: I1003 08:42:36.243098 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-f9drg" event={"ID":"5b33ea81-e9c2-4297-a0ed-6fe0a1688563","Type":"ContainerStarted","Data":"f6d8e0767fcd44ad755da9dc43884854c2768fb7b8dc66d0d74f8c2eed42ebc2"} Oct 03 08:42:36 crc kubenswrapper[4790]: I1003 08:42:36.243114 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-f9drg" event={"ID":"5b33ea81-e9c2-4297-a0ed-6fe0a1688563","Type":"ContainerStarted","Data":"d818595849b511ed87427322e708a0e2bcc0e758415737091feadfd77ef50dc5"} Oct 03 08:42:36 crc kubenswrapper[4790]: I1003 08:42:36.245596 4790 generic.go:334] "Generic (PLEG): container finished" podID="44aed4cf-ccda-49ee-9d8a-c1b95946bdf9" containerID="afb50c1551b59366ec010cd6cb7640752f6bf08968b9399b3dfee55b10bb30a1" exitCode=0 Oct 03 08:42:36 crc kubenswrapper[4790]: I1003 08:42:36.245687 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jjxln" event={"ID":"44aed4cf-ccda-49ee-9d8a-c1b95946bdf9","Type":"ContainerDied","Data":"afb50c1551b59366ec010cd6cb7640752f6bf08968b9399b3dfee55b10bb30a1"} Oct 03 08:42:36 crc kubenswrapper[4790]: I1003 08:42:36.245726 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jjxln" event={"ID":"44aed4cf-ccda-49ee-9d8a-c1b95946bdf9","Type":"ContainerStarted","Data":"7bac19897984f4dcede801bdcd85908aee731a948eb7ebef766593aa4d0a423e"} Oct 03 08:42:36 crc kubenswrapper[4790]: I1003 08:42:36.246724 4790 patch_prober.go:28] interesting pod/downloads-7954f5f757-4gzvp container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.11:8080/\": dial tcp 10.217.0.11:8080: connect: connection refused" start-of-body= Oct 03 08:42:36 crc kubenswrapper[4790]: I1003 08:42:36.246801 4790 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-4gzvp" podUID="cf6939a0-5f2b-4db5-b5c2-613e19bb4b43" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.11:8080/\": dial tcp 10.217.0.11:8080: connect: connection refused" Oct 03 08:42:36 crc kubenswrapper[4790]: I1003 08:42:36.263000 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Oct 03 08:42:36 crc kubenswrapper[4790]: I1003 08:42:36.298362 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-cxxjr"] Oct 03 08:42:36 crc kubenswrapper[4790]: I1003 08:42:36.306736 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qwc8p\" (UniqueName: \"kubernetes.io/projected/9b926d95-536f-4925-8078-d14df3d56611-kube-api-access-qwc8p\") pod \"ingress-operator-5b745b69d9-v6bmf\" (UID: \"9b926d95-536f-4925-8078-d14df3d56611\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-v6bmf" Oct 03 08:42:36 crc kubenswrapper[4790]: I1003 08:42:36.319635 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z6ngr\" (UniqueName: \"kubernetes.io/projected/a04ec950-87ce-4f04-9c9d-201a4f6099a1-kube-api-access-z6ngr\") pod \"marketplace-operator-79b997595-h7tdr\" (UID: \"a04ec950-87ce-4f04-9c9d-201a4f6099a1\") " pod="openshift-marketplace/marketplace-operator-79b997595-h7tdr" Oct 03 08:42:36 crc kubenswrapper[4790]: I1003 08:42:36.333955 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-kzt96" Oct 03 08:42:36 crc kubenswrapper[4790]: I1003 08:42:36.338909 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rqz8p\" (UniqueName: \"kubernetes.io/projected/89e2e475-d36b-4626-ab23-e767070eb376-kube-api-access-rqz8p\") pod \"control-plane-machine-set-operator-78cbb6b69f-dp7r9\" (UID: \"89e2e475-d36b-4626-ab23-e767070eb376\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-dp7r9" Oct 03 08:42:36 crc kubenswrapper[4790]: I1003 08:42:36.343228 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-lzfvb" Oct 03 08:42:36 crc kubenswrapper[4790]: I1003 08:42:36.351073 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-r9zxx" Oct 03 08:42:36 crc kubenswrapper[4790]: I1003 08:42:36.358627 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-8p9cs" Oct 03 08:42:36 crc kubenswrapper[4790]: I1003 08:42:36.364718 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/fc49b92e-3574-4789-9f65-c2aeb678899b-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-qgfpf\" (UID: \"fc49b92e-3574-4789-9f65-c2aeb678899b\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-qgfpf" Oct 03 08:42:36 crc kubenswrapper[4790]: I1003 08:42:36.382826 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/9b926d95-536f-4925-8078-d14df3d56611-bound-sa-token\") pod \"ingress-operator-5b745b69d9-v6bmf\" (UID: \"9b926d95-536f-4925-8078-d14df3d56611\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-v6bmf" Oct 03 08:42:36 crc kubenswrapper[4790]: I1003 08:42:36.408648 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-69spb\" (UniqueName: \"kubernetes.io/projected/6fd95bed-c64a-41fa-9382-233d6b69b31b-kube-api-access-69spb\") pod \"catalog-operator-68c6474976-8zvcg\" (UID: \"6fd95bed-c64a-41fa-9382-233d6b69b31b\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-8zvcg" Oct 03 08:42:36 crc kubenswrapper[4790]: I1003 08:42:36.423932 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-dp7r9" Oct 03 08:42:36 crc kubenswrapper[4790]: I1003 08:42:36.428895 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kgcpz\" (UniqueName: \"kubernetes.io/projected/3db3e714-d4c8-4e90-9d7a-d694fadfa856-kube-api-access-kgcpz\") pod \"dns-default-2j9mx\" (UID: \"3db3e714-d4c8-4e90-9d7a-d694fadfa856\") " pod="openshift-dns/dns-default-2j9mx" Oct 03 08:42:36 crc kubenswrapper[4790]: I1003 08:42:36.454228 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-qgfpf" Oct 03 08:42:36 crc kubenswrapper[4790]: I1003 08:42:36.465559 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-8zvcg" Oct 03 08:42:36 crc kubenswrapper[4790]: I1003 08:42:36.506592 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-h7tdr" Oct 03 08:42:36 crc kubenswrapper[4790]: I1003 08:42:36.526108 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/68fadce0-3257-4525-b81f-552537c013be-config-volume\") pod \"collect-profiles-29324670-6j9ps\" (UID: \"68fadce0-3257-4525-b81f-552537c013be\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324670-6j9ps" Oct 03 08:42:36 crc kubenswrapper[4790]: I1003 08:42:36.526160 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r499b\" (UniqueName: \"kubernetes.io/projected/e6ba1e49-48e4-4fa7-860e-d7fac4e26c99-kube-api-access-r499b\") pod \"migrator-59844c95c7-c8l5f\" (UID: \"e6ba1e49-48e4-4fa7-860e-d7fac4e26c99\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-c8l5f" Oct 03 08:42:36 crc kubenswrapper[4790]: I1003 08:42:36.526193 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f4l52\" (UniqueName: \"kubernetes.io/projected/ef2e3da0-4171-404c-9521-6b69ee4b54f6-kube-api-access-f4l52\") pod \"multus-admission-controller-857f4d67dd-mksjr\" (UID: \"ef2e3da0-4171-404c-9521-6b69ee4b54f6\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-mksjr" Oct 03 08:42:36 crc kubenswrapper[4790]: I1003 08:42:36.526213 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fqtn6\" (UniqueName: \"kubernetes.io/projected/6c82e80d-c50d-4475-839c-f757c4e545f7-kube-api-access-fqtn6\") pod \"machine-config-operator-74547568cd-7nt9l\" (UID: \"6c82e80d-c50d-4475-839c-f757c4e545f7\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-7nt9l" Oct 03 08:42:36 crc kubenswrapper[4790]: I1003 08:42:36.526232 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/ce133d91-dccc-4862-ac75-8e3de6fb88cf-signing-key\") pod \"service-ca-9c57cc56f-np5k5\" (UID: \"ce133d91-dccc-4862-ac75-8e3de6fb88cf\") " pod="openshift-service-ca/service-ca-9c57cc56f-np5k5" Oct 03 08:42:36 crc kubenswrapper[4790]: I1003 08:42:36.526258 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-csbrj\" (UID: \"4f5de6e6-d3aa-4e4e-9f8c-bf67bf228b8c\") " pod="openshift-image-registry/image-registry-697d97f7c8-csbrj" Oct 03 08:42:36 crc kubenswrapper[4790]: I1003 08:42:36.526291 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ndt26\" (UniqueName: \"kubernetes.io/projected/29ea825c-5d1f-4563-af62-10be06c59e0a-kube-api-access-ndt26\") pod \"olm-operator-6b444d44fb-87tx4\" (UID: \"29ea825c-5d1f-4563-af62-10be06c59e0a\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-87tx4" Oct 03 08:42:36 crc kubenswrapper[4790]: I1003 08:42:36.526306 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1e42f548-0f82-45ec-9d39-7131fe9f66ad-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-nlpzc\" (UID: \"1e42f548-0f82-45ec-9d39-7131fe9f66ad\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-nlpzc" Oct 03 08:42:36 crc kubenswrapper[4790]: I1003 08:42:36.526355 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1e42f548-0f82-45ec-9d39-7131fe9f66ad-config\") pod \"kube-controller-manager-operator-78b949d7b-nlpzc\" (UID: \"1e42f548-0f82-45ec-9d39-7131fe9f66ad\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-nlpzc" Oct 03 08:42:36 crc kubenswrapper[4790]: I1003 08:42:36.526374 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6fb4k\" (UniqueName: \"kubernetes.io/projected/801aeaf6-dba4-4080-ba94-52aaace1c77b-kube-api-access-6fb4k\") pod \"package-server-manager-789f6589d5-8f5qg\" (UID: \"801aeaf6-dba4-4080-ba94-52aaace1c77b\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-8f5qg" Oct 03 08:42:36 crc kubenswrapper[4790]: I1003 08:42:36.526406 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-djj76\" (UniqueName: \"kubernetes.io/projected/cb758737-329d-43f2-9fa0-50537bd62787-kube-api-access-djj76\") pod \"kube-storage-version-migrator-operator-b67b599dd-sv5st\" (UID: \"cb758737-329d-43f2-9fa0-50537bd62787\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-sv5st" Oct 03 08:42:36 crc kubenswrapper[4790]: I1003 08:42:36.526442 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cb758737-329d-43f2-9fa0-50537bd62787-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-sv5st\" (UID: \"cb758737-329d-43f2-9fa0-50537bd62787\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-sv5st" Oct 03 08:42:36 crc kubenswrapper[4790]: I1003 08:42:36.526483 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4f5de6e6-d3aa-4e4e-9f8c-bf67bf228b8c-bound-sa-token\") pod \"image-registry-697d97f7c8-csbrj\" (UID: \"4f5de6e6-d3aa-4e4e-9f8c-bf67bf228b8c\") " pod="openshift-image-registry/image-registry-697d97f7c8-csbrj" Oct 03 08:42:36 crc kubenswrapper[4790]: I1003 08:42:36.526498 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/6c82e80d-c50d-4475-839c-f757c4e545f7-auth-proxy-config\") pod \"machine-config-operator-74547568cd-7nt9l\" (UID: \"6c82e80d-c50d-4475-839c-f757c4e545f7\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-7nt9l" Oct 03 08:42:36 crc kubenswrapper[4790]: I1003 08:42:36.526538 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/4f5de6e6-d3aa-4e4e-9f8c-bf67bf228b8c-installation-pull-secrets\") pod \"image-registry-697d97f7c8-csbrj\" (UID: \"4f5de6e6-d3aa-4e4e-9f8c-bf67bf228b8c\") " pod="openshift-image-registry/image-registry-697d97f7c8-csbrj" Oct 03 08:42:36 crc kubenswrapper[4790]: I1003 08:42:36.526594 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/29ea825c-5d1f-4563-af62-10be06c59e0a-srv-cert\") pod \"olm-operator-6b444d44fb-87tx4\" (UID: \"29ea825c-5d1f-4563-af62-10be06c59e0a\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-87tx4" Oct 03 08:42:36 crc kubenswrapper[4790]: I1003 08:42:36.526611 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/4f5de6e6-d3aa-4e4e-9f8c-bf67bf228b8c-registry-certificates\") pod \"image-registry-697d97f7c8-csbrj\" (UID: \"4f5de6e6-d3aa-4e4e-9f8c-bf67bf228b8c\") " pod="openshift-image-registry/image-registry-697d97f7c8-csbrj" Oct 03 08:42:36 crc kubenswrapper[4790]: I1003 08:42:36.526647 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/68fadce0-3257-4525-b81f-552537c013be-secret-volume\") pod \"collect-profiles-29324670-6j9ps\" (UID: \"68fadce0-3257-4525-b81f-552537c013be\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324670-6j9ps" Oct 03 08:42:36 crc kubenswrapper[4790]: I1003 08:42:36.526692 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/862522f5-5c44-456c-a8c3-787d7dc573df-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-hwr6b\" (UID: \"862522f5-5c44-456c-a8c3-787d7dc573df\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-hwr6b" Oct 03 08:42:36 crc kubenswrapper[4790]: I1003 08:42:36.526707 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/862522f5-5c44-456c-a8c3-787d7dc573df-config\") pod \"kube-apiserver-operator-766d6c64bb-hwr6b\" (UID: \"862522f5-5c44-456c-a8c3-787d7dc573df\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-hwr6b" Oct 03 08:42:36 crc kubenswrapper[4790]: I1003 08:42:36.526722 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6c82e80d-c50d-4475-839c-f757c4e545f7-proxy-tls\") pod \"machine-config-operator-74547568cd-7nt9l\" (UID: \"6c82e80d-c50d-4475-839c-f757c4e545f7\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-7nt9l" Oct 03 08:42:36 crc kubenswrapper[4790]: I1003 08:42:36.526751 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7w2p4\" (UniqueName: \"kubernetes.io/projected/8f295eb1-856c-4063-8411-3282ffac9de5-kube-api-access-7w2p4\") pod \"router-default-5444994796-xhdkx\" (UID: \"8f295eb1-856c-4063-8411-3282ffac9de5\") " pod="openshift-ingress/router-default-5444994796-xhdkx" Oct 03 08:42:36 crc kubenswrapper[4790]: I1003 08:42:36.526786 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8f295eb1-856c-4063-8411-3282ffac9de5-service-ca-bundle\") pod \"router-default-5444994796-xhdkx\" (UID: \"8f295eb1-856c-4063-8411-3282ffac9de5\") " pod="openshift-ingress/router-default-5444994796-xhdkx" Oct 03 08:42:36 crc kubenswrapper[4790]: I1003 08:42:36.526824 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8f295eb1-856c-4063-8411-3282ffac9de5-metrics-certs\") pod \"router-default-5444994796-xhdkx\" (UID: \"8f295eb1-856c-4063-8411-3282ffac9de5\") " pod="openshift-ingress/router-default-5444994796-xhdkx" Oct 03 08:42:36 crc kubenswrapper[4790]: I1003 08:42:36.526848 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-krnzm\" (UniqueName: \"kubernetes.io/projected/77532493-1d9d-47fe-ae15-2163ee04a272-kube-api-access-krnzm\") pod \"machine-config-controller-84d6567774-pssx9\" (UID: \"77532493-1d9d-47fe-ae15-2163ee04a272\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-pssx9" Oct 03 08:42:36 crc kubenswrapper[4790]: I1003 08:42:36.526888 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2h66p\" (UniqueName: \"kubernetes.io/projected/ebbc7ead-1893-4ca5-bd08-3b3f0090a781-kube-api-access-2h66p\") pod \"service-ca-operator-777779d784-5btj7\" (UID: \"ebbc7ead-1893-4ca5-bd08-3b3f0090a781\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-5btj7" Oct 03 08:42:36 crc kubenswrapper[4790]: I1003 08:42:36.526906 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c64dk\" (UniqueName: \"kubernetes.io/projected/ce133d91-dccc-4862-ac75-8e3de6fb88cf-kube-api-access-c64dk\") pod \"service-ca-9c57cc56f-np5k5\" (UID: \"ce133d91-dccc-4862-ac75-8e3de6fb88cf\") " pod="openshift-service-ca/service-ca-9c57cc56f-np5k5" Oct 03 08:42:36 crc kubenswrapper[4790]: I1003 08:42:36.526922 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/8f295eb1-856c-4063-8411-3282ffac9de5-default-certificate\") pod \"router-default-5444994796-xhdkx\" (UID: \"8f295eb1-856c-4063-8411-3282ffac9de5\") " pod="openshift-ingress/router-default-5444994796-xhdkx" Oct 03 08:42:36 crc kubenswrapper[4790]: I1003 08:42:36.526942 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4f5de6e6-d3aa-4e4e-9f8c-bf67bf228b8c-trusted-ca\") pod \"image-registry-697d97f7c8-csbrj\" (UID: \"4f5de6e6-d3aa-4e4e-9f8c-bf67bf228b8c\") " pod="openshift-image-registry/image-registry-697d97f7c8-csbrj" Oct 03 08:42:36 crc kubenswrapper[4790]: I1003 08:42:36.526974 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6c82e80d-c50d-4475-839c-f757c4e545f7-images\") pod \"machine-config-operator-74547568cd-7nt9l\" (UID: \"6c82e80d-c50d-4475-839c-f757c4e545f7\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-7nt9l" Oct 03 08:42:36 crc kubenswrapper[4790]: I1003 08:42:36.526995 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/ce133d91-dccc-4862-ac75-8e3de6fb88cf-signing-cabundle\") pod \"service-ca-9c57cc56f-np5k5\" (UID: \"ce133d91-dccc-4862-ac75-8e3de6fb88cf\") " pod="openshift-service-ca/service-ca-9c57cc56f-np5k5" Oct 03 08:42:36 crc kubenswrapper[4790]: I1003 08:42:36.527016 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1e42f548-0f82-45ec-9d39-7131fe9f66ad-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-nlpzc\" (UID: \"1e42f548-0f82-45ec-9d39-7131fe9f66ad\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-nlpzc" Oct 03 08:42:36 crc kubenswrapper[4790]: I1003 08:42:36.527037 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/801aeaf6-dba4-4080-ba94-52aaace1c77b-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-8f5qg\" (UID: \"801aeaf6-dba4-4080-ba94-52aaace1c77b\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-8f5qg" Oct 03 08:42:36 crc kubenswrapper[4790]: I1003 08:42:36.527103 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/77532493-1d9d-47fe-ae15-2163ee04a272-proxy-tls\") pod \"machine-config-controller-84d6567774-pssx9\" (UID: \"77532493-1d9d-47fe-ae15-2163ee04a272\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-pssx9" Oct 03 08:42:36 crc kubenswrapper[4790]: I1003 08:42:36.527137 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/8f295eb1-856c-4063-8411-3282ffac9de5-stats-auth\") pod \"router-default-5444994796-xhdkx\" (UID: \"8f295eb1-856c-4063-8411-3282ffac9de5\") " pod="openshift-ingress/router-default-5444994796-xhdkx" Oct 03 08:42:36 crc kubenswrapper[4790]: I1003 08:42:36.527155 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/ef2e3da0-4171-404c-9521-6b69ee4b54f6-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-mksjr\" (UID: \"ef2e3da0-4171-404c-9521-6b69ee4b54f6\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-mksjr" Oct 03 08:42:36 crc kubenswrapper[4790]: I1003 08:42:36.527189 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/4f5de6e6-d3aa-4e4e-9f8c-bf67bf228b8c-registry-tls\") pod \"image-registry-697d97f7c8-csbrj\" (UID: \"4f5de6e6-d3aa-4e4e-9f8c-bf67bf228b8c\") " pod="openshift-image-registry/image-registry-697d97f7c8-csbrj" Oct 03 08:42:36 crc kubenswrapper[4790]: I1003 08:42:36.527222 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/77532493-1d9d-47fe-ae15-2163ee04a272-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-pssx9\" (UID: \"77532493-1d9d-47fe-ae15-2163ee04a272\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-pssx9" Oct 03 08:42:36 crc kubenswrapper[4790]: I1003 08:42:36.527239 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/4f5de6e6-d3aa-4e4e-9f8c-bf67bf228b8c-ca-trust-extracted\") pod \"image-registry-697d97f7c8-csbrj\" (UID: \"4f5de6e6-d3aa-4e4e-9f8c-bf67bf228b8c\") " pod="openshift-image-registry/image-registry-697d97f7c8-csbrj" Oct 03 08:42:36 crc kubenswrapper[4790]: I1003 08:42:36.527255 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lv9wm\" (UniqueName: \"kubernetes.io/projected/68fadce0-3257-4525-b81f-552537c013be-kube-api-access-lv9wm\") pod \"collect-profiles-29324670-6j9ps\" (UID: \"68fadce0-3257-4525-b81f-552537c013be\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324670-6j9ps" Oct 03 08:42:36 crc kubenswrapper[4790]: I1003 08:42:36.527296 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cb758737-329d-43f2-9fa0-50537bd62787-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-sv5st\" (UID: \"cb758737-329d-43f2-9fa0-50537bd62787\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-sv5st" Oct 03 08:42:36 crc kubenswrapper[4790]: I1003 08:42:36.527356 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/862522f5-5c44-456c-a8c3-787d7dc573df-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-hwr6b\" (UID: \"862522f5-5c44-456c-a8c3-787d7dc573df\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-hwr6b" Oct 03 08:42:36 crc kubenswrapper[4790]: I1003 08:42:36.527399 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tjp5v\" (UniqueName: \"kubernetes.io/projected/4f5de6e6-d3aa-4e4e-9f8c-bf67bf228b8c-kube-api-access-tjp5v\") pod \"image-registry-697d97f7c8-csbrj\" (UID: \"4f5de6e6-d3aa-4e4e-9f8c-bf67bf228b8c\") " pod="openshift-image-registry/image-registry-697d97f7c8-csbrj" Oct 03 08:42:36 crc kubenswrapper[4790]: I1003 08:42:36.527414 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ebbc7ead-1893-4ca5-bd08-3b3f0090a781-config\") pod \"service-ca-operator-777779d784-5btj7\" (UID: \"ebbc7ead-1893-4ca5-bd08-3b3f0090a781\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-5btj7" Oct 03 08:42:36 crc kubenswrapper[4790]: I1003 08:42:36.527449 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/29ea825c-5d1f-4563-af62-10be06c59e0a-profile-collector-cert\") pod \"olm-operator-6b444d44fb-87tx4\" (UID: \"29ea825c-5d1f-4563-af62-10be06c59e0a\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-87tx4" Oct 03 08:42:36 crc kubenswrapper[4790]: I1003 08:42:36.527487 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ebbc7ead-1893-4ca5-bd08-3b3f0090a781-serving-cert\") pod \"service-ca-operator-777779d784-5btj7\" (UID: \"ebbc7ead-1893-4ca5-bd08-3b3f0090a781\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-5btj7" Oct 03 08:42:36 crc kubenswrapper[4790]: E1003 08:42:36.531428 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 08:42:37.031409955 +0000 UTC m=+143.384344393 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-csbrj" (UID: "4f5de6e6-d3aa-4e4e-9f8c-bf67bf228b8c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 08:42:36 crc kubenswrapper[4790]: I1003 08:42:36.628916 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 08:42:36 crc kubenswrapper[4790]: E1003 08:42:36.629268 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 08:42:37.129232014 +0000 UTC m=+143.482166252 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 08:42:36 crc kubenswrapper[4790]: I1003 08:42:36.629663 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1e42f548-0f82-45ec-9d39-7131fe9f66ad-config\") pod \"kube-controller-manager-operator-78b949d7b-nlpzc\" (UID: \"1e42f548-0f82-45ec-9d39-7131fe9f66ad\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-nlpzc" Oct 03 08:42:36 crc kubenswrapper[4790]: I1003 08:42:36.629697 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6fb4k\" (UniqueName: \"kubernetes.io/projected/801aeaf6-dba4-4080-ba94-52aaace1c77b-kube-api-access-6fb4k\") pod \"package-server-manager-789f6589d5-8f5qg\" (UID: \"801aeaf6-dba4-4080-ba94-52aaace1c77b\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-8f5qg" Oct 03 08:42:36 crc kubenswrapper[4790]: I1003 08:42:36.629749 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-djj76\" (UniqueName: \"kubernetes.io/projected/cb758737-329d-43f2-9fa0-50537bd62787-kube-api-access-djj76\") pod \"kube-storage-version-migrator-operator-b67b599dd-sv5st\" (UID: \"cb758737-329d-43f2-9fa0-50537bd62787\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-sv5st" Oct 03 08:42:36 crc kubenswrapper[4790]: I1003 08:42:36.629828 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cb758737-329d-43f2-9fa0-50537bd62787-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-sv5st\" (UID: \"cb758737-329d-43f2-9fa0-50537bd62787\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-sv5st" Oct 03 08:42:36 crc kubenswrapper[4790]: I1003 08:42:36.629852 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/6c82e80d-c50d-4475-839c-f757c4e545f7-auth-proxy-config\") pod \"machine-config-operator-74547568cd-7nt9l\" (UID: \"6c82e80d-c50d-4475-839c-f757c4e545f7\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-7nt9l" Oct 03 08:42:36 crc kubenswrapper[4790]: I1003 08:42:36.629897 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4f5de6e6-d3aa-4e4e-9f8c-bf67bf228b8c-bound-sa-token\") pod \"image-registry-697d97f7c8-csbrj\" (UID: \"4f5de6e6-d3aa-4e4e-9f8c-bf67bf228b8c\") " pod="openshift-image-registry/image-registry-697d97f7c8-csbrj" Oct 03 08:42:36 crc kubenswrapper[4790]: I1003 08:42:36.629961 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/0e3b1091-0285-4063-9031-fdca0671b63c-node-bootstrap-token\") pod \"machine-config-server-jwdjw\" (UID: \"0e3b1091-0285-4063-9031-fdca0671b63c\") " pod="openshift-machine-config-operator/machine-config-server-jwdjw" Oct 03 08:42:36 crc kubenswrapper[4790]: I1003 08:42:36.629986 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/4f5de6e6-d3aa-4e4e-9f8c-bf67bf228b8c-installation-pull-secrets\") pod \"image-registry-697d97f7c8-csbrj\" (UID: \"4f5de6e6-d3aa-4e4e-9f8c-bf67bf228b8c\") " pod="openshift-image-registry/image-registry-697d97f7c8-csbrj" Oct 03 08:42:36 crc kubenswrapper[4790]: I1003 08:42:36.630037 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/29ea825c-5d1f-4563-af62-10be06c59e0a-srv-cert\") pod \"olm-operator-6b444d44fb-87tx4\" (UID: \"29ea825c-5d1f-4563-af62-10be06c59e0a\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-87tx4" Oct 03 08:42:36 crc kubenswrapper[4790]: I1003 08:42:36.630063 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/4f5de6e6-d3aa-4e4e-9f8c-bf67bf228b8c-registry-certificates\") pod \"image-registry-697d97f7c8-csbrj\" (UID: \"4f5de6e6-d3aa-4e4e-9f8c-bf67bf228b8c\") " pod="openshift-image-registry/image-registry-697d97f7c8-csbrj" Oct 03 08:42:36 crc kubenswrapper[4790]: I1003 08:42:36.630132 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/a7785589-4fe5-4219-8fc8-f314d6e0dd7c-plugins-dir\") pod \"csi-hostpathplugin-t9hsf\" (UID: \"a7785589-4fe5-4219-8fc8-f314d6e0dd7c\") " pod="hostpath-provisioner/csi-hostpathplugin-t9hsf" Oct 03 08:42:36 crc kubenswrapper[4790]: I1003 08:42:36.630154 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/a7785589-4fe5-4219-8fc8-f314d6e0dd7c-mountpoint-dir\") pod \"csi-hostpathplugin-t9hsf\" (UID: \"a7785589-4fe5-4219-8fc8-f314d6e0dd7c\") " pod="hostpath-provisioner/csi-hostpathplugin-t9hsf" Oct 03 08:42:36 crc kubenswrapper[4790]: I1003 08:42:36.630225 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/68fadce0-3257-4525-b81f-552537c013be-secret-volume\") pod \"collect-profiles-29324670-6j9ps\" (UID: \"68fadce0-3257-4525-b81f-552537c013be\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324670-6j9ps" Oct 03 08:42:36 crc kubenswrapper[4790]: I1003 08:42:36.630258 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6c82e80d-c50d-4475-839c-f757c4e545f7-proxy-tls\") pod \"machine-config-operator-74547568cd-7nt9l\" (UID: \"6c82e80d-c50d-4475-839c-f757c4e545f7\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-7nt9l" Oct 03 08:42:36 crc kubenswrapper[4790]: I1003 08:42:36.630295 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/862522f5-5c44-456c-a8c3-787d7dc573df-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-hwr6b\" (UID: \"862522f5-5c44-456c-a8c3-787d7dc573df\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-hwr6b" Oct 03 08:42:36 crc kubenswrapper[4790]: I1003 08:42:36.630317 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/862522f5-5c44-456c-a8c3-787d7dc573df-config\") pod \"kube-apiserver-operator-766d6c64bb-hwr6b\" (UID: \"862522f5-5c44-456c-a8c3-787d7dc573df\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-hwr6b" Oct 03 08:42:36 crc kubenswrapper[4790]: I1003 08:42:36.630339 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7w2p4\" (UniqueName: \"kubernetes.io/projected/8f295eb1-856c-4063-8411-3282ffac9de5-kube-api-access-7w2p4\") pod \"router-default-5444994796-xhdkx\" (UID: \"8f295eb1-856c-4063-8411-3282ffac9de5\") " pod="openshift-ingress/router-default-5444994796-xhdkx" Oct 03 08:42:36 crc kubenswrapper[4790]: I1003 08:42:36.630403 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8f295eb1-856c-4063-8411-3282ffac9de5-service-ca-bundle\") pod \"router-default-5444994796-xhdkx\" (UID: \"8f295eb1-856c-4063-8411-3282ffac9de5\") " pod="openshift-ingress/router-default-5444994796-xhdkx" Oct 03 08:42:36 crc kubenswrapper[4790]: I1003 08:42:36.630441 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-krnzm\" (UniqueName: \"kubernetes.io/projected/77532493-1d9d-47fe-ae15-2163ee04a272-kube-api-access-krnzm\") pod \"machine-config-controller-84d6567774-pssx9\" (UID: \"77532493-1d9d-47fe-ae15-2163ee04a272\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-pssx9" Oct 03 08:42:36 crc kubenswrapper[4790]: I1003 08:42:36.630465 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8f295eb1-856c-4063-8411-3282ffac9de5-metrics-certs\") pod \"router-default-5444994796-xhdkx\" (UID: \"8f295eb1-856c-4063-8411-3282ffac9de5\") " pod="openshift-ingress/router-default-5444994796-xhdkx" Oct 03 08:42:36 crc kubenswrapper[4790]: I1003 08:42:36.630483 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1e42f548-0f82-45ec-9d39-7131fe9f66ad-config\") pod \"kube-controller-manager-operator-78b949d7b-nlpzc\" (UID: \"1e42f548-0f82-45ec-9d39-7131fe9f66ad\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-nlpzc" Oct 03 08:42:36 crc kubenswrapper[4790]: I1003 08:42:36.632106 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/8f295eb1-856c-4063-8411-3282ffac9de5-default-certificate\") pod \"router-default-5444994796-xhdkx\" (UID: \"8f295eb1-856c-4063-8411-3282ffac9de5\") " pod="openshift-ingress/router-default-5444994796-xhdkx" Oct 03 08:42:36 crc kubenswrapper[4790]: I1003 08:42:36.632141 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2h66p\" (UniqueName: \"kubernetes.io/projected/ebbc7ead-1893-4ca5-bd08-3b3f0090a781-kube-api-access-2h66p\") pod \"service-ca-operator-777779d784-5btj7\" (UID: \"ebbc7ead-1893-4ca5-bd08-3b3f0090a781\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-5btj7" Oct 03 08:42:36 crc kubenswrapper[4790]: I1003 08:42:36.632168 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c64dk\" (UniqueName: \"kubernetes.io/projected/ce133d91-dccc-4862-ac75-8e3de6fb88cf-kube-api-access-c64dk\") pod \"service-ca-9c57cc56f-np5k5\" (UID: \"ce133d91-dccc-4862-ac75-8e3de6fb88cf\") " pod="openshift-service-ca/service-ca-9c57cc56f-np5k5" Oct 03 08:42:36 crc kubenswrapper[4790]: I1003 08:42:36.632198 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4f5de6e6-d3aa-4e4e-9f8c-bf67bf228b8c-trusted-ca\") pod \"image-registry-697d97f7c8-csbrj\" (UID: \"4f5de6e6-d3aa-4e4e-9f8c-bf67bf228b8c\") " pod="openshift-image-registry/image-registry-697d97f7c8-csbrj" Oct 03 08:42:36 crc kubenswrapper[4790]: I1003 08:42:36.632216 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/801aeaf6-dba4-4080-ba94-52aaace1c77b-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-8f5qg\" (UID: \"801aeaf6-dba4-4080-ba94-52aaace1c77b\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-8f5qg" Oct 03 08:42:36 crc kubenswrapper[4790]: I1003 08:42:36.632239 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6c82e80d-c50d-4475-839c-f757c4e545f7-images\") pod \"machine-config-operator-74547568cd-7nt9l\" (UID: \"6c82e80d-c50d-4475-839c-f757c4e545f7\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-7nt9l" Oct 03 08:42:36 crc kubenswrapper[4790]: I1003 08:42:36.632258 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/ce133d91-dccc-4862-ac75-8e3de6fb88cf-signing-cabundle\") pod \"service-ca-9c57cc56f-np5k5\" (UID: \"ce133d91-dccc-4862-ac75-8e3de6fb88cf\") " pod="openshift-service-ca/service-ca-9c57cc56f-np5k5" Oct 03 08:42:36 crc kubenswrapper[4790]: I1003 08:42:36.632275 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1e42f548-0f82-45ec-9d39-7131fe9f66ad-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-nlpzc\" (UID: \"1e42f548-0f82-45ec-9d39-7131fe9f66ad\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-nlpzc" Oct 03 08:42:36 crc kubenswrapper[4790]: I1003 08:42:36.632300 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-87mj6\" (UniqueName: \"kubernetes.io/projected/ff2a3a67-029a-4dbb-9acc-fc1456977f72-kube-api-access-87mj6\") pod \"ingress-canary-qc2rd\" (UID: \"ff2a3a67-029a-4dbb-9acc-fc1456977f72\") " pod="openshift-ingress-canary/ingress-canary-qc2rd" Oct 03 08:42:36 crc kubenswrapper[4790]: I1003 08:42:36.632358 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/77532493-1d9d-47fe-ae15-2163ee04a272-proxy-tls\") pod \"machine-config-controller-84d6567774-pssx9\" (UID: \"77532493-1d9d-47fe-ae15-2163ee04a272\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-pssx9" Oct 03 08:42:36 crc kubenswrapper[4790]: I1003 08:42:36.632373 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/a7785589-4fe5-4219-8fc8-f314d6e0dd7c-csi-data-dir\") pod \"csi-hostpathplugin-t9hsf\" (UID: \"a7785589-4fe5-4219-8fc8-f314d6e0dd7c\") " pod="hostpath-provisioner/csi-hostpathplugin-t9hsf" Oct 03 08:42:36 crc kubenswrapper[4790]: I1003 08:42:36.632401 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/8f295eb1-856c-4063-8411-3282ffac9de5-stats-auth\") pod \"router-default-5444994796-xhdkx\" (UID: \"8f295eb1-856c-4063-8411-3282ffac9de5\") " pod="openshift-ingress/router-default-5444994796-xhdkx" Oct 03 08:42:36 crc kubenswrapper[4790]: I1003 08:42:36.632419 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/ef2e3da0-4171-404c-9521-6b69ee4b54f6-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-mksjr\" (UID: \"ef2e3da0-4171-404c-9521-6b69ee4b54f6\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-mksjr" Oct 03 08:42:36 crc kubenswrapper[4790]: I1003 08:42:36.632429 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cb758737-329d-43f2-9fa0-50537bd62787-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-sv5st\" (UID: \"cb758737-329d-43f2-9fa0-50537bd62787\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-sv5st" Oct 03 08:42:36 crc kubenswrapper[4790]: I1003 08:42:36.632440 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/4f5de6e6-d3aa-4e4e-9f8c-bf67bf228b8c-registry-tls\") pod \"image-registry-697d97f7c8-csbrj\" (UID: \"4f5de6e6-d3aa-4e4e-9f8c-bf67bf228b8c\") " pod="openshift-image-registry/image-registry-697d97f7c8-csbrj" Oct 03 08:42:36 crc kubenswrapper[4790]: I1003 08:42:36.632531 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/77532493-1d9d-47fe-ae15-2163ee04a272-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-pssx9\" (UID: \"77532493-1d9d-47fe-ae15-2163ee04a272\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-pssx9" Oct 03 08:42:36 crc kubenswrapper[4790]: I1003 08:42:36.632917 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/6c82e80d-c50d-4475-839c-f757c4e545f7-auth-proxy-config\") pod \"machine-config-operator-74547568cd-7nt9l\" (UID: \"6c82e80d-c50d-4475-839c-f757c4e545f7\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-7nt9l" Oct 03 08:42:36 crc kubenswrapper[4790]: I1003 08:42:36.632999 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/4f5de6e6-d3aa-4e4e-9f8c-bf67bf228b8c-ca-trust-extracted\") pod \"image-registry-697d97f7c8-csbrj\" (UID: \"4f5de6e6-d3aa-4e4e-9f8c-bf67bf228b8c\") " pod="openshift-image-registry/image-registry-697d97f7c8-csbrj" Oct 03 08:42:36 crc kubenswrapper[4790]: I1003 08:42:36.633023 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lv9wm\" (UniqueName: \"kubernetes.io/projected/68fadce0-3257-4525-b81f-552537c013be-kube-api-access-lv9wm\") pod \"collect-profiles-29324670-6j9ps\" (UID: \"68fadce0-3257-4525-b81f-552537c013be\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324670-6j9ps" Oct 03 08:42:36 crc kubenswrapper[4790]: I1003 08:42:36.633566 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/4f5de6e6-d3aa-4e4e-9f8c-bf67bf228b8c-ca-trust-extracted\") pod \"image-registry-697d97f7c8-csbrj\" (UID: \"4f5de6e6-d3aa-4e4e-9f8c-bf67bf228b8c\") " pod="openshift-image-registry/image-registry-697d97f7c8-csbrj" Oct 03 08:42:36 crc kubenswrapper[4790]: I1003 08:42:36.635981 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-26qgl\" (UniqueName: \"kubernetes.io/projected/0e3b1091-0285-4063-9031-fdca0671b63c-kube-api-access-26qgl\") pod \"machine-config-server-jwdjw\" (UID: \"0e3b1091-0285-4063-9031-fdca0671b63c\") " pod="openshift-machine-config-operator/machine-config-server-jwdjw" Oct 03 08:42:36 crc kubenswrapper[4790]: I1003 08:42:36.636091 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cb758737-329d-43f2-9fa0-50537bd62787-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-sv5st\" (UID: \"cb758737-329d-43f2-9fa0-50537bd62787\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-sv5st" Oct 03 08:42:36 crc kubenswrapper[4790]: I1003 08:42:36.636849 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6c82e80d-c50d-4475-839c-f757c4e545f7-images\") pod \"machine-config-operator-74547568cd-7nt9l\" (UID: \"6c82e80d-c50d-4475-839c-f757c4e545f7\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-7nt9l" Oct 03 08:42:36 crc kubenswrapper[4790]: I1003 08:42:36.639477 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4f5de6e6-d3aa-4e4e-9f8c-bf67bf228b8c-trusted-ca\") pod \"image-registry-697d97f7c8-csbrj\" (UID: \"4f5de6e6-d3aa-4e4e-9f8c-bf67bf228b8c\") " pod="openshift-image-registry/image-registry-697d97f7c8-csbrj" Oct 03 08:42:36 crc kubenswrapper[4790]: I1003 08:42:36.639659 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/4f5de6e6-d3aa-4e4e-9f8c-bf67bf228b8c-registry-certificates\") pod \"image-registry-697d97f7c8-csbrj\" (UID: \"4f5de6e6-d3aa-4e4e-9f8c-bf67bf228b8c\") " pod="openshift-image-registry/image-registry-697d97f7c8-csbrj" Oct 03 08:42:36 crc kubenswrapper[4790]: I1003 08:42:36.639981 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/77532493-1d9d-47fe-ae15-2163ee04a272-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-pssx9\" (UID: \"77532493-1d9d-47fe-ae15-2163ee04a272\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-pssx9" Oct 03 08:42:36 crc kubenswrapper[4790]: I1003 08:42:36.640330 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/862522f5-5c44-456c-a8c3-787d7dc573df-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-hwr6b\" (UID: \"862522f5-5c44-456c-a8c3-787d7dc573df\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-hwr6b" Oct 03 08:42:36 crc kubenswrapper[4790]: I1003 08:42:36.640366 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ebbc7ead-1893-4ca5-bd08-3b3f0090a781-config\") pod \"service-ca-operator-777779d784-5btj7\" (UID: \"ebbc7ead-1893-4ca5-bd08-3b3f0090a781\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-5btj7" Oct 03 08:42:36 crc kubenswrapper[4790]: I1003 08:42:36.641104 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cb758737-329d-43f2-9fa0-50537bd62787-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-sv5st\" (UID: \"cb758737-329d-43f2-9fa0-50537bd62787\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-sv5st" Oct 03 08:42:36 crc kubenswrapper[4790]: I1003 08:42:36.641496 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ebbc7ead-1893-4ca5-bd08-3b3f0090a781-config\") pod \"service-ca-operator-777779d784-5btj7\" (UID: \"ebbc7ead-1893-4ca5-bd08-3b3f0090a781\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-5btj7" Oct 03 08:42:36 crc kubenswrapper[4790]: I1003 08:42:36.641547 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6c82e80d-c50d-4475-839c-f757c4e545f7-proxy-tls\") pod \"machine-config-operator-74547568cd-7nt9l\" (UID: \"6c82e80d-c50d-4475-839c-f757c4e545f7\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-7nt9l" Oct 03 08:42:36 crc kubenswrapper[4790]: I1003 08:42:36.641553 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tjp5v\" (UniqueName: \"kubernetes.io/projected/4f5de6e6-d3aa-4e4e-9f8c-bf67bf228b8c-kube-api-access-tjp5v\") pod \"image-registry-697d97f7c8-csbrj\" (UID: \"4f5de6e6-d3aa-4e4e-9f8c-bf67bf228b8c\") " pod="openshift-image-registry/image-registry-697d97f7c8-csbrj" Oct 03 08:42:36 crc kubenswrapper[4790]: I1003 08:42:36.641643 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/0e3b1091-0285-4063-9031-fdca0671b63c-certs\") pod \"machine-config-server-jwdjw\" (UID: \"0e3b1091-0285-4063-9031-fdca0671b63c\") " pod="openshift-machine-config-operator/machine-config-server-jwdjw" Oct 03 08:42:36 crc kubenswrapper[4790]: I1003 08:42:36.641747 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8f295eb1-856c-4063-8411-3282ffac9de5-service-ca-bundle\") pod \"router-default-5444994796-xhdkx\" (UID: \"8f295eb1-856c-4063-8411-3282ffac9de5\") " pod="openshift-ingress/router-default-5444994796-xhdkx" Oct 03 08:42:36 crc kubenswrapper[4790]: I1003 08:42:36.642667 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/ce133d91-dccc-4862-ac75-8e3de6fb88cf-signing-cabundle\") pod \"service-ca-9c57cc56f-np5k5\" (UID: \"ce133d91-dccc-4862-ac75-8e3de6fb88cf\") " pod="openshift-service-ca/service-ca-9c57cc56f-np5k5" Oct 03 08:42:36 crc kubenswrapper[4790]: I1003 08:42:36.643006 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/29ea825c-5d1f-4563-af62-10be06c59e0a-profile-collector-cert\") pod \"olm-operator-6b444d44fb-87tx4\" (UID: \"29ea825c-5d1f-4563-af62-10be06c59e0a\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-87tx4" Oct 03 08:42:36 crc kubenswrapper[4790]: I1003 08:42:36.643149 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/862522f5-5c44-456c-a8c3-787d7dc573df-config\") pod \"kube-apiserver-operator-766d6c64bb-hwr6b\" (UID: \"862522f5-5c44-456c-a8c3-787d7dc573df\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-hwr6b" Oct 03 08:42:36 crc kubenswrapper[4790]: I1003 08:42:36.643249 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/4f5de6e6-d3aa-4e4e-9f8c-bf67bf228b8c-installation-pull-secrets\") pod \"image-registry-697d97f7c8-csbrj\" (UID: \"4f5de6e6-d3aa-4e4e-9f8c-bf67bf228b8c\") " pod="openshift-image-registry/image-registry-697d97f7c8-csbrj" Oct 03 08:42:36 crc kubenswrapper[4790]: I1003 08:42:36.643386 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/a7785589-4fe5-4219-8fc8-f314d6e0dd7c-registration-dir\") pod \"csi-hostpathplugin-t9hsf\" (UID: \"a7785589-4fe5-4219-8fc8-f314d6e0dd7c\") " pod="hostpath-provisioner/csi-hostpathplugin-t9hsf" Oct 03 08:42:36 crc kubenswrapper[4790]: I1003 08:42:36.644274 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ebbc7ead-1893-4ca5-bd08-3b3f0090a781-serving-cert\") pod \"service-ca-operator-777779d784-5btj7\" (UID: \"ebbc7ead-1893-4ca5-bd08-3b3f0090a781\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-5btj7" Oct 03 08:42:36 crc kubenswrapper[4790]: I1003 08:42:36.645041 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ff2a3a67-029a-4dbb-9acc-fc1456977f72-cert\") pod \"ingress-canary-qc2rd\" (UID: \"ff2a3a67-029a-4dbb-9acc-fc1456977f72\") " pod="openshift-ingress-canary/ingress-canary-qc2rd" Oct 03 08:42:36 crc kubenswrapper[4790]: I1003 08:42:36.645442 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/68fadce0-3257-4525-b81f-552537c013be-config-volume\") pod \"collect-profiles-29324670-6j9ps\" (UID: \"68fadce0-3257-4525-b81f-552537c013be\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324670-6j9ps" Oct 03 08:42:36 crc kubenswrapper[4790]: I1003 08:42:36.645508 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r499b\" (UniqueName: \"kubernetes.io/projected/e6ba1e49-48e4-4fa7-860e-d7fac4e26c99-kube-api-access-r499b\") pod \"migrator-59844c95c7-c8l5f\" (UID: \"e6ba1e49-48e4-4fa7-860e-d7fac4e26c99\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-c8l5f" Oct 03 08:42:36 crc kubenswrapper[4790]: I1003 08:42:36.645535 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q77cj\" (UniqueName: \"kubernetes.io/projected/a7785589-4fe5-4219-8fc8-f314d6e0dd7c-kube-api-access-q77cj\") pod \"csi-hostpathplugin-t9hsf\" (UID: \"a7785589-4fe5-4219-8fc8-f314d6e0dd7c\") " pod="hostpath-provisioner/csi-hostpathplugin-t9hsf" Oct 03 08:42:36 crc kubenswrapper[4790]: I1003 08:42:36.645727 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f4l52\" (UniqueName: \"kubernetes.io/projected/ef2e3da0-4171-404c-9521-6b69ee4b54f6-kube-api-access-f4l52\") pod \"multus-admission-controller-857f4d67dd-mksjr\" (UID: \"ef2e3da0-4171-404c-9521-6b69ee4b54f6\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-mksjr" Oct 03 08:42:36 crc kubenswrapper[4790]: I1003 08:42:36.645863 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fqtn6\" (UniqueName: \"kubernetes.io/projected/6c82e80d-c50d-4475-839c-f757c4e545f7-kube-api-access-fqtn6\") pod \"machine-config-operator-74547568cd-7nt9l\" (UID: \"6c82e80d-c50d-4475-839c-f757c4e545f7\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-7nt9l" Oct 03 08:42:36 crc kubenswrapper[4790]: I1003 08:42:36.646883 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/ce133d91-dccc-4862-ac75-8e3de6fb88cf-signing-key\") pod \"service-ca-9c57cc56f-np5k5\" (UID: \"ce133d91-dccc-4862-ac75-8e3de6fb88cf\") " pod="openshift-service-ca/service-ca-9c57cc56f-np5k5" Oct 03 08:42:36 crc kubenswrapper[4790]: I1003 08:42:36.646919 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-csbrj\" (UID: \"4f5de6e6-d3aa-4e4e-9f8c-bf67bf228b8c\") " pod="openshift-image-registry/image-registry-697d97f7c8-csbrj" Oct 03 08:42:36 crc kubenswrapper[4790]: I1003 08:42:36.646941 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/a7785589-4fe5-4219-8fc8-f314d6e0dd7c-socket-dir\") pod \"csi-hostpathplugin-t9hsf\" (UID: \"a7785589-4fe5-4219-8fc8-f314d6e0dd7c\") " pod="hostpath-provisioner/csi-hostpathplugin-t9hsf" Oct 03 08:42:36 crc kubenswrapper[4790]: I1003 08:42:36.646969 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ndt26\" (UniqueName: \"kubernetes.io/projected/29ea825c-5d1f-4563-af62-10be06c59e0a-kube-api-access-ndt26\") pod \"olm-operator-6b444d44fb-87tx4\" (UID: \"29ea825c-5d1f-4563-af62-10be06c59e0a\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-87tx4" Oct 03 08:42:36 crc kubenswrapper[4790]: I1003 08:42:36.646985 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1e42f548-0f82-45ec-9d39-7131fe9f66ad-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-nlpzc\" (UID: \"1e42f548-0f82-45ec-9d39-7131fe9f66ad\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-nlpzc" Oct 03 08:42:36 crc kubenswrapper[4790]: E1003 08:42:36.647635 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 08:42:37.147621246 +0000 UTC m=+143.500555484 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-csbrj" (UID: "4f5de6e6-d3aa-4e4e-9f8c-bf67bf228b8c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 08:42:36 crc kubenswrapper[4790]: I1003 08:42:36.657494 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/68fadce0-3257-4525-b81f-552537c013be-secret-volume\") pod \"collect-profiles-29324670-6j9ps\" (UID: \"68fadce0-3257-4525-b81f-552537c013be\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324670-6j9ps" Oct 03 08:42:36 crc kubenswrapper[4790]: I1003 08:42:36.658002 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/68fadce0-3257-4525-b81f-552537c013be-config-volume\") pod \"collect-profiles-29324670-6j9ps\" (UID: \"68fadce0-3257-4525-b81f-552537c013be\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324670-6j9ps" Oct 03 08:42:36 crc kubenswrapper[4790]: I1003 08:42:36.661328 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/862522f5-5c44-456c-a8c3-787d7dc573df-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-hwr6b\" (UID: \"862522f5-5c44-456c-a8c3-787d7dc573df\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-hwr6b" Oct 03 08:42:36 crc kubenswrapper[4790]: I1003 08:42:36.661409 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ebbc7ead-1893-4ca5-bd08-3b3f0090a781-serving-cert\") pod \"service-ca-operator-777779d784-5btj7\" (UID: \"ebbc7ead-1893-4ca5-bd08-3b3f0090a781\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-5btj7" Oct 03 08:42:36 crc kubenswrapper[4790]: I1003 08:42:36.661762 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/ef2e3da0-4171-404c-9521-6b69ee4b54f6-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-mksjr\" (UID: \"ef2e3da0-4171-404c-9521-6b69ee4b54f6\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-mksjr" Oct 03 08:42:36 crc kubenswrapper[4790]: I1003 08:42:36.661958 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/4f5de6e6-d3aa-4e4e-9f8c-bf67bf228b8c-registry-tls\") pod \"image-registry-697d97f7c8-csbrj\" (UID: \"4f5de6e6-d3aa-4e4e-9f8c-bf67bf228b8c\") " pod="openshift-image-registry/image-registry-697d97f7c8-csbrj" Oct 03 08:42:36 crc kubenswrapper[4790]: I1003 08:42:36.664285 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/8f295eb1-856c-4063-8411-3282ffac9de5-default-certificate\") pod \"router-default-5444994796-xhdkx\" (UID: \"8f295eb1-856c-4063-8411-3282ffac9de5\") " pod="openshift-ingress/router-default-5444994796-xhdkx" Oct 03 08:42:36 crc kubenswrapper[4790]: I1003 08:42:36.664683 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1e42f548-0f82-45ec-9d39-7131fe9f66ad-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-nlpzc\" (UID: \"1e42f548-0f82-45ec-9d39-7131fe9f66ad\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-nlpzc" Oct 03 08:42:36 crc kubenswrapper[4790]: I1003 08:42:36.665220 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8f295eb1-856c-4063-8411-3282ffac9de5-metrics-certs\") pod \"router-default-5444994796-xhdkx\" (UID: \"8f295eb1-856c-4063-8411-3282ffac9de5\") " pod="openshift-ingress/router-default-5444994796-xhdkx" Oct 03 08:42:36 crc kubenswrapper[4790]: I1003 08:42:36.666045 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/ce133d91-dccc-4862-ac75-8e3de6fb88cf-signing-key\") pod \"service-ca-9c57cc56f-np5k5\" (UID: \"ce133d91-dccc-4862-ac75-8e3de6fb88cf\") " pod="openshift-service-ca/service-ca-9c57cc56f-np5k5" Oct 03 08:42:36 crc kubenswrapper[4790]: I1003 08:42:36.667659 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-v6bmf" Oct 03 08:42:36 crc kubenswrapper[4790]: I1003 08:42:36.668202 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/8f295eb1-856c-4063-8411-3282ffac9de5-stats-auth\") pod \"router-default-5444994796-xhdkx\" (UID: \"8f295eb1-856c-4063-8411-3282ffac9de5\") " pod="openshift-ingress/router-default-5444994796-xhdkx" Oct 03 08:42:36 crc kubenswrapper[4790]: I1003 08:42:36.672529 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/801aeaf6-dba4-4080-ba94-52aaace1c77b-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-8f5qg\" (UID: \"801aeaf6-dba4-4080-ba94-52aaace1c77b\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-8f5qg" Oct 03 08:42:36 crc kubenswrapper[4790]: I1003 08:42:36.679608 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-2j9mx" Oct 03 08:42:36 crc kubenswrapper[4790]: I1003 08:42:36.684952 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6fb4k\" (UniqueName: \"kubernetes.io/projected/801aeaf6-dba4-4080-ba94-52aaace1c77b-kube-api-access-6fb4k\") pod \"package-server-manager-789f6589d5-8f5qg\" (UID: \"801aeaf6-dba4-4080-ba94-52aaace1c77b\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-8f5qg" Oct 03 08:42:36 crc kubenswrapper[4790]: I1003 08:42:36.685105 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/77532493-1d9d-47fe-ae15-2163ee04a272-proxy-tls\") pod \"machine-config-controller-84d6567774-pssx9\" (UID: \"77532493-1d9d-47fe-ae15-2163ee04a272\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-pssx9" Oct 03 08:42:36 crc kubenswrapper[4790]: I1003 08:42:36.688304 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/29ea825c-5d1f-4563-af62-10be06c59e0a-srv-cert\") pod \"olm-operator-6b444d44fb-87tx4\" (UID: \"29ea825c-5d1f-4563-af62-10be06c59e0a\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-87tx4" Oct 03 08:42:36 crc kubenswrapper[4790]: I1003 08:42:36.703014 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/29ea825c-5d1f-4563-af62-10be06c59e0a-profile-collector-cert\") pod \"olm-operator-6b444d44fb-87tx4\" (UID: \"29ea825c-5d1f-4563-af62-10be06c59e0a\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-87tx4" Oct 03 08:42:36 crc kubenswrapper[4790]: I1003 08:42:36.705496 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4f5de6e6-d3aa-4e4e-9f8c-bf67bf228b8c-bound-sa-token\") pod \"image-registry-697d97f7c8-csbrj\" (UID: \"4f5de6e6-d3aa-4e4e-9f8c-bf67bf228b8c\") " pod="openshift-image-registry/image-registry-697d97f7c8-csbrj" Oct 03 08:42:36 crc kubenswrapper[4790]: I1003 08:42:36.731180 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-djj76\" (UniqueName: \"kubernetes.io/projected/cb758737-329d-43f2-9fa0-50537bd62787-kube-api-access-djj76\") pod \"kube-storage-version-migrator-operator-b67b599dd-sv5st\" (UID: \"cb758737-329d-43f2-9fa0-50537bd62787\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-sv5st" Oct 03 08:42:36 crc kubenswrapper[4790]: I1003 08:42:36.745107 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lv9wm\" (UniqueName: \"kubernetes.io/projected/68fadce0-3257-4525-b81f-552537c013be-kube-api-access-lv9wm\") pod \"collect-profiles-29324670-6j9ps\" (UID: \"68fadce0-3257-4525-b81f-552537c013be\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324670-6j9ps" Oct 03 08:42:36 crc kubenswrapper[4790]: I1003 08:42:36.753450 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 08:42:36 crc kubenswrapper[4790]: I1003 08:42:36.753664 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/a7785589-4fe5-4219-8fc8-f314d6e0dd7c-socket-dir\") pod \"csi-hostpathplugin-t9hsf\" (UID: \"a7785589-4fe5-4219-8fc8-f314d6e0dd7c\") " pod="hostpath-provisioner/csi-hostpathplugin-t9hsf" Oct 03 08:42:36 crc kubenswrapper[4790]: I1003 08:42:36.753723 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/0e3b1091-0285-4063-9031-fdca0671b63c-node-bootstrap-token\") pod \"machine-config-server-jwdjw\" (UID: \"0e3b1091-0285-4063-9031-fdca0671b63c\") " pod="openshift-machine-config-operator/machine-config-server-jwdjw" Oct 03 08:42:36 crc kubenswrapper[4790]: I1003 08:42:36.753746 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/a7785589-4fe5-4219-8fc8-f314d6e0dd7c-plugins-dir\") pod \"csi-hostpathplugin-t9hsf\" (UID: \"a7785589-4fe5-4219-8fc8-f314d6e0dd7c\") " pod="hostpath-provisioner/csi-hostpathplugin-t9hsf" Oct 03 08:42:36 crc kubenswrapper[4790]: I1003 08:42:36.753763 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/a7785589-4fe5-4219-8fc8-f314d6e0dd7c-mountpoint-dir\") pod \"csi-hostpathplugin-t9hsf\" (UID: \"a7785589-4fe5-4219-8fc8-f314d6e0dd7c\") " pod="hostpath-provisioner/csi-hostpathplugin-t9hsf" Oct 03 08:42:36 crc kubenswrapper[4790]: I1003 08:42:36.753832 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-87mj6\" (UniqueName: \"kubernetes.io/projected/ff2a3a67-029a-4dbb-9acc-fc1456977f72-kube-api-access-87mj6\") pod \"ingress-canary-qc2rd\" (UID: \"ff2a3a67-029a-4dbb-9acc-fc1456977f72\") " pod="openshift-ingress-canary/ingress-canary-qc2rd" Oct 03 08:42:36 crc kubenswrapper[4790]: I1003 08:42:36.753850 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/a7785589-4fe5-4219-8fc8-f314d6e0dd7c-csi-data-dir\") pod \"csi-hostpathplugin-t9hsf\" (UID: \"a7785589-4fe5-4219-8fc8-f314d6e0dd7c\") " pod="hostpath-provisioner/csi-hostpathplugin-t9hsf" Oct 03 08:42:36 crc kubenswrapper[4790]: I1003 08:42:36.753876 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-26qgl\" (UniqueName: \"kubernetes.io/projected/0e3b1091-0285-4063-9031-fdca0671b63c-kube-api-access-26qgl\") pod \"machine-config-server-jwdjw\" (UID: \"0e3b1091-0285-4063-9031-fdca0671b63c\") " pod="openshift-machine-config-operator/machine-config-server-jwdjw" Oct 03 08:42:36 crc kubenswrapper[4790]: I1003 08:42:36.753919 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/0e3b1091-0285-4063-9031-fdca0671b63c-certs\") pod \"machine-config-server-jwdjw\" (UID: \"0e3b1091-0285-4063-9031-fdca0671b63c\") " pod="openshift-machine-config-operator/machine-config-server-jwdjw" Oct 03 08:42:36 crc kubenswrapper[4790]: I1003 08:42:36.753946 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/a7785589-4fe5-4219-8fc8-f314d6e0dd7c-registration-dir\") pod \"csi-hostpathplugin-t9hsf\" (UID: \"a7785589-4fe5-4219-8fc8-f314d6e0dd7c\") " pod="hostpath-provisioner/csi-hostpathplugin-t9hsf" Oct 03 08:42:36 crc kubenswrapper[4790]: I1003 08:42:36.753966 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ff2a3a67-029a-4dbb-9acc-fc1456977f72-cert\") pod \"ingress-canary-qc2rd\" (UID: \"ff2a3a67-029a-4dbb-9acc-fc1456977f72\") " pod="openshift-ingress-canary/ingress-canary-qc2rd" Oct 03 08:42:36 crc kubenswrapper[4790]: I1003 08:42:36.753990 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q77cj\" (UniqueName: \"kubernetes.io/projected/a7785589-4fe5-4219-8fc8-f314d6e0dd7c-kube-api-access-q77cj\") pod \"csi-hostpathplugin-t9hsf\" (UID: \"a7785589-4fe5-4219-8fc8-f314d6e0dd7c\") " pod="hostpath-provisioner/csi-hostpathplugin-t9hsf" Oct 03 08:42:36 crc kubenswrapper[4790]: I1003 08:42:36.754212 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/a7785589-4fe5-4219-8fc8-f314d6e0dd7c-csi-data-dir\") pod \"csi-hostpathplugin-t9hsf\" (UID: \"a7785589-4fe5-4219-8fc8-f314d6e0dd7c\") " pod="hostpath-provisioner/csi-hostpathplugin-t9hsf" Oct 03 08:42:36 crc kubenswrapper[4790]: E1003 08:42:36.754487 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 08:42:37.254462247 +0000 UTC m=+143.607396485 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 08:42:36 crc kubenswrapper[4790]: I1003 08:42:36.755064 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/a7785589-4fe5-4219-8fc8-f314d6e0dd7c-socket-dir\") pod \"csi-hostpathplugin-t9hsf\" (UID: \"a7785589-4fe5-4219-8fc8-f314d6e0dd7c\") " pod="hostpath-provisioner/csi-hostpathplugin-t9hsf" Oct 03 08:42:36 crc kubenswrapper[4790]: I1003 08:42:36.755101 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/a7785589-4fe5-4219-8fc8-f314d6e0dd7c-registration-dir\") pod \"csi-hostpathplugin-t9hsf\" (UID: \"a7785589-4fe5-4219-8fc8-f314d6e0dd7c\") " pod="hostpath-provisioner/csi-hostpathplugin-t9hsf" Oct 03 08:42:36 crc kubenswrapper[4790]: I1003 08:42:36.755157 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/a7785589-4fe5-4219-8fc8-f314d6e0dd7c-plugins-dir\") pod \"csi-hostpathplugin-t9hsf\" (UID: \"a7785589-4fe5-4219-8fc8-f314d6e0dd7c\") " pod="hostpath-provisioner/csi-hostpathplugin-t9hsf" Oct 03 08:42:36 crc kubenswrapper[4790]: I1003 08:42:36.755204 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/a7785589-4fe5-4219-8fc8-f314d6e0dd7c-mountpoint-dir\") pod \"csi-hostpathplugin-t9hsf\" (UID: \"a7785589-4fe5-4219-8fc8-f314d6e0dd7c\") " pod="hostpath-provisioner/csi-hostpathplugin-t9hsf" Oct 03 08:42:36 crc kubenswrapper[4790]: I1003 08:42:36.759357 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ff2a3a67-029a-4dbb-9acc-fc1456977f72-cert\") pod \"ingress-canary-qc2rd\" (UID: \"ff2a3a67-029a-4dbb-9acc-fc1456977f72\") " pod="openshift-ingress-canary/ingress-canary-qc2rd" Oct 03 08:42:36 crc kubenswrapper[4790]: I1003 08:42:36.765973 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/0e3b1091-0285-4063-9031-fdca0671b63c-node-bootstrap-token\") pod \"machine-config-server-jwdjw\" (UID: \"0e3b1091-0285-4063-9031-fdca0671b63c\") " pod="openshift-machine-config-operator/machine-config-server-jwdjw" Oct 03 08:42:36 crc kubenswrapper[4790]: I1003 08:42:36.774303 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/0e3b1091-0285-4063-9031-fdca0671b63c-certs\") pod \"machine-config-server-jwdjw\" (UID: \"0e3b1091-0285-4063-9031-fdca0671b63c\") " pod="openshift-machine-config-operator/machine-config-server-jwdjw" Oct 03 08:42:36 crc kubenswrapper[4790]: I1003 08:42:36.774554 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-8f5qg" Oct 03 08:42:36 crc kubenswrapper[4790]: I1003 08:42:36.778446 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1e42f548-0f82-45ec-9d39-7131fe9f66ad-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-nlpzc\" (UID: \"1e42f548-0f82-45ec-9d39-7131fe9f66ad\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-nlpzc" Oct 03 08:42:36 crc kubenswrapper[4790]: I1003 08:42:36.787515 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-krnzm\" (UniqueName: \"kubernetes.io/projected/77532493-1d9d-47fe-ae15-2163ee04a272-kube-api-access-krnzm\") pod \"machine-config-controller-84d6567774-pssx9\" (UID: \"77532493-1d9d-47fe-ae15-2163ee04a272\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-pssx9" Oct 03 08:42:36 crc kubenswrapper[4790]: I1003 08:42:36.787979 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-sv5st" Oct 03 08:42:36 crc kubenswrapper[4790]: I1003 08:42:36.790807 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-lzfvb"] Oct 03 08:42:36 crc kubenswrapper[4790]: I1003 08:42:36.800202 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-pssx9" Oct 03 08:42:36 crc kubenswrapper[4790]: I1003 08:42:36.811359 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c64dk\" (UniqueName: \"kubernetes.io/projected/ce133d91-dccc-4862-ac75-8e3de6fb88cf-kube-api-access-c64dk\") pod \"service-ca-9c57cc56f-np5k5\" (UID: \"ce133d91-dccc-4862-ac75-8e3de6fb88cf\") " pod="openshift-service-ca/service-ca-9c57cc56f-np5k5" Oct 03 08:42:36 crc kubenswrapper[4790]: I1003 08:42:36.827709 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29324670-6j9ps" Oct 03 08:42:36 crc kubenswrapper[4790]: I1003 08:42:36.834769 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2h66p\" (UniqueName: \"kubernetes.io/projected/ebbc7ead-1893-4ca5-bd08-3b3f0090a781-kube-api-access-2h66p\") pod \"service-ca-operator-777779d784-5btj7\" (UID: \"ebbc7ead-1893-4ca5-bd08-3b3f0090a781\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-5btj7" Oct 03 08:42:36 crc kubenswrapper[4790]: I1003 08:42:36.837476 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-np5k5" Oct 03 08:42:36 crc kubenswrapper[4790]: I1003 08:42:36.838977 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7w2p4\" (UniqueName: \"kubernetes.io/projected/8f295eb1-856c-4063-8411-3282ffac9de5-kube-api-access-7w2p4\") pod \"router-default-5444994796-xhdkx\" (UID: \"8f295eb1-856c-4063-8411-3282ffac9de5\") " pod="openshift-ingress/router-default-5444994796-xhdkx" Oct 03 08:42:36 crc kubenswrapper[4790]: I1003 08:42:36.855479 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-csbrj\" (UID: \"4f5de6e6-d3aa-4e4e-9f8c-bf67bf228b8c\") " pod="openshift-image-registry/image-registry-697d97f7c8-csbrj" Oct 03 08:42:36 crc kubenswrapper[4790]: E1003 08:42:36.855867 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 08:42:37.355851357 +0000 UTC m=+143.708785595 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-csbrj" (UID: "4f5de6e6-d3aa-4e4e-9f8c-bf67bf228b8c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 08:42:36 crc kubenswrapper[4790]: I1003 08:42:36.864420 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/862522f5-5c44-456c-a8c3-787d7dc573df-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-hwr6b\" (UID: \"862522f5-5c44-456c-a8c3-787d7dc573df\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-hwr6b" Oct 03 08:42:36 crc kubenswrapper[4790]: I1003 08:42:36.888326 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tjp5v\" (UniqueName: \"kubernetes.io/projected/4f5de6e6-d3aa-4e4e-9f8c-bf67bf228b8c-kube-api-access-tjp5v\") pod \"image-registry-697d97f7c8-csbrj\" (UID: \"4f5de6e6-d3aa-4e4e-9f8c-bf67bf228b8c\") " pod="openshift-image-registry/image-registry-697d97f7c8-csbrj" Oct 03 08:42:36 crc kubenswrapper[4790]: I1003 08:42:36.913476 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r499b\" (UniqueName: \"kubernetes.io/projected/e6ba1e49-48e4-4fa7-860e-d7fac4e26c99-kube-api-access-r499b\") pod \"migrator-59844c95c7-c8l5f\" (UID: \"e6ba1e49-48e4-4fa7-860e-d7fac4e26c99\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-c8l5f" Oct 03 08:42:36 crc kubenswrapper[4790]: I1003 08:42:36.927421 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f4l52\" (UniqueName: \"kubernetes.io/projected/ef2e3da0-4171-404c-9521-6b69ee4b54f6-kube-api-access-f4l52\") pod \"multus-admission-controller-857f4d67dd-mksjr\" (UID: \"ef2e3da0-4171-404c-9521-6b69ee4b54f6\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-mksjr" Oct 03 08:42:36 crc kubenswrapper[4790]: I1003 08:42:36.946169 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fqtn6\" (UniqueName: \"kubernetes.io/projected/6c82e80d-c50d-4475-839c-f757c4e545f7-kube-api-access-fqtn6\") pod \"machine-config-operator-74547568cd-7nt9l\" (UID: \"6c82e80d-c50d-4475-839c-f757c4e545f7\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-7nt9l" Oct 03 08:42:36 crc kubenswrapper[4790]: I1003 08:42:36.956901 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 08:42:36 crc kubenswrapper[4790]: E1003 08:42:36.957617 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 08:42:37.457593827 +0000 UTC m=+143.810528065 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 08:42:36 crc kubenswrapper[4790]: I1003 08:42:36.965814 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ndt26\" (UniqueName: \"kubernetes.io/projected/29ea825c-5d1f-4563-af62-10be06c59e0a-kube-api-access-ndt26\") pod \"olm-operator-6b444d44fb-87tx4\" (UID: \"29ea825c-5d1f-4563-af62-10be06c59e0a\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-87tx4" Oct 03 08:42:37 crc kubenswrapper[4790]: I1003 08:42:37.004992 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-c8l5f" Oct 03 08:42:37 crc kubenswrapper[4790]: I1003 08:42:37.009609 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q77cj\" (UniqueName: \"kubernetes.io/projected/a7785589-4fe5-4219-8fc8-f314d6e0dd7c-kube-api-access-q77cj\") pod \"csi-hostpathplugin-t9hsf\" (UID: \"a7785589-4fe5-4219-8fc8-f314d6e0dd7c\") " pod="hostpath-provisioner/csi-hostpathplugin-t9hsf" Oct 03 08:42:37 crc kubenswrapper[4790]: I1003 08:42:37.015035 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-7nt9l" Oct 03 08:42:37 crc kubenswrapper[4790]: I1003 08:42:37.028776 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-26qgl\" (UniqueName: \"kubernetes.io/projected/0e3b1091-0285-4063-9031-fdca0671b63c-kube-api-access-26qgl\") pod \"machine-config-server-jwdjw\" (UID: \"0e3b1091-0285-4063-9031-fdca0671b63c\") " pod="openshift-machine-config-operator/machine-config-server-jwdjw" Oct 03 08:42:37 crc kubenswrapper[4790]: I1003 08:42:37.040157 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-mksjr" Oct 03 08:42:37 crc kubenswrapper[4790]: I1003 08:42:37.053109 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-kzt96"] Oct 03 08:42:37 crc kubenswrapper[4790]: I1003 08:42:37.057686 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-87tx4" Oct 03 08:42:37 crc kubenswrapper[4790]: I1003 08:42:37.058392 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-nlpzc" Oct 03 08:42:37 crc kubenswrapper[4790]: I1003 08:42:37.058508 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-87mj6\" (UniqueName: \"kubernetes.io/projected/ff2a3a67-029a-4dbb-9acc-fc1456977f72-kube-api-access-87mj6\") pod \"ingress-canary-qc2rd\" (UID: \"ff2a3a67-029a-4dbb-9acc-fc1456977f72\") " pod="openshift-ingress-canary/ingress-canary-qc2rd" Oct 03 08:42:37 crc kubenswrapper[4790]: I1003 08:42:37.059384 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-csbrj\" (UID: \"4f5de6e6-d3aa-4e4e-9f8c-bf67bf228b8c\") " pod="openshift-image-registry/image-registry-697d97f7c8-csbrj" Oct 03 08:42:37 crc kubenswrapper[4790]: E1003 08:42:37.059775 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 08:42:37.559761208 +0000 UTC m=+143.912695436 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-csbrj" (UID: "4f5de6e6-d3aa-4e4e-9f8c-bf67bf228b8c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 08:42:37 crc kubenswrapper[4790]: I1003 08:42:37.080780 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-hwr6b" Oct 03 08:42:37 crc kubenswrapper[4790]: I1003 08:42:37.113636 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-xhdkx" Oct 03 08:42:37 crc kubenswrapper[4790]: I1003 08:42:37.119685 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-5btj7" Oct 03 08:42:37 crc kubenswrapper[4790]: I1003 08:42:37.140475 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-jwdjw" Oct 03 08:42:37 crc kubenswrapper[4790]: I1003 08:42:37.147891 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-qc2rd" Oct 03 08:42:37 crc kubenswrapper[4790]: I1003 08:42:37.167006 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 08:42:37 crc kubenswrapper[4790]: E1003 08:42:37.167587 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 08:42:37.667546376 +0000 UTC m=+144.020480614 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 08:42:37 crc kubenswrapper[4790]: I1003 08:42:37.172016 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-t9hsf" Oct 03 08:42:37 crc kubenswrapper[4790]: I1003 08:42:37.198533 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-qgfpf"] Oct 03 08:42:37 crc kubenswrapper[4790]: I1003 08:42:37.215197 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-8p9cs"] Oct 03 08:42:37 crc kubenswrapper[4790]: I1003 08:42:37.249679 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-r9zxx"] Oct 03 08:42:37 crc kubenswrapper[4790]: W1003 08:42:37.265801 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8f295eb1_856c_4063_8411_3282ffac9de5.slice/crio-4edcb03744567a59f0c790aa1d64fd0e079267f5d71884a26e9d1262f17a44d4 WatchSource:0}: Error finding container 4edcb03744567a59f0c790aa1d64fd0e079267f5d71884a26e9d1262f17a44d4: Status 404 returned error can't find the container with id 4edcb03744567a59f0c790aa1d64fd0e079267f5d71884a26e9d1262f17a44d4 Oct 03 08:42:37 crc kubenswrapper[4790]: I1003 08:42:37.283093 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-csbrj\" (UID: \"4f5de6e6-d3aa-4e4e-9f8c-bf67bf228b8c\") " pod="openshift-image-registry/image-registry-697d97f7c8-csbrj" Oct 03 08:42:37 crc kubenswrapper[4790]: E1003 08:42:37.306419 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 08:42:37.806391657 +0000 UTC m=+144.159325885 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-csbrj" (UID: "4f5de6e6-d3aa-4e4e-9f8c-bf67bf228b8c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 08:42:37 crc kubenswrapper[4790]: I1003 08:42:37.340169 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-vm8dg" event={"ID":"1f090fa3-d832-4693-a9a7-24cdae8aae07","Type":"ContainerStarted","Data":"f7062530e522db721c033f2df7c2e0f4dc0fb345e7ef2776dbd8e201686ca9ed"} Oct 03 08:42:37 crc kubenswrapper[4790]: I1003 08:42:37.388919 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-lzfvb" event={"ID":"560651e5-35a6-4428-b763-972bdbef6c6d","Type":"ContainerStarted","Data":"68cc695e1e71b9f50a378147cb201ea07e8ff5c366fe9abe0fc8ca3ab170a2c5"} Oct 03 08:42:37 crc kubenswrapper[4790]: I1003 08:42:37.388980 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-lzfvb" event={"ID":"560651e5-35a6-4428-b763-972bdbef6c6d","Type":"ContainerStarted","Data":"6fb879ad0198b989a896f953bfeb96f075f059271b26a995856a22c807b9eb32"} Oct 03 08:42:37 crc kubenswrapper[4790]: I1003 08:42:37.416901 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 08:42:37 crc kubenswrapper[4790]: E1003 08:42:37.417906 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 08:42:37.917866168 +0000 UTC m=+144.270800406 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 08:42:37 crc kubenswrapper[4790]: I1003 08:42:37.418089 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-csbrj\" (UID: \"4f5de6e6-d3aa-4e4e-9f8c-bf67bf228b8c\") " pod="openshift-image-registry/image-registry-697d97f7c8-csbrj" Oct 03 08:42:37 crc kubenswrapper[4790]: E1003 08:42:37.419941 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 08:42:37.919915295 +0000 UTC m=+144.272849723 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-csbrj" (UID: "4f5de6e6-d3aa-4e4e-9f8c-bf67bf228b8c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 08:42:37 crc kubenswrapper[4790]: I1003 08:42:37.483397 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-wlp9g" event={"ID":"e2c063f6-68dc-4e05-bb17-fcbee1a2eec6","Type":"ContainerStarted","Data":"e9c9e5487a3a633c03a936c81247ac25121301250424be2a52bd7bd597b63de1"} Oct 03 08:42:37 crc kubenswrapper[4790]: I1003 08:42:37.483989 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-wlp9g" event={"ID":"e2c063f6-68dc-4e05-bb17-fcbee1a2eec6","Type":"ContainerStarted","Data":"e7b9162d5ae546a84807c5bd993ee1ba2a66dbba460dca9ce25533a2a5494b10"} Oct 03 08:42:37 crc kubenswrapper[4790]: I1003 08:42:37.491924 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-kzt96" event={"ID":"0826b49c-d336-4af5-82fc-c0e9d03cc7f4","Type":"ContainerStarted","Data":"15b39dc27c5db88a8b50cadedf15e23b7b4289c52e9edff0e2439e505d264a6f"} Oct 03 08:42:37 crc kubenswrapper[4790]: I1003 08:42:37.507003 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-cxxjr" event={"ID":"6f9925f6-f8f0-4e9d-8f05-1d3e08d9e4dd","Type":"ContainerStarted","Data":"78a7d12b531b2d53bf821d5a377f1f09a489754b2dd83cc02fc044914a276e18"} Oct 03 08:42:37 crc kubenswrapper[4790]: I1003 08:42:37.507068 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-cxxjr" event={"ID":"6f9925f6-f8f0-4e9d-8f05-1d3e08d9e4dd","Type":"ContainerStarted","Data":"f6c7ac5a44bc26c57c1a9c1f531a43909bc2a7789722af18375604ce398af00f"} Oct 03 08:42:37 crc kubenswrapper[4790]: I1003 08:42:37.510072 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-tvcgx" event={"ID":"74be60a5-ccdf-40b5-bc76-ebf2196ae376","Type":"ContainerStarted","Data":"9db2214616770ef4914f92507e05e694fad50a555ad92b0ae796b74294560fdd"} Oct 03 08:42:37 crc kubenswrapper[4790]: I1003 08:42:37.515885 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-tvcgx" Oct 03 08:42:37 crc kubenswrapper[4790]: I1003 08:42:37.519749 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 08:42:37 crc kubenswrapper[4790]: E1003 08:42:37.523164 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 08:42:38.023135626 +0000 UTC m=+144.376069864 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 08:42:37 crc kubenswrapper[4790]: I1003 08:42:37.531944 4790 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-tvcgx container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.12:8443/healthz\": dial tcp 10.217.0.12:8443: connect: connection refused" start-of-body= Oct 03 08:42:37 crc kubenswrapper[4790]: I1003 08:42:37.532012 4790 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-tvcgx" podUID="74be60a5-ccdf-40b5-bc76-ebf2196ae376" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.12:8443/healthz\": dial tcp 10.217.0.12:8443: connect: connection refused" Oct 03 08:42:37 crc kubenswrapper[4790]: I1003 08:42:37.534312 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jjxln" event={"ID":"44aed4cf-ccda-49ee-9d8a-c1b95946bdf9","Type":"ContainerStarted","Data":"c072d96b578a0a445c5debbdf6d32c9fbd6cce7928305c4a8a3ea0d3899e3c2d"} Oct 03 08:42:37 crc kubenswrapper[4790]: I1003 08:42:37.558483 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-dp7r9"] Oct 03 08:42:37 crc kubenswrapper[4790]: I1003 08:42:37.573178 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-62t8j" event={"ID":"484bb0d4-e7ab-406b-a9e8-5972d7cbfa8d","Type":"ContainerStarted","Data":"44236f8ce417d109ee90693b2ca3eef1411c1a9d204d45530bb9a96239e1f8f0"} Oct 03 08:42:37 crc kubenswrapper[4790]: I1003 08:42:37.580095 4790 patch_prober.go:28] interesting pod/downloads-7954f5f757-4gzvp container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.11:8080/\": dial tcp 10.217.0.11:8080: connect: connection refused" start-of-body= Oct 03 08:42:37 crc kubenswrapper[4790]: I1003 08:42:37.580181 4790 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-4gzvp" podUID="cf6939a0-5f2b-4db5-b5c2-613e19bb4b43" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.11:8080/\": dial tcp 10.217.0.11:8080: connect: connection refused" Oct 03 08:42:37 crc kubenswrapper[4790]: I1003 08:42:37.590468 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-tdsgg" Oct 03 08:42:37 crc kubenswrapper[4790]: I1003 08:42:37.617895 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-8zvcg"] Oct 03 08:42:37 crc kubenswrapper[4790]: I1003 08:42:37.630661 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-csbrj\" (UID: \"4f5de6e6-d3aa-4e4e-9f8c-bf67bf228b8c\") " pod="openshift-image-registry/image-registry-697d97f7c8-csbrj" Oct 03 08:42:37 crc kubenswrapper[4790]: E1003 08:42:37.645876 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 08:42:38.145849098 +0000 UTC m=+144.498783336 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-csbrj" (UID: "4f5de6e6-d3aa-4e4e-9f8c-bf67bf228b8c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 08:42:37 crc kubenswrapper[4790]: I1003 08:42:37.714740 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-4gzvp" podStartSLOduration=122.714717764 podStartE2EDuration="2m2.714717764s" podCreationTimestamp="2025-10-03 08:40:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 08:42:37.712306787 +0000 UTC m=+144.065241025" watchObservedRunningTime="2025-10-03 08:42:37.714717764 +0000 UTC m=+144.067652002" Oct 03 08:42:37 crc kubenswrapper[4790]: I1003 08:42:37.733372 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 08:42:37 crc kubenswrapper[4790]: E1003 08:42:37.734916 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 08:42:38.234897935 +0000 UTC m=+144.587832163 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 08:42:37 crc kubenswrapper[4790]: I1003 08:42:37.836155 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-csbrj\" (UID: \"4f5de6e6-d3aa-4e4e-9f8c-bf67bf228b8c\") " pod="openshift-image-registry/image-registry-697d97f7c8-csbrj" Oct 03 08:42:37 crc kubenswrapper[4790]: E1003 08:42:37.836555 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 08:42:38.336542611 +0000 UTC m=+144.689476849 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-csbrj" (UID: "4f5de6e6-d3aa-4e4e-9f8c-bf67bf228b8c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 08:42:37 crc kubenswrapper[4790]: I1003 08:42:37.880351 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-tml4h" podStartSLOduration=122.88032566 podStartE2EDuration="2m2.88032566s" podCreationTimestamp="2025-10-03 08:40:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 08:42:37.877943233 +0000 UTC m=+144.230877471" watchObservedRunningTime="2025-10-03 08:42:37.88032566 +0000 UTC m=+144.233259898" Oct 03 08:42:37 crc kubenswrapper[4790]: I1003 08:42:37.940013 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 08:42:37 crc kubenswrapper[4790]: E1003 08:42:37.940185 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 08:42:38.440161993 +0000 UTC m=+144.793096241 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 08:42:37 crc kubenswrapper[4790]: I1003 08:42:37.941621 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-csbrj\" (UID: \"4f5de6e6-d3aa-4e4e-9f8c-bf67bf228b8c\") " pod="openshift-image-registry/image-registry-697d97f7c8-csbrj" Oct 03 08:42:37 crc kubenswrapper[4790]: E1003 08:42:37.942021 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 08:42:38.442005915 +0000 UTC m=+144.794940153 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-csbrj" (UID: "4f5de6e6-d3aa-4e4e-9f8c-bf67bf228b8c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 08:42:37 crc kubenswrapper[4790]: I1003 08:42:37.962056 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-vm8dg" Oct 03 08:42:38 crc kubenswrapper[4790]: I1003 08:42:38.046010 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 08:42:38 crc kubenswrapper[4790]: E1003 08:42:38.046656 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 08:42:38.546637224 +0000 UTC m=+144.899571462 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 08:42:38 crc kubenswrapper[4790]: I1003 08:42:38.182600 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-csbrj\" (UID: \"4f5de6e6-d3aa-4e4e-9f8c-bf67bf228b8c\") " pod="openshift-image-registry/image-registry-697d97f7c8-csbrj" Oct 03 08:42:38 crc kubenswrapper[4790]: E1003 08:42:38.183901 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 08:42:38.683878901 +0000 UTC m=+145.036813139 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-csbrj" (UID: "4f5de6e6-d3aa-4e4e-9f8c-bf67bf228b8c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 08:42:38 crc kubenswrapper[4790]: I1003 08:42:38.288726 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 08:42:38 crc kubenswrapper[4790]: E1003 08:42:38.290201 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 08:42:38.790171167 +0000 UTC m=+145.143105405 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 08:42:38 crc kubenswrapper[4790]: I1003 08:42:38.308980 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-h7tdr"] Oct 03 08:42:38 crc kubenswrapper[4790]: I1003 08:42:38.359737 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-ggr2t" podStartSLOduration=123.359716102 podStartE2EDuration="2m3.359716102s" podCreationTimestamp="2025-10-03 08:40:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 08:42:38.354668741 +0000 UTC m=+144.707602999" watchObservedRunningTime="2025-10-03 08:42:38.359716102 +0000 UTC m=+144.712650330" Oct 03 08:42:38 crc kubenswrapper[4790]: I1003 08:42:38.387278 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-v6bmf"] Oct 03 08:42:38 crc kubenswrapper[4790]: I1003 08:42:38.387329 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-2j9mx"] Oct 03 08:42:38 crc kubenswrapper[4790]: I1003 08:42:38.391843 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-csbrj\" (UID: \"4f5de6e6-d3aa-4e4e-9f8c-bf67bf228b8c\") " pod="openshift-image-registry/image-registry-697d97f7c8-csbrj" Oct 03 08:42:38 crc kubenswrapper[4790]: E1003 08:42:38.392313 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 08:42:38.892297728 +0000 UTC m=+145.245231966 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-csbrj" (UID: "4f5de6e6-d3aa-4e4e-9f8c-bf67bf228b8c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 08:42:38 crc kubenswrapper[4790]: I1003 08:42:38.494393 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 08:42:38 crc kubenswrapper[4790]: E1003 08:42:38.494883 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 08:42:38.99484646 +0000 UTC m=+145.347780698 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 08:42:38 crc kubenswrapper[4790]: I1003 08:42:38.495083 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-csbrj\" (UID: \"4f5de6e6-d3aa-4e4e-9f8c-bf67bf228b8c\") " pod="openshift-image-registry/image-registry-697d97f7c8-csbrj" Oct 03 08:42:38 crc kubenswrapper[4790]: E1003 08:42:38.495500 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 08:42:38.995486198 +0000 UTC m=+145.348420436 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-csbrj" (UID: "4f5de6e6-d3aa-4e4e-9f8c-bf67bf228b8c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 08:42:38 crc kubenswrapper[4790]: I1003 08:42:38.553308 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-pssx9"] Oct 03 08:42:38 crc kubenswrapper[4790]: I1003 08:42:38.585765 4790 patch_prober.go:28] interesting pod/console-operator-58897d9998-msk65 container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.18:8443/readyz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Oct 03 08:42:38 crc kubenswrapper[4790]: I1003 08:42:38.585839 4790 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-msk65" podUID="04e0d67a-57a0-47be-8b3a-79e45a4c17cd" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.18:8443/readyz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Oct 03 08:42:38 crc kubenswrapper[4790]: I1003 08:42:38.596976 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 08:42:38 crc kubenswrapper[4790]: E1003 08:42:38.598204 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 08:42:39.098183714 +0000 UTC m=+145.451117952 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 08:42:38 crc kubenswrapper[4790]: I1003 08:42:38.600718 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-8f5qg"] Oct 03 08:42:38 crc kubenswrapper[4790]: I1003 08:42:38.600832 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-csbrj\" (UID: \"4f5de6e6-d3aa-4e4e-9f8c-bf67bf228b8c\") " pod="openshift-image-registry/image-registry-697d97f7c8-csbrj" Oct 03 08:42:38 crc kubenswrapper[4790]: E1003 08:42:38.601735 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 08:42:39.101725713 +0000 UTC m=+145.454659951 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-csbrj" (UID: "4f5de6e6-d3aa-4e4e-9f8c-bf67bf228b8c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 08:42:38 crc kubenswrapper[4790]: I1003 08:42:38.629307 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29324670-6j9ps"] Oct 03 08:42:38 crc kubenswrapper[4790]: W1003 08:42:38.633881 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod77532493_1d9d_47fe_ae15_2163ee04a272.slice/crio-5a3f17659c0fa0da9974c4c4035ba5fe1dfd22fab178e49535815df7e25ff634 WatchSource:0}: Error finding container 5a3f17659c0fa0da9974c4c4035ba5fe1dfd22fab178e49535815df7e25ff634: Status 404 returned error can't find the container with id 5a3f17659c0fa0da9974c4c4035ba5fe1dfd22fab178e49535815df7e25ff634 Oct 03 08:42:38 crc kubenswrapper[4790]: I1003 08:42:38.633946 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-np5k5"] Oct 03 08:42:38 crc kubenswrapper[4790]: I1003 08:42:38.642678 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-sv5st"] Oct 03 08:42:38 crc kubenswrapper[4790]: W1003 08:42:38.651807 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod801aeaf6_dba4_4080_ba94_52aaace1c77b.slice/crio-0d6d3232dd1910e30b52aed02d6173816c2b6186e87ddefbd023bfd60bbf7c32 WatchSource:0}: Error finding container 0d6d3232dd1910e30b52aed02d6173816c2b6186e87ddefbd023bfd60bbf7c32: Status 404 returned error can't find the container with id 0d6d3232dd1910e30b52aed02d6173816c2b6186e87ddefbd023bfd60bbf7c32 Oct 03 08:42:38 crc kubenswrapper[4790]: I1003 08:42:38.654796 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-xhdkx" event={"ID":"8f295eb1-856c-4063-8411-3282ffac9de5","Type":"ContainerStarted","Data":"ac555841c0b68f2f30a3b16a2233861f117eef9feade79a32d163d77bb6a89ad"} Oct 03 08:42:38 crc kubenswrapper[4790]: I1003 08:42:38.654850 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-xhdkx" event={"ID":"8f295eb1-856c-4063-8411-3282ffac9de5","Type":"ContainerStarted","Data":"4edcb03744567a59f0c790aa1d64fd0e079267f5d71884a26e9d1262f17a44d4"} Oct 03 08:42:38 crc kubenswrapper[4790]: I1003 08:42:38.655269 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-87tx4"] Oct 03 08:42:38 crc kubenswrapper[4790]: W1003 08:42:38.686075 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod68fadce0_3257_4525_b81f_552537c013be.slice/crio-244011290d41446dc3cc9418cf6bdf2d65a640e03359c49e6ec8cb241dcbca16 WatchSource:0}: Error finding container 244011290d41446dc3cc9418cf6bdf2d65a640e03359c49e6ec8cb241dcbca16: Status 404 returned error can't find the container with id 244011290d41446dc3cc9418cf6bdf2d65a640e03359c49e6ec8cb241dcbca16 Oct 03 08:42:38 crc kubenswrapper[4790]: I1003 08:42:38.702547 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 08:42:38 crc kubenswrapper[4790]: E1003 08:42:38.704618 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 08:42:39.204589653 +0000 UTC m=+145.557523891 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 08:42:38 crc kubenswrapper[4790]: I1003 08:42:38.718092 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-lzfvb" podStartSLOduration=123.718068698 podStartE2EDuration="2m3.718068698s" podCreationTimestamp="2025-10-03 08:40:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 08:42:38.717142183 +0000 UTC m=+145.070076421" watchObservedRunningTime="2025-10-03 08:42:38.718068698 +0000 UTC m=+145.071002936" Oct 03 08:42:38 crc kubenswrapper[4790]: I1003 08:42:38.753485 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-msk65" podStartSLOduration=123.753423552 podStartE2EDuration="2m3.753423552s" podCreationTimestamp="2025-10-03 08:40:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 08:42:38.686408888 +0000 UTC m=+145.039343146" watchObservedRunningTime="2025-10-03 08:42:38.753423552 +0000 UTC m=+145.106357790" Oct 03 08:42:38 crc kubenswrapper[4790]: I1003 08:42:38.754385 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-qgfpf" event={"ID":"fc49b92e-3574-4789-9f65-c2aeb678899b","Type":"ContainerStarted","Data":"f2a643c09fa64b859e997ef12a9f63fcd7216088a75afb43730bc2e7f5515b21"} Oct 03 08:42:38 crc kubenswrapper[4790]: I1003 08:42:38.754442 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-qgfpf" event={"ID":"fc49b92e-3574-4789-9f65-c2aeb678899b","Type":"ContainerStarted","Data":"0ca1e9f69b909c16c594ab83ba3a1d56c84b31e4992e387e52dced006a3f439e"} Oct 03 08:42:38 crc kubenswrapper[4790]: I1003 08:42:38.757006 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-7nt9l"] Oct 03 08:42:38 crc kubenswrapper[4790]: I1003 08:42:38.765666 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-dp7r9" event={"ID":"89e2e475-d36b-4626-ab23-e767070eb376","Type":"ContainerStarted","Data":"f4707bfa3993c663a3025602a4d1e12ec53178eceb8e1430179d84b6873f5cc2"} Oct 03 08:42:38 crc kubenswrapper[4790]: I1003 08:42:38.770912 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-h7tdr" event={"ID":"a04ec950-87ce-4f04-9c9d-201a4f6099a1","Type":"ContainerStarted","Data":"0ed1e9b90bbf329b5d60466677ba8dcd121b8b3a8d8653d055c6bada0a7579e9"} Oct 03 08:42:38 crc kubenswrapper[4790]: I1003 08:42:38.771659 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-v6bmf" event={"ID":"9b926d95-536f-4925-8078-d14df3d56611","Type":"ContainerStarted","Data":"060078ed937596f1d5245d25b543d7b4501046967f3eab7494cff4e418bd32b1"} Oct 03 08:42:38 crc kubenswrapper[4790]: W1003 08:42:38.777240 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcb758737_329d_43f2_9fa0_50537bd62787.slice/crio-d93b8343cb3edb250ae1b4d187637d569c7b3809e9dbfd80966b90c855b4228f WatchSource:0}: Error finding container d93b8343cb3edb250ae1b4d187637d569c7b3809e9dbfd80966b90c855b4228f: Status 404 returned error can't find the container with id d93b8343cb3edb250ae1b4d187637d569c7b3809e9dbfd80966b90c855b4228f Oct 03 08:42:38 crc kubenswrapper[4790]: I1003 08:42:38.807899 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-csbrj\" (UID: \"4f5de6e6-d3aa-4e4e-9f8c-bf67bf228b8c\") " pod="openshift-image-registry/image-registry-697d97f7c8-csbrj" Oct 03 08:42:38 crc kubenswrapper[4790]: E1003 08:42:38.808413 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 08:42:39.30839564 +0000 UTC m=+145.661329878 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-csbrj" (UID: "4f5de6e6-d3aa-4e4e-9f8c-bf67bf228b8c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 08:42:38 crc kubenswrapper[4790]: I1003 08:42:38.856356 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-8p9cs" event={"ID":"81ab3c47-6cf1-493f-b1bd-730c970de51f","Type":"ContainerStarted","Data":"4d49ca736b8ae3545a2657c8d50a42816d3c7bcebcdd90b4ac4056fd4f3b8ce6"} Oct 03 08:42:38 crc kubenswrapper[4790]: I1003 08:42:38.856432 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-8p9cs" event={"ID":"81ab3c47-6cf1-493f-b1bd-730c970de51f","Type":"ContainerStarted","Data":"09304d48a52388b1e3f251e2f9db5ee1d416e53015f48a3d390935da7edbbce7"} Oct 03 08:42:38 crc kubenswrapper[4790]: I1003 08:42:38.858397 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-8p9cs" Oct 03 08:42:38 crc kubenswrapper[4790]: I1003 08:42:38.876727 4790 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-8p9cs container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.21:5443/healthz\": dial tcp 10.217.0.21:5443: connect: connection refused" start-of-body= Oct 03 08:42:38 crc kubenswrapper[4790]: I1003 08:42:38.876776 4790 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-8p9cs" podUID="81ab3c47-6cf1-493f-b1bd-730c970de51f" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.21:5443/healthz\": dial tcp 10.217.0.21:5443: connect: connection refused" Oct 03 08:42:38 crc kubenswrapper[4790]: I1003 08:42:38.880382 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-jk7ch" podStartSLOduration=124.880356452 podStartE2EDuration="2m4.880356452s" podCreationTimestamp="2025-10-03 08:40:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 08:42:38.876407412 +0000 UTC m=+145.229341670" watchObservedRunningTime="2025-10-03 08:42:38.880356452 +0000 UTC m=+145.233290690" Oct 03 08:42:38 crc kubenswrapper[4790]: I1003 08:42:38.883692 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-mksjr"] Oct 03 08:42:38 crc kubenswrapper[4790]: I1003 08:42:38.891128 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-r9zxx" event={"ID":"a6d4f573-ea4a-4c8b-85da-cd3ce54f732b","Type":"ContainerStarted","Data":"88098454f7c1829922b9c3fd5e3355181c08a2efab48195c7ebe99ac39fabfc4"} Oct 03 08:42:38 crc kubenswrapper[4790]: I1003 08:42:38.891185 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-r9zxx" event={"ID":"a6d4f573-ea4a-4c8b-85da-cd3ce54f732b","Type":"ContainerStarted","Data":"e0ebe644bff0fed5794d6b8d21fa932c48c5c1d17133a5cf89e2ee3f8f5eb6fe"} Oct 03 08:42:38 crc kubenswrapper[4790]: I1003 08:42:38.892028 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-r9zxx" Oct 03 08:42:38 crc kubenswrapper[4790]: I1003 08:42:38.911193 4790 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-r9zxx container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.17:6443/healthz\": dial tcp 10.217.0.17:6443: connect: connection refused" start-of-body= Oct 03 08:42:38 crc kubenswrapper[4790]: I1003 08:42:38.911251 4790 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-r9zxx" podUID="a6d4f573-ea4a-4c8b-85da-cd3ce54f732b" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.17:6443/healthz\": dial tcp 10.217.0.17:6443: connect: connection refused" Oct 03 08:42:38 crc kubenswrapper[4790]: I1003 08:42:38.911834 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 08:42:38 crc kubenswrapper[4790]: E1003 08:42:38.914370 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 08:42:39.414334666 +0000 UTC m=+145.767268894 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 08:42:38 crc kubenswrapper[4790]: I1003 08:42:38.927139 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-kzt96" event={"ID":"0826b49c-d336-4af5-82fc-c0e9d03cc7f4","Type":"ContainerStarted","Data":"520dcb67d74ff0dedd53c7611073e564047af54d36f7d7ccc19fe87927d45bec"} Oct 03 08:42:39 crc kubenswrapper[4790]: I1003 08:42:39.000101 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-hwr6b"] Oct 03 08:42:39 crc kubenswrapper[4790]: I1003 08:42:39.022790 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-5btj7"] Oct 03 08:42:39 crc kubenswrapper[4790]: I1003 08:42:39.023461 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-8rqjf" podStartSLOduration=126.02343114 podStartE2EDuration="2m6.02343114s" podCreationTimestamp="2025-10-03 08:40:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 08:42:39.011287463 +0000 UTC m=+145.364221701" watchObservedRunningTime="2025-10-03 08:42:39.02343114 +0000 UTC m=+145.376365388" Oct 03 08:42:39 crc kubenswrapper[4790]: I1003 08:42:39.023684 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-csbrj\" (UID: \"4f5de6e6-d3aa-4e4e-9f8c-bf67bf228b8c\") " pod="openshift-image-registry/image-registry-697d97f7c8-csbrj" Oct 03 08:42:39 crc kubenswrapper[4790]: E1003 08:42:39.024268 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 08:42:39.524251143 +0000 UTC m=+145.877185381 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-csbrj" (UID: "4f5de6e6-d3aa-4e4e-9f8c-bf67bf228b8c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 08:42:39 crc kubenswrapper[4790]: I1003 08:42:39.048738 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-nlpzc"] Oct 03 08:42:39 crc kubenswrapper[4790]: I1003 08:42:39.114444 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-qc2rd"] Oct 03 08:42:39 crc kubenswrapper[4790]: I1003 08:42:39.115124 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-xhdkx" Oct 03 08:42:39 crc kubenswrapper[4790]: I1003 08:42:39.121334 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-jwdjw" event={"ID":"0e3b1091-0285-4063-9031-fdca0671b63c","Type":"ContainerStarted","Data":"a83b4a491e9874d81796dc0aa6883946d910e35b493a20899af2766509ecbbf5"} Oct 03 08:42:39 crc kubenswrapper[4790]: I1003 08:42:39.121387 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-jwdjw" event={"ID":"0e3b1091-0285-4063-9031-fdca0671b63c","Type":"ContainerStarted","Data":"6627531872e09b930798b7c8a97b035bcedf73c5f2279e7ca8e1d3b3f5d895a7"} Oct 03 08:42:39 crc kubenswrapper[4790]: I1003 08:42:39.129275 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 08:42:39 crc kubenswrapper[4790]: E1003 08:42:39.129799 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 08:42:39.629777118 +0000 UTC m=+145.982711366 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 08:42:39 crc kubenswrapper[4790]: I1003 08:42:39.134869 4790 patch_prober.go:28] interesting pod/router-default-5444994796-xhdkx container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 03 08:42:39 crc kubenswrapper[4790]: [-]has-synced failed: reason withheld Oct 03 08:42:39 crc kubenswrapper[4790]: [+]process-running ok Oct 03 08:42:39 crc kubenswrapper[4790]: healthz check failed Oct 03 08:42:39 crc kubenswrapper[4790]: I1003 08:42:39.134938 4790 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-xhdkx" podUID="8f295eb1-856c-4063-8411-3282ffac9de5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 03 08:42:39 crc kubenswrapper[4790]: I1003 08:42:39.135849 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-c8l5f"] Oct 03 08:42:39 crc kubenswrapper[4790]: I1003 08:42:39.150671 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-t9hsf"] Oct 03 08:42:39 crc kubenswrapper[4790]: I1003 08:42:39.151760 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-8zvcg" event={"ID":"6fd95bed-c64a-41fa-9382-233d6b69b31b","Type":"ContainerStarted","Data":"0584a190270ff508460569cfa791e21e226bd1fe0cbeef5f44d6ddefecc16a2a"} Oct 03 08:42:39 crc kubenswrapper[4790]: I1003 08:42:39.151826 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-8zvcg" Oct 03 08:42:39 crc kubenswrapper[4790]: I1003 08:42:39.151959 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-xnn4z" podStartSLOduration=124.151933634 podStartE2EDuration="2m4.151933634s" podCreationTimestamp="2025-10-03 08:40:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 08:42:39.098655203 +0000 UTC m=+145.451589451" watchObservedRunningTime="2025-10-03 08:42:39.151933634 +0000 UTC m=+145.504867872" Oct 03 08:42:39 crc kubenswrapper[4790]: I1003 08:42:39.162942 4790 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-8zvcg container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.37:8443/healthz\": dial tcp 10.217.0.37:8443: connect: connection refused" start-of-body= Oct 03 08:42:39 crc kubenswrapper[4790]: I1003 08:42:39.162998 4790 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-8zvcg" podUID="6fd95bed-c64a-41fa-9382-233d6b69b31b" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.37:8443/healthz\": dial tcp 10.217.0.37:8443: connect: connection refused" Oct 03 08:42:39 crc kubenswrapper[4790]: I1003 08:42:39.194995 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-2j9mx" event={"ID":"3db3e714-d4c8-4e90-9d7a-d694fadfa856","Type":"ContainerStarted","Data":"a6b0a9c3f32ea09e8f06d3b766cae678ae666edfd4ef54100f5610fb64fcba87"} Oct 03 08:42:39 crc kubenswrapper[4790]: I1003 08:42:39.212841 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-tvcgx" Oct 03 08:42:39 crc kubenswrapper[4790]: I1003 08:42:39.223066 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-msk65" Oct 03 08:42:39 crc kubenswrapper[4790]: I1003 08:42:39.232636 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-vm8dg" Oct 03 08:42:39 crc kubenswrapper[4790]: I1003 08:42:39.233708 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-csbrj\" (UID: \"4f5de6e6-d3aa-4e4e-9f8c-bf67bf228b8c\") " pod="openshift-image-registry/image-registry-697d97f7c8-csbrj" Oct 03 08:42:39 crc kubenswrapper[4790]: E1003 08:42:39.248119 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 08:42:39.748095958 +0000 UTC m=+146.101030396 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-csbrj" (UID: "4f5de6e6-d3aa-4e4e-9f8c-bf67bf228b8c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 08:42:39 crc kubenswrapper[4790]: I1003 08:42:39.325184 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-tdsgg" podStartSLOduration=124.325153442 podStartE2EDuration="2m4.325153442s" podCreationTimestamp="2025-10-03 08:40:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 08:42:39.314515596 +0000 UTC m=+145.667449834" watchObservedRunningTime="2025-10-03 08:42:39.325153442 +0000 UTC m=+145.678087680" Oct 03 08:42:39 crc kubenswrapper[4790]: I1003 08:42:39.348222 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 08:42:39 crc kubenswrapper[4790]: E1003 08:42:39.348826 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 08:42:39.848798589 +0000 UTC m=+146.201732827 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 08:42:39 crc kubenswrapper[4790]: E1003 08:42:39.450863 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 08:42:39.950849028 +0000 UTC m=+146.303783256 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-csbrj" (UID: "4f5de6e6-d3aa-4e4e-9f8c-bf67bf228b8c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 08:42:39 crc kubenswrapper[4790]: I1003 08:42:39.450457 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-csbrj\" (UID: \"4f5de6e6-d3aa-4e4e-9f8c-bf67bf228b8c\") " pod="openshift-image-registry/image-registry-697d97f7c8-csbrj" Oct 03 08:42:39 crc kubenswrapper[4790]: I1003 08:42:39.533059 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-wlp9g" podStartSLOduration=125.533035313 podStartE2EDuration="2m5.533035313s" podCreationTimestamp="2025-10-03 08:40:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 08:42:39.504065238 +0000 UTC m=+145.856999486" watchObservedRunningTime="2025-10-03 08:42:39.533035313 +0000 UTC m=+145.885969551" Oct 03 08:42:39 crc kubenswrapper[4790]: I1003 08:42:39.533731 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-vm8dg" podStartSLOduration=124.533725322 podStartE2EDuration="2m4.533725322s" podCreationTimestamp="2025-10-03 08:40:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 08:42:39.531001956 +0000 UTC m=+145.883936204" watchObservedRunningTime="2025-10-03 08:42:39.533725322 +0000 UTC m=+145.886659560" Oct 03 08:42:39 crc kubenswrapper[4790]: I1003 08:42:39.552306 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 08:42:39 crc kubenswrapper[4790]: E1003 08:42:39.552738 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 08:42:40.052715021 +0000 UTC m=+146.405649259 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 08:42:39 crc kubenswrapper[4790]: I1003 08:42:39.661884 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-csbrj\" (UID: \"4f5de6e6-d3aa-4e4e-9f8c-bf67bf228b8c\") " pod="openshift-image-registry/image-registry-697d97f7c8-csbrj" Oct 03 08:42:39 crc kubenswrapper[4790]: E1003 08:42:39.662262 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 08:42:40.162249407 +0000 UTC m=+146.515183645 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-csbrj" (UID: "4f5de6e6-d3aa-4e4e-9f8c-bf67bf228b8c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 08:42:39 crc kubenswrapper[4790]: I1003 08:42:39.706494 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-f9drg" podStartSLOduration=125.706470937 podStartE2EDuration="2m5.706470937s" podCreationTimestamp="2025-10-03 08:40:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 08:42:39.705719846 +0000 UTC m=+146.058654084" watchObservedRunningTime="2025-10-03 08:42:39.706470937 +0000 UTC m=+146.059405175" Oct 03 08:42:39 crc kubenswrapper[4790]: I1003 08:42:39.769168 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 08:42:39 crc kubenswrapper[4790]: E1003 08:42:39.769607 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 08:42:40.269590852 +0000 UTC m=+146.622525090 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 08:42:39 crc kubenswrapper[4790]: I1003 08:42:39.871321 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-csbrj\" (UID: \"4f5de6e6-d3aa-4e4e-9f8c-bf67bf228b8c\") " pod="openshift-image-registry/image-registry-697d97f7c8-csbrj" Oct 03 08:42:39 crc kubenswrapper[4790]: E1003 08:42:39.871970 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 08:42:40.371951599 +0000 UTC m=+146.724885837 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-csbrj" (UID: "4f5de6e6-d3aa-4e4e-9f8c-bf67bf228b8c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 08:42:39 crc kubenswrapper[4790]: I1003 08:42:39.922320 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-tvcgx" podStartSLOduration=124.922295769 podStartE2EDuration="2m4.922295769s" podCreationTimestamp="2025-10-03 08:40:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 08:42:39.897143079 +0000 UTC m=+146.250077317" watchObservedRunningTime="2025-10-03 08:42:39.922295769 +0000 UTC m=+146.275230007" Oct 03 08:42:39 crc kubenswrapper[4790]: I1003 08:42:39.925192 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-jwdjw" podStartSLOduration=6.925184769 podStartE2EDuration="6.925184769s" podCreationTimestamp="2025-10-03 08:42:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 08:42:39.92195966 +0000 UTC m=+146.274893908" watchObservedRunningTime="2025-10-03 08:42:39.925184769 +0000 UTC m=+146.278119007" Oct 03 08:42:39 crc kubenswrapper[4790]: I1003 08:42:39.972820 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 08:42:39 crc kubenswrapper[4790]: E1003 08:42:39.973432 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 08:42:40.47341083 +0000 UTC m=+146.826345068 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 08:42:39 crc kubenswrapper[4790]: I1003 08:42:39.978405 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-qgfpf" podStartSLOduration=124.978347158 podStartE2EDuration="2m4.978347158s" podCreationTimestamp="2025-10-03 08:40:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 08:42:39.968031861 +0000 UTC m=+146.320966099" watchObservedRunningTime="2025-10-03 08:42:39.978347158 +0000 UTC m=+146.331281406" Oct 03 08:42:40 crc kubenswrapper[4790]: I1003 08:42:40.027465 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-xhdkx" podStartSLOduration=125.027437763 podStartE2EDuration="2m5.027437763s" podCreationTimestamp="2025-10-03 08:40:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 08:42:40.027292349 +0000 UTC m=+146.380226597" watchObservedRunningTime="2025-10-03 08:42:40.027437763 +0000 UTC m=+146.380372001" Oct 03 08:42:40 crc kubenswrapper[4790]: I1003 08:42:40.038953 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-wlp9g" Oct 03 08:42:40 crc kubenswrapper[4790]: I1003 08:42:40.039485 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-wlp9g" Oct 03 08:42:40 crc kubenswrapper[4790]: I1003 08:42:40.076495 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-csbrj\" (UID: \"4f5de6e6-d3aa-4e4e-9f8c-bf67bf228b8c\") " pod="openshift-image-registry/image-registry-697d97f7c8-csbrj" Oct 03 08:42:40 crc kubenswrapper[4790]: E1003 08:42:40.078512 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 08:42:40.578481163 +0000 UTC m=+146.931415401 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-csbrj" (UID: "4f5de6e6-d3aa-4e4e-9f8c-bf67bf228b8c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 08:42:40 crc kubenswrapper[4790]: I1003 08:42:40.118283 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jjxln" Oct 03 08:42:40 crc kubenswrapper[4790]: I1003 08:42:40.118495 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jjxln" Oct 03 08:42:40 crc kubenswrapper[4790]: I1003 08:42:40.128884 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jjxln" podStartSLOduration=125.128854974 podStartE2EDuration="2m5.128854974s" podCreationTimestamp="2025-10-03 08:40:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 08:42:40.116705125 +0000 UTC m=+146.469639373" watchObservedRunningTime="2025-10-03 08:42:40.128854974 +0000 UTC m=+146.481789222" Oct 03 08:42:40 crc kubenswrapper[4790]: I1003 08:42:40.141850 4790 patch_prober.go:28] interesting pod/router-default-5444994796-xhdkx container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 03 08:42:40 crc kubenswrapper[4790]: [-]has-synced failed: reason withheld Oct 03 08:42:40 crc kubenswrapper[4790]: [+]process-running ok Oct 03 08:42:40 crc kubenswrapper[4790]: healthz check failed Oct 03 08:42:40 crc kubenswrapper[4790]: I1003 08:42:40.141915 4790 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-xhdkx" podUID="8f295eb1-856c-4063-8411-3282ffac9de5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 03 08:42:40 crc kubenswrapper[4790]: I1003 08:42:40.170940 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jjxln" Oct 03 08:42:40 crc kubenswrapper[4790]: I1003 08:42:40.191675 4790 patch_prober.go:28] interesting pod/machine-config-daemon-zqj4j container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 08:42:40 crc kubenswrapper[4790]: I1003 08:42:40.191842 4790 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" podUID="36eae06a-d8be-4128-8d68-acb32e1f011b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 08:42:40 crc kubenswrapper[4790]: I1003 08:42:40.191886 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 08:42:40 crc kubenswrapper[4790]: E1003 08:42:40.191966 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 08:42:40.691946318 +0000 UTC m=+147.044880556 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 08:42:40 crc kubenswrapper[4790]: I1003 08:42:40.192678 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-csbrj\" (UID: \"4f5de6e6-d3aa-4e4e-9f8c-bf67bf228b8c\") " pod="openshift-image-registry/image-registry-697d97f7c8-csbrj" Oct 03 08:42:40 crc kubenswrapper[4790]: E1003 08:42:40.193303 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 08:42:40.693283216 +0000 UTC m=+147.046217444 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-csbrj" (UID: "4f5de6e6-d3aa-4e4e-9f8c-bf67bf228b8c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 08:42:40 crc kubenswrapper[4790]: I1003 08:42:40.193297 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-62t8j" podStartSLOduration=125.193284366 podStartE2EDuration="2m5.193284366s" podCreationTimestamp="2025-10-03 08:40:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 08:42:40.191213568 +0000 UTC m=+146.544147806" watchObservedRunningTime="2025-10-03 08:42:40.193284366 +0000 UTC m=+146.546218604" Oct 03 08:42:40 crc kubenswrapper[4790]: I1003 08:42:40.249639 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-r9zxx" podStartSLOduration=126.249620843 podStartE2EDuration="2m6.249620843s" podCreationTimestamp="2025-10-03 08:40:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 08:42:40.249001706 +0000 UTC m=+146.601935944" watchObservedRunningTime="2025-10-03 08:42:40.249620843 +0000 UTC m=+146.602555071" Oct 03 08:42:40 crc kubenswrapper[4790]: I1003 08:42:40.258912 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29324670-6j9ps" event={"ID":"68fadce0-3257-4525-b81f-552537c013be","Type":"ContainerStarted","Data":"464aec7f7c6fd3f9380c7c491dfee2cab43bacc0d8f586ee7aeb6b34f58b320e"} Oct 03 08:42:40 crc kubenswrapper[4790]: I1003 08:42:40.258961 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29324670-6j9ps" event={"ID":"68fadce0-3257-4525-b81f-552537c013be","Type":"ContainerStarted","Data":"244011290d41446dc3cc9418cf6bdf2d65a640e03359c49e6ec8cb241dcbca16"} Oct 03 08:42:40 crc kubenswrapper[4790]: I1003 08:42:40.270978 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-t9hsf" event={"ID":"a7785589-4fe5-4219-8fc8-f314d6e0dd7c","Type":"ContainerStarted","Data":"6be4f19de6ed0f78ddc41de91d06cc7fd72813217cb2f4141f6350b1c308c9c3"} Oct 03 08:42:40 crc kubenswrapper[4790]: I1003 08:42:40.272644 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-c8l5f" event={"ID":"e6ba1e49-48e4-4fa7-860e-d7fac4e26c99","Type":"ContainerStarted","Data":"5f6c40bd471f1d04d30ef3b9b00d938c268f83e76044425b892288bc16d304ee"} Oct 03 08:42:40 crc kubenswrapper[4790]: I1003 08:42:40.291407 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-sv5st" event={"ID":"cb758737-329d-43f2-9fa0-50537bd62787","Type":"ContainerStarted","Data":"ab7e62b0b7c5c0750c56aa863552cadc0db9bdaa9d4ffc9b9b768dcb3cad33a5"} Oct 03 08:42:40 crc kubenswrapper[4790]: I1003 08:42:40.291459 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-sv5st" event={"ID":"cb758737-329d-43f2-9fa0-50537bd62787","Type":"ContainerStarted","Data":"d93b8343cb3edb250ae1b4d187637d569c7b3809e9dbfd80966b90c855b4228f"} Oct 03 08:42:40 crc kubenswrapper[4790]: I1003 08:42:40.293972 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-8f5qg" event={"ID":"801aeaf6-dba4-4080-ba94-52aaace1c77b","Type":"ContainerStarted","Data":"65c173ed79ba7d6d2142e7b4d58ae01412b591139835733a1aebab7f5176c774"} Oct 03 08:42:40 crc kubenswrapper[4790]: I1003 08:42:40.294056 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-8f5qg" event={"ID":"801aeaf6-dba4-4080-ba94-52aaace1c77b","Type":"ContainerStarted","Data":"0d6d3232dd1910e30b52aed02d6173816c2b6186e87ddefbd023bfd60bbf7c32"} Oct 03 08:42:40 crc kubenswrapper[4790]: I1003 08:42:40.300942 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-cxxjr" podStartSLOduration=126.300920159 podStartE2EDuration="2m6.300920159s" podCreationTimestamp="2025-10-03 08:40:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 08:42:40.297039732 +0000 UTC m=+146.649973970" watchObservedRunningTime="2025-10-03 08:42:40.300920159 +0000 UTC m=+146.653854407" Oct 03 08:42:40 crc kubenswrapper[4790]: I1003 08:42:40.305356 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 08:42:40 crc kubenswrapper[4790]: E1003 08:42:40.307177 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 08:42:40.806637678 +0000 UTC m=+147.159571916 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 08:42:40 crc kubenswrapper[4790]: I1003 08:42:40.357059 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-kzt96" event={"ID":"0826b49c-d336-4af5-82fc-c0e9d03cc7f4","Type":"ContainerStarted","Data":"e51f4ab2a44737658517ee152cc9185b3c4b2763920f89c0d75c0303234a06eb"} Oct 03 08:42:40 crc kubenswrapper[4790]: I1003 08:42:40.361150 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-h7tdr" event={"ID":"a04ec950-87ce-4f04-9c9d-201a4f6099a1","Type":"ContainerStarted","Data":"0a70a9044adf281ff7b64c63a15d94aefe74174c8b191ba0fc8a2cbb6c6e67a8"} Oct 03 08:42:40 crc kubenswrapper[4790]: I1003 08:42:40.362426 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-h7tdr" Oct 03 08:42:40 crc kubenswrapper[4790]: I1003 08:42:40.363558 4790 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-h7tdr container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.38:8080/healthz\": dial tcp 10.217.0.38:8080: connect: connection refused" start-of-body= Oct 03 08:42:40 crc kubenswrapper[4790]: I1003 08:42:40.363666 4790 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-h7tdr" podUID="a04ec950-87ce-4f04-9c9d-201a4f6099a1" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.38:8080/healthz\": dial tcp 10.217.0.38:8080: connect: connection refused" Oct 03 08:42:40 crc kubenswrapper[4790]: I1003 08:42:40.397172 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-nlpzc" event={"ID":"1e42f548-0f82-45ec-9d39-7131fe9f66ad","Type":"ContainerStarted","Data":"1527d73243de02ad778431eeaf30337a015114b9e44b7eeeb4c342673ce3bbf7"} Oct 03 08:42:40 crc kubenswrapper[4790]: I1003 08:42:40.408494 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-csbrj\" (UID: \"4f5de6e6-d3aa-4e4e-9f8c-bf67bf228b8c\") " pod="openshift-image-registry/image-registry-697d97f7c8-csbrj" Oct 03 08:42:40 crc kubenswrapper[4790]: E1003 08:42:40.409982 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 08:42:40.909961232 +0000 UTC m=+147.262895680 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-csbrj" (UID: "4f5de6e6-d3aa-4e4e-9f8c-bf67bf228b8c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 08:42:40 crc kubenswrapper[4790]: I1003 08:42:40.427111 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-8p9cs" podStartSLOduration=125.427086518 podStartE2EDuration="2m5.427086518s" podCreationTimestamp="2025-10-03 08:40:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 08:42:40.422178811 +0000 UTC m=+146.775113049" watchObservedRunningTime="2025-10-03 08:42:40.427086518 +0000 UTC m=+146.780020756" Oct 03 08:42:40 crc kubenswrapper[4790]: I1003 08:42:40.445403 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-8zvcg" event={"ID":"6fd95bed-c64a-41fa-9382-233d6b69b31b","Type":"ContainerStarted","Data":"cf3a36ce8820f9c42d94a106c6e014d3c850d3e90b825cb05c4302d21ff33184"} Oct 03 08:42:40 crc kubenswrapper[4790]: I1003 08:42:40.446678 4790 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-8zvcg container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.37:8443/healthz\": dial tcp 10.217.0.37:8443: connect: connection refused" start-of-body= Oct 03 08:42:40 crc kubenswrapper[4790]: I1003 08:42:40.446716 4790 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-8zvcg" podUID="6fd95bed-c64a-41fa-9382-233d6b69b31b" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.37:8443/healthz\": dial tcp 10.217.0.37:8443: connect: connection refused" Oct 03 08:42:40 crc kubenswrapper[4790]: I1003 08:42:40.451237 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-2j9mx" event={"ID":"3db3e714-d4c8-4e90-9d7a-d694fadfa856","Type":"ContainerStarted","Data":"bb7a2fc9f64874b36a0b781c54815878f74905eeab62089e57445777e71df6ac"} Oct 03 08:42:40 crc kubenswrapper[4790]: I1003 08:42:40.461998 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-8zvcg" podStartSLOduration=125.461954718 podStartE2EDuration="2m5.461954718s" podCreationTimestamp="2025-10-03 08:40:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 08:42:40.450966353 +0000 UTC m=+146.803900591" watchObservedRunningTime="2025-10-03 08:42:40.461954718 +0000 UTC m=+146.814888966" Oct 03 08:42:40 crc kubenswrapper[4790]: I1003 08:42:40.467367 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-qc2rd" event={"ID":"ff2a3a67-029a-4dbb-9acc-fc1456977f72","Type":"ContainerStarted","Data":"f167e1f9f5a3bc5912b2b8b7e60ed5d77fa230eb73aced823d6b139b37559ba3"} Oct 03 08:42:40 crc kubenswrapper[4790]: I1003 08:42:40.467418 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-qc2rd" event={"ID":"ff2a3a67-029a-4dbb-9acc-fc1456977f72","Type":"ContainerStarted","Data":"a00f5145218749f655f279b22650aa1261f0e374c03e58eb21138eb2bd90bd4e"} Oct 03 08:42:40 crc kubenswrapper[4790]: I1003 08:42:40.502220 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-np5k5" event={"ID":"ce133d91-dccc-4862-ac75-8e3de6fb88cf","Type":"ContainerStarted","Data":"aa60f0ebbe0caca72a2fb7368b84de4b9583acd761e0d1b91f6178553e96ab2c"} Oct 03 08:42:40 crc kubenswrapper[4790]: I1003 08:42:40.502822 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-np5k5" event={"ID":"ce133d91-dccc-4862-ac75-8e3de6fb88cf","Type":"ContainerStarted","Data":"33f3b85f32a79b0a6ba0b30173935028809144d523f152e068adba3aeb8f2ac9"} Oct 03 08:42:40 crc kubenswrapper[4790]: I1003 08:42:40.510016 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 08:42:40 crc kubenswrapper[4790]: E1003 08:42:40.511201 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 08:42:41.011186417 +0000 UTC m=+147.364120655 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 08:42:40 crc kubenswrapper[4790]: I1003 08:42:40.550932 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-dp7r9" event={"ID":"89e2e475-d36b-4626-ab23-e767070eb376","Type":"ContainerStarted","Data":"3686fde350089c5444ae1d4420f76005966ad85c11d6d1d6291015faa14130c2"} Oct 03 08:42:40 crc kubenswrapper[4790]: I1003 08:42:40.556240 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-pssx9" event={"ID":"77532493-1d9d-47fe-ae15-2163ee04a272","Type":"ContainerStarted","Data":"4baad3f38b0ffd69692b6ea01bff9386850fa5e6327cedd27e5056c89f4f46bb"} Oct 03 08:42:40 crc kubenswrapper[4790]: I1003 08:42:40.556307 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-pssx9" event={"ID":"77532493-1d9d-47fe-ae15-2163ee04a272","Type":"ContainerStarted","Data":"5a3f17659c0fa0da9974c4c4035ba5fe1dfd22fab178e49535815df7e25ff634"} Oct 03 08:42:40 crc kubenswrapper[4790]: I1003 08:42:40.560505 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29324670-6j9ps" podStartSLOduration=126.560488468 podStartE2EDuration="2m6.560488468s" podCreationTimestamp="2025-10-03 08:40:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 08:42:40.526975376 +0000 UTC m=+146.879909614" watchObservedRunningTime="2025-10-03 08:42:40.560488468 +0000 UTC m=+146.913422706" Oct 03 08:42:40 crc kubenswrapper[4790]: I1003 08:42:40.560871 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-v6bmf" event={"ID":"9b926d95-536f-4925-8078-d14df3d56611","Type":"ContainerStarted","Data":"913b4ee9ee72bfaebea93028acbf31a040d0e5916a7c4b5e4bc3f0c8be836570"} Oct 03 08:42:40 crc kubenswrapper[4790]: I1003 08:42:40.564135 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-mksjr" event={"ID":"ef2e3da0-4171-404c-9521-6b69ee4b54f6","Type":"ContainerStarted","Data":"cae35b4cd263db7e625acbb72eabeb2d9e254083db8a0e7137d2b17f44c10323"} Oct 03 08:42:40 crc kubenswrapper[4790]: I1003 08:42:40.583800 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-7nt9l" event={"ID":"6c82e80d-c50d-4475-839c-f757c4e545f7","Type":"ContainerStarted","Data":"210540c8da299df267dd697ed7ab9f4962ae08627edbba0e85782c07b88c07a3"} Oct 03 08:42:40 crc kubenswrapper[4790]: I1003 08:42:40.583852 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-7nt9l" event={"ID":"6c82e80d-c50d-4475-839c-f757c4e545f7","Type":"ContainerStarted","Data":"bfe3ba26589357c826d1fe883d2c63332f76102b353c7c5be5e8275d9bd361d3"} Oct 03 08:42:40 crc kubenswrapper[4790]: I1003 08:42:40.607610 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-hwr6b" event={"ID":"862522f5-5c44-456c-a8c3-787d7dc573df","Type":"ContainerStarted","Data":"c1e9f79cc8923ef23d31e2d0ccf65c10d5463bc24dbaf632ec80b7950299732d"} Oct 03 08:42:40 crc kubenswrapper[4790]: I1003 08:42:40.620293 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-sv5st" podStartSLOduration=125.620271491 podStartE2EDuration="2m5.620271491s" podCreationTimestamp="2025-10-03 08:40:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 08:42:40.613958305 +0000 UTC m=+146.966892543" watchObservedRunningTime="2025-10-03 08:42:40.620271491 +0000 UTC m=+146.973205729" Oct 03 08:42:40 crc kubenswrapper[4790]: I1003 08:42:40.637272 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-87tx4" event={"ID":"29ea825c-5d1f-4563-af62-10be06c59e0a","Type":"ContainerStarted","Data":"667860523c7e53b53d1873a8e9a069ebe729ba0d205b1065fa1b07d59b113dfa"} Oct 03 08:42:40 crc kubenswrapper[4790]: I1003 08:42:40.637531 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-87tx4" event={"ID":"29ea825c-5d1f-4563-af62-10be06c59e0a","Type":"ContainerStarted","Data":"4db4c0ad0313a7bf8bd196321a8a2ee6d89ea5f630f953818d99d0d1c8c7b292"} Oct 03 08:42:40 crc kubenswrapper[4790]: I1003 08:42:40.639175 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-87tx4" Oct 03 08:42:40 crc kubenswrapper[4790]: I1003 08:42:40.647748 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-csbrj\" (UID: \"4f5de6e6-d3aa-4e4e-9f8c-bf67bf228b8c\") " pod="openshift-image-registry/image-registry-697d97f7c8-csbrj" Oct 03 08:42:40 crc kubenswrapper[4790]: E1003 08:42:40.648221 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 08:42:41.148203288 +0000 UTC m=+147.501137526 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-csbrj" (UID: "4f5de6e6-d3aa-4e4e-9f8c-bf67bf228b8c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 08:42:40 crc kubenswrapper[4790]: I1003 08:42:40.658175 4790 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-87tx4 container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.34:8443/healthz\": dial tcp 10.217.0.34:8443: connect: connection refused" start-of-body= Oct 03 08:42:40 crc kubenswrapper[4790]: I1003 08:42:40.658288 4790 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-87tx4" podUID="29ea825c-5d1f-4563-af62-10be06c59e0a" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.34:8443/healthz\": dial tcp 10.217.0.34:8443: connect: connection refused" Oct 03 08:42:40 crc kubenswrapper[4790]: I1003 08:42:40.661205 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-5btj7" event={"ID":"ebbc7ead-1893-4ca5-bd08-3b3f0090a781","Type":"ContainerStarted","Data":"965c296856eb8cbee1d3502d5658e3af4af74a86718a8ec97cff2a39228878c5"} Oct 03 08:42:40 crc kubenswrapper[4790]: I1003 08:42:40.661262 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-5btj7" event={"ID":"ebbc7ead-1893-4ca5-bd08-3b3f0090a781","Type":"ContainerStarted","Data":"6994580aa7096f2d5fbb13870b511d60572d739c895d1e18ebe8c50a93d14130"} Oct 03 08:42:40 crc kubenswrapper[4790]: I1003 08:42:40.690101 4790 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-r9zxx container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.17:6443/healthz\": dial tcp 10.217.0.17:6443: connect: connection refused" start-of-body= Oct 03 08:42:40 crc kubenswrapper[4790]: I1003 08:42:40.690154 4790 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-r9zxx" podUID="a6d4f573-ea4a-4c8b-85da-cd3ce54f732b" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.17:6443/healthz\": dial tcp 10.217.0.17:6443: connect: connection refused" Oct 03 08:42:40 crc kubenswrapper[4790]: I1003 08:42:40.701894 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jjxln" Oct 03 08:42:40 crc kubenswrapper[4790]: I1003 08:42:40.728979 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-qc2rd" podStartSLOduration=7.728937693 podStartE2EDuration="7.728937693s" podCreationTimestamp="2025-10-03 08:42:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 08:42:40.7194596 +0000 UTC m=+147.072393838" watchObservedRunningTime="2025-10-03 08:42:40.728937693 +0000 UTC m=+147.081871951" Oct 03 08:42:40 crc kubenswrapper[4790]: I1003 08:42:40.750311 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 08:42:40 crc kubenswrapper[4790]: E1003 08:42:40.753842 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 08:42:41.253815075 +0000 UTC m=+147.606749553 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 08:42:40 crc kubenswrapper[4790]: I1003 08:42:40.833374 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-kzt96" podStartSLOduration=125.833350187 podStartE2EDuration="2m5.833350187s" podCreationTimestamp="2025-10-03 08:40:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 08:42:40.829474619 +0000 UTC m=+147.182408857" watchObservedRunningTime="2025-10-03 08:42:40.833350187 +0000 UTC m=+147.186284415" Oct 03 08:42:40 crc kubenswrapper[4790]: I1003 08:42:40.854634 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-csbrj\" (UID: \"4f5de6e6-d3aa-4e4e-9f8c-bf67bf228b8c\") " pod="openshift-image-registry/image-registry-697d97f7c8-csbrj" Oct 03 08:42:40 crc kubenswrapper[4790]: E1003 08:42:40.855155 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 08:42:41.355140143 +0000 UTC m=+147.708074381 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-csbrj" (UID: "4f5de6e6-d3aa-4e4e-9f8c-bf67bf228b8c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 08:42:40 crc kubenswrapper[4790]: I1003 08:42:40.957693 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 08:42:40 crc kubenswrapper[4790]: E1003 08:42:40.958111 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 08:42:41.458090866 +0000 UTC m=+147.811025104 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 08:42:41 crc kubenswrapper[4790]: I1003 08:42:41.059646 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-csbrj\" (UID: \"4f5de6e6-d3aa-4e4e-9f8c-bf67bf228b8c\") " pod="openshift-image-registry/image-registry-697d97f7c8-csbrj" Oct 03 08:42:41 crc kubenswrapper[4790]: E1003 08:42:41.060145 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 08:42:41.560130414 +0000 UTC m=+147.913064652 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-csbrj" (UID: "4f5de6e6-d3aa-4e4e-9f8c-bf67bf228b8c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 08:42:41 crc kubenswrapper[4790]: I1003 08:42:41.117120 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-h7tdr" podStartSLOduration=126.117101818 podStartE2EDuration="2m6.117101818s" podCreationTimestamp="2025-10-03 08:40:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 08:42:41.010482443 +0000 UTC m=+147.363416681" watchObservedRunningTime="2025-10-03 08:42:41.117101818 +0000 UTC m=+147.470036056" Oct 03 08:42:41 crc kubenswrapper[4790]: I1003 08:42:41.137122 4790 patch_prober.go:28] interesting pod/router-default-5444994796-xhdkx container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 03 08:42:41 crc kubenswrapper[4790]: [-]has-synced failed: reason withheld Oct 03 08:42:41 crc kubenswrapper[4790]: [+]process-running ok Oct 03 08:42:41 crc kubenswrapper[4790]: healthz check failed Oct 03 08:42:41 crc kubenswrapper[4790]: I1003 08:42:41.137197 4790 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-xhdkx" podUID="8f295eb1-856c-4063-8411-3282ffac9de5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 03 08:42:41 crc kubenswrapper[4790]: I1003 08:42:41.166303 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 08:42:41 crc kubenswrapper[4790]: E1003 08:42:41.166705 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 08:42:41.666657926 +0000 UTC m=+148.019592164 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 08:42:41 crc kubenswrapper[4790]: I1003 08:42:41.247506 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-dp7r9" podStartSLOduration=126.247481784 podStartE2EDuration="2m6.247481784s" podCreationTimestamp="2025-10-03 08:40:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 08:42:41.236471938 +0000 UTC m=+147.589406176" watchObservedRunningTime="2025-10-03 08:42:41.247481784 +0000 UTC m=+147.600416022" Oct 03 08:42:41 crc kubenswrapper[4790]: I1003 08:42:41.268131 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-csbrj\" (UID: \"4f5de6e6-d3aa-4e4e-9f8c-bf67bf228b8c\") " pod="openshift-image-registry/image-registry-697d97f7c8-csbrj" Oct 03 08:42:41 crc kubenswrapper[4790]: E1003 08:42:41.268676 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 08:42:41.768662714 +0000 UTC m=+148.121596952 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-csbrj" (UID: "4f5de6e6-d3aa-4e4e-9f8c-bf67bf228b8c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 08:42:41 crc kubenswrapper[4790]: I1003 08:42:41.309766 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-5btj7" podStartSLOduration=126.309745496 podStartE2EDuration="2m6.309745496s" podCreationTimestamp="2025-10-03 08:40:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 08:42:41.309012606 +0000 UTC m=+147.661946844" watchObservedRunningTime="2025-10-03 08:42:41.309745496 +0000 UTC m=+147.662679734" Oct 03 08:42:41 crc kubenswrapper[4790]: I1003 08:42:41.369014 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 08:42:41 crc kubenswrapper[4790]: E1003 08:42:41.369463 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 08:42:41.869442977 +0000 UTC m=+148.222377215 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 08:42:41 crc kubenswrapper[4790]: I1003 08:42:41.446215 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-np5k5" podStartSLOduration=126.446192771 podStartE2EDuration="2m6.446192771s" podCreationTimestamp="2025-10-03 08:40:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 08:42:41.379566868 +0000 UTC m=+147.732501106" watchObservedRunningTime="2025-10-03 08:42:41.446192771 +0000 UTC m=+147.799127009" Oct 03 08:42:41 crc kubenswrapper[4790]: I1003 08:42:41.471413 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-csbrj\" (UID: \"4f5de6e6-d3aa-4e4e-9f8c-bf67bf228b8c\") " pod="openshift-image-registry/image-registry-697d97f7c8-csbrj" Oct 03 08:42:41 crc kubenswrapper[4790]: E1003 08:42:41.472237 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 08:42:41.972216064 +0000 UTC m=+148.325150302 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-csbrj" (UID: "4f5de6e6-d3aa-4e4e-9f8c-bf67bf228b8c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 08:42:41 crc kubenswrapper[4790]: I1003 08:42:41.474776 4790 patch_prober.go:28] interesting pod/apiserver-76f77b778f-wlp9g container/openshift-apiserver namespace/openshift-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Oct 03 08:42:41 crc kubenswrapper[4790]: [+]log ok Oct 03 08:42:41 crc kubenswrapper[4790]: [+]etcd ok Oct 03 08:42:41 crc kubenswrapper[4790]: [+]poststarthook/start-apiserver-admission-initializer ok Oct 03 08:42:41 crc kubenswrapper[4790]: [+]poststarthook/generic-apiserver-start-informers ok Oct 03 08:42:41 crc kubenswrapper[4790]: [+]poststarthook/max-in-flight-filter ok Oct 03 08:42:41 crc kubenswrapper[4790]: [+]poststarthook/storage-object-count-tracker-hook ok Oct 03 08:42:41 crc kubenswrapper[4790]: [+]poststarthook/image.openshift.io-apiserver-caches ok Oct 03 08:42:41 crc kubenswrapper[4790]: [-]poststarthook/authorization.openshift.io-bootstrapclusterroles failed: reason withheld Oct 03 08:42:41 crc kubenswrapper[4790]: [-]poststarthook/authorization.openshift.io-ensurenodebootstrap-sa failed: reason withheld Oct 03 08:42:41 crc kubenswrapper[4790]: [+]poststarthook/project.openshift.io-projectcache ok Oct 03 08:42:41 crc kubenswrapper[4790]: [+]poststarthook/project.openshift.io-projectauthorizationcache ok Oct 03 08:42:41 crc kubenswrapper[4790]: [+]poststarthook/openshift.io-startinformers ok Oct 03 08:42:41 crc kubenswrapper[4790]: [+]poststarthook/openshift.io-restmapperupdater ok Oct 03 08:42:41 crc kubenswrapper[4790]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Oct 03 08:42:41 crc kubenswrapper[4790]: livez check failed Oct 03 08:42:41 crc kubenswrapper[4790]: I1003 08:42:41.474855 4790 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-apiserver/apiserver-76f77b778f-wlp9g" podUID="e2c063f6-68dc-4e05-bb17-fcbee1a2eec6" containerName="openshift-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 03 08:42:41 crc kubenswrapper[4790]: I1003 08:42:41.476244 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-87tx4" podStartSLOduration=126.476229536 podStartE2EDuration="2m6.476229536s" podCreationTimestamp="2025-10-03 08:40:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 08:42:41.475368152 +0000 UTC m=+147.828302400" watchObservedRunningTime="2025-10-03 08:42:41.476229536 +0000 UTC m=+147.829163764" Oct 03 08:42:41 crc kubenswrapper[4790]: I1003 08:42:41.572698 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 08:42:41 crc kubenswrapper[4790]: E1003 08:42:41.573085 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 08:42:42.073064049 +0000 UTC m=+148.425998287 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 08:42:41 crc kubenswrapper[4790]: I1003 08:42:41.662130 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-8p9cs" Oct 03 08:42:41 crc kubenswrapper[4790]: I1003 08:42:41.666681 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-7nt9l" event={"ID":"6c82e80d-c50d-4475-839c-f757c4e545f7","Type":"ContainerStarted","Data":"f7c1646d09dfe88f24a3205621756d7cff7451803ed5093eb03c8232d958f0b2"} Oct 03 08:42:41 crc kubenswrapper[4790]: I1003 08:42:41.668473 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-hwr6b" event={"ID":"862522f5-5c44-456c-a8c3-787d7dc573df","Type":"ContainerStarted","Data":"94893dcf53eb5a6294fc82d93993171755b7e386e20329bc1ce3250f5302cb16"} Oct 03 08:42:41 crc kubenswrapper[4790]: I1003 08:42:41.671056 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-pssx9" event={"ID":"77532493-1d9d-47fe-ae15-2163ee04a272","Type":"ContainerStarted","Data":"3f79b2926ded8e4bf3bae9ca616834ba9016a3a995cb363109b3d9541eef5eaa"} Oct 03 08:42:41 crc kubenswrapper[4790]: I1003 08:42:41.673697 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-csbrj\" (UID: \"4f5de6e6-d3aa-4e4e-9f8c-bf67bf228b8c\") " pod="openshift-image-registry/image-registry-697d97f7c8-csbrj" Oct 03 08:42:41 crc kubenswrapper[4790]: E1003 08:42:41.674103 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 08:42:42.174088858 +0000 UTC m=+148.527023096 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-csbrj" (UID: "4f5de6e6-d3aa-4e4e-9f8c-bf67bf228b8c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 08:42:41 crc kubenswrapper[4790]: I1003 08:42:41.674912 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-mksjr" event={"ID":"ef2e3da0-4171-404c-9521-6b69ee4b54f6","Type":"ContainerStarted","Data":"420a805dd0f3a1cf88ae321c5fcecbaf16b3b622c28d3e32050fcf30df85bcdb"} Oct 03 08:42:41 crc kubenswrapper[4790]: I1003 08:42:41.674963 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-mksjr" event={"ID":"ef2e3da0-4171-404c-9521-6b69ee4b54f6","Type":"ContainerStarted","Data":"53f68412fd6e821058abb235e163f2da10e1954a4933f70c23b6d9b44af55703"} Oct 03 08:42:41 crc kubenswrapper[4790]: I1003 08:42:41.677263 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-8f5qg" event={"ID":"801aeaf6-dba4-4080-ba94-52aaace1c77b","Type":"ContainerStarted","Data":"b0de624147d986d2a36c24bfd8e1f77cea2da10d1c58f31700cf4321a505b4d6"} Oct 03 08:42:41 crc kubenswrapper[4790]: I1003 08:42:41.677778 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-8f5qg" Oct 03 08:42:41 crc kubenswrapper[4790]: I1003 08:42:41.679464 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-v6bmf" event={"ID":"9b926d95-536f-4925-8078-d14df3d56611","Type":"ContainerStarted","Data":"0cf82125e07d82cabb6e283d4c0ec843334006dc373657bb14743b260a5225d7"} Oct 03 08:42:41 crc kubenswrapper[4790]: I1003 08:42:41.681637 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-2j9mx" event={"ID":"3db3e714-d4c8-4e90-9d7a-d694fadfa856","Type":"ContainerStarted","Data":"60fb177736fd3d8b81ef19bca5181bd265d55c9bbcc53de33653a7fcfb04914d"} Oct 03 08:42:41 crc kubenswrapper[4790]: I1003 08:42:41.682072 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-2j9mx" Oct 03 08:42:41 crc kubenswrapper[4790]: I1003 08:42:41.685264 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-t9hsf" event={"ID":"a7785589-4fe5-4219-8fc8-f314d6e0dd7c","Type":"ContainerStarted","Data":"dd99004555c57fd933d52ff287758f61e3f9e9c0d82074de7a07db6ca3edbadf"} Oct 03 08:42:41 crc kubenswrapper[4790]: I1003 08:42:41.687887 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-c8l5f" event={"ID":"e6ba1e49-48e4-4fa7-860e-d7fac4e26c99","Type":"ContainerStarted","Data":"8b9056740f067463d532247455c3f3352b65abb4dac73dcd259755d5159234a2"} Oct 03 08:42:41 crc kubenswrapper[4790]: I1003 08:42:41.687910 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-c8l5f" event={"ID":"e6ba1e49-48e4-4fa7-860e-d7fac4e26c99","Type":"ContainerStarted","Data":"774f8341e378b27d71e34e664d56664a612b60877a528a3bceded4ad4d34bfe2"} Oct 03 08:42:41 crc kubenswrapper[4790]: I1003 08:42:41.740514 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-nlpzc" event={"ID":"1e42f548-0f82-45ec-9d39-7131fe9f66ad","Type":"ContainerStarted","Data":"d5a0df345ccf150613d19ffc58fccbe1b28eef9c53283deb1f9eab809ab2dd53"} Oct 03 08:42:41 crc kubenswrapper[4790]: I1003 08:42:41.742666 4790 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-h7tdr container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.38:8080/healthz\": dial tcp 10.217.0.38:8080: connect: connection refused" start-of-body= Oct 03 08:42:41 crc kubenswrapper[4790]: I1003 08:42:41.742732 4790 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-h7tdr" podUID="a04ec950-87ce-4f04-9c9d-201a4f6099a1" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.38:8080/healthz\": dial tcp 10.217.0.38:8080: connect: connection refused" Oct 03 08:42:41 crc kubenswrapper[4790]: I1003 08:42:41.743458 4790 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-87tx4 container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.34:8443/healthz\": dial tcp 10.217.0.34:8443: connect: connection refused" start-of-body= Oct 03 08:42:41 crc kubenswrapper[4790]: I1003 08:42:41.743524 4790 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-87tx4" podUID="29ea825c-5d1f-4563-af62-10be06c59e0a" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.34:8443/healthz\": dial tcp 10.217.0.34:8443: connect: connection refused" Oct 03 08:42:41 crc kubenswrapper[4790]: I1003 08:42:41.763403 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-8zvcg" Oct 03 08:42:41 crc kubenswrapper[4790]: I1003 08:42:41.775439 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 08:42:41 crc kubenswrapper[4790]: E1003 08:42:41.775759 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 08:42:42.275708835 +0000 UTC m=+148.628643073 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 08:42:41 crc kubenswrapper[4790]: I1003 08:42:41.779566 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-csbrj\" (UID: \"4f5de6e6-d3aa-4e4e-9f8c-bf67bf228b8c\") " pod="openshift-image-registry/image-registry-697d97f7c8-csbrj" Oct 03 08:42:41 crc kubenswrapper[4790]: E1003 08:42:41.781078 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 08:42:42.281058403 +0000 UTC m=+148.633992851 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-csbrj" (UID: "4f5de6e6-d3aa-4e4e-9f8c-bf67bf228b8c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 08:42:41 crc kubenswrapper[4790]: I1003 08:42:41.804312 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-7nt9l" podStartSLOduration=126.80428977 podStartE2EDuration="2m6.80428977s" podCreationTimestamp="2025-10-03 08:40:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 08:42:41.798048447 +0000 UTC m=+148.150982685" watchObservedRunningTime="2025-10-03 08:42:41.80428977 +0000 UTC m=+148.157224008" Oct 03 08:42:41 crc kubenswrapper[4790]: I1003 08:42:41.826112 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-hwr6b" podStartSLOduration=126.826086006 podStartE2EDuration="2m6.826086006s" podCreationTimestamp="2025-10-03 08:40:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 08:42:41.825032987 +0000 UTC m=+148.177967235" watchObservedRunningTime="2025-10-03 08:42:41.826086006 +0000 UTC m=+148.179020274" Oct 03 08:42:41 crc kubenswrapper[4790]: I1003 08:42:41.882267 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 08:42:41 crc kubenswrapper[4790]: E1003 08:42:41.882740 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 08:42:42.382693771 +0000 UTC m=+148.735628009 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 08:42:41 crc kubenswrapper[4790]: I1003 08:42:41.886952 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-csbrj\" (UID: \"4f5de6e6-d3aa-4e4e-9f8c-bf67bf228b8c\") " pod="openshift-image-registry/image-registry-697d97f7c8-csbrj" Oct 03 08:42:41 crc kubenswrapper[4790]: E1003 08:42:41.891329 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 08:42:42.39130262 +0000 UTC m=+148.744236858 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-csbrj" (UID: "4f5de6e6-d3aa-4e4e-9f8c-bf67bf228b8c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 08:42:41 crc kubenswrapper[4790]: I1003 08:42:41.913232 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-2j9mx" podStartSLOduration=8.913206388999999 podStartE2EDuration="8.913206389s" podCreationTimestamp="2025-10-03 08:42:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 08:42:41.908505518 +0000 UTC m=+148.261439776" watchObservedRunningTime="2025-10-03 08:42:41.913206389 +0000 UTC m=+148.266140627" Oct 03 08:42:41 crc kubenswrapper[4790]: I1003 08:42:41.913856 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-c8l5f" podStartSLOduration=126.913852137 podStartE2EDuration="2m6.913852137s" podCreationTimestamp="2025-10-03 08:40:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 08:42:41.865804341 +0000 UTC m=+148.218738599" watchObservedRunningTime="2025-10-03 08:42:41.913852137 +0000 UTC m=+148.266786375" Oct 03 08:42:41 crc kubenswrapper[4790]: I1003 08:42:41.939545 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-nlpzc" podStartSLOduration=126.93949974 podStartE2EDuration="2m6.93949974s" podCreationTimestamp="2025-10-03 08:40:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 08:42:41.93804612 +0000 UTC m=+148.290980358" watchObservedRunningTime="2025-10-03 08:42:41.93949974 +0000 UTC m=+148.292433968" Oct 03 08:42:41 crc kubenswrapper[4790]: I1003 08:42:41.967612 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-v6bmf" podStartSLOduration=126.967591331 podStartE2EDuration="2m6.967591331s" podCreationTimestamp="2025-10-03 08:40:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 08:42:41.964041573 +0000 UTC m=+148.316975821" watchObservedRunningTime="2025-10-03 08:42:41.967591331 +0000 UTC m=+148.320525569" Oct 03 08:42:41 crc kubenswrapper[4790]: I1003 08:42:41.988728 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 08:42:41 crc kubenswrapper[4790]: E1003 08:42:41.989239 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 08:42:42.489220363 +0000 UTC m=+148.842154601 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 08:42:42 crc kubenswrapper[4790]: I1003 08:42:42.013897 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-pssx9" podStartSLOduration=127.013878119 podStartE2EDuration="2m7.013878119s" podCreationTimestamp="2025-10-03 08:40:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 08:42:42.013014784 +0000 UTC m=+148.365949022" watchObservedRunningTime="2025-10-03 08:42:42.013878119 +0000 UTC m=+148.366812347" Oct 03 08:42:42 crc kubenswrapper[4790]: I1003 08:42:42.015635 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-mksjr" podStartSLOduration=127.015625697 podStartE2EDuration="2m7.015625697s" podCreationTimestamp="2025-10-03 08:40:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 08:42:41.990896929 +0000 UTC m=+148.343831177" watchObservedRunningTime="2025-10-03 08:42:42.015625697 +0000 UTC m=+148.368559945" Oct 03 08:42:42 crc kubenswrapper[4790]: I1003 08:42:42.019487 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-r9zxx" Oct 03 08:42:42 crc kubenswrapper[4790]: I1003 08:42:42.089614 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-8f5qg" podStartSLOduration=127.089594785 podStartE2EDuration="2m7.089594785s" podCreationTimestamp="2025-10-03 08:40:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 08:42:42.046009502 +0000 UTC m=+148.398943760" watchObservedRunningTime="2025-10-03 08:42:42.089594785 +0000 UTC m=+148.442529023" Oct 03 08:42:42 crc kubenswrapper[4790]: I1003 08:42:42.091420 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-csbrj\" (UID: \"4f5de6e6-d3aa-4e4e-9f8c-bf67bf228b8c\") " pod="openshift-image-registry/image-registry-697d97f7c8-csbrj" Oct 03 08:42:42 crc kubenswrapper[4790]: E1003 08:42:42.091871 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 08:42:42.591857327 +0000 UTC m=+148.944791565 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-csbrj" (UID: "4f5de6e6-d3aa-4e4e-9f8c-bf67bf228b8c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 08:42:42 crc kubenswrapper[4790]: I1003 08:42:42.122808 4790 patch_prober.go:28] interesting pod/router-default-5444994796-xhdkx container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 03 08:42:42 crc kubenswrapper[4790]: [-]has-synced failed: reason withheld Oct 03 08:42:42 crc kubenswrapper[4790]: [+]process-running ok Oct 03 08:42:42 crc kubenswrapper[4790]: healthz check failed Oct 03 08:42:42 crc kubenswrapper[4790]: I1003 08:42:42.122876 4790 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-xhdkx" podUID="8f295eb1-856c-4063-8411-3282ffac9de5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 03 08:42:42 crc kubenswrapper[4790]: I1003 08:42:42.193566 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 08:42:42 crc kubenswrapper[4790]: E1003 08:42:42.194073 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 08:42:42.69405181 +0000 UTC m=+149.046986048 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 08:42:42 crc kubenswrapper[4790]: I1003 08:42:42.295003 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 08:42:42 crc kubenswrapper[4790]: I1003 08:42:42.295050 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 08:42:42 crc kubenswrapper[4790]: I1003 08:42:42.295119 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-csbrj\" (UID: \"4f5de6e6-d3aa-4e4e-9f8c-bf67bf228b8c\") " pod="openshift-image-registry/image-registry-697d97f7c8-csbrj" Oct 03 08:42:42 crc kubenswrapper[4790]: I1003 08:42:42.295142 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 08:42:42 crc kubenswrapper[4790]: I1003 08:42:42.295162 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 08:42:42 crc kubenswrapper[4790]: E1003 08:42:42.295626 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 08:42:42.795605214 +0000 UTC m=+149.148539442 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-csbrj" (UID: "4f5de6e6-d3aa-4e4e-9f8c-bf67bf228b8c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 08:42:42 crc kubenswrapper[4790]: I1003 08:42:42.296323 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 08:42:42 crc kubenswrapper[4790]: I1003 08:42:42.302704 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 08:42:42 crc kubenswrapper[4790]: I1003 08:42:42.321168 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 08:42:42 crc kubenswrapper[4790]: I1003 08:42:42.327296 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 08:42:42 crc kubenswrapper[4790]: I1003 08:42:42.395992 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 08:42:42 crc kubenswrapper[4790]: E1003 08:42:42.396274 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 08:42:42.896252313 +0000 UTC m=+149.249186551 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 08:42:42 crc kubenswrapper[4790]: I1003 08:42:42.498058 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-csbrj\" (UID: \"4f5de6e6-d3aa-4e4e-9f8c-bf67bf228b8c\") " pod="openshift-image-registry/image-registry-697d97f7c8-csbrj" Oct 03 08:42:42 crc kubenswrapper[4790]: E1003 08:42:42.498642 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 08:42:42.99861691 +0000 UTC m=+149.351551338 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-csbrj" (UID: "4f5de6e6-d3aa-4e4e-9f8c-bf67bf228b8c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 08:42:42 crc kubenswrapper[4790]: I1003 08:42:42.541901 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 03 08:42:42 crc kubenswrapper[4790]: I1003 08:42:42.551154 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 03 08:42:42 crc kubenswrapper[4790]: I1003 08:42:42.556242 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 08:42:42 crc kubenswrapper[4790]: I1003 08:42:42.599638 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 08:42:42 crc kubenswrapper[4790]: E1003 08:42:42.599919 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 08:42:43.099881536 +0000 UTC m=+149.452815834 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 08:42:42 crc kubenswrapper[4790]: I1003 08:42:42.702385 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-csbrj\" (UID: \"4f5de6e6-d3aa-4e4e-9f8c-bf67bf228b8c\") " pod="openshift-image-registry/image-registry-697d97f7c8-csbrj" Oct 03 08:42:42 crc kubenswrapper[4790]: E1003 08:42:42.705246 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 08:42:43.205226086 +0000 UTC m=+149.558160324 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-csbrj" (UID: "4f5de6e6-d3aa-4e4e-9f8c-bf67bf228b8c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 08:42:42 crc kubenswrapper[4790]: I1003 08:42:42.807220 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 08:42:42 crc kubenswrapper[4790]: E1003 08:42:42.807596 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 08:42:43.307556542 +0000 UTC m=+149.660490780 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 08:42:42 crc kubenswrapper[4790]: I1003 08:42:42.869434 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-h7tdr" Oct 03 08:42:42 crc kubenswrapper[4790]: I1003 08:42:42.919801 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-csbrj\" (UID: \"4f5de6e6-d3aa-4e4e-9f8c-bf67bf228b8c\") " pod="openshift-image-registry/image-registry-697d97f7c8-csbrj" Oct 03 08:42:42 crc kubenswrapper[4790]: E1003 08:42:42.922980 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 08:42:43.422961881 +0000 UTC m=+149.775896119 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-csbrj" (UID: "4f5de6e6-d3aa-4e4e-9f8c-bf67bf228b8c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 08:42:43 crc kubenswrapper[4790]: I1003 08:42:43.027124 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-zj7jl"] Oct 03 08:42:43 crc kubenswrapper[4790]: I1003 08:42:43.027616 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 08:42:43 crc kubenswrapper[4790]: E1003 08:42:43.027708 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 08:42:43.527688633 +0000 UTC m=+149.880622871 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 08:42:43 crc kubenswrapper[4790]: I1003 08:42:43.043938 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-zj7jl"] Oct 03 08:42:43 crc kubenswrapper[4790]: I1003 08:42:43.044049 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zj7jl" Oct 03 08:42:43 crc kubenswrapper[4790]: I1003 08:42:43.045060 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-csbrj\" (UID: \"4f5de6e6-d3aa-4e4e-9f8c-bf67bf228b8c\") " pod="openshift-image-registry/image-registry-697d97f7c8-csbrj" Oct 03 08:42:43 crc kubenswrapper[4790]: E1003 08:42:43.045479 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 08:42:43.545464329 +0000 UTC m=+149.898398567 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-csbrj" (UID: "4f5de6e6-d3aa-4e4e-9f8c-bf67bf228b8c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 08:42:43 crc kubenswrapper[4790]: I1003 08:42:43.097855 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Oct 03 08:42:43 crc kubenswrapper[4790]: I1003 08:42:43.130724 4790 patch_prober.go:28] interesting pod/router-default-5444994796-xhdkx container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 03 08:42:43 crc kubenswrapper[4790]: [-]has-synced failed: reason withheld Oct 03 08:42:43 crc kubenswrapper[4790]: [+]process-running ok Oct 03 08:42:43 crc kubenswrapper[4790]: healthz check failed Oct 03 08:42:43 crc kubenswrapper[4790]: I1003 08:42:43.130808 4790 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-xhdkx" podUID="8f295eb1-856c-4063-8411-3282ffac9de5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 03 08:42:43 crc kubenswrapper[4790]: I1003 08:42:43.137029 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-g9chf"] Oct 03 08:42:43 crc kubenswrapper[4790]: I1003 08:42:43.141061 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-g9chf" Oct 03 08:42:43 crc kubenswrapper[4790]: I1003 08:42:43.147944 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Oct 03 08:42:43 crc kubenswrapper[4790]: I1003 08:42:43.148355 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 08:42:43 crc kubenswrapper[4790]: E1003 08:42:43.150721 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 08:42:43.650688194 +0000 UTC m=+150.003622432 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 08:42:43 crc kubenswrapper[4790]: I1003 08:42:43.151567 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b66141f0-fc3d-4596-9484-d01137e1f76c-catalog-content\") pod \"community-operators-zj7jl\" (UID: \"b66141f0-fc3d-4596-9484-d01137e1f76c\") " pod="openshift-marketplace/community-operators-zj7jl" Oct 03 08:42:43 crc kubenswrapper[4790]: I1003 08:42:43.151641 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-csbrj\" (UID: \"4f5de6e6-d3aa-4e4e-9f8c-bf67bf228b8c\") " pod="openshift-image-registry/image-registry-697d97f7c8-csbrj" Oct 03 08:42:43 crc kubenswrapper[4790]: I1003 08:42:43.151669 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b66141f0-fc3d-4596-9484-d01137e1f76c-utilities\") pod \"community-operators-zj7jl\" (UID: \"b66141f0-fc3d-4596-9484-d01137e1f76c\") " pod="openshift-marketplace/community-operators-zj7jl" Oct 03 08:42:43 crc kubenswrapper[4790]: I1003 08:42:43.151757 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bdtwz\" (UniqueName: \"kubernetes.io/projected/b66141f0-fc3d-4596-9484-d01137e1f76c-kube-api-access-bdtwz\") pod \"community-operators-zj7jl\" (UID: \"b66141f0-fc3d-4596-9484-d01137e1f76c\") " pod="openshift-marketplace/community-operators-zj7jl" Oct 03 08:42:43 crc kubenswrapper[4790]: E1003 08:42:43.152548 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 08:42:43.652531216 +0000 UTC m=+150.005465454 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-csbrj" (UID: "4f5de6e6-d3aa-4e4e-9f8c-bf67bf228b8c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 08:42:43 crc kubenswrapper[4790]: I1003 08:42:43.183786 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-g9chf"] Oct 03 08:42:43 crc kubenswrapper[4790]: I1003 08:42:43.264010 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 08:42:43 crc kubenswrapper[4790]: I1003 08:42:43.264170 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b66141f0-fc3d-4596-9484-d01137e1f76c-utilities\") pod \"community-operators-zj7jl\" (UID: \"b66141f0-fc3d-4596-9484-d01137e1f76c\") " pod="openshift-marketplace/community-operators-zj7jl" Oct 03 08:42:43 crc kubenswrapper[4790]: I1003 08:42:43.264209 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6b044862-0484-4701-a965-796c87a3adc1-catalog-content\") pod \"certified-operators-g9chf\" (UID: \"6b044862-0484-4701-a965-796c87a3adc1\") " pod="openshift-marketplace/certified-operators-g9chf" Oct 03 08:42:43 crc kubenswrapper[4790]: I1003 08:42:43.264236 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t6w6k\" (UniqueName: \"kubernetes.io/projected/6b044862-0484-4701-a965-796c87a3adc1-kube-api-access-t6w6k\") pod \"certified-operators-g9chf\" (UID: \"6b044862-0484-4701-a965-796c87a3adc1\") " pod="openshift-marketplace/certified-operators-g9chf" Oct 03 08:42:43 crc kubenswrapper[4790]: I1003 08:42:43.264279 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6b044862-0484-4701-a965-796c87a3adc1-utilities\") pod \"certified-operators-g9chf\" (UID: \"6b044862-0484-4701-a965-796c87a3adc1\") " pod="openshift-marketplace/certified-operators-g9chf" Oct 03 08:42:43 crc kubenswrapper[4790]: I1003 08:42:43.264298 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bdtwz\" (UniqueName: \"kubernetes.io/projected/b66141f0-fc3d-4596-9484-d01137e1f76c-kube-api-access-bdtwz\") pod \"community-operators-zj7jl\" (UID: \"b66141f0-fc3d-4596-9484-d01137e1f76c\") " pod="openshift-marketplace/community-operators-zj7jl" Oct 03 08:42:43 crc kubenswrapper[4790]: I1003 08:42:43.264336 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b66141f0-fc3d-4596-9484-d01137e1f76c-catalog-content\") pod \"community-operators-zj7jl\" (UID: \"b66141f0-fc3d-4596-9484-d01137e1f76c\") " pod="openshift-marketplace/community-operators-zj7jl" Oct 03 08:42:43 crc kubenswrapper[4790]: I1003 08:42:43.264888 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b66141f0-fc3d-4596-9484-d01137e1f76c-catalog-content\") pod \"community-operators-zj7jl\" (UID: \"b66141f0-fc3d-4596-9484-d01137e1f76c\") " pod="openshift-marketplace/community-operators-zj7jl" Oct 03 08:42:43 crc kubenswrapper[4790]: E1003 08:42:43.264966 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 08:42:43.764948392 +0000 UTC m=+150.117882630 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 08:42:43 crc kubenswrapper[4790]: I1003 08:42:43.265192 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b66141f0-fc3d-4596-9484-d01137e1f76c-utilities\") pod \"community-operators-zj7jl\" (UID: \"b66141f0-fc3d-4596-9484-d01137e1f76c\") " pod="openshift-marketplace/community-operators-zj7jl" Oct 03 08:42:43 crc kubenswrapper[4790]: I1003 08:42:43.308769 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-rf5kb"] Oct 03 08:42:43 crc kubenswrapper[4790]: I1003 08:42:43.309810 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rf5kb" Oct 03 08:42:43 crc kubenswrapper[4790]: I1003 08:42:43.323338 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bdtwz\" (UniqueName: \"kubernetes.io/projected/b66141f0-fc3d-4596-9484-d01137e1f76c-kube-api-access-bdtwz\") pod \"community-operators-zj7jl\" (UID: \"b66141f0-fc3d-4596-9484-d01137e1f76c\") " pod="openshift-marketplace/community-operators-zj7jl" Oct 03 08:42:43 crc kubenswrapper[4790]: I1003 08:42:43.341034 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-rf5kb"] Oct 03 08:42:43 crc kubenswrapper[4790]: I1003 08:42:43.365542 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6b044862-0484-4701-a965-796c87a3adc1-utilities\") pod \"certified-operators-g9chf\" (UID: \"6b044862-0484-4701-a965-796c87a3adc1\") " pod="openshift-marketplace/certified-operators-g9chf" Oct 03 08:42:43 crc kubenswrapper[4790]: I1003 08:42:43.365647 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-csbrj\" (UID: \"4f5de6e6-d3aa-4e4e-9f8c-bf67bf228b8c\") " pod="openshift-image-registry/image-registry-697d97f7c8-csbrj" Oct 03 08:42:43 crc kubenswrapper[4790]: I1003 08:42:43.365674 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b8qfx\" (UniqueName: \"kubernetes.io/projected/46b04491-809e-431f-bac5-2a84accab94f-kube-api-access-b8qfx\") pod \"community-operators-rf5kb\" (UID: \"46b04491-809e-431f-bac5-2a84accab94f\") " pod="openshift-marketplace/community-operators-rf5kb" Oct 03 08:42:43 crc kubenswrapper[4790]: I1003 08:42:43.365701 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/46b04491-809e-431f-bac5-2a84accab94f-catalog-content\") pod \"community-operators-rf5kb\" (UID: \"46b04491-809e-431f-bac5-2a84accab94f\") " pod="openshift-marketplace/community-operators-rf5kb" Oct 03 08:42:43 crc kubenswrapper[4790]: I1003 08:42:43.365717 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6b044862-0484-4701-a965-796c87a3adc1-catalog-content\") pod \"certified-operators-g9chf\" (UID: \"6b044862-0484-4701-a965-796c87a3adc1\") " pod="openshift-marketplace/certified-operators-g9chf" Oct 03 08:42:43 crc kubenswrapper[4790]: I1003 08:42:43.365740 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t6w6k\" (UniqueName: \"kubernetes.io/projected/6b044862-0484-4701-a965-796c87a3adc1-kube-api-access-t6w6k\") pod \"certified-operators-g9chf\" (UID: \"6b044862-0484-4701-a965-796c87a3adc1\") " pod="openshift-marketplace/certified-operators-g9chf" Oct 03 08:42:43 crc kubenswrapper[4790]: I1003 08:42:43.365763 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/46b04491-809e-431f-bac5-2a84accab94f-utilities\") pod \"community-operators-rf5kb\" (UID: \"46b04491-809e-431f-bac5-2a84accab94f\") " pod="openshift-marketplace/community-operators-rf5kb" Oct 03 08:42:43 crc kubenswrapper[4790]: I1003 08:42:43.366200 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6b044862-0484-4701-a965-796c87a3adc1-utilities\") pod \"certified-operators-g9chf\" (UID: \"6b044862-0484-4701-a965-796c87a3adc1\") " pod="openshift-marketplace/certified-operators-g9chf" Oct 03 08:42:43 crc kubenswrapper[4790]: E1003 08:42:43.366495 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 08:42:43.866481766 +0000 UTC m=+150.219416004 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-csbrj" (UID: "4f5de6e6-d3aa-4e4e-9f8c-bf67bf228b8c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 08:42:43 crc kubenswrapper[4790]: I1003 08:42:43.366863 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6b044862-0484-4701-a965-796c87a3adc1-catalog-content\") pod \"certified-operators-g9chf\" (UID: \"6b044862-0484-4701-a965-796c87a3adc1\") " pod="openshift-marketplace/certified-operators-g9chf" Oct 03 08:42:43 crc kubenswrapper[4790]: I1003 08:42:43.412852 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t6w6k\" (UniqueName: \"kubernetes.io/projected/6b044862-0484-4701-a965-796c87a3adc1-kube-api-access-t6w6k\") pod \"certified-operators-g9chf\" (UID: \"6b044862-0484-4701-a965-796c87a3adc1\") " pod="openshift-marketplace/certified-operators-g9chf" Oct 03 08:42:43 crc kubenswrapper[4790]: I1003 08:42:43.469359 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 08:42:43 crc kubenswrapper[4790]: I1003 08:42:43.469600 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zj7jl" Oct 03 08:42:43 crc kubenswrapper[4790]: I1003 08:42:43.469701 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b8qfx\" (UniqueName: \"kubernetes.io/projected/46b04491-809e-431f-bac5-2a84accab94f-kube-api-access-b8qfx\") pod \"community-operators-rf5kb\" (UID: \"46b04491-809e-431f-bac5-2a84accab94f\") " pod="openshift-marketplace/community-operators-rf5kb" Oct 03 08:42:43 crc kubenswrapper[4790]: I1003 08:42:43.469754 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/46b04491-809e-431f-bac5-2a84accab94f-catalog-content\") pod \"community-operators-rf5kb\" (UID: \"46b04491-809e-431f-bac5-2a84accab94f\") " pod="openshift-marketplace/community-operators-rf5kb" Oct 03 08:42:43 crc kubenswrapper[4790]: I1003 08:42:43.469798 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/46b04491-809e-431f-bac5-2a84accab94f-utilities\") pod \"community-operators-rf5kb\" (UID: \"46b04491-809e-431f-bac5-2a84accab94f\") " pod="openshift-marketplace/community-operators-rf5kb" Oct 03 08:42:43 crc kubenswrapper[4790]: E1003 08:42:43.470090 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 08:42:43.970064717 +0000 UTC m=+150.322998955 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 08:42:43 crc kubenswrapper[4790]: I1003 08:42:43.470434 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/46b04491-809e-431f-bac5-2a84accab94f-utilities\") pod \"community-operators-rf5kb\" (UID: \"46b04491-809e-431f-bac5-2a84accab94f\") " pod="openshift-marketplace/community-operators-rf5kb" Oct 03 08:42:43 crc kubenswrapper[4790]: I1003 08:42:43.473209 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/46b04491-809e-431f-bac5-2a84accab94f-catalog-content\") pod \"community-operators-rf5kb\" (UID: \"46b04491-809e-431f-bac5-2a84accab94f\") " pod="openshift-marketplace/community-operators-rf5kb" Oct 03 08:42:43 crc kubenswrapper[4790]: I1003 08:42:43.492037 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b8qfx\" (UniqueName: \"kubernetes.io/projected/46b04491-809e-431f-bac5-2a84accab94f-kube-api-access-b8qfx\") pod \"community-operators-rf5kb\" (UID: \"46b04491-809e-431f-bac5-2a84accab94f\") " pod="openshift-marketplace/community-operators-rf5kb" Oct 03 08:42:43 crc kubenswrapper[4790]: I1003 08:42:43.493594 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-g9chf" Oct 03 08:42:43 crc kubenswrapper[4790]: I1003 08:42:43.535423 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-7ww29"] Oct 03 08:42:43 crc kubenswrapper[4790]: I1003 08:42:43.536488 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7ww29" Oct 03 08:42:43 crc kubenswrapper[4790]: I1003 08:42:43.542357 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-7ww29"] Oct 03 08:42:43 crc kubenswrapper[4790]: I1003 08:42:43.580127 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3003b3b3-a1dd-4db2-aaee-8a4b0db7d46d-utilities\") pod \"certified-operators-7ww29\" (UID: \"3003b3b3-a1dd-4db2-aaee-8a4b0db7d46d\") " pod="openshift-marketplace/certified-operators-7ww29" Oct 03 08:42:43 crc kubenswrapper[4790]: I1003 08:42:43.580202 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3003b3b3-a1dd-4db2-aaee-8a4b0db7d46d-catalog-content\") pod \"certified-operators-7ww29\" (UID: \"3003b3b3-a1dd-4db2-aaee-8a4b0db7d46d\") " pod="openshift-marketplace/certified-operators-7ww29" Oct 03 08:42:43 crc kubenswrapper[4790]: I1003 08:42:43.580271 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r5frg\" (UniqueName: \"kubernetes.io/projected/3003b3b3-a1dd-4db2-aaee-8a4b0db7d46d-kube-api-access-r5frg\") pod \"certified-operators-7ww29\" (UID: \"3003b3b3-a1dd-4db2-aaee-8a4b0db7d46d\") " pod="openshift-marketplace/certified-operators-7ww29" Oct 03 08:42:43 crc kubenswrapper[4790]: I1003 08:42:43.580327 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-csbrj\" (UID: \"4f5de6e6-d3aa-4e4e-9f8c-bf67bf228b8c\") " pod="openshift-image-registry/image-registry-697d97f7c8-csbrj" Oct 03 08:42:43 crc kubenswrapper[4790]: E1003 08:42:43.580682 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 08:42:44.080665973 +0000 UTC m=+150.433600211 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-csbrj" (UID: "4f5de6e6-d3aa-4e4e-9f8c-bf67bf228b8c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 08:42:43 crc kubenswrapper[4790]: W1003 08:42:43.607161 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3b6479f0_333b_4a96_9adf_2099afdc2447.slice/crio-03772d0d642510e7fb95c3e088f9cec4fab176f95bb74c06a787cd12ba73691b WatchSource:0}: Error finding container 03772d0d642510e7fb95c3e088f9cec4fab176f95bb74c06a787cd12ba73691b: Status 404 returned error can't find the container with id 03772d0d642510e7fb95c3e088f9cec4fab176f95bb74c06a787cd12ba73691b Oct 03 08:42:43 crc kubenswrapper[4790]: I1003 08:42:43.686332 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 08:42:43 crc kubenswrapper[4790]: I1003 08:42:43.686731 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3003b3b3-a1dd-4db2-aaee-8a4b0db7d46d-utilities\") pod \"certified-operators-7ww29\" (UID: \"3003b3b3-a1dd-4db2-aaee-8a4b0db7d46d\") " pod="openshift-marketplace/certified-operators-7ww29" Oct 03 08:42:43 crc kubenswrapper[4790]: I1003 08:42:43.686773 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3003b3b3-a1dd-4db2-aaee-8a4b0db7d46d-catalog-content\") pod \"certified-operators-7ww29\" (UID: \"3003b3b3-a1dd-4db2-aaee-8a4b0db7d46d\") " pod="openshift-marketplace/certified-operators-7ww29" Oct 03 08:42:43 crc kubenswrapper[4790]: I1003 08:42:43.686835 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r5frg\" (UniqueName: \"kubernetes.io/projected/3003b3b3-a1dd-4db2-aaee-8a4b0db7d46d-kube-api-access-r5frg\") pod \"certified-operators-7ww29\" (UID: \"3003b3b3-a1dd-4db2-aaee-8a4b0db7d46d\") " pod="openshift-marketplace/certified-operators-7ww29" Oct 03 08:42:43 crc kubenswrapper[4790]: E1003 08:42:43.687858 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 08:42:44.187836343 +0000 UTC m=+150.540770591 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 08:42:43 crc kubenswrapper[4790]: I1003 08:42:43.688297 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3003b3b3-a1dd-4db2-aaee-8a4b0db7d46d-utilities\") pod \"certified-operators-7ww29\" (UID: \"3003b3b3-a1dd-4db2-aaee-8a4b0db7d46d\") " pod="openshift-marketplace/certified-operators-7ww29" Oct 03 08:42:43 crc kubenswrapper[4790]: I1003 08:42:43.688410 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3003b3b3-a1dd-4db2-aaee-8a4b0db7d46d-catalog-content\") pod \"certified-operators-7ww29\" (UID: \"3003b3b3-a1dd-4db2-aaee-8a4b0db7d46d\") " pod="openshift-marketplace/certified-operators-7ww29" Oct 03 08:42:43 crc kubenswrapper[4790]: I1003 08:42:43.703481 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rf5kb" Oct 03 08:42:43 crc kubenswrapper[4790]: I1003 08:42:43.720941 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r5frg\" (UniqueName: \"kubernetes.io/projected/3003b3b3-a1dd-4db2-aaee-8a4b0db7d46d-kube-api-access-r5frg\") pod \"certified-operators-7ww29\" (UID: \"3003b3b3-a1dd-4db2-aaee-8a4b0db7d46d\") " pod="openshift-marketplace/certified-operators-7ww29" Oct 03 08:42:43 crc kubenswrapper[4790]: I1003 08:42:43.770346 4790 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Oct 03 08:42:43 crc kubenswrapper[4790]: I1003 08:42:43.808647 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-csbrj\" (UID: \"4f5de6e6-d3aa-4e4e-9f8c-bf67bf228b8c\") " pod="openshift-image-registry/image-registry-697d97f7c8-csbrj" Oct 03 08:42:43 crc kubenswrapper[4790]: E1003 08:42:43.809295 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 08:42:44.309278991 +0000 UTC m=+150.662213229 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-csbrj" (UID: "4f5de6e6-d3aa-4e4e-9f8c-bf67bf228b8c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 08:42:43 crc kubenswrapper[4790]: I1003 08:42:43.895668 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"a6ed572f999699568aa1fa54c553b8c303abdbefb52ebb0f93164cc25c3d5be8"} Oct 03 08:42:43 crc kubenswrapper[4790]: I1003 08:42:43.895729 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"df897820d99b69f67f5c2f6faf731417750bec7ed83b73aae71ea09df489cf9b"} Oct 03 08:42:43 crc kubenswrapper[4790]: I1003 08:42:43.900482 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7ww29" Oct 03 08:42:43 crc kubenswrapper[4790]: I1003 08:42:43.911827 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 08:42:43 crc kubenswrapper[4790]: E1003 08:42:43.912304 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 08:42:44.412286315 +0000 UTC m=+150.765220563 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 08:42:43 crc kubenswrapper[4790]: I1003 08:42:43.919868 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-t9hsf" event={"ID":"a7785589-4fe5-4219-8fc8-f314d6e0dd7c","Type":"ContainerStarted","Data":"dcd2edb88ba754b74c569b9b736f74857a79a62fe063f2b0b725ca36e636f06e"} Oct 03 08:42:43 crc kubenswrapper[4790]: I1003 08:42:43.919917 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-t9hsf" event={"ID":"a7785589-4fe5-4219-8fc8-f314d6e0dd7c","Type":"ContainerStarted","Data":"fbc9ba8ec1be83946efeb69a20510fcfab277cb0db34a169826ce4539b03ef5f"} Oct 03 08:42:43 crc kubenswrapper[4790]: I1003 08:42:43.925327 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"36f9cf52d9dc87ccdacee1501ae6289f8849e4e28424c5c4605b2a47b2f8c391"} Oct 03 08:42:43 crc kubenswrapper[4790]: I1003 08:42:43.932954 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"03772d0d642510e7fb95c3e088f9cec4fab176f95bb74c06a787cd12ba73691b"} Oct 03 08:42:43 crc kubenswrapper[4790]: I1003 08:42:43.951606 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-zj7jl"] Oct 03 08:42:44 crc kubenswrapper[4790]: I1003 08:42:44.013249 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-csbrj\" (UID: \"4f5de6e6-d3aa-4e4e-9f8c-bf67bf228b8c\") " pod="openshift-image-registry/image-registry-697d97f7c8-csbrj" Oct 03 08:42:44 crc kubenswrapper[4790]: E1003 08:42:44.015968 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 08:42:44.515947488 +0000 UTC m=+150.868881936 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-csbrj" (UID: "4f5de6e6-d3aa-4e4e-9f8c-bf67bf228b8c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 08:42:44 crc kubenswrapper[4790]: I1003 08:42:44.018327 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-g9chf"] Oct 03 08:42:44 crc kubenswrapper[4790]: I1003 08:42:44.114490 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 08:42:44 crc kubenswrapper[4790]: E1003 08:42:44.114898 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 08:42:44.614881161 +0000 UTC m=+150.967815399 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 08:42:44 crc kubenswrapper[4790]: I1003 08:42:44.133276 4790 patch_prober.go:28] interesting pod/router-default-5444994796-xhdkx container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 03 08:42:44 crc kubenswrapper[4790]: [-]has-synced failed: reason withheld Oct 03 08:42:44 crc kubenswrapper[4790]: [+]process-running ok Oct 03 08:42:44 crc kubenswrapper[4790]: healthz check failed Oct 03 08:42:44 crc kubenswrapper[4790]: I1003 08:42:44.133353 4790 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-xhdkx" podUID="8f295eb1-856c-4063-8411-3282ffac9de5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 03 08:42:44 crc kubenswrapper[4790]: I1003 08:42:44.218089 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-csbrj\" (UID: \"4f5de6e6-d3aa-4e4e-9f8c-bf67bf228b8c\") " pod="openshift-image-registry/image-registry-697d97f7c8-csbrj" Oct 03 08:42:44 crc kubenswrapper[4790]: E1003 08:42:44.218989 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 08:42:44.718972105 +0000 UTC m=+151.071906363 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-csbrj" (UID: "4f5de6e6-d3aa-4e4e-9f8c-bf67bf228b8c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 08:42:44 crc kubenswrapper[4790]: I1003 08:42:44.238776 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-rf5kb"] Oct 03 08:42:44 crc kubenswrapper[4790]: W1003 08:42:44.257263 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod46b04491_809e_431f_bac5_2a84accab94f.slice/crio-660712d86bf8e912df80819c20bd0c1a2587ee3cb72814bffa15c85100cea310 WatchSource:0}: Error finding container 660712d86bf8e912df80819c20bd0c1a2587ee3cb72814bffa15c85100cea310: Status 404 returned error can't find the container with id 660712d86bf8e912df80819c20bd0c1a2587ee3cb72814bffa15c85100cea310 Oct 03 08:42:44 crc kubenswrapper[4790]: I1003 08:42:44.320097 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 08:42:44 crc kubenswrapper[4790]: E1003 08:42:44.320606 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 08:42:44.820586671 +0000 UTC m=+151.173520909 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 08:42:44 crc kubenswrapper[4790]: I1003 08:42:44.320645 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-7ww29"] Oct 03 08:42:44 crc kubenswrapper[4790]: W1003 08:42:44.345380 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3003b3b3_a1dd_4db2_aaee_8a4b0db7d46d.slice/crio-8c4b8f9f4557886462abf51272a21a64451fae94a03f341bb44ee38780eefaa3 WatchSource:0}: Error finding container 8c4b8f9f4557886462abf51272a21a64451fae94a03f341bb44ee38780eefaa3: Status 404 returned error can't find the container with id 8c4b8f9f4557886462abf51272a21a64451fae94a03f341bb44ee38780eefaa3 Oct 03 08:42:44 crc kubenswrapper[4790]: I1003 08:42:44.422206 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-csbrj\" (UID: \"4f5de6e6-d3aa-4e4e-9f8c-bf67bf228b8c\") " pod="openshift-image-registry/image-registry-697d97f7c8-csbrj" Oct 03 08:42:44 crc kubenswrapper[4790]: E1003 08:42:44.422669 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-03 08:42:44.922642569 +0000 UTC m=+151.275576807 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-csbrj" (UID: "4f5de6e6-d3aa-4e4e-9f8c-bf67bf228b8c") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 08:42:44 crc kubenswrapper[4790]: I1003 08:42:44.524391 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 08:42:44 crc kubenswrapper[4790]: E1003 08:42:44.524825 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-03 08:42:45.024805091 +0000 UTC m=+151.377739329 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 03 08:42:44 crc kubenswrapper[4790]: I1003 08:42:44.530784 4790 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2025-10-03T08:42:43.770375189Z","Handler":null,"Name":""} Oct 03 08:42:44 crc kubenswrapper[4790]: I1003 08:42:44.543078 4790 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Oct 03 08:42:44 crc kubenswrapper[4790]: I1003 08:42:44.543133 4790 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Oct 03 08:42:44 crc kubenswrapper[4790]: I1003 08:42:44.625903 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-csbrj\" (UID: \"4f5de6e6-d3aa-4e4e-9f8c-bf67bf228b8c\") " pod="openshift-image-registry/image-registry-697d97f7c8-csbrj" Oct 03 08:42:44 crc kubenswrapper[4790]: I1003 08:42:44.628791 4790 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Oct 03 08:42:44 crc kubenswrapper[4790]: I1003 08:42:44.628843 4790 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-csbrj\" (UID: \"4f5de6e6-d3aa-4e4e-9f8c-bf67bf228b8c\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-csbrj" Oct 03 08:42:44 crc kubenswrapper[4790]: I1003 08:42:44.649807 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-csbrj\" (UID: \"4f5de6e6-d3aa-4e4e-9f8c-bf67bf228b8c\") " pod="openshift-image-registry/image-registry-697d97f7c8-csbrj" Oct 03 08:42:44 crc kubenswrapper[4790]: I1003 08:42:44.726781 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 03 08:42:44 crc kubenswrapper[4790]: I1003 08:42:44.733143 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Oct 03 08:42:44 crc kubenswrapper[4790]: I1003 08:42:44.745116 4790 patch_prober.go:28] interesting pod/downloads-7954f5f757-4gzvp container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.11:8080/\": dial tcp 10.217.0.11:8080: connect: connection refused" start-of-body= Oct 03 08:42:44 crc kubenswrapper[4790]: I1003 08:42:44.745193 4790 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-4gzvp" podUID="cf6939a0-5f2b-4db5-b5c2-613e19bb4b43" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.11:8080/\": dial tcp 10.217.0.11:8080: connect: connection refused" Oct 03 08:42:44 crc kubenswrapper[4790]: I1003 08:42:44.745242 4790 patch_prober.go:28] interesting pod/downloads-7954f5f757-4gzvp container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.11:8080/\": dial tcp 10.217.0.11:8080: connect: connection refused" start-of-body= Oct 03 08:42:44 crc kubenswrapper[4790]: I1003 08:42:44.745297 4790 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-4gzvp" podUID="cf6939a0-5f2b-4db5-b5c2-613e19bb4b43" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.11:8080/\": dial tcp 10.217.0.11:8080: connect: connection refused" Oct 03 08:42:44 crc kubenswrapper[4790]: I1003 08:42:44.791478 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-csbrj" Oct 03 08:42:44 crc kubenswrapper[4790]: I1003 08:42:44.944002 4790 generic.go:334] "Generic (PLEG): container finished" podID="b66141f0-fc3d-4596-9484-d01137e1f76c" containerID="6f54614a7213f63ecaf6d9243af5412285bcbd46c6d78db7c1e8f4f9e053a086" exitCode=0 Oct 03 08:42:44 crc kubenswrapper[4790]: I1003 08:42:44.944084 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zj7jl" event={"ID":"b66141f0-fc3d-4596-9484-d01137e1f76c","Type":"ContainerDied","Data":"6f54614a7213f63ecaf6d9243af5412285bcbd46c6d78db7c1e8f4f9e053a086"} Oct 03 08:42:44 crc kubenswrapper[4790]: I1003 08:42:44.944119 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zj7jl" event={"ID":"b66141f0-fc3d-4596-9484-d01137e1f76c","Type":"ContainerStarted","Data":"af5f332bd6f8ba73c91b2fab8740d3da9902a418a6b8e11f089f6525fa28536d"} Oct 03 08:42:44 crc kubenswrapper[4790]: I1003 08:42:44.950394 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"7c0b6cd99ac5393fa5198e3bdc63af43c49c8454eac4f2f803d3ca82031ea376"} Oct 03 08:42:44 crc kubenswrapper[4790]: I1003 08:42:44.951151 4790 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 03 08:42:44 crc kubenswrapper[4790]: I1003 08:42:44.952739 4790 generic.go:334] "Generic (PLEG): container finished" podID="6b044862-0484-4701-a965-796c87a3adc1" containerID="93ce6f94129b1a31d4e944d62f4de87a09c5ebc34e114abf8f90f72a1c9aa16d" exitCode=0 Oct 03 08:42:44 crc kubenswrapper[4790]: I1003 08:42:44.952807 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-g9chf" event={"ID":"6b044862-0484-4701-a965-796c87a3adc1","Type":"ContainerDied","Data":"93ce6f94129b1a31d4e944d62f4de87a09c5ebc34e114abf8f90f72a1c9aa16d"} Oct 03 08:42:44 crc kubenswrapper[4790]: I1003 08:42:44.952829 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-g9chf" event={"ID":"6b044862-0484-4701-a965-796c87a3adc1","Type":"ContainerStarted","Data":"ae841a1c47fc6b6929936a4b8de238798847375c72009d576550606b5a6c4d5a"} Oct 03 08:42:44 crc kubenswrapper[4790]: I1003 08:42:44.955381 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"0f1a302fce3480679480f039462dfa0407efb305296a388d3e0d05efb0cf9ec0"} Oct 03 08:42:44 crc kubenswrapper[4790]: I1003 08:42:44.955510 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 08:42:44 crc kubenswrapper[4790]: I1003 08:42:44.958500 4790 generic.go:334] "Generic (PLEG): container finished" podID="46b04491-809e-431f-bac5-2a84accab94f" containerID="12cca27e04c114ffeb063306d85fa28919cb69563922dd41322c84bfd667de03" exitCode=0 Oct 03 08:42:44 crc kubenswrapper[4790]: I1003 08:42:44.958554 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rf5kb" event={"ID":"46b04491-809e-431f-bac5-2a84accab94f","Type":"ContainerDied","Data":"12cca27e04c114ffeb063306d85fa28919cb69563922dd41322c84bfd667de03"} Oct 03 08:42:44 crc kubenswrapper[4790]: I1003 08:42:44.958623 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rf5kb" event={"ID":"46b04491-809e-431f-bac5-2a84accab94f","Type":"ContainerStarted","Data":"660712d86bf8e912df80819c20bd0c1a2587ee3cb72814bffa15c85100cea310"} Oct 03 08:42:44 crc kubenswrapper[4790]: I1003 08:42:44.962223 4790 generic.go:334] "Generic (PLEG): container finished" podID="3003b3b3-a1dd-4db2-aaee-8a4b0db7d46d" containerID="60f604cae7e195869e3adbb65ec6078f64ec51ce6f7da3dc96f758a17dfda06a" exitCode=0 Oct 03 08:42:44 crc kubenswrapper[4790]: I1003 08:42:44.962286 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7ww29" event={"ID":"3003b3b3-a1dd-4db2-aaee-8a4b0db7d46d","Type":"ContainerDied","Data":"60f604cae7e195869e3adbb65ec6078f64ec51ce6f7da3dc96f758a17dfda06a"} Oct 03 08:42:44 crc kubenswrapper[4790]: I1003 08:42:44.962315 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7ww29" event={"ID":"3003b3b3-a1dd-4db2-aaee-8a4b0db7d46d","Type":"ContainerStarted","Data":"8c4b8f9f4557886462abf51272a21a64451fae94a03f341bb44ee38780eefaa3"} Oct 03 08:42:44 crc kubenswrapper[4790]: I1003 08:42:44.993510 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-t9hsf" event={"ID":"a7785589-4fe5-4219-8fc8-f314d6e0dd7c","Type":"ContainerStarted","Data":"e602065509cd99af5d995b02b74ec3076618733346643396b2066e9caaaf3051"} Oct 03 08:42:45 crc kubenswrapper[4790]: I1003 08:42:45.016209 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-csbrj"] Oct 03 08:42:45 crc kubenswrapper[4790]: W1003 08:42:45.039024 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4f5de6e6_d3aa_4e4e_9f8c_bf67bf228b8c.slice/crio-5b0e9a812811ac3d469408b413be27b27ef9acd8068d65cb0718c22f9d428890 WatchSource:0}: Error finding container 5b0e9a812811ac3d469408b413be27b27ef9acd8068d65cb0718c22f9d428890: Status 404 returned error can't find the container with id 5b0e9a812811ac3d469408b413be27b27ef9acd8068d65cb0718c22f9d428890 Oct 03 08:42:45 crc kubenswrapper[4790]: I1003 08:42:45.043914 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-wlp9g" Oct 03 08:42:45 crc kubenswrapper[4790]: I1003 08:42:45.050135 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-wlp9g" Oct 03 08:42:45 crc kubenswrapper[4790]: I1003 08:42:45.073031 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-t9hsf" podStartSLOduration=11.073007797 podStartE2EDuration="11.073007797s" podCreationTimestamp="2025-10-03 08:42:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 08:42:45.068926783 +0000 UTC m=+151.421861021" watchObservedRunningTime="2025-10-03 08:42:45.073007797 +0000 UTC m=+151.425942035" Oct 03 08:42:45 crc kubenswrapper[4790]: I1003 08:42:45.101097 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-sttgw"] Oct 03 08:42:45 crc kubenswrapper[4790]: I1003 08:42:45.102939 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-sttgw" Oct 03 08:42:45 crc kubenswrapper[4790]: I1003 08:42:45.105659 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Oct 03 08:42:45 crc kubenswrapper[4790]: I1003 08:42:45.108334 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-sttgw"] Oct 03 08:42:45 crc kubenswrapper[4790]: I1003 08:42:45.121981 4790 patch_prober.go:28] interesting pod/router-default-5444994796-xhdkx container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 03 08:42:45 crc kubenswrapper[4790]: [-]has-synced failed: reason withheld Oct 03 08:42:45 crc kubenswrapper[4790]: [+]process-running ok Oct 03 08:42:45 crc kubenswrapper[4790]: healthz check failed Oct 03 08:42:45 crc kubenswrapper[4790]: I1003 08:42:45.122062 4790 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-xhdkx" podUID="8f295eb1-856c-4063-8411-3282ffac9de5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 03 08:42:45 crc kubenswrapper[4790]: I1003 08:42:45.235515 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/de73f7fd-8767-4ac4-8975-9eb18a03b881-catalog-content\") pod \"redhat-marketplace-sttgw\" (UID: \"de73f7fd-8767-4ac4-8975-9eb18a03b881\") " pod="openshift-marketplace/redhat-marketplace-sttgw" Oct 03 08:42:45 crc kubenswrapper[4790]: I1003 08:42:45.235654 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/de73f7fd-8767-4ac4-8975-9eb18a03b881-utilities\") pod \"redhat-marketplace-sttgw\" (UID: \"de73f7fd-8767-4ac4-8975-9eb18a03b881\") " pod="openshift-marketplace/redhat-marketplace-sttgw" Oct 03 08:42:45 crc kubenswrapper[4790]: I1003 08:42:45.235694 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vrqzl\" (UniqueName: \"kubernetes.io/projected/de73f7fd-8767-4ac4-8975-9eb18a03b881-kube-api-access-vrqzl\") pod \"redhat-marketplace-sttgw\" (UID: \"de73f7fd-8767-4ac4-8975-9eb18a03b881\") " pod="openshift-marketplace/redhat-marketplace-sttgw" Oct 03 08:42:45 crc kubenswrapper[4790]: I1003 08:42:45.337398 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/de73f7fd-8767-4ac4-8975-9eb18a03b881-utilities\") pod \"redhat-marketplace-sttgw\" (UID: \"de73f7fd-8767-4ac4-8975-9eb18a03b881\") " pod="openshift-marketplace/redhat-marketplace-sttgw" Oct 03 08:42:45 crc kubenswrapper[4790]: I1003 08:42:45.337474 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vrqzl\" (UniqueName: \"kubernetes.io/projected/de73f7fd-8767-4ac4-8975-9eb18a03b881-kube-api-access-vrqzl\") pod \"redhat-marketplace-sttgw\" (UID: \"de73f7fd-8767-4ac4-8975-9eb18a03b881\") " pod="openshift-marketplace/redhat-marketplace-sttgw" Oct 03 08:42:45 crc kubenswrapper[4790]: I1003 08:42:45.337520 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/de73f7fd-8767-4ac4-8975-9eb18a03b881-catalog-content\") pod \"redhat-marketplace-sttgw\" (UID: \"de73f7fd-8767-4ac4-8975-9eb18a03b881\") " pod="openshift-marketplace/redhat-marketplace-sttgw" Oct 03 08:42:45 crc kubenswrapper[4790]: I1003 08:42:45.338095 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/de73f7fd-8767-4ac4-8975-9eb18a03b881-catalog-content\") pod \"redhat-marketplace-sttgw\" (UID: \"de73f7fd-8767-4ac4-8975-9eb18a03b881\") " pod="openshift-marketplace/redhat-marketplace-sttgw" Oct 03 08:42:45 crc kubenswrapper[4790]: I1003 08:42:45.338359 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/de73f7fd-8767-4ac4-8975-9eb18a03b881-utilities\") pod \"redhat-marketplace-sttgw\" (UID: \"de73f7fd-8767-4ac4-8975-9eb18a03b881\") " pod="openshift-marketplace/redhat-marketplace-sttgw" Oct 03 08:42:45 crc kubenswrapper[4790]: I1003 08:42:45.385792 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-ggr2t" Oct 03 08:42:45 crc kubenswrapper[4790]: I1003 08:42:45.386209 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-ggr2t" Oct 03 08:42:45 crc kubenswrapper[4790]: I1003 08:42:45.388589 4790 patch_prober.go:28] interesting pod/console-f9d7485db-ggr2t container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.13:8443/health\": dial tcp 10.217.0.13:8443: connect: connection refused" start-of-body= Oct 03 08:42:45 crc kubenswrapper[4790]: I1003 08:42:45.388678 4790 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-ggr2t" podUID="f53170bf-fd8d-41c2-b353-50c5d5cff3d8" containerName="console" probeResult="failure" output="Get \"https://10.217.0.13:8443/health\": dial tcp 10.217.0.13:8443: connect: connection refused" Oct 03 08:42:45 crc kubenswrapper[4790]: I1003 08:42:45.389229 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vrqzl\" (UniqueName: \"kubernetes.io/projected/de73f7fd-8767-4ac4-8975-9eb18a03b881-kube-api-access-vrqzl\") pod \"redhat-marketplace-sttgw\" (UID: \"de73f7fd-8767-4ac4-8975-9eb18a03b881\") " pod="openshift-marketplace/redhat-marketplace-sttgw" Oct 03 08:42:45 crc kubenswrapper[4790]: I1003 08:42:45.449139 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-sttgw" Oct 03 08:42:45 crc kubenswrapper[4790]: I1003 08:42:45.506126 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-kbklb"] Oct 03 08:42:45 crc kubenswrapper[4790]: I1003 08:42:45.507554 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-kbklb" Oct 03 08:42:45 crc kubenswrapper[4790]: I1003 08:42:45.528808 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-kbklb"] Oct 03 08:42:45 crc kubenswrapper[4790]: I1003 08:42:45.643466 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fcf0cfae-75a1-4e0d-95cc-c62f43e6748e-catalog-content\") pod \"redhat-marketplace-kbklb\" (UID: \"fcf0cfae-75a1-4e0d-95cc-c62f43e6748e\") " pod="openshift-marketplace/redhat-marketplace-kbklb" Oct 03 08:42:45 crc kubenswrapper[4790]: I1003 08:42:45.643598 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tks57\" (UniqueName: \"kubernetes.io/projected/fcf0cfae-75a1-4e0d-95cc-c62f43e6748e-kube-api-access-tks57\") pod \"redhat-marketplace-kbklb\" (UID: \"fcf0cfae-75a1-4e0d-95cc-c62f43e6748e\") " pod="openshift-marketplace/redhat-marketplace-kbklb" Oct 03 08:42:45 crc kubenswrapper[4790]: I1003 08:42:45.643660 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fcf0cfae-75a1-4e0d-95cc-c62f43e6748e-utilities\") pod \"redhat-marketplace-kbklb\" (UID: \"fcf0cfae-75a1-4e0d-95cc-c62f43e6748e\") " pod="openshift-marketplace/redhat-marketplace-kbklb" Oct 03 08:42:45 crc kubenswrapper[4790]: I1003 08:42:45.743510 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-sttgw"] Oct 03 08:42:45 crc kubenswrapper[4790]: I1003 08:42:45.745015 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fcf0cfae-75a1-4e0d-95cc-c62f43e6748e-catalog-content\") pod \"redhat-marketplace-kbklb\" (UID: \"fcf0cfae-75a1-4e0d-95cc-c62f43e6748e\") " pod="openshift-marketplace/redhat-marketplace-kbklb" Oct 03 08:42:45 crc kubenswrapper[4790]: I1003 08:42:45.745129 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tks57\" (UniqueName: \"kubernetes.io/projected/fcf0cfae-75a1-4e0d-95cc-c62f43e6748e-kube-api-access-tks57\") pod \"redhat-marketplace-kbklb\" (UID: \"fcf0cfae-75a1-4e0d-95cc-c62f43e6748e\") " pod="openshift-marketplace/redhat-marketplace-kbklb" Oct 03 08:42:45 crc kubenswrapper[4790]: I1003 08:42:45.745162 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fcf0cfae-75a1-4e0d-95cc-c62f43e6748e-utilities\") pod \"redhat-marketplace-kbklb\" (UID: \"fcf0cfae-75a1-4e0d-95cc-c62f43e6748e\") " pod="openshift-marketplace/redhat-marketplace-kbklb" Oct 03 08:42:45 crc kubenswrapper[4790]: I1003 08:42:45.755960 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fcf0cfae-75a1-4e0d-95cc-c62f43e6748e-catalog-content\") pod \"redhat-marketplace-kbklb\" (UID: \"fcf0cfae-75a1-4e0d-95cc-c62f43e6748e\") " pod="openshift-marketplace/redhat-marketplace-kbklb" Oct 03 08:42:45 crc kubenswrapper[4790]: I1003 08:42:45.756656 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fcf0cfae-75a1-4e0d-95cc-c62f43e6748e-utilities\") pod \"redhat-marketplace-kbklb\" (UID: \"fcf0cfae-75a1-4e0d-95cc-c62f43e6748e\") " pod="openshift-marketplace/redhat-marketplace-kbklb" Oct 03 08:42:45 crc kubenswrapper[4790]: I1003 08:42:45.768052 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tks57\" (UniqueName: \"kubernetes.io/projected/fcf0cfae-75a1-4e0d-95cc-c62f43e6748e-kube-api-access-tks57\") pod \"redhat-marketplace-kbklb\" (UID: \"fcf0cfae-75a1-4e0d-95cc-c62f43e6748e\") " pod="openshift-marketplace/redhat-marketplace-kbklb" Oct 03 08:42:45 crc kubenswrapper[4790]: I1003 08:42:45.837129 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-kbklb" Oct 03 08:42:46 crc kubenswrapper[4790]: I1003 08:42:46.009904 4790 generic.go:334] "Generic (PLEG): container finished" podID="de73f7fd-8767-4ac4-8975-9eb18a03b881" containerID="d5afb3910230680bc2f2461fbb476801408abf43495bca0cfb330a04d6d34eb6" exitCode=0 Oct 03 08:42:46 crc kubenswrapper[4790]: I1003 08:42:46.010033 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sttgw" event={"ID":"de73f7fd-8767-4ac4-8975-9eb18a03b881","Type":"ContainerDied","Data":"d5afb3910230680bc2f2461fbb476801408abf43495bca0cfb330a04d6d34eb6"} Oct 03 08:42:46 crc kubenswrapper[4790]: I1003 08:42:46.010076 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sttgw" event={"ID":"de73f7fd-8767-4ac4-8975-9eb18a03b881","Type":"ContainerStarted","Data":"1ba3363a7c04846776f042309df4aa2352784325f68d70ea8fd7ece9fc2d41f2"} Oct 03 08:42:46 crc kubenswrapper[4790]: I1003 08:42:46.026442 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-csbrj" event={"ID":"4f5de6e6-d3aa-4e4e-9f8c-bf67bf228b8c","Type":"ContainerStarted","Data":"401e578b1d7b13af1d7223c99b3abb03355dad107f7f7280b0ad422a93328769"} Oct 03 08:42:46 crc kubenswrapper[4790]: I1003 08:42:46.026505 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-csbrj" event={"ID":"4f5de6e6-d3aa-4e4e-9f8c-bf67bf228b8c","Type":"ContainerStarted","Data":"5b0e9a812811ac3d469408b413be27b27ef9acd8068d65cb0718c22f9d428890"} Oct 03 08:42:46 crc kubenswrapper[4790]: I1003 08:42:46.026646 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-csbrj" Oct 03 08:42:46 crc kubenswrapper[4790]: I1003 08:42:46.044745 4790 generic.go:334] "Generic (PLEG): container finished" podID="68fadce0-3257-4525-b81f-552537c013be" containerID="464aec7f7c6fd3f9380c7c491dfee2cab43bacc0d8f586ee7aeb6b34f58b320e" exitCode=0 Oct 03 08:42:46 crc kubenswrapper[4790]: I1003 08:42:46.045425 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29324670-6j9ps" event={"ID":"68fadce0-3257-4525-b81f-552537c013be","Type":"ContainerDied","Data":"464aec7f7c6fd3f9380c7c491dfee2cab43bacc0d8f586ee7aeb6b34f58b320e"} Oct 03 08:42:46 crc kubenswrapper[4790]: I1003 08:42:46.089789 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-csbrj" podStartSLOduration=131.089708993 podStartE2EDuration="2m11.089708993s" podCreationTimestamp="2025-10-03 08:40:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 08:42:46.084301213 +0000 UTC m=+152.437235451" watchObservedRunningTime="2025-10-03 08:42:46.089708993 +0000 UTC m=+152.442643261" Oct 03 08:42:46 crc kubenswrapper[4790]: I1003 08:42:46.098143 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-njqpz"] Oct 03 08:42:46 crc kubenswrapper[4790]: I1003 08:42:46.100032 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-njqpz" Oct 03 08:42:46 crc kubenswrapper[4790]: I1003 08:42:46.100408 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-kbklb"] Oct 03 08:42:46 crc kubenswrapper[4790]: I1003 08:42:46.101869 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Oct 03 08:42:46 crc kubenswrapper[4790]: I1003 08:42:46.123564 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-njqpz"] Oct 03 08:42:46 crc kubenswrapper[4790]: I1003 08:42:46.128794 4790 patch_prober.go:28] interesting pod/router-default-5444994796-xhdkx container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 03 08:42:46 crc kubenswrapper[4790]: [-]has-synced failed: reason withheld Oct 03 08:42:46 crc kubenswrapper[4790]: [+]process-running ok Oct 03 08:42:46 crc kubenswrapper[4790]: healthz check failed Oct 03 08:42:46 crc kubenswrapper[4790]: I1003 08:42:46.128855 4790 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-xhdkx" podUID="8f295eb1-856c-4063-8411-3282ffac9de5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 03 08:42:46 crc kubenswrapper[4790]: W1003 08:42:46.137900 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfcf0cfae_75a1_4e0d_95cc_c62f43e6748e.slice/crio-1fb0ff6a09218f5c7dd15a9cd28115b859d6aa1f56d07f7eea98dfc56dfb2bc1 WatchSource:0}: Error finding container 1fb0ff6a09218f5c7dd15a9cd28115b859d6aa1f56d07f7eea98dfc56dfb2bc1: Status 404 returned error can't find the container with id 1fb0ff6a09218f5c7dd15a9cd28115b859d6aa1f56d07f7eea98dfc56dfb2bc1 Oct 03 08:42:46 crc kubenswrapper[4790]: I1003 08:42:46.153403 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2b485\" (UniqueName: \"kubernetes.io/projected/4882956d-71db-444d-9573-83151dc6f80f-kube-api-access-2b485\") pod \"redhat-operators-njqpz\" (UID: \"4882956d-71db-444d-9573-83151dc6f80f\") " pod="openshift-marketplace/redhat-operators-njqpz" Oct 03 08:42:46 crc kubenswrapper[4790]: I1003 08:42:46.155346 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4882956d-71db-444d-9573-83151dc6f80f-utilities\") pod \"redhat-operators-njqpz\" (UID: \"4882956d-71db-444d-9573-83151dc6f80f\") " pod="openshift-marketplace/redhat-operators-njqpz" Oct 03 08:42:46 crc kubenswrapper[4790]: I1003 08:42:46.155425 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4882956d-71db-444d-9573-83151dc6f80f-catalog-content\") pod \"redhat-operators-njqpz\" (UID: \"4882956d-71db-444d-9573-83151dc6f80f\") " pod="openshift-marketplace/redhat-operators-njqpz" Oct 03 08:42:46 crc kubenswrapper[4790]: I1003 08:42:46.257196 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4882956d-71db-444d-9573-83151dc6f80f-utilities\") pod \"redhat-operators-njqpz\" (UID: \"4882956d-71db-444d-9573-83151dc6f80f\") " pod="openshift-marketplace/redhat-operators-njqpz" Oct 03 08:42:46 crc kubenswrapper[4790]: I1003 08:42:46.257261 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4882956d-71db-444d-9573-83151dc6f80f-catalog-content\") pod \"redhat-operators-njqpz\" (UID: \"4882956d-71db-444d-9573-83151dc6f80f\") " pod="openshift-marketplace/redhat-operators-njqpz" Oct 03 08:42:46 crc kubenswrapper[4790]: I1003 08:42:46.257340 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2b485\" (UniqueName: \"kubernetes.io/projected/4882956d-71db-444d-9573-83151dc6f80f-kube-api-access-2b485\") pod \"redhat-operators-njqpz\" (UID: \"4882956d-71db-444d-9573-83151dc6f80f\") " pod="openshift-marketplace/redhat-operators-njqpz" Oct 03 08:42:46 crc kubenswrapper[4790]: I1003 08:42:46.257966 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4882956d-71db-444d-9573-83151dc6f80f-catalog-content\") pod \"redhat-operators-njqpz\" (UID: \"4882956d-71db-444d-9573-83151dc6f80f\") " pod="openshift-marketplace/redhat-operators-njqpz" Oct 03 08:42:46 crc kubenswrapper[4790]: I1003 08:42:46.257993 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4882956d-71db-444d-9573-83151dc6f80f-utilities\") pod \"redhat-operators-njqpz\" (UID: \"4882956d-71db-444d-9573-83151dc6f80f\") " pod="openshift-marketplace/redhat-operators-njqpz" Oct 03 08:42:46 crc kubenswrapper[4790]: I1003 08:42:46.281204 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2b485\" (UniqueName: \"kubernetes.io/projected/4882956d-71db-444d-9573-83151dc6f80f-kube-api-access-2b485\") pod \"redhat-operators-njqpz\" (UID: \"4882956d-71db-444d-9573-83151dc6f80f\") " pod="openshift-marketplace/redhat-operators-njqpz" Oct 03 08:42:46 crc kubenswrapper[4790]: I1003 08:42:46.300493 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-hf4ws"] Oct 03 08:42:46 crc kubenswrapper[4790]: I1003 08:42:46.303126 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-hf4ws" Oct 03 08:42:46 crc kubenswrapper[4790]: I1003 08:42:46.311792 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-hf4ws"] Oct 03 08:42:46 crc kubenswrapper[4790]: I1003 08:42:46.351320 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Oct 03 08:42:46 crc kubenswrapper[4790]: I1003 08:42:46.358401 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0cd66a26-e1dc-4abc-b1bf-8c1dc4f84b60-utilities\") pod \"redhat-operators-hf4ws\" (UID: \"0cd66a26-e1dc-4abc-b1bf-8c1dc4f84b60\") " pod="openshift-marketplace/redhat-operators-hf4ws" Oct 03 08:42:46 crc kubenswrapper[4790]: I1003 08:42:46.358466 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cc6v2\" (UniqueName: \"kubernetes.io/projected/0cd66a26-e1dc-4abc-b1bf-8c1dc4f84b60-kube-api-access-cc6v2\") pod \"redhat-operators-hf4ws\" (UID: \"0cd66a26-e1dc-4abc-b1bf-8c1dc4f84b60\") " pod="openshift-marketplace/redhat-operators-hf4ws" Oct 03 08:42:46 crc kubenswrapper[4790]: I1003 08:42:46.358519 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0cd66a26-e1dc-4abc-b1bf-8c1dc4f84b60-catalog-content\") pod \"redhat-operators-hf4ws\" (UID: \"0cd66a26-e1dc-4abc-b1bf-8c1dc4f84b60\") " pod="openshift-marketplace/redhat-operators-hf4ws" Oct 03 08:42:46 crc kubenswrapper[4790]: I1003 08:42:46.424718 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-njqpz" Oct 03 08:42:46 crc kubenswrapper[4790]: I1003 08:42:46.461022 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0cd66a26-e1dc-4abc-b1bf-8c1dc4f84b60-catalog-content\") pod \"redhat-operators-hf4ws\" (UID: \"0cd66a26-e1dc-4abc-b1bf-8c1dc4f84b60\") " pod="openshift-marketplace/redhat-operators-hf4ws" Oct 03 08:42:46 crc kubenswrapper[4790]: I1003 08:42:46.461147 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0cd66a26-e1dc-4abc-b1bf-8c1dc4f84b60-utilities\") pod \"redhat-operators-hf4ws\" (UID: \"0cd66a26-e1dc-4abc-b1bf-8c1dc4f84b60\") " pod="openshift-marketplace/redhat-operators-hf4ws" Oct 03 08:42:46 crc kubenswrapper[4790]: I1003 08:42:46.461216 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cc6v2\" (UniqueName: \"kubernetes.io/projected/0cd66a26-e1dc-4abc-b1bf-8c1dc4f84b60-kube-api-access-cc6v2\") pod \"redhat-operators-hf4ws\" (UID: \"0cd66a26-e1dc-4abc-b1bf-8c1dc4f84b60\") " pod="openshift-marketplace/redhat-operators-hf4ws" Oct 03 08:42:46 crc kubenswrapper[4790]: I1003 08:42:46.462435 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0cd66a26-e1dc-4abc-b1bf-8c1dc4f84b60-catalog-content\") pod \"redhat-operators-hf4ws\" (UID: \"0cd66a26-e1dc-4abc-b1bf-8c1dc4f84b60\") " pod="openshift-marketplace/redhat-operators-hf4ws" Oct 03 08:42:46 crc kubenswrapper[4790]: I1003 08:42:46.462558 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0cd66a26-e1dc-4abc-b1bf-8c1dc4f84b60-utilities\") pod \"redhat-operators-hf4ws\" (UID: \"0cd66a26-e1dc-4abc-b1bf-8c1dc4f84b60\") " pod="openshift-marketplace/redhat-operators-hf4ws" Oct 03 08:42:46 crc kubenswrapper[4790]: I1003 08:42:46.483331 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cc6v2\" (UniqueName: \"kubernetes.io/projected/0cd66a26-e1dc-4abc-b1bf-8c1dc4f84b60-kube-api-access-cc6v2\") pod \"redhat-operators-hf4ws\" (UID: \"0cd66a26-e1dc-4abc-b1bf-8c1dc4f84b60\") " pod="openshift-marketplace/redhat-operators-hf4ws" Oct 03 08:42:46 crc kubenswrapper[4790]: I1003 08:42:46.628479 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-hf4ws" Oct 03 08:42:46 crc kubenswrapper[4790]: I1003 08:42:46.688246 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-njqpz"] Oct 03 08:42:46 crc kubenswrapper[4790]: W1003 08:42:46.693878 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4882956d_71db_444d_9573_83151dc6f80f.slice/crio-7d5ac4fac659d76d7b1b9352a79b6eb40b0c8666b7614e205a39fe34eb7c3f4d WatchSource:0}: Error finding container 7d5ac4fac659d76d7b1b9352a79b6eb40b0c8666b7614e205a39fe34eb7c3f4d: Status 404 returned error can't find the container with id 7d5ac4fac659d76d7b1b9352a79b6eb40b0c8666b7614e205a39fe34eb7c3f4d Oct 03 08:42:46 crc kubenswrapper[4790]: I1003 08:42:46.986066 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-hf4ws"] Oct 03 08:42:46 crc kubenswrapper[4790]: W1003 08:42:46.996837 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0cd66a26_e1dc_4abc_b1bf_8c1dc4f84b60.slice/crio-87448abe1481b3b05e21ce650819a5d35eeca1719026993fad5356e3b08a9287 WatchSource:0}: Error finding container 87448abe1481b3b05e21ce650819a5d35eeca1719026993fad5356e3b08a9287: Status 404 returned error can't find the container with id 87448abe1481b3b05e21ce650819a5d35eeca1719026993fad5356e3b08a9287 Oct 03 08:42:47 crc kubenswrapper[4790]: I1003 08:42:47.054451 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-njqpz" event={"ID":"4882956d-71db-444d-9573-83151dc6f80f","Type":"ContainerStarted","Data":"7d5ac4fac659d76d7b1b9352a79b6eb40b0c8666b7614e205a39fe34eb7c3f4d"} Oct 03 08:42:47 crc kubenswrapper[4790]: I1003 08:42:47.060264 4790 generic.go:334] "Generic (PLEG): container finished" podID="fcf0cfae-75a1-4e0d-95cc-c62f43e6748e" containerID="4b77087d27a5ee62dfe25a350b5062286b7f3afcfb1a02e375563cea79ce74d0" exitCode=0 Oct 03 08:42:47 crc kubenswrapper[4790]: I1003 08:42:47.060358 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kbklb" event={"ID":"fcf0cfae-75a1-4e0d-95cc-c62f43e6748e","Type":"ContainerDied","Data":"4b77087d27a5ee62dfe25a350b5062286b7f3afcfb1a02e375563cea79ce74d0"} Oct 03 08:42:47 crc kubenswrapper[4790]: I1003 08:42:47.060394 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kbklb" event={"ID":"fcf0cfae-75a1-4e0d-95cc-c62f43e6748e","Type":"ContainerStarted","Data":"1fb0ff6a09218f5c7dd15a9cd28115b859d6aa1f56d07f7eea98dfc56dfb2bc1"} Oct 03 08:42:47 crc kubenswrapper[4790]: I1003 08:42:47.066244 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-87tx4" Oct 03 08:42:47 crc kubenswrapper[4790]: I1003 08:42:47.068603 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hf4ws" event={"ID":"0cd66a26-e1dc-4abc-b1bf-8c1dc4f84b60","Type":"ContainerStarted","Data":"87448abe1481b3b05e21ce650819a5d35eeca1719026993fad5356e3b08a9287"} Oct 03 08:42:47 crc kubenswrapper[4790]: I1003 08:42:47.115540 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-xhdkx" Oct 03 08:42:47 crc kubenswrapper[4790]: I1003 08:42:47.124467 4790 patch_prober.go:28] interesting pod/router-default-5444994796-xhdkx container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 03 08:42:47 crc kubenswrapper[4790]: [-]has-synced failed: reason withheld Oct 03 08:42:47 crc kubenswrapper[4790]: [+]process-running ok Oct 03 08:42:47 crc kubenswrapper[4790]: healthz check failed Oct 03 08:42:47 crc kubenswrapper[4790]: I1003 08:42:47.124547 4790 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-xhdkx" podUID="8f295eb1-856c-4063-8411-3282ffac9de5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 03 08:42:47 crc kubenswrapper[4790]: I1003 08:42:47.256735 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Oct 03 08:42:47 crc kubenswrapper[4790]: I1003 08:42:47.257809 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 03 08:42:47 crc kubenswrapper[4790]: I1003 08:42:47.260350 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Oct 03 08:42:47 crc kubenswrapper[4790]: I1003 08:42:47.261309 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Oct 03 08:42:47 crc kubenswrapper[4790]: I1003 08:42:47.274547 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Oct 03 08:42:47 crc kubenswrapper[4790]: I1003 08:42:47.299495 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29324670-6j9ps" Oct 03 08:42:47 crc kubenswrapper[4790]: I1003 08:42:47.380095 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/68fadce0-3257-4525-b81f-552537c013be-config-volume" (OuterVolumeSpecName: "config-volume") pod "68fadce0-3257-4525-b81f-552537c013be" (UID: "68fadce0-3257-4525-b81f-552537c013be"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:42:47 crc kubenswrapper[4790]: I1003 08:42:47.382291 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/68fadce0-3257-4525-b81f-552537c013be-config-volume\") pod \"68fadce0-3257-4525-b81f-552537c013be\" (UID: \"68fadce0-3257-4525-b81f-552537c013be\") " Oct 03 08:42:47 crc kubenswrapper[4790]: I1003 08:42:47.382456 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/68fadce0-3257-4525-b81f-552537c013be-secret-volume\") pod \"68fadce0-3257-4525-b81f-552537c013be\" (UID: \"68fadce0-3257-4525-b81f-552537c013be\") " Oct 03 08:42:47 crc kubenswrapper[4790]: I1003 08:42:47.382514 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lv9wm\" (UniqueName: \"kubernetes.io/projected/68fadce0-3257-4525-b81f-552537c013be-kube-api-access-lv9wm\") pod \"68fadce0-3257-4525-b81f-552537c013be\" (UID: \"68fadce0-3257-4525-b81f-552537c013be\") " Oct 03 08:42:47 crc kubenswrapper[4790]: I1003 08:42:47.383110 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f3cf3999-ed97-44d0-a955-bf33f37c2582-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"f3cf3999-ed97-44d0-a955-bf33f37c2582\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 03 08:42:47 crc kubenswrapper[4790]: I1003 08:42:47.383248 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/f3cf3999-ed97-44d0-a955-bf33f37c2582-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"f3cf3999-ed97-44d0-a955-bf33f37c2582\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 03 08:42:47 crc kubenswrapper[4790]: I1003 08:42:47.383313 4790 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/68fadce0-3257-4525-b81f-552537c013be-config-volume\") on node \"crc\" DevicePath \"\"" Oct 03 08:42:47 crc kubenswrapper[4790]: I1003 08:42:47.407166 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/68fadce0-3257-4525-b81f-552537c013be-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "68fadce0-3257-4525-b81f-552537c013be" (UID: "68fadce0-3257-4525-b81f-552537c013be"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:42:47 crc kubenswrapper[4790]: I1003 08:42:47.418343 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/68fadce0-3257-4525-b81f-552537c013be-kube-api-access-lv9wm" (OuterVolumeSpecName: "kube-api-access-lv9wm") pod "68fadce0-3257-4525-b81f-552537c013be" (UID: "68fadce0-3257-4525-b81f-552537c013be"). InnerVolumeSpecName "kube-api-access-lv9wm". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:42:47 crc kubenswrapper[4790]: I1003 08:42:47.484417 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/f3cf3999-ed97-44d0-a955-bf33f37c2582-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"f3cf3999-ed97-44d0-a955-bf33f37c2582\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 03 08:42:47 crc kubenswrapper[4790]: I1003 08:42:47.484658 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f3cf3999-ed97-44d0-a955-bf33f37c2582-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"f3cf3999-ed97-44d0-a955-bf33f37c2582\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 03 08:42:47 crc kubenswrapper[4790]: I1003 08:42:47.484728 4790 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/68fadce0-3257-4525-b81f-552537c013be-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 03 08:42:47 crc kubenswrapper[4790]: I1003 08:42:47.484764 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lv9wm\" (UniqueName: \"kubernetes.io/projected/68fadce0-3257-4525-b81f-552537c013be-kube-api-access-lv9wm\") on node \"crc\" DevicePath \"\"" Oct 03 08:42:47 crc kubenswrapper[4790]: I1003 08:42:47.485284 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/f3cf3999-ed97-44d0-a955-bf33f37c2582-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"f3cf3999-ed97-44d0-a955-bf33f37c2582\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 03 08:42:47 crc kubenswrapper[4790]: I1003 08:42:47.523843 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f3cf3999-ed97-44d0-a955-bf33f37c2582-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"f3cf3999-ed97-44d0-a955-bf33f37c2582\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 03 08:42:47 crc kubenswrapper[4790]: I1003 08:42:47.590108 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 03 08:42:47 crc kubenswrapper[4790]: I1003 08:42:47.946711 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Oct 03 08:42:48 crc kubenswrapper[4790]: I1003 08:42:48.106288 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-njqpz" event={"ID":"4882956d-71db-444d-9573-83151dc6f80f","Type":"ContainerDied","Data":"f39196339522d3de25733de5e7e0f2405478d69a9af5fffadc44b19bcd036be4"} Oct 03 08:42:48 crc kubenswrapper[4790]: I1003 08:42:48.105510 4790 generic.go:334] "Generic (PLEG): container finished" podID="4882956d-71db-444d-9573-83151dc6f80f" containerID="f39196339522d3de25733de5e7e0f2405478d69a9af5fffadc44b19bcd036be4" exitCode=0 Oct 03 08:42:48 crc kubenswrapper[4790]: I1003 08:42:48.118943 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"f3cf3999-ed97-44d0-a955-bf33f37c2582","Type":"ContainerStarted","Data":"3b3685b466d0e85e29d944e36ebb7bec11f0f596f72d64111611a09b0094e42f"} Oct 03 08:42:48 crc kubenswrapper[4790]: I1003 08:42:48.120239 4790 patch_prober.go:28] interesting pod/router-default-5444994796-xhdkx container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 03 08:42:48 crc kubenswrapper[4790]: [-]has-synced failed: reason withheld Oct 03 08:42:48 crc kubenswrapper[4790]: [+]process-running ok Oct 03 08:42:48 crc kubenswrapper[4790]: healthz check failed Oct 03 08:42:48 crc kubenswrapper[4790]: I1003 08:42:48.120290 4790 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-xhdkx" podUID="8f295eb1-856c-4063-8411-3282ffac9de5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 03 08:42:48 crc kubenswrapper[4790]: I1003 08:42:48.121811 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29324670-6j9ps" event={"ID":"68fadce0-3257-4525-b81f-552537c013be","Type":"ContainerDied","Data":"244011290d41446dc3cc9418cf6bdf2d65a640e03359c49e6ec8cb241dcbca16"} Oct 03 08:42:48 crc kubenswrapper[4790]: I1003 08:42:48.121834 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="244011290d41446dc3cc9418cf6bdf2d65a640e03359c49e6ec8cb241dcbca16" Oct 03 08:42:48 crc kubenswrapper[4790]: I1003 08:42:48.121905 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29324670-6j9ps" Oct 03 08:42:48 crc kubenswrapper[4790]: I1003 08:42:48.150038 4790 generic.go:334] "Generic (PLEG): container finished" podID="0cd66a26-e1dc-4abc-b1bf-8c1dc4f84b60" containerID="6ead4248df84ba8360815412f5e280fadb271fc3519ffa3bba1c231a3e9e9102" exitCode=0 Oct 03 08:42:48 crc kubenswrapper[4790]: I1003 08:42:48.150094 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hf4ws" event={"ID":"0cd66a26-e1dc-4abc-b1bf-8c1dc4f84b60","Type":"ContainerDied","Data":"6ead4248df84ba8360815412f5e280fadb271fc3519ffa3bba1c231a3e9e9102"} Oct 03 08:42:49 crc kubenswrapper[4790]: I1003 08:42:49.118937 4790 patch_prober.go:28] interesting pod/router-default-5444994796-xhdkx container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 03 08:42:49 crc kubenswrapper[4790]: [-]has-synced failed: reason withheld Oct 03 08:42:49 crc kubenswrapper[4790]: [+]process-running ok Oct 03 08:42:49 crc kubenswrapper[4790]: healthz check failed Oct 03 08:42:49 crc kubenswrapper[4790]: I1003 08:42:49.119533 4790 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-xhdkx" podUID="8f295eb1-856c-4063-8411-3282ffac9de5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 03 08:42:49 crc kubenswrapper[4790]: I1003 08:42:49.177708 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"f3cf3999-ed97-44d0-a955-bf33f37c2582","Type":"ContainerStarted","Data":"166ada86390f6ea32ceedcf292a73bbf3e77ed9eee6bef5e69480c53abad9a84"} Oct 03 08:42:49 crc kubenswrapper[4790]: I1003 08:42:49.463041 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Oct 03 08:42:49 crc kubenswrapper[4790]: E1003 08:42:49.463367 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="68fadce0-3257-4525-b81f-552537c013be" containerName="collect-profiles" Oct 03 08:42:49 crc kubenswrapper[4790]: I1003 08:42:49.463383 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="68fadce0-3257-4525-b81f-552537c013be" containerName="collect-profiles" Oct 03 08:42:49 crc kubenswrapper[4790]: I1003 08:42:49.463504 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="68fadce0-3257-4525-b81f-552537c013be" containerName="collect-profiles" Oct 03 08:42:49 crc kubenswrapper[4790]: I1003 08:42:49.464025 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 03 08:42:49 crc kubenswrapper[4790]: I1003 08:42:49.470246 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Oct 03 08:42:49 crc kubenswrapper[4790]: I1003 08:42:49.472793 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Oct 03 08:42:49 crc kubenswrapper[4790]: I1003 08:42:49.481656 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Oct 03 08:42:49 crc kubenswrapper[4790]: I1003 08:42:49.547689 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/fca400f1-6a23-4dd7-b518-7dadfadd3809-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"fca400f1-6a23-4dd7-b518-7dadfadd3809\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 03 08:42:49 crc kubenswrapper[4790]: I1003 08:42:49.547747 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/fca400f1-6a23-4dd7-b518-7dadfadd3809-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"fca400f1-6a23-4dd7-b518-7dadfadd3809\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 03 08:42:49 crc kubenswrapper[4790]: I1003 08:42:49.649004 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/fca400f1-6a23-4dd7-b518-7dadfadd3809-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"fca400f1-6a23-4dd7-b518-7dadfadd3809\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 03 08:42:49 crc kubenswrapper[4790]: I1003 08:42:49.649079 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/fca400f1-6a23-4dd7-b518-7dadfadd3809-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"fca400f1-6a23-4dd7-b518-7dadfadd3809\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 03 08:42:49 crc kubenswrapper[4790]: I1003 08:42:49.649221 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/fca400f1-6a23-4dd7-b518-7dadfadd3809-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"fca400f1-6a23-4dd7-b518-7dadfadd3809\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 03 08:42:49 crc kubenswrapper[4790]: I1003 08:42:49.677658 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/fca400f1-6a23-4dd7-b518-7dadfadd3809-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"fca400f1-6a23-4dd7-b518-7dadfadd3809\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 03 08:42:49 crc kubenswrapper[4790]: I1003 08:42:49.800976 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 03 08:42:50 crc kubenswrapper[4790]: I1003 08:42:50.123767 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-xhdkx" Oct 03 08:42:50 crc kubenswrapper[4790]: I1003 08:42:50.137484 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-xhdkx" Oct 03 08:42:50 crc kubenswrapper[4790]: I1003 08:42:50.218393 4790 generic.go:334] "Generic (PLEG): container finished" podID="f3cf3999-ed97-44d0-a955-bf33f37c2582" containerID="166ada86390f6ea32ceedcf292a73bbf3e77ed9eee6bef5e69480c53abad9a84" exitCode=0 Oct 03 08:42:50 crc kubenswrapper[4790]: I1003 08:42:50.219776 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"f3cf3999-ed97-44d0-a955-bf33f37c2582","Type":"ContainerDied","Data":"166ada86390f6ea32ceedcf292a73bbf3e77ed9eee6bef5e69480c53abad9a84"} Oct 03 08:42:50 crc kubenswrapper[4790]: I1003 08:42:50.451770 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Oct 03 08:42:51 crc kubenswrapper[4790]: I1003 08:42:51.277303 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"fca400f1-6a23-4dd7-b518-7dadfadd3809","Type":"ContainerStarted","Data":"c0bbafae5d032ac9e4e473fb155772c4b8e92acfb3e243f670e7f2e33c91ca05"} Oct 03 08:42:51 crc kubenswrapper[4790]: I1003 08:42:51.684205 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-2j9mx" Oct 03 08:42:52 crc kubenswrapper[4790]: I1003 08:42:52.333398 4790 generic.go:334] "Generic (PLEG): container finished" podID="fca400f1-6a23-4dd7-b518-7dadfadd3809" containerID="21c6dbc269961d8f4dc6ce4c55e33ee1c2b64558609141e8abda5789f1139139" exitCode=0 Oct 03 08:42:52 crc kubenswrapper[4790]: I1003 08:42:52.363499 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"fca400f1-6a23-4dd7-b518-7dadfadd3809","Type":"ContainerDied","Data":"21c6dbc269961d8f4dc6ce4c55e33ee1c2b64558609141e8abda5789f1139139"} Oct 03 08:42:54 crc kubenswrapper[4790]: I1003 08:42:54.757595 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-4gzvp" Oct 03 08:42:55 crc kubenswrapper[4790]: I1003 08:42:55.383691 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-ggr2t" Oct 03 08:42:55 crc kubenswrapper[4790]: I1003 08:42:55.389704 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-ggr2t" Oct 03 08:42:57 crc kubenswrapper[4790]: I1003 08:42:57.093504 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9236b957-81cb-40bc-969e-a2057884ba50-metrics-certs\") pod \"network-metrics-daemon-4vl95\" (UID: \"9236b957-81cb-40bc-969e-a2057884ba50\") " pod="openshift-multus/network-metrics-daemon-4vl95" Oct 03 08:42:57 crc kubenswrapper[4790]: I1003 08:42:57.102251 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9236b957-81cb-40bc-969e-a2057884ba50-metrics-certs\") pod \"network-metrics-daemon-4vl95\" (UID: \"9236b957-81cb-40bc-969e-a2057884ba50\") " pod="openshift-multus/network-metrics-daemon-4vl95" Oct 03 08:42:57 crc kubenswrapper[4790]: I1003 08:42:57.146053 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4vl95" Oct 03 08:43:04 crc kubenswrapper[4790]: I1003 08:43:04.164837 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 03 08:43:04 crc kubenswrapper[4790]: I1003 08:43:04.305347 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/fca400f1-6a23-4dd7-b518-7dadfadd3809-kubelet-dir\") pod \"fca400f1-6a23-4dd7-b518-7dadfadd3809\" (UID: \"fca400f1-6a23-4dd7-b518-7dadfadd3809\") " Oct 03 08:43:04 crc kubenswrapper[4790]: I1003 08:43:04.305480 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/fca400f1-6a23-4dd7-b518-7dadfadd3809-kube-api-access\") pod \"fca400f1-6a23-4dd7-b518-7dadfadd3809\" (UID: \"fca400f1-6a23-4dd7-b518-7dadfadd3809\") " Oct 03 08:43:04 crc kubenswrapper[4790]: I1003 08:43:04.305932 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fca400f1-6a23-4dd7-b518-7dadfadd3809-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "fca400f1-6a23-4dd7-b518-7dadfadd3809" (UID: "fca400f1-6a23-4dd7-b518-7dadfadd3809"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 08:43:04 crc kubenswrapper[4790]: I1003 08:43:04.311846 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fca400f1-6a23-4dd7-b518-7dadfadd3809-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "fca400f1-6a23-4dd7-b518-7dadfadd3809" (UID: "fca400f1-6a23-4dd7-b518-7dadfadd3809"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:43:04 crc kubenswrapper[4790]: I1003 08:43:04.408209 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/fca400f1-6a23-4dd7-b518-7dadfadd3809-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 03 08:43:04 crc kubenswrapper[4790]: I1003 08:43:04.408272 4790 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/fca400f1-6a23-4dd7-b518-7dadfadd3809-kubelet-dir\") on node \"crc\" DevicePath \"\"" Oct 03 08:43:04 crc kubenswrapper[4790]: I1003 08:43:04.459415 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"fca400f1-6a23-4dd7-b518-7dadfadd3809","Type":"ContainerDied","Data":"c0bbafae5d032ac9e4e473fb155772c4b8e92acfb3e243f670e7f2e33c91ca05"} Oct 03 08:43:04 crc kubenswrapper[4790]: I1003 08:43:04.459464 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c0bbafae5d032ac9e4e473fb155772c4b8e92acfb3e243f670e7f2e33c91ca05" Oct 03 08:43:04 crc kubenswrapper[4790]: I1003 08:43:04.459534 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 03 08:43:04 crc kubenswrapper[4790]: I1003 08:43:04.798616 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-csbrj" Oct 03 08:43:10 crc kubenswrapper[4790]: I1003 08:43:10.187143 4790 patch_prober.go:28] interesting pod/machine-config-daemon-zqj4j container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 08:43:10 crc kubenswrapper[4790]: I1003 08:43:10.187687 4790 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" podUID="36eae06a-d8be-4128-8d68-acb32e1f011b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 08:43:16 crc kubenswrapper[4790]: I1003 08:43:16.785983 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-8f5qg" Oct 03 08:43:19 crc kubenswrapper[4790]: I1003 08:43:19.648132 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 03 08:43:19 crc kubenswrapper[4790]: I1003 08:43:19.757552 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f3cf3999-ed97-44d0-a955-bf33f37c2582-kube-api-access\") pod \"f3cf3999-ed97-44d0-a955-bf33f37c2582\" (UID: \"f3cf3999-ed97-44d0-a955-bf33f37c2582\") " Oct 03 08:43:19 crc kubenswrapper[4790]: I1003 08:43:19.757640 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/f3cf3999-ed97-44d0-a955-bf33f37c2582-kubelet-dir\") pod \"f3cf3999-ed97-44d0-a955-bf33f37c2582\" (UID: \"f3cf3999-ed97-44d0-a955-bf33f37c2582\") " Oct 03 08:43:19 crc kubenswrapper[4790]: I1003 08:43:19.757823 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f3cf3999-ed97-44d0-a955-bf33f37c2582-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "f3cf3999-ed97-44d0-a955-bf33f37c2582" (UID: "f3cf3999-ed97-44d0-a955-bf33f37c2582"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 08:43:19 crc kubenswrapper[4790]: I1003 08:43:19.766471 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f3cf3999-ed97-44d0-a955-bf33f37c2582-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "f3cf3999-ed97-44d0-a955-bf33f37c2582" (UID: "f3cf3999-ed97-44d0-a955-bf33f37c2582"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:43:19 crc kubenswrapper[4790]: I1003 08:43:19.858806 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f3cf3999-ed97-44d0-a955-bf33f37c2582-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 03 08:43:19 crc kubenswrapper[4790]: I1003 08:43:19.858854 4790 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/f3cf3999-ed97-44d0-a955-bf33f37c2582-kubelet-dir\") on node \"crc\" DevicePath \"\"" Oct 03 08:43:20 crc kubenswrapper[4790]: I1003 08:43:20.551531 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"f3cf3999-ed97-44d0-a955-bf33f37c2582","Type":"ContainerDied","Data":"3b3685b466d0e85e29d944e36ebb7bec11f0f596f72d64111611a09b0094e42f"} Oct 03 08:43:20 crc kubenswrapper[4790]: I1003 08:43:20.551608 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3b3685b466d0e85e29d944e36ebb7bec11f0f596f72d64111611a09b0094e42f" Oct 03 08:43:20 crc kubenswrapper[4790]: I1003 08:43:20.551678 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 03 08:43:22 crc kubenswrapper[4790]: I1003 08:43:22.571623 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 03 08:43:24 crc kubenswrapper[4790]: E1003 08:43:24.979826 4790 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Oct 03 08:43:24 crc kubenswrapper[4790]: E1003 08:43:24.980482 4790 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-bdtwz,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-zj7jl_openshift-marketplace(b66141f0-fc3d-4596-9484-d01137e1f76c): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 03 08:43:24 crc kubenswrapper[4790]: E1003 08:43:24.981736 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-zj7jl" podUID="b66141f0-fc3d-4596-9484-d01137e1f76c" Oct 03 08:43:26 crc kubenswrapper[4790]: E1003 08:43:26.433846 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-zj7jl" podUID="b66141f0-fc3d-4596-9484-d01137e1f76c" Oct 03 08:43:26 crc kubenswrapper[4790]: E1003 08:43:26.559490 4790 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Oct 03 08:43:26 crc kubenswrapper[4790]: E1003 08:43:26.559746 4790 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-r5frg,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-7ww29_openshift-marketplace(3003b3b3-a1dd-4db2-aaee-8a4b0db7d46d): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 03 08:43:26 crc kubenswrapper[4790]: E1003 08:43:26.561031 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-7ww29" podUID="3003b3b3-a1dd-4db2-aaee-8a4b0db7d46d" Oct 03 08:43:26 crc kubenswrapper[4790]: E1003 08:43:26.563488 4790 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Oct 03 08:43:26 crc kubenswrapper[4790]: E1003 08:43:26.563689 4790 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-b8qfx,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-rf5kb_openshift-marketplace(46b04491-809e-431f-bac5-2a84accab94f): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 03 08:43:26 crc kubenswrapper[4790]: E1003 08:43:26.565730 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-rf5kb" podUID="46b04491-809e-431f-bac5-2a84accab94f" Oct 03 08:43:29 crc kubenswrapper[4790]: E1003 08:43:29.760728 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-7ww29" podUID="3003b3b3-a1dd-4db2-aaee-8a4b0db7d46d" Oct 03 08:43:29 crc kubenswrapper[4790]: E1003 08:43:29.760869 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-rf5kb" podUID="46b04491-809e-431f-bac5-2a84accab94f" Oct 03 08:43:29 crc kubenswrapper[4790]: E1003 08:43:29.843033 4790 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Oct 03 08:43:29 crc kubenswrapper[4790]: E1003 08:43:29.843839 4790 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-2b485,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-njqpz_openshift-marketplace(4882956d-71db-444d-9573-83151dc6f80f): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 03 08:43:29 crc kubenswrapper[4790]: E1003 08:43:29.845967 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-njqpz" podUID="4882956d-71db-444d-9573-83151dc6f80f" Oct 03 08:43:32 crc kubenswrapper[4790]: E1003 08:43:32.242528 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-njqpz" podUID="4882956d-71db-444d-9573-83151dc6f80f" Oct 03 08:43:32 crc kubenswrapper[4790]: E1003 08:43:32.322889 4790 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Oct 03 08:43:32 crc kubenswrapper[4790]: E1003 08:43:32.323040 4790 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-t6w6k,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-g9chf_openshift-marketplace(6b044862-0484-4701-a965-796c87a3adc1): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 03 08:43:32 crc kubenswrapper[4790]: E1003 08:43:32.324219 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-g9chf" podUID="6b044862-0484-4701-a965-796c87a3adc1" Oct 03 08:43:32 crc kubenswrapper[4790]: E1003 08:43:32.365947 4790 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Oct 03 08:43:32 crc kubenswrapper[4790]: E1003 08:43:32.366093 4790 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-cc6v2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-hf4ws_openshift-marketplace(0cd66a26-e1dc-4abc-b1bf-8c1dc4f84b60): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 03 08:43:32 crc kubenswrapper[4790]: E1003 08:43:32.367280 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-hf4ws" podUID="0cd66a26-e1dc-4abc-b1bf-8c1dc4f84b60" Oct 03 08:43:32 crc kubenswrapper[4790]: I1003 08:43:32.421247 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-4vl95"] Oct 03 08:43:33 crc kubenswrapper[4790]: E1003 08:43:33.083864 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-hf4ws" podUID="0cd66a26-e1dc-4abc-b1bf-8c1dc4f84b60" Oct 03 08:43:33 crc kubenswrapper[4790]: E1003 08:43:33.083972 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-g9chf" podUID="6b044862-0484-4701-a965-796c87a3adc1" Oct 03 08:43:33 crc kubenswrapper[4790]: W1003 08:43:33.092508 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9236b957_81cb_40bc_969e_a2057884ba50.slice/crio-a9690953ac015b2dc26134139b2974d836bd8b813e7ecc02a25282f0add7bf4c WatchSource:0}: Error finding container a9690953ac015b2dc26134139b2974d836bd8b813e7ecc02a25282f0add7bf4c: Status 404 returned error can't find the container with id a9690953ac015b2dc26134139b2974d836bd8b813e7ecc02a25282f0add7bf4c Oct 03 08:43:33 crc kubenswrapper[4790]: E1003 08:43:33.160966 4790 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Oct 03 08:43:33 crc kubenswrapper[4790]: E1003 08:43:33.161166 4790 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-vrqzl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-sttgw_openshift-marketplace(de73f7fd-8767-4ac4-8975-9eb18a03b881): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 03 08:43:33 crc kubenswrapper[4790]: E1003 08:43:33.162718 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-sttgw" podUID="de73f7fd-8767-4ac4-8975-9eb18a03b881" Oct 03 08:43:33 crc kubenswrapper[4790]: E1003 08:43:33.166890 4790 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Oct 03 08:43:33 crc kubenswrapper[4790]: E1003 08:43:33.167012 4790 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-tks57,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-kbklb_openshift-marketplace(fcf0cfae-75a1-4e0d-95cc-c62f43e6748e): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 03 08:43:33 crc kubenswrapper[4790]: E1003 08:43:33.168113 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-kbklb" podUID="fcf0cfae-75a1-4e0d-95cc-c62f43e6748e" Oct 03 08:43:33 crc kubenswrapper[4790]: I1003 08:43:33.636395 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-4vl95" event={"ID":"9236b957-81cb-40bc-969e-a2057884ba50","Type":"ContainerStarted","Data":"4a6ded7f0fe6b7cc2878460f4176edbb7b7e4e5b34d811db399920d0988e569b"} Oct 03 08:43:33 crc kubenswrapper[4790]: I1003 08:43:33.636804 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-4vl95" event={"ID":"9236b957-81cb-40bc-969e-a2057884ba50","Type":"ContainerStarted","Data":"a9690953ac015b2dc26134139b2974d836bd8b813e7ecc02a25282f0add7bf4c"} Oct 03 08:43:33 crc kubenswrapper[4790]: E1003 08:43:33.637868 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-sttgw" podUID="de73f7fd-8767-4ac4-8975-9eb18a03b881" Oct 03 08:43:33 crc kubenswrapper[4790]: E1003 08:43:33.638625 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-kbklb" podUID="fcf0cfae-75a1-4e0d-95cc-c62f43e6748e" Oct 03 08:43:34 crc kubenswrapper[4790]: I1003 08:43:34.644085 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-4vl95" event={"ID":"9236b957-81cb-40bc-969e-a2057884ba50","Type":"ContainerStarted","Data":"5df8c34faca29be1e7c076cbb1a9ef2595b0a34450807e3f2930fce49d37f176"} Oct 03 08:43:34 crc kubenswrapper[4790]: I1003 08:43:34.682135 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-4vl95" podStartSLOduration=179.682108078 podStartE2EDuration="2m59.682108078s" podCreationTimestamp="2025-10-03 08:40:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 08:43:34.67617348 +0000 UTC m=+201.029107768" watchObservedRunningTime="2025-10-03 08:43:34.682108078 +0000 UTC m=+201.035042346" Oct 03 08:43:40 crc kubenswrapper[4790]: I1003 08:43:40.186391 4790 patch_prober.go:28] interesting pod/machine-config-daemon-zqj4j container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 08:43:40 crc kubenswrapper[4790]: I1003 08:43:40.187023 4790 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" podUID="36eae06a-d8be-4128-8d68-acb32e1f011b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 08:43:40 crc kubenswrapper[4790]: I1003 08:43:40.187113 4790 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" Oct 03 08:43:40 crc kubenswrapper[4790]: I1003 08:43:40.187774 4790 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"b8c3aea222eb433729d8eac107ebcdb306f649bef25942cd0390bfc1bbda5c74"} pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 03 08:43:40 crc kubenswrapper[4790]: I1003 08:43:40.187866 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" podUID="36eae06a-d8be-4128-8d68-acb32e1f011b" containerName="machine-config-daemon" containerID="cri-o://b8c3aea222eb433729d8eac107ebcdb306f649bef25942cd0390bfc1bbda5c74" gracePeriod=600 Oct 03 08:43:40 crc kubenswrapper[4790]: I1003 08:43:40.676043 4790 generic.go:334] "Generic (PLEG): container finished" podID="b66141f0-fc3d-4596-9484-d01137e1f76c" containerID="a5e26cd7d7a34a16853f6c5645972850a920d7b13bb478f7380111c7c241bbba" exitCode=0 Oct 03 08:43:40 crc kubenswrapper[4790]: I1003 08:43:40.676188 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zj7jl" event={"ID":"b66141f0-fc3d-4596-9484-d01137e1f76c","Type":"ContainerDied","Data":"a5e26cd7d7a34a16853f6c5645972850a920d7b13bb478f7380111c7c241bbba"} Oct 03 08:43:40 crc kubenswrapper[4790]: I1003 08:43:40.682131 4790 generic.go:334] "Generic (PLEG): container finished" podID="36eae06a-d8be-4128-8d68-acb32e1f011b" containerID="b8c3aea222eb433729d8eac107ebcdb306f649bef25942cd0390bfc1bbda5c74" exitCode=0 Oct 03 08:43:40 crc kubenswrapper[4790]: I1003 08:43:40.682180 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" event={"ID":"36eae06a-d8be-4128-8d68-acb32e1f011b","Type":"ContainerDied","Data":"b8c3aea222eb433729d8eac107ebcdb306f649bef25942cd0390bfc1bbda5c74"} Oct 03 08:43:42 crc kubenswrapper[4790]: I1003 08:43:42.693060 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" event={"ID":"36eae06a-d8be-4128-8d68-acb32e1f011b","Type":"ContainerStarted","Data":"43a28ab79be93dc66c4ea1f907c9eeb943c508f991b4352e4e0fde3491023e73"} Oct 03 08:43:48 crc kubenswrapper[4790]: I1003 08:43:48.733128 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zj7jl" event={"ID":"b66141f0-fc3d-4596-9484-d01137e1f76c","Type":"ContainerStarted","Data":"12a322c107c67611ae45e6459662e2d1b17284183a82e4934aeb2943ec617d9a"} Oct 03 08:43:48 crc kubenswrapper[4790]: I1003 08:43:48.735908 4790 generic.go:334] "Generic (PLEG): container finished" podID="3003b3b3-a1dd-4db2-aaee-8a4b0db7d46d" containerID="639caa38f1ceb2de83050259dd0277c8a84ca46c2f1a7f92887c5d6767d734ae" exitCode=0 Oct 03 08:43:48 crc kubenswrapper[4790]: I1003 08:43:48.735997 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7ww29" event={"ID":"3003b3b3-a1dd-4db2-aaee-8a4b0db7d46d","Type":"ContainerDied","Data":"639caa38f1ceb2de83050259dd0277c8a84ca46c2f1a7f92887c5d6767d734ae"} Oct 03 08:43:48 crc kubenswrapper[4790]: I1003 08:43:48.759823 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-zj7jl" podStartSLOduration=4.085735381 podStartE2EDuration="1m6.759797993s" podCreationTimestamp="2025-10-03 08:42:42 +0000 UTC" firstStartedPulling="2025-10-03 08:42:44.95085621 +0000 UTC m=+151.303790448" lastFinishedPulling="2025-10-03 08:43:47.624918782 +0000 UTC m=+213.977853060" observedRunningTime="2025-10-03 08:43:48.75937088 +0000 UTC m=+215.112305128" watchObservedRunningTime="2025-10-03 08:43:48.759797993 +0000 UTC m=+215.112732261" Oct 03 08:43:50 crc kubenswrapper[4790]: I1003 08:43:50.754986 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hf4ws" event={"ID":"0cd66a26-e1dc-4abc-b1bf-8c1dc4f84b60","Type":"ContainerStarted","Data":"d0c7c12ee49b4ecc5c663f9444dfa614ff5abe29bc3b169c3b6503919805929c"} Oct 03 08:43:50 crc kubenswrapper[4790]: I1003 08:43:50.756810 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7ww29" event={"ID":"3003b3b3-a1dd-4db2-aaee-8a4b0db7d46d","Type":"ContainerStarted","Data":"2a7aeb9a0d9d77a0cebddb45633127e5061faa37f0aea90346c95faa788f53a1"} Oct 03 08:43:50 crc kubenswrapper[4790]: I1003 08:43:50.758177 4790 generic.go:334] "Generic (PLEG): container finished" podID="de73f7fd-8767-4ac4-8975-9eb18a03b881" containerID="3b6a79be929bc70d84cd1facc64a8b15f799abf3812164ad34a102ca0f3134ae" exitCode=0 Oct 03 08:43:50 crc kubenswrapper[4790]: I1003 08:43:50.758232 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sttgw" event={"ID":"de73f7fd-8767-4ac4-8975-9eb18a03b881","Type":"ContainerDied","Data":"3b6a79be929bc70d84cd1facc64a8b15f799abf3812164ad34a102ca0f3134ae"} Oct 03 08:43:50 crc kubenswrapper[4790]: I1003 08:43:50.760309 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-njqpz" event={"ID":"4882956d-71db-444d-9573-83151dc6f80f","Type":"ContainerStarted","Data":"06df50519bd50c832b25e7eda9d7116fd1f9d8e905fb8c3a607fdc4b734a9143"} Oct 03 08:43:50 crc kubenswrapper[4790]: I1003 08:43:50.761909 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-g9chf" event={"ID":"6b044862-0484-4701-a965-796c87a3adc1","Type":"ContainerStarted","Data":"24140849270be8938403c7e2730646843e2ba15ade9747b98c5cf8b0969b72ca"} Oct 03 08:43:50 crc kubenswrapper[4790]: I1003 08:43:50.764484 4790 generic.go:334] "Generic (PLEG): container finished" podID="fcf0cfae-75a1-4e0d-95cc-c62f43e6748e" containerID="690d03f1203280c73635776c84f11dbfa51e5327ff68cfd63957b2c2d3c0656a" exitCode=0 Oct 03 08:43:50 crc kubenswrapper[4790]: I1003 08:43:50.764507 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kbklb" event={"ID":"fcf0cfae-75a1-4e0d-95cc-c62f43e6748e","Type":"ContainerDied","Data":"690d03f1203280c73635776c84f11dbfa51e5327ff68cfd63957b2c2d3c0656a"} Oct 03 08:43:50 crc kubenswrapper[4790]: I1003 08:43:50.766386 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rf5kb" event={"ID":"46b04491-809e-431f-bac5-2a84accab94f","Type":"ContainerStarted","Data":"9cf5f3db78d21abf5ee6baf742d30ff015f68a1f61313e4aba241820f4d81b1e"} Oct 03 08:43:50 crc kubenswrapper[4790]: I1003 08:43:50.834022 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-7ww29" podStartSLOduration=2.556529834 podStartE2EDuration="1m7.834008035s" podCreationTimestamp="2025-10-03 08:42:43 +0000 UTC" firstStartedPulling="2025-10-03 08:42:44.963630155 +0000 UTC m=+151.316564393" lastFinishedPulling="2025-10-03 08:43:50.241108356 +0000 UTC m=+216.594042594" observedRunningTime="2025-10-03 08:43:50.832022378 +0000 UTC m=+217.184956616" watchObservedRunningTime="2025-10-03 08:43:50.834008035 +0000 UTC m=+217.186942273" Oct 03 08:43:51 crc kubenswrapper[4790]: I1003 08:43:51.792478 4790 generic.go:334] "Generic (PLEG): container finished" podID="6b044862-0484-4701-a965-796c87a3adc1" containerID="24140849270be8938403c7e2730646843e2ba15ade9747b98c5cf8b0969b72ca" exitCode=0 Oct 03 08:43:51 crc kubenswrapper[4790]: I1003 08:43:51.792804 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-g9chf" event={"ID":"6b044862-0484-4701-a965-796c87a3adc1","Type":"ContainerDied","Data":"24140849270be8938403c7e2730646843e2ba15ade9747b98c5cf8b0969b72ca"} Oct 03 08:43:51 crc kubenswrapper[4790]: I1003 08:43:51.799188 4790 generic.go:334] "Generic (PLEG): container finished" podID="46b04491-809e-431f-bac5-2a84accab94f" containerID="9cf5f3db78d21abf5ee6baf742d30ff015f68a1f61313e4aba241820f4d81b1e" exitCode=0 Oct 03 08:43:51 crc kubenswrapper[4790]: I1003 08:43:51.799270 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rf5kb" event={"ID":"46b04491-809e-431f-bac5-2a84accab94f","Type":"ContainerDied","Data":"9cf5f3db78d21abf5ee6baf742d30ff015f68a1f61313e4aba241820f4d81b1e"} Oct 03 08:43:51 crc kubenswrapper[4790]: I1003 08:43:51.805563 4790 generic.go:334] "Generic (PLEG): container finished" podID="0cd66a26-e1dc-4abc-b1bf-8c1dc4f84b60" containerID="d0c7c12ee49b4ecc5c663f9444dfa614ff5abe29bc3b169c3b6503919805929c" exitCode=0 Oct 03 08:43:51 crc kubenswrapper[4790]: I1003 08:43:51.805682 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hf4ws" event={"ID":"0cd66a26-e1dc-4abc-b1bf-8c1dc4f84b60","Type":"ContainerDied","Data":"d0c7c12ee49b4ecc5c663f9444dfa614ff5abe29bc3b169c3b6503919805929c"} Oct 03 08:43:51 crc kubenswrapper[4790]: I1003 08:43:51.815410 4790 generic.go:334] "Generic (PLEG): container finished" podID="4882956d-71db-444d-9573-83151dc6f80f" containerID="06df50519bd50c832b25e7eda9d7116fd1f9d8e905fb8c3a607fdc4b734a9143" exitCode=0 Oct 03 08:43:51 crc kubenswrapper[4790]: I1003 08:43:51.815482 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-njqpz" event={"ID":"4882956d-71db-444d-9573-83151dc6f80f","Type":"ContainerDied","Data":"06df50519bd50c832b25e7eda9d7116fd1f9d8e905fb8c3a607fdc4b734a9143"} Oct 03 08:43:53 crc kubenswrapper[4790]: I1003 08:43:53.471539 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-zj7jl" Oct 03 08:43:53 crc kubenswrapper[4790]: I1003 08:43:53.471931 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-zj7jl" Oct 03 08:43:53 crc kubenswrapper[4790]: I1003 08:43:53.901325 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-7ww29" Oct 03 08:43:53 crc kubenswrapper[4790]: I1003 08:43:53.901361 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-7ww29" Oct 03 08:43:53 crc kubenswrapper[4790]: I1003 08:43:53.953174 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-7ww29" Oct 03 08:43:53 crc kubenswrapper[4790]: I1003 08:43:53.954144 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-zj7jl" Oct 03 08:43:54 crc kubenswrapper[4790]: I1003 08:43:54.001960 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-zj7jl" Oct 03 08:44:02 crc kubenswrapper[4790]: I1003 08:44:02.872673 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kbklb" event={"ID":"fcf0cfae-75a1-4e0d-95cc-c62f43e6748e","Type":"ContainerStarted","Data":"9914fa66b7868939f58ecb666f2ef0d53c3a1a56def9d45873b8094d2a392fbe"} Oct 03 08:44:03 crc kubenswrapper[4790]: I1003 08:44:03.903740 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-kbklb" podStartSLOduration=4.817855161 podStartE2EDuration="1m18.903719009s" podCreationTimestamp="2025-10-03 08:42:45 +0000 UTC" firstStartedPulling="2025-10-03 08:42:47.062016604 +0000 UTC m=+153.414950842" lastFinishedPulling="2025-10-03 08:44:01.147880412 +0000 UTC m=+227.500814690" observedRunningTime="2025-10-03 08:44:03.901136977 +0000 UTC m=+230.254071235" watchObservedRunningTime="2025-10-03 08:44:03.903719009 +0000 UTC m=+230.256653257" Oct 03 08:44:03 crc kubenswrapper[4790]: I1003 08:44:03.945779 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-7ww29" Oct 03 08:44:04 crc kubenswrapper[4790]: I1003 08:44:04.116323 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-7ww29"] Oct 03 08:44:04 crc kubenswrapper[4790]: I1003 08:44:04.885083 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-7ww29" podUID="3003b3b3-a1dd-4db2-aaee-8a4b0db7d46d" containerName="registry-server" containerID="cri-o://2a7aeb9a0d9d77a0cebddb45633127e5061faa37f0aea90346c95faa788f53a1" gracePeriod=2 Oct 03 08:44:05 crc kubenswrapper[4790]: I1003 08:44:05.837889 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-kbklb" Oct 03 08:44:05 crc kubenswrapper[4790]: I1003 08:44:05.838529 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-kbklb" Oct 03 08:44:05 crc kubenswrapper[4790]: I1003 08:44:05.879403 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-kbklb" Oct 03 08:44:06 crc kubenswrapper[4790]: I1003 08:44:06.901939 4790 generic.go:334] "Generic (PLEG): container finished" podID="3003b3b3-a1dd-4db2-aaee-8a4b0db7d46d" containerID="2a7aeb9a0d9d77a0cebddb45633127e5061faa37f0aea90346c95faa788f53a1" exitCode=0 Oct 03 08:44:06 crc kubenswrapper[4790]: I1003 08:44:06.902027 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7ww29" event={"ID":"3003b3b3-a1dd-4db2-aaee-8a4b0db7d46d","Type":"ContainerDied","Data":"2a7aeb9a0d9d77a0cebddb45633127e5061faa37f0aea90346c95faa788f53a1"} Oct 03 08:44:07 crc kubenswrapper[4790]: I1003 08:44:07.287245 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7ww29" Oct 03 08:44:07 crc kubenswrapper[4790]: I1003 08:44:07.483201 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3003b3b3-a1dd-4db2-aaee-8a4b0db7d46d-catalog-content\") pod \"3003b3b3-a1dd-4db2-aaee-8a4b0db7d46d\" (UID: \"3003b3b3-a1dd-4db2-aaee-8a4b0db7d46d\") " Oct 03 08:44:07 crc kubenswrapper[4790]: I1003 08:44:07.483385 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r5frg\" (UniqueName: \"kubernetes.io/projected/3003b3b3-a1dd-4db2-aaee-8a4b0db7d46d-kube-api-access-r5frg\") pod \"3003b3b3-a1dd-4db2-aaee-8a4b0db7d46d\" (UID: \"3003b3b3-a1dd-4db2-aaee-8a4b0db7d46d\") " Oct 03 08:44:07 crc kubenswrapper[4790]: I1003 08:44:07.483452 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3003b3b3-a1dd-4db2-aaee-8a4b0db7d46d-utilities\") pod \"3003b3b3-a1dd-4db2-aaee-8a4b0db7d46d\" (UID: \"3003b3b3-a1dd-4db2-aaee-8a4b0db7d46d\") " Oct 03 08:44:07 crc kubenswrapper[4790]: I1003 08:44:07.484286 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3003b3b3-a1dd-4db2-aaee-8a4b0db7d46d-utilities" (OuterVolumeSpecName: "utilities") pod "3003b3b3-a1dd-4db2-aaee-8a4b0db7d46d" (UID: "3003b3b3-a1dd-4db2-aaee-8a4b0db7d46d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 08:44:07 crc kubenswrapper[4790]: I1003 08:44:07.489040 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3003b3b3-a1dd-4db2-aaee-8a4b0db7d46d-kube-api-access-r5frg" (OuterVolumeSpecName: "kube-api-access-r5frg") pod "3003b3b3-a1dd-4db2-aaee-8a4b0db7d46d" (UID: "3003b3b3-a1dd-4db2-aaee-8a4b0db7d46d"). InnerVolumeSpecName "kube-api-access-r5frg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:44:07 crc kubenswrapper[4790]: I1003 08:44:07.527016 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3003b3b3-a1dd-4db2-aaee-8a4b0db7d46d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3003b3b3-a1dd-4db2-aaee-8a4b0db7d46d" (UID: "3003b3b3-a1dd-4db2-aaee-8a4b0db7d46d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 08:44:07 crc kubenswrapper[4790]: I1003 08:44:07.584731 4790 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3003b3b3-a1dd-4db2-aaee-8a4b0db7d46d-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 08:44:07 crc kubenswrapper[4790]: I1003 08:44:07.584768 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r5frg\" (UniqueName: \"kubernetes.io/projected/3003b3b3-a1dd-4db2-aaee-8a4b0db7d46d-kube-api-access-r5frg\") on node \"crc\" DevicePath \"\"" Oct 03 08:44:07 crc kubenswrapper[4790]: I1003 08:44:07.584787 4790 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3003b3b3-a1dd-4db2-aaee-8a4b0db7d46d-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 08:44:07 crc kubenswrapper[4790]: I1003 08:44:07.911211 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7ww29" event={"ID":"3003b3b3-a1dd-4db2-aaee-8a4b0db7d46d","Type":"ContainerDied","Data":"8c4b8f9f4557886462abf51272a21a64451fae94a03f341bb44ee38780eefaa3"} Oct 03 08:44:07 crc kubenswrapper[4790]: I1003 08:44:07.911384 4790 scope.go:117] "RemoveContainer" containerID="2a7aeb9a0d9d77a0cebddb45633127e5061faa37f0aea90346c95faa788f53a1" Oct 03 08:44:07 crc kubenswrapper[4790]: I1003 08:44:07.911280 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7ww29" Oct 03 08:44:07 crc kubenswrapper[4790]: I1003 08:44:07.942964 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-7ww29"] Oct 03 08:44:07 crc kubenswrapper[4790]: I1003 08:44:07.945788 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-7ww29"] Oct 03 08:44:08 crc kubenswrapper[4790]: I1003 08:44:08.339982 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3003b3b3-a1dd-4db2-aaee-8a4b0db7d46d" path="/var/lib/kubelet/pods/3003b3b3-a1dd-4db2-aaee-8a4b0db7d46d/volumes" Oct 03 08:44:12 crc kubenswrapper[4790]: I1003 08:44:12.433513 4790 scope.go:117] "RemoveContainer" containerID="639caa38f1ceb2de83050259dd0277c8a84ca46c2f1a7f92887c5d6767d734ae" Oct 03 08:44:13 crc kubenswrapper[4790]: I1003 08:44:13.815996 4790 scope.go:117] "RemoveContainer" containerID="60f604cae7e195869e3adbb65ec6078f64ec51ce6f7da3dc96f758a17dfda06a" Oct 03 08:44:15 crc kubenswrapper[4790]: I1003 08:44:15.876196 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-kbklb" Oct 03 08:44:15 crc kubenswrapper[4790]: I1003 08:44:15.921736 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-kbklb"] Oct 03 08:44:15 crc kubenswrapper[4790]: I1003 08:44:15.973738 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-kbklb" podUID="fcf0cfae-75a1-4e0d-95cc-c62f43e6748e" containerName="registry-server" containerID="cri-o://9914fa66b7868939f58ecb666f2ef0d53c3a1a56def9d45873b8094d2a392fbe" gracePeriod=2 Oct 03 08:44:15 crc kubenswrapper[4790]: I1003 08:44:15.973825 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-g9chf" event={"ID":"6b044862-0484-4701-a965-796c87a3adc1","Type":"ContainerStarted","Data":"6022d3806f2192f5e6ec39991c18c426252ea4fb82e052f02eed52662c0885d9"} Oct 03 08:44:16 crc kubenswrapper[4790]: I1003 08:44:16.981179 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sttgw" event={"ID":"de73f7fd-8767-4ac4-8975-9eb18a03b881","Type":"ContainerStarted","Data":"34d15504da66bdd752c1892d9faff81461cfed9a95b3f9e8cb5c059411aa1014"} Oct 03 08:44:16 crc kubenswrapper[4790]: I1003 08:44:16.984173 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-njqpz" event={"ID":"4882956d-71db-444d-9573-83151dc6f80f","Type":"ContainerStarted","Data":"7132b4101d980da555344150482a7deafd8272dc23c4a7ed9360fdf717e66621"} Oct 03 08:44:16 crc kubenswrapper[4790]: I1003 08:44:16.986349 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rf5kb" event={"ID":"46b04491-809e-431f-bac5-2a84accab94f","Type":"ContainerStarted","Data":"5369e41d7a809379bcf784e1beb77108276134793cc48069aac38d6c3c4c366f"} Oct 03 08:44:16 crc kubenswrapper[4790]: I1003 08:44:16.988969 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hf4ws" event={"ID":"0cd66a26-e1dc-4abc-b1bf-8c1dc4f84b60","Type":"ContainerStarted","Data":"c7b29043f24bc0d763c35185444568c4efd6030e8d7fb7befece14f05aeae6fa"} Oct 03 08:44:17 crc kubenswrapper[4790]: I1003 08:44:17.013942 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-sttgw" podStartSLOduration=13.367884884 podStartE2EDuration="1m32.013925254s" podCreationTimestamp="2025-10-03 08:42:45 +0000 UTC" firstStartedPulling="2025-10-03 08:42:46.016866737 +0000 UTC m=+152.369800975" lastFinishedPulling="2025-10-03 08:44:04.662907107 +0000 UTC m=+231.015841345" observedRunningTime="2025-10-03 08:44:17.011889547 +0000 UTC m=+243.364823785" watchObservedRunningTime="2025-10-03 08:44:17.013925254 +0000 UTC m=+243.366859482" Oct 03 08:44:17 crc kubenswrapper[4790]: I1003 08:44:17.042479 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-g9chf" podStartSLOduration=6.135303504 podStartE2EDuration="1m34.042462805s" podCreationTimestamp="2025-10-03 08:42:43 +0000 UTC" firstStartedPulling="2025-10-03 08:42:44.954377637 +0000 UTC m=+151.307311875" lastFinishedPulling="2025-10-03 08:44:12.861536938 +0000 UTC m=+239.214471176" observedRunningTime="2025-10-03 08:44:17.041833017 +0000 UTC m=+243.394767285" watchObservedRunningTime="2025-10-03 08:44:17.042462805 +0000 UTC m=+243.395397043" Oct 03 08:44:17 crc kubenswrapper[4790]: I1003 08:44:17.062014 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-njqpz" podStartSLOduration=4.706837038 podStartE2EDuration="1m31.06199491s" podCreationTimestamp="2025-10-03 08:42:46 +0000 UTC" firstStartedPulling="2025-10-03 08:42:48.106915724 +0000 UTC m=+154.459849962" lastFinishedPulling="2025-10-03 08:44:14.462073596 +0000 UTC m=+240.815007834" observedRunningTime="2025-10-03 08:44:17.059790547 +0000 UTC m=+243.412724805" watchObservedRunningTime="2025-10-03 08:44:17.06199491 +0000 UTC m=+243.414929148" Oct 03 08:44:17 crc kubenswrapper[4790]: I1003 08:44:17.998547 4790 generic.go:334] "Generic (PLEG): container finished" podID="fcf0cfae-75a1-4e0d-95cc-c62f43e6748e" containerID="9914fa66b7868939f58ecb666f2ef0d53c3a1a56def9d45873b8094d2a392fbe" exitCode=0 Oct 03 08:44:17 crc kubenswrapper[4790]: I1003 08:44:17.998729 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kbklb" event={"ID":"fcf0cfae-75a1-4e0d-95cc-c62f43e6748e","Type":"ContainerDied","Data":"9914fa66b7868939f58ecb666f2ef0d53c3a1a56def9d45873b8094d2a392fbe"} Oct 03 08:44:18 crc kubenswrapper[4790]: I1003 08:44:18.022965 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-hf4ws" podStartSLOduration=6.36907943 podStartE2EDuration="1m32.022927043s" podCreationTimestamp="2025-10-03 08:42:46 +0000 UTC" firstStartedPulling="2025-10-03 08:42:48.151701359 +0000 UTC m=+154.504635597" lastFinishedPulling="2025-10-03 08:44:13.805548972 +0000 UTC m=+240.158483210" observedRunningTime="2025-10-03 08:44:17.08775387 +0000 UTC m=+243.440688108" watchObservedRunningTime="2025-10-03 08:44:18.022927043 +0000 UTC m=+244.375861291" Oct 03 08:44:18 crc kubenswrapper[4790]: I1003 08:44:18.192451 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-kbklb" Oct 03 08:44:18 crc kubenswrapper[4790]: I1003 08:44:18.219634 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-rf5kb" podStartSLOduration=13.980751127 podStartE2EDuration="1m35.219613856s" podCreationTimestamp="2025-10-03 08:42:43 +0000 UTC" firstStartedPulling="2025-10-03 08:42:44.960077376 +0000 UTC m=+151.313011614" lastFinishedPulling="2025-10-03 08:44:06.198940105 +0000 UTC m=+232.551874343" observedRunningTime="2025-10-03 08:44:18.026808553 +0000 UTC m=+244.379742811" watchObservedRunningTime="2025-10-03 08:44:18.219613856 +0000 UTC m=+244.572548104" Oct 03 08:44:18 crc kubenswrapper[4790]: I1003 08:44:18.356862 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fcf0cfae-75a1-4e0d-95cc-c62f43e6748e-catalog-content\") pod \"fcf0cfae-75a1-4e0d-95cc-c62f43e6748e\" (UID: \"fcf0cfae-75a1-4e0d-95cc-c62f43e6748e\") " Oct 03 08:44:18 crc kubenswrapper[4790]: I1003 08:44:18.357231 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tks57\" (UniqueName: \"kubernetes.io/projected/fcf0cfae-75a1-4e0d-95cc-c62f43e6748e-kube-api-access-tks57\") pod \"fcf0cfae-75a1-4e0d-95cc-c62f43e6748e\" (UID: \"fcf0cfae-75a1-4e0d-95cc-c62f43e6748e\") " Oct 03 08:44:18 crc kubenswrapper[4790]: I1003 08:44:18.357342 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fcf0cfae-75a1-4e0d-95cc-c62f43e6748e-utilities\") pod \"fcf0cfae-75a1-4e0d-95cc-c62f43e6748e\" (UID: \"fcf0cfae-75a1-4e0d-95cc-c62f43e6748e\") " Oct 03 08:44:18 crc kubenswrapper[4790]: I1003 08:44:18.357984 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fcf0cfae-75a1-4e0d-95cc-c62f43e6748e-utilities" (OuterVolumeSpecName: "utilities") pod "fcf0cfae-75a1-4e0d-95cc-c62f43e6748e" (UID: "fcf0cfae-75a1-4e0d-95cc-c62f43e6748e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 08:44:18 crc kubenswrapper[4790]: I1003 08:44:18.358226 4790 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fcf0cfae-75a1-4e0d-95cc-c62f43e6748e-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 08:44:18 crc kubenswrapper[4790]: I1003 08:44:18.363787 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fcf0cfae-75a1-4e0d-95cc-c62f43e6748e-kube-api-access-tks57" (OuterVolumeSpecName: "kube-api-access-tks57") pod "fcf0cfae-75a1-4e0d-95cc-c62f43e6748e" (UID: "fcf0cfae-75a1-4e0d-95cc-c62f43e6748e"). InnerVolumeSpecName "kube-api-access-tks57". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:44:18 crc kubenswrapper[4790]: I1003 08:44:18.375098 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fcf0cfae-75a1-4e0d-95cc-c62f43e6748e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "fcf0cfae-75a1-4e0d-95cc-c62f43e6748e" (UID: "fcf0cfae-75a1-4e0d-95cc-c62f43e6748e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 08:44:18 crc kubenswrapper[4790]: I1003 08:44:18.458995 4790 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fcf0cfae-75a1-4e0d-95cc-c62f43e6748e-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 08:44:18 crc kubenswrapper[4790]: I1003 08:44:18.459033 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tks57\" (UniqueName: \"kubernetes.io/projected/fcf0cfae-75a1-4e0d-95cc-c62f43e6748e-kube-api-access-tks57\") on node \"crc\" DevicePath \"\"" Oct 03 08:44:19 crc kubenswrapper[4790]: I1003 08:44:19.009082 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kbklb" event={"ID":"fcf0cfae-75a1-4e0d-95cc-c62f43e6748e","Type":"ContainerDied","Data":"1fb0ff6a09218f5c7dd15a9cd28115b859d6aa1f56d07f7eea98dfc56dfb2bc1"} Oct 03 08:44:19 crc kubenswrapper[4790]: I1003 08:44:19.009186 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-kbklb" Oct 03 08:44:19 crc kubenswrapper[4790]: I1003 08:44:19.009514 4790 scope.go:117] "RemoveContainer" containerID="9914fa66b7868939f58ecb666f2ef0d53c3a1a56def9d45873b8094d2a392fbe" Oct 03 08:44:19 crc kubenswrapper[4790]: I1003 08:44:19.064770 4790 scope.go:117] "RemoveContainer" containerID="690d03f1203280c73635776c84f11dbfa51e5327ff68cfd63957b2c2d3c0656a" Oct 03 08:44:19 crc kubenswrapper[4790]: I1003 08:44:19.074100 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-kbklb"] Oct 03 08:44:19 crc kubenswrapper[4790]: I1003 08:44:19.076725 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-kbklb"] Oct 03 08:44:19 crc kubenswrapper[4790]: I1003 08:44:19.112912 4790 scope.go:117] "RemoveContainer" containerID="4b77087d27a5ee62dfe25a350b5062286b7f3afcfb1a02e375563cea79ce74d0" Oct 03 08:44:20 crc kubenswrapper[4790]: I1003 08:44:20.335226 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fcf0cfae-75a1-4e0d-95cc-c62f43e6748e" path="/var/lib/kubelet/pods/fcf0cfae-75a1-4e0d-95cc-c62f43e6748e/volumes" Oct 03 08:44:23 crc kubenswrapper[4790]: I1003 08:44:23.495012 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-g9chf" Oct 03 08:44:23 crc kubenswrapper[4790]: I1003 08:44:23.495455 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-g9chf" Oct 03 08:44:23 crc kubenswrapper[4790]: I1003 08:44:23.534421 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-g9chf" Oct 03 08:44:23 crc kubenswrapper[4790]: I1003 08:44:23.704750 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-rf5kb" Oct 03 08:44:23 crc kubenswrapper[4790]: I1003 08:44:23.704963 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-rf5kb" Oct 03 08:44:23 crc kubenswrapper[4790]: I1003 08:44:23.752426 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-rf5kb" Oct 03 08:44:24 crc kubenswrapper[4790]: I1003 08:44:24.090810 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-rf5kb" Oct 03 08:44:24 crc kubenswrapper[4790]: I1003 08:44:24.101246 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-g9chf" Oct 03 08:44:25 crc kubenswrapper[4790]: I1003 08:44:25.449612 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-sttgw" Oct 03 08:44:25 crc kubenswrapper[4790]: I1003 08:44:25.450090 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-sttgw" Oct 03 08:44:25 crc kubenswrapper[4790]: I1003 08:44:25.513230 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-sttgw" Oct 03 08:44:26 crc kubenswrapper[4790]: I1003 08:44:26.092431 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-sttgw" Oct 03 08:44:26 crc kubenswrapper[4790]: I1003 08:44:26.121081 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-rf5kb"] Oct 03 08:44:26 crc kubenswrapper[4790]: I1003 08:44:26.121568 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-rf5kb" podUID="46b04491-809e-431f-bac5-2a84accab94f" containerName="registry-server" containerID="cri-o://5369e41d7a809379bcf784e1beb77108276134793cc48069aac38d6c3c4c366f" gracePeriod=2 Oct 03 08:44:26 crc kubenswrapper[4790]: I1003 08:44:26.425295 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-njqpz" Oct 03 08:44:26 crc kubenswrapper[4790]: I1003 08:44:26.425873 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-njqpz" Oct 03 08:44:26 crc kubenswrapper[4790]: I1003 08:44:26.464713 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-njqpz" Oct 03 08:44:26 crc kubenswrapper[4790]: I1003 08:44:26.480868 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rf5kb" Oct 03 08:44:26 crc kubenswrapper[4790]: I1003 08:44:26.594976 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/46b04491-809e-431f-bac5-2a84accab94f-utilities\") pod \"46b04491-809e-431f-bac5-2a84accab94f\" (UID: \"46b04491-809e-431f-bac5-2a84accab94f\") " Oct 03 08:44:26 crc kubenswrapper[4790]: I1003 08:44:26.595063 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/46b04491-809e-431f-bac5-2a84accab94f-catalog-content\") pod \"46b04491-809e-431f-bac5-2a84accab94f\" (UID: \"46b04491-809e-431f-bac5-2a84accab94f\") " Oct 03 08:44:26 crc kubenswrapper[4790]: I1003 08:44:26.596438 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/46b04491-809e-431f-bac5-2a84accab94f-utilities" (OuterVolumeSpecName: "utilities") pod "46b04491-809e-431f-bac5-2a84accab94f" (UID: "46b04491-809e-431f-bac5-2a84accab94f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 08:44:26 crc kubenswrapper[4790]: I1003 08:44:26.598706 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b8qfx\" (UniqueName: \"kubernetes.io/projected/46b04491-809e-431f-bac5-2a84accab94f-kube-api-access-b8qfx\") pod \"46b04491-809e-431f-bac5-2a84accab94f\" (UID: \"46b04491-809e-431f-bac5-2a84accab94f\") " Oct 03 08:44:26 crc kubenswrapper[4790]: I1003 08:44:26.599180 4790 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/46b04491-809e-431f-bac5-2a84accab94f-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 08:44:26 crc kubenswrapper[4790]: I1003 08:44:26.615139 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/46b04491-809e-431f-bac5-2a84accab94f-kube-api-access-b8qfx" (OuterVolumeSpecName: "kube-api-access-b8qfx") pod "46b04491-809e-431f-bac5-2a84accab94f" (UID: "46b04491-809e-431f-bac5-2a84accab94f"). InnerVolumeSpecName "kube-api-access-b8qfx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:44:26 crc kubenswrapper[4790]: I1003 08:44:26.628783 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-hf4ws" Oct 03 08:44:26 crc kubenswrapper[4790]: I1003 08:44:26.628871 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-hf4ws" Oct 03 08:44:26 crc kubenswrapper[4790]: I1003 08:44:26.650900 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/46b04491-809e-431f-bac5-2a84accab94f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "46b04491-809e-431f-bac5-2a84accab94f" (UID: "46b04491-809e-431f-bac5-2a84accab94f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 08:44:26 crc kubenswrapper[4790]: I1003 08:44:26.683927 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-hf4ws" Oct 03 08:44:26 crc kubenswrapper[4790]: I1003 08:44:26.703257 4790 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/46b04491-809e-431f-bac5-2a84accab94f-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 08:44:26 crc kubenswrapper[4790]: I1003 08:44:26.703340 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b8qfx\" (UniqueName: \"kubernetes.io/projected/46b04491-809e-431f-bac5-2a84accab94f-kube-api-access-b8qfx\") on node \"crc\" DevicePath \"\"" Oct 03 08:44:27 crc kubenswrapper[4790]: I1003 08:44:27.055748 4790 generic.go:334] "Generic (PLEG): container finished" podID="46b04491-809e-431f-bac5-2a84accab94f" containerID="5369e41d7a809379bcf784e1beb77108276134793cc48069aac38d6c3c4c366f" exitCode=0 Oct 03 08:44:27 crc kubenswrapper[4790]: I1003 08:44:27.055790 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rf5kb" event={"ID":"46b04491-809e-431f-bac5-2a84accab94f","Type":"ContainerDied","Data":"5369e41d7a809379bcf784e1beb77108276134793cc48069aac38d6c3c4c366f"} Oct 03 08:44:27 crc kubenswrapper[4790]: I1003 08:44:27.055842 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rf5kb" event={"ID":"46b04491-809e-431f-bac5-2a84accab94f","Type":"ContainerDied","Data":"660712d86bf8e912df80819c20bd0c1a2587ee3cb72814bffa15c85100cea310"} Oct 03 08:44:27 crc kubenswrapper[4790]: I1003 08:44:27.055840 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rf5kb" Oct 03 08:44:27 crc kubenswrapper[4790]: I1003 08:44:27.055860 4790 scope.go:117] "RemoveContainer" containerID="5369e41d7a809379bcf784e1beb77108276134793cc48069aac38d6c3c4c366f" Oct 03 08:44:27 crc kubenswrapper[4790]: I1003 08:44:27.073257 4790 scope.go:117] "RemoveContainer" containerID="9cf5f3db78d21abf5ee6baf742d30ff015f68a1f61313e4aba241820f4d81b1e" Oct 03 08:44:27 crc kubenswrapper[4790]: I1003 08:44:27.086533 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-rf5kb"] Oct 03 08:44:27 crc kubenswrapper[4790]: I1003 08:44:27.088612 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-rf5kb"] Oct 03 08:44:27 crc kubenswrapper[4790]: I1003 08:44:27.097283 4790 scope.go:117] "RemoveContainer" containerID="12cca27e04c114ffeb063306d85fa28919cb69563922dd41322c84bfd667de03" Oct 03 08:44:27 crc kubenswrapper[4790]: I1003 08:44:27.099417 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-hf4ws" Oct 03 08:44:27 crc kubenswrapper[4790]: I1003 08:44:27.112321 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-njqpz" Oct 03 08:44:27 crc kubenswrapper[4790]: I1003 08:44:27.120414 4790 scope.go:117] "RemoveContainer" containerID="5369e41d7a809379bcf784e1beb77108276134793cc48069aac38d6c3c4c366f" Oct 03 08:44:27 crc kubenswrapper[4790]: E1003 08:44:27.121135 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5369e41d7a809379bcf784e1beb77108276134793cc48069aac38d6c3c4c366f\": container with ID starting with 5369e41d7a809379bcf784e1beb77108276134793cc48069aac38d6c3c4c366f not found: ID does not exist" containerID="5369e41d7a809379bcf784e1beb77108276134793cc48069aac38d6c3c4c366f" Oct 03 08:44:27 crc kubenswrapper[4790]: I1003 08:44:27.121168 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5369e41d7a809379bcf784e1beb77108276134793cc48069aac38d6c3c4c366f"} err="failed to get container status \"5369e41d7a809379bcf784e1beb77108276134793cc48069aac38d6c3c4c366f\": rpc error: code = NotFound desc = could not find container \"5369e41d7a809379bcf784e1beb77108276134793cc48069aac38d6c3c4c366f\": container with ID starting with 5369e41d7a809379bcf784e1beb77108276134793cc48069aac38d6c3c4c366f not found: ID does not exist" Oct 03 08:44:27 crc kubenswrapper[4790]: I1003 08:44:27.121193 4790 scope.go:117] "RemoveContainer" containerID="9cf5f3db78d21abf5ee6baf742d30ff015f68a1f61313e4aba241820f4d81b1e" Oct 03 08:44:27 crc kubenswrapper[4790]: E1003 08:44:27.121542 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9cf5f3db78d21abf5ee6baf742d30ff015f68a1f61313e4aba241820f4d81b1e\": container with ID starting with 9cf5f3db78d21abf5ee6baf742d30ff015f68a1f61313e4aba241820f4d81b1e not found: ID does not exist" containerID="9cf5f3db78d21abf5ee6baf742d30ff015f68a1f61313e4aba241820f4d81b1e" Oct 03 08:44:27 crc kubenswrapper[4790]: I1003 08:44:27.121587 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9cf5f3db78d21abf5ee6baf742d30ff015f68a1f61313e4aba241820f4d81b1e"} err="failed to get container status \"9cf5f3db78d21abf5ee6baf742d30ff015f68a1f61313e4aba241820f4d81b1e\": rpc error: code = NotFound desc = could not find container \"9cf5f3db78d21abf5ee6baf742d30ff015f68a1f61313e4aba241820f4d81b1e\": container with ID starting with 9cf5f3db78d21abf5ee6baf742d30ff015f68a1f61313e4aba241820f4d81b1e not found: ID does not exist" Oct 03 08:44:27 crc kubenswrapper[4790]: I1003 08:44:27.121611 4790 scope.go:117] "RemoveContainer" containerID="12cca27e04c114ffeb063306d85fa28919cb69563922dd41322c84bfd667de03" Oct 03 08:44:27 crc kubenswrapper[4790]: E1003 08:44:27.123312 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"12cca27e04c114ffeb063306d85fa28919cb69563922dd41322c84bfd667de03\": container with ID starting with 12cca27e04c114ffeb063306d85fa28919cb69563922dd41322c84bfd667de03 not found: ID does not exist" containerID="12cca27e04c114ffeb063306d85fa28919cb69563922dd41322c84bfd667de03" Oct 03 08:44:27 crc kubenswrapper[4790]: I1003 08:44:27.123344 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"12cca27e04c114ffeb063306d85fa28919cb69563922dd41322c84bfd667de03"} err="failed to get container status \"12cca27e04c114ffeb063306d85fa28919cb69563922dd41322c84bfd667de03\": rpc error: code = NotFound desc = could not find container \"12cca27e04c114ffeb063306d85fa28919cb69563922dd41322c84bfd667de03\": container with ID starting with 12cca27e04c114ffeb063306d85fa28919cb69563922dd41322c84bfd667de03 not found: ID does not exist" Oct 03 08:44:28 crc kubenswrapper[4790]: I1003 08:44:28.334466 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="46b04491-809e-431f-bac5-2a84accab94f" path="/var/lib/kubelet/pods/46b04491-809e-431f-bac5-2a84accab94f/volumes" Oct 03 08:44:30 crc kubenswrapper[4790]: I1003 08:44:30.510436 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-hf4ws"] Oct 03 08:44:30 crc kubenswrapper[4790]: I1003 08:44:30.511063 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-hf4ws" podUID="0cd66a26-e1dc-4abc-b1bf-8c1dc4f84b60" containerName="registry-server" containerID="cri-o://c7b29043f24bc0d763c35185444568c4efd6030e8d7fb7befece14f05aeae6fa" gracePeriod=2 Oct 03 08:44:31 crc kubenswrapper[4790]: I1003 08:44:31.078141 4790 generic.go:334] "Generic (PLEG): container finished" podID="0cd66a26-e1dc-4abc-b1bf-8c1dc4f84b60" containerID="c7b29043f24bc0d763c35185444568c4efd6030e8d7fb7befece14f05aeae6fa" exitCode=0 Oct 03 08:44:31 crc kubenswrapper[4790]: I1003 08:44:31.078196 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hf4ws" event={"ID":"0cd66a26-e1dc-4abc-b1bf-8c1dc4f84b60","Type":"ContainerDied","Data":"c7b29043f24bc0d763c35185444568c4efd6030e8d7fb7befece14f05aeae6fa"} Oct 03 08:44:31 crc kubenswrapper[4790]: I1003 08:44:31.460469 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-hf4ws" Oct 03 08:44:31 crc kubenswrapper[4790]: I1003 08:44:31.597019 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cc6v2\" (UniqueName: \"kubernetes.io/projected/0cd66a26-e1dc-4abc-b1bf-8c1dc4f84b60-kube-api-access-cc6v2\") pod \"0cd66a26-e1dc-4abc-b1bf-8c1dc4f84b60\" (UID: \"0cd66a26-e1dc-4abc-b1bf-8c1dc4f84b60\") " Oct 03 08:44:31 crc kubenswrapper[4790]: I1003 08:44:31.597158 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0cd66a26-e1dc-4abc-b1bf-8c1dc4f84b60-utilities\") pod \"0cd66a26-e1dc-4abc-b1bf-8c1dc4f84b60\" (UID: \"0cd66a26-e1dc-4abc-b1bf-8c1dc4f84b60\") " Oct 03 08:44:31 crc kubenswrapper[4790]: I1003 08:44:31.597184 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0cd66a26-e1dc-4abc-b1bf-8c1dc4f84b60-catalog-content\") pod \"0cd66a26-e1dc-4abc-b1bf-8c1dc4f84b60\" (UID: \"0cd66a26-e1dc-4abc-b1bf-8c1dc4f84b60\") " Oct 03 08:44:31 crc kubenswrapper[4790]: I1003 08:44:31.597901 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0cd66a26-e1dc-4abc-b1bf-8c1dc4f84b60-utilities" (OuterVolumeSpecName: "utilities") pod "0cd66a26-e1dc-4abc-b1bf-8c1dc4f84b60" (UID: "0cd66a26-e1dc-4abc-b1bf-8c1dc4f84b60"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 08:44:31 crc kubenswrapper[4790]: I1003 08:44:31.617728 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0cd66a26-e1dc-4abc-b1bf-8c1dc4f84b60-kube-api-access-cc6v2" (OuterVolumeSpecName: "kube-api-access-cc6v2") pod "0cd66a26-e1dc-4abc-b1bf-8c1dc4f84b60" (UID: "0cd66a26-e1dc-4abc-b1bf-8c1dc4f84b60"). InnerVolumeSpecName "kube-api-access-cc6v2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:44:31 crc kubenswrapper[4790]: I1003 08:44:31.679368 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0cd66a26-e1dc-4abc-b1bf-8c1dc4f84b60-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0cd66a26-e1dc-4abc-b1bf-8c1dc4f84b60" (UID: "0cd66a26-e1dc-4abc-b1bf-8c1dc4f84b60"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 08:44:31 crc kubenswrapper[4790]: I1003 08:44:31.698698 4790 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0cd66a26-e1dc-4abc-b1bf-8c1dc4f84b60-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 08:44:31 crc kubenswrapper[4790]: I1003 08:44:31.698731 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cc6v2\" (UniqueName: \"kubernetes.io/projected/0cd66a26-e1dc-4abc-b1bf-8c1dc4f84b60-kube-api-access-cc6v2\") on node \"crc\" DevicePath \"\"" Oct 03 08:44:31 crc kubenswrapper[4790]: I1003 08:44:31.698743 4790 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0cd66a26-e1dc-4abc-b1bf-8c1dc4f84b60-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 08:44:32 crc kubenswrapper[4790]: I1003 08:44:32.089336 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hf4ws" event={"ID":"0cd66a26-e1dc-4abc-b1bf-8c1dc4f84b60","Type":"ContainerDied","Data":"87448abe1481b3b05e21ce650819a5d35eeca1719026993fad5356e3b08a9287"} Oct 03 08:44:32 crc kubenswrapper[4790]: I1003 08:44:32.089566 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-hf4ws" Oct 03 08:44:32 crc kubenswrapper[4790]: I1003 08:44:32.089798 4790 scope.go:117] "RemoveContainer" containerID="c7b29043f24bc0d763c35185444568c4efd6030e8d7fb7befece14f05aeae6fa" Oct 03 08:44:32 crc kubenswrapper[4790]: I1003 08:44:32.113546 4790 scope.go:117] "RemoveContainer" containerID="d0c7c12ee49b4ecc5c663f9444dfa614ff5abe29bc3b169c3b6503919805929c" Oct 03 08:44:32 crc kubenswrapper[4790]: I1003 08:44:32.126164 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-hf4ws"] Oct 03 08:44:32 crc kubenswrapper[4790]: I1003 08:44:32.129201 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-hf4ws"] Oct 03 08:44:32 crc kubenswrapper[4790]: I1003 08:44:32.152834 4790 scope.go:117] "RemoveContainer" containerID="6ead4248df84ba8360815412f5e280fadb271fc3519ffa3bba1c231a3e9e9102" Oct 03 08:44:32 crc kubenswrapper[4790]: I1003 08:44:32.336280 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0cd66a26-e1dc-4abc-b1bf-8c1dc4f84b60" path="/var/lib/kubelet/pods/0cd66a26-e1dc-4abc-b1bf-8c1dc4f84b60/volumes" Oct 03 08:44:46 crc kubenswrapper[4790]: I1003 08:44:46.715592 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-r9zxx"] Oct 03 08:45:00 crc kubenswrapper[4790]: I1003 08:45:00.144933 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29324685-lhpqd"] Oct 03 08:45:00 crc kubenswrapper[4790]: E1003 08:45:00.146165 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="46b04491-809e-431f-bac5-2a84accab94f" containerName="registry-server" Oct 03 08:45:00 crc kubenswrapper[4790]: I1003 08:45:00.146185 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="46b04491-809e-431f-bac5-2a84accab94f" containerName="registry-server" Oct 03 08:45:00 crc kubenswrapper[4790]: E1003 08:45:00.146199 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0cd66a26-e1dc-4abc-b1bf-8c1dc4f84b60" containerName="extract-content" Oct 03 08:45:00 crc kubenswrapper[4790]: I1003 08:45:00.146207 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="0cd66a26-e1dc-4abc-b1bf-8c1dc4f84b60" containerName="extract-content" Oct 03 08:45:00 crc kubenswrapper[4790]: E1003 08:45:00.146218 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fcf0cfae-75a1-4e0d-95cc-c62f43e6748e" containerName="extract-content" Oct 03 08:45:00 crc kubenswrapper[4790]: I1003 08:45:00.146225 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="fcf0cfae-75a1-4e0d-95cc-c62f43e6748e" containerName="extract-content" Oct 03 08:45:00 crc kubenswrapper[4790]: E1003 08:45:00.146238 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3003b3b3-a1dd-4db2-aaee-8a4b0db7d46d" containerName="registry-server" Oct 03 08:45:00 crc kubenswrapper[4790]: I1003 08:45:00.146245 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="3003b3b3-a1dd-4db2-aaee-8a4b0db7d46d" containerName="registry-server" Oct 03 08:45:00 crc kubenswrapper[4790]: E1003 08:45:00.146260 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3003b3b3-a1dd-4db2-aaee-8a4b0db7d46d" containerName="extract-utilities" Oct 03 08:45:00 crc kubenswrapper[4790]: I1003 08:45:00.146266 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="3003b3b3-a1dd-4db2-aaee-8a4b0db7d46d" containerName="extract-utilities" Oct 03 08:45:00 crc kubenswrapper[4790]: E1003 08:45:00.146275 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fca400f1-6a23-4dd7-b518-7dadfadd3809" containerName="pruner" Oct 03 08:45:00 crc kubenswrapper[4790]: I1003 08:45:00.146280 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="fca400f1-6a23-4dd7-b518-7dadfadd3809" containerName="pruner" Oct 03 08:45:00 crc kubenswrapper[4790]: E1003 08:45:00.146292 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3003b3b3-a1dd-4db2-aaee-8a4b0db7d46d" containerName="extract-content" Oct 03 08:45:00 crc kubenswrapper[4790]: I1003 08:45:00.146298 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="3003b3b3-a1dd-4db2-aaee-8a4b0db7d46d" containerName="extract-content" Oct 03 08:45:00 crc kubenswrapper[4790]: E1003 08:45:00.146308 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="46b04491-809e-431f-bac5-2a84accab94f" containerName="extract-content" Oct 03 08:45:00 crc kubenswrapper[4790]: I1003 08:45:00.146314 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="46b04491-809e-431f-bac5-2a84accab94f" containerName="extract-content" Oct 03 08:45:00 crc kubenswrapper[4790]: E1003 08:45:00.146323 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fcf0cfae-75a1-4e0d-95cc-c62f43e6748e" containerName="extract-utilities" Oct 03 08:45:00 crc kubenswrapper[4790]: I1003 08:45:00.146329 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="fcf0cfae-75a1-4e0d-95cc-c62f43e6748e" containerName="extract-utilities" Oct 03 08:45:00 crc kubenswrapper[4790]: E1003 08:45:00.146339 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0cd66a26-e1dc-4abc-b1bf-8c1dc4f84b60" containerName="registry-server" Oct 03 08:45:00 crc kubenswrapper[4790]: I1003 08:45:00.146451 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="0cd66a26-e1dc-4abc-b1bf-8c1dc4f84b60" containerName="registry-server" Oct 03 08:45:00 crc kubenswrapper[4790]: E1003 08:45:00.146462 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0cd66a26-e1dc-4abc-b1bf-8c1dc4f84b60" containerName="extract-utilities" Oct 03 08:45:00 crc kubenswrapper[4790]: I1003 08:45:00.146469 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="0cd66a26-e1dc-4abc-b1bf-8c1dc4f84b60" containerName="extract-utilities" Oct 03 08:45:00 crc kubenswrapper[4790]: E1003 08:45:00.146477 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f3cf3999-ed97-44d0-a955-bf33f37c2582" containerName="pruner" Oct 03 08:45:00 crc kubenswrapper[4790]: I1003 08:45:00.146483 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="f3cf3999-ed97-44d0-a955-bf33f37c2582" containerName="pruner" Oct 03 08:45:00 crc kubenswrapper[4790]: E1003 08:45:00.146491 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="46b04491-809e-431f-bac5-2a84accab94f" containerName="extract-utilities" Oct 03 08:45:00 crc kubenswrapper[4790]: I1003 08:45:00.146498 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="46b04491-809e-431f-bac5-2a84accab94f" containerName="extract-utilities" Oct 03 08:45:00 crc kubenswrapper[4790]: E1003 08:45:00.146506 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fcf0cfae-75a1-4e0d-95cc-c62f43e6748e" containerName="registry-server" Oct 03 08:45:00 crc kubenswrapper[4790]: I1003 08:45:00.146512 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="fcf0cfae-75a1-4e0d-95cc-c62f43e6748e" containerName="registry-server" Oct 03 08:45:00 crc kubenswrapper[4790]: I1003 08:45:00.146652 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="fcf0cfae-75a1-4e0d-95cc-c62f43e6748e" containerName="registry-server" Oct 03 08:45:00 crc kubenswrapper[4790]: I1003 08:45:00.146665 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="f3cf3999-ed97-44d0-a955-bf33f37c2582" containerName="pruner" Oct 03 08:45:00 crc kubenswrapper[4790]: I1003 08:45:00.146675 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="46b04491-809e-431f-bac5-2a84accab94f" containerName="registry-server" Oct 03 08:45:00 crc kubenswrapper[4790]: I1003 08:45:00.146683 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="fca400f1-6a23-4dd7-b518-7dadfadd3809" containerName="pruner" Oct 03 08:45:00 crc kubenswrapper[4790]: I1003 08:45:00.146692 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="3003b3b3-a1dd-4db2-aaee-8a4b0db7d46d" containerName="registry-server" Oct 03 08:45:00 crc kubenswrapper[4790]: I1003 08:45:00.146702 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="0cd66a26-e1dc-4abc-b1bf-8c1dc4f84b60" containerName="registry-server" Oct 03 08:45:00 crc kubenswrapper[4790]: I1003 08:45:00.147305 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29324685-lhpqd" Oct 03 08:45:00 crc kubenswrapper[4790]: I1003 08:45:00.150181 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 03 08:45:00 crc kubenswrapper[4790]: I1003 08:45:00.150297 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 03 08:45:00 crc kubenswrapper[4790]: I1003 08:45:00.160312 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29324685-lhpqd"] Oct 03 08:45:00 crc kubenswrapper[4790]: I1003 08:45:00.311113 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/52625f75-0999-4e17-a847-d2227effc006-secret-volume\") pod \"collect-profiles-29324685-lhpqd\" (UID: \"52625f75-0999-4e17-a847-d2227effc006\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324685-lhpqd" Oct 03 08:45:00 crc kubenswrapper[4790]: I1003 08:45:00.311241 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/52625f75-0999-4e17-a847-d2227effc006-config-volume\") pod \"collect-profiles-29324685-lhpqd\" (UID: \"52625f75-0999-4e17-a847-d2227effc006\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324685-lhpqd" Oct 03 08:45:00 crc kubenswrapper[4790]: I1003 08:45:00.311360 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5fttz\" (UniqueName: \"kubernetes.io/projected/52625f75-0999-4e17-a847-d2227effc006-kube-api-access-5fttz\") pod \"collect-profiles-29324685-lhpqd\" (UID: \"52625f75-0999-4e17-a847-d2227effc006\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324685-lhpqd" Oct 03 08:45:00 crc kubenswrapper[4790]: I1003 08:45:00.412755 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/52625f75-0999-4e17-a847-d2227effc006-config-volume\") pod \"collect-profiles-29324685-lhpqd\" (UID: \"52625f75-0999-4e17-a847-d2227effc006\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324685-lhpqd" Oct 03 08:45:00 crc kubenswrapper[4790]: I1003 08:45:00.412852 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5fttz\" (UniqueName: \"kubernetes.io/projected/52625f75-0999-4e17-a847-d2227effc006-kube-api-access-5fttz\") pod \"collect-profiles-29324685-lhpqd\" (UID: \"52625f75-0999-4e17-a847-d2227effc006\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324685-lhpqd" Oct 03 08:45:00 crc kubenswrapper[4790]: I1003 08:45:00.412888 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/52625f75-0999-4e17-a847-d2227effc006-secret-volume\") pod \"collect-profiles-29324685-lhpqd\" (UID: \"52625f75-0999-4e17-a847-d2227effc006\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324685-lhpqd" Oct 03 08:45:00 crc kubenswrapper[4790]: I1003 08:45:00.415454 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/52625f75-0999-4e17-a847-d2227effc006-config-volume\") pod \"collect-profiles-29324685-lhpqd\" (UID: \"52625f75-0999-4e17-a847-d2227effc006\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324685-lhpqd" Oct 03 08:45:00 crc kubenswrapper[4790]: I1003 08:45:00.422121 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/52625f75-0999-4e17-a847-d2227effc006-secret-volume\") pod \"collect-profiles-29324685-lhpqd\" (UID: \"52625f75-0999-4e17-a847-d2227effc006\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324685-lhpqd" Oct 03 08:45:00 crc kubenswrapper[4790]: I1003 08:45:00.432923 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5fttz\" (UniqueName: \"kubernetes.io/projected/52625f75-0999-4e17-a847-d2227effc006-kube-api-access-5fttz\") pod \"collect-profiles-29324685-lhpqd\" (UID: \"52625f75-0999-4e17-a847-d2227effc006\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324685-lhpqd" Oct 03 08:45:00 crc kubenswrapper[4790]: I1003 08:45:00.475690 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29324685-lhpqd" Oct 03 08:45:00 crc kubenswrapper[4790]: I1003 08:45:00.697744 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29324685-lhpqd"] Oct 03 08:45:01 crc kubenswrapper[4790]: I1003 08:45:01.258764 4790 generic.go:334] "Generic (PLEG): container finished" podID="52625f75-0999-4e17-a847-d2227effc006" containerID="ab6c5bc82cfc5d6f91e6562d40f19a146cb0220ddddb4b39730c66593c86b2a4" exitCode=0 Oct 03 08:45:01 crc kubenswrapper[4790]: I1003 08:45:01.258882 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29324685-lhpqd" event={"ID":"52625f75-0999-4e17-a847-d2227effc006","Type":"ContainerDied","Data":"ab6c5bc82cfc5d6f91e6562d40f19a146cb0220ddddb4b39730c66593c86b2a4"} Oct 03 08:45:01 crc kubenswrapper[4790]: I1003 08:45:01.259230 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29324685-lhpqd" event={"ID":"52625f75-0999-4e17-a847-d2227effc006","Type":"ContainerStarted","Data":"69aea4786ca3efe6e034300c3c2e23a85c566f11b9faf419b4fb25bdd931d15c"} Oct 03 08:45:02 crc kubenswrapper[4790]: I1003 08:45:02.496032 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29324685-lhpqd" Oct 03 08:45:02 crc kubenswrapper[4790]: I1003 08:45:02.646268 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5fttz\" (UniqueName: \"kubernetes.io/projected/52625f75-0999-4e17-a847-d2227effc006-kube-api-access-5fttz\") pod \"52625f75-0999-4e17-a847-d2227effc006\" (UID: \"52625f75-0999-4e17-a847-d2227effc006\") " Oct 03 08:45:02 crc kubenswrapper[4790]: I1003 08:45:02.646865 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/52625f75-0999-4e17-a847-d2227effc006-secret-volume\") pod \"52625f75-0999-4e17-a847-d2227effc006\" (UID: \"52625f75-0999-4e17-a847-d2227effc006\") " Oct 03 08:45:02 crc kubenswrapper[4790]: I1003 08:45:02.646920 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/52625f75-0999-4e17-a847-d2227effc006-config-volume\") pod \"52625f75-0999-4e17-a847-d2227effc006\" (UID: \"52625f75-0999-4e17-a847-d2227effc006\") " Oct 03 08:45:02 crc kubenswrapper[4790]: I1003 08:45:02.647978 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/52625f75-0999-4e17-a847-d2227effc006-config-volume" (OuterVolumeSpecName: "config-volume") pod "52625f75-0999-4e17-a847-d2227effc006" (UID: "52625f75-0999-4e17-a847-d2227effc006"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:45:02 crc kubenswrapper[4790]: I1003 08:45:02.653704 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/52625f75-0999-4e17-a847-d2227effc006-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "52625f75-0999-4e17-a847-d2227effc006" (UID: "52625f75-0999-4e17-a847-d2227effc006"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:45:02 crc kubenswrapper[4790]: I1003 08:45:02.653874 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/52625f75-0999-4e17-a847-d2227effc006-kube-api-access-5fttz" (OuterVolumeSpecName: "kube-api-access-5fttz") pod "52625f75-0999-4e17-a847-d2227effc006" (UID: "52625f75-0999-4e17-a847-d2227effc006"). InnerVolumeSpecName "kube-api-access-5fttz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:45:02 crc kubenswrapper[4790]: I1003 08:45:02.748335 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5fttz\" (UniqueName: \"kubernetes.io/projected/52625f75-0999-4e17-a847-d2227effc006-kube-api-access-5fttz\") on node \"crc\" DevicePath \"\"" Oct 03 08:45:02 crc kubenswrapper[4790]: I1003 08:45:02.748396 4790 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/52625f75-0999-4e17-a847-d2227effc006-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 03 08:45:02 crc kubenswrapper[4790]: I1003 08:45:02.748410 4790 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/52625f75-0999-4e17-a847-d2227effc006-config-volume\") on node \"crc\" DevicePath \"\"" Oct 03 08:45:03 crc kubenswrapper[4790]: I1003 08:45:03.277562 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29324685-lhpqd" event={"ID":"52625f75-0999-4e17-a847-d2227effc006","Type":"ContainerDied","Data":"69aea4786ca3efe6e034300c3c2e23a85c566f11b9faf419b4fb25bdd931d15c"} Oct 03 08:45:03 crc kubenswrapper[4790]: I1003 08:45:03.277643 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="69aea4786ca3efe6e034300c3c2e23a85c566f11b9faf419b4fb25bdd931d15c" Oct 03 08:45:03 crc kubenswrapper[4790]: I1003 08:45:03.277725 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29324685-lhpqd" Oct 03 08:45:11 crc kubenswrapper[4790]: I1003 08:45:11.757953 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-r9zxx" podUID="a6d4f573-ea4a-4c8b-85da-cd3ce54f732b" containerName="oauth-openshift" containerID="cri-o://88098454f7c1829922b9c3fd5e3355181c08a2efab48195c7ebe99ac39fabfc4" gracePeriod=15 Oct 03 08:45:12 crc kubenswrapper[4790]: I1003 08:45:12.104227 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-r9zxx" Oct 03 08:45:12 crc kubenswrapper[4790]: I1003 08:45:12.140955 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-c494796b-wq8x5"] Oct 03 08:45:12 crc kubenswrapper[4790]: E1003 08:45:12.141239 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a6d4f573-ea4a-4c8b-85da-cd3ce54f732b" containerName="oauth-openshift" Oct 03 08:45:12 crc kubenswrapper[4790]: I1003 08:45:12.141254 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6d4f573-ea4a-4c8b-85da-cd3ce54f732b" containerName="oauth-openshift" Oct 03 08:45:12 crc kubenswrapper[4790]: E1003 08:45:12.141271 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="52625f75-0999-4e17-a847-d2227effc006" containerName="collect-profiles" Oct 03 08:45:12 crc kubenswrapper[4790]: I1003 08:45:12.141277 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="52625f75-0999-4e17-a847-d2227effc006" containerName="collect-profiles" Oct 03 08:45:12 crc kubenswrapper[4790]: I1003 08:45:12.141406 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="a6d4f573-ea4a-4c8b-85da-cd3ce54f732b" containerName="oauth-openshift" Oct 03 08:45:12 crc kubenswrapper[4790]: I1003 08:45:12.141424 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="52625f75-0999-4e17-a847-d2227effc006" containerName="collect-profiles" Oct 03 08:45:12 crc kubenswrapper[4790]: I1003 08:45:12.142045 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-c494796b-wq8x5" Oct 03 08:45:12 crc kubenswrapper[4790]: I1003 08:45:12.163121 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-c494796b-wq8x5"] Oct 03 08:45:12 crc kubenswrapper[4790]: I1003 08:45:12.195532 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/a6d4f573-ea4a-4c8b-85da-cd3ce54f732b-v4-0-config-user-idp-0-file-data\") pod \"a6d4f573-ea4a-4c8b-85da-cd3ce54f732b\" (UID: \"a6d4f573-ea4a-4c8b-85da-cd3ce54f732b\") " Oct 03 08:45:12 crc kubenswrapper[4790]: I1003 08:45:12.196126 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/a6d4f573-ea4a-4c8b-85da-cd3ce54f732b-v4-0-config-user-template-error\") pod \"a6d4f573-ea4a-4c8b-85da-cd3ce54f732b\" (UID: \"a6d4f573-ea4a-4c8b-85da-cd3ce54f732b\") " Oct 03 08:45:12 crc kubenswrapper[4790]: I1003 08:45:12.196180 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/a6d4f573-ea4a-4c8b-85da-cd3ce54f732b-v4-0-config-system-ocp-branding-template\") pod \"a6d4f573-ea4a-4c8b-85da-cd3ce54f732b\" (UID: \"a6d4f573-ea4a-4c8b-85da-cd3ce54f732b\") " Oct 03 08:45:12 crc kubenswrapper[4790]: I1003 08:45:12.196230 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/a6d4f573-ea4a-4c8b-85da-cd3ce54f732b-v4-0-config-user-template-provider-selection\") pod \"a6d4f573-ea4a-4c8b-85da-cd3ce54f732b\" (UID: \"a6d4f573-ea4a-4c8b-85da-cd3ce54f732b\") " Oct 03 08:45:12 crc kubenswrapper[4790]: I1003 08:45:12.196257 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/a6d4f573-ea4a-4c8b-85da-cd3ce54f732b-v4-0-config-system-serving-cert\") pod \"a6d4f573-ea4a-4c8b-85da-cd3ce54f732b\" (UID: \"a6d4f573-ea4a-4c8b-85da-cd3ce54f732b\") " Oct 03 08:45:12 crc kubenswrapper[4790]: I1003 08:45:12.196292 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/a6d4f573-ea4a-4c8b-85da-cd3ce54f732b-audit-policies\") pod \"a6d4f573-ea4a-4c8b-85da-cd3ce54f732b\" (UID: \"a6d4f573-ea4a-4c8b-85da-cd3ce54f732b\") " Oct 03 08:45:12 crc kubenswrapper[4790]: I1003 08:45:12.196315 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/a6d4f573-ea4a-4c8b-85da-cd3ce54f732b-v4-0-config-system-cliconfig\") pod \"a6d4f573-ea4a-4c8b-85da-cd3ce54f732b\" (UID: \"a6d4f573-ea4a-4c8b-85da-cd3ce54f732b\") " Oct 03 08:45:12 crc kubenswrapper[4790]: I1003 08:45:12.196339 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/a6d4f573-ea4a-4c8b-85da-cd3ce54f732b-v4-0-config-system-session\") pod \"a6d4f573-ea4a-4c8b-85da-cd3ce54f732b\" (UID: \"a6d4f573-ea4a-4c8b-85da-cd3ce54f732b\") " Oct 03 08:45:12 crc kubenswrapper[4790]: I1003 08:45:12.196364 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/a6d4f573-ea4a-4c8b-85da-cd3ce54f732b-v4-0-config-system-service-ca\") pod \"a6d4f573-ea4a-4c8b-85da-cd3ce54f732b\" (UID: \"a6d4f573-ea4a-4c8b-85da-cd3ce54f732b\") " Oct 03 08:45:12 crc kubenswrapper[4790]: I1003 08:45:12.197194 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/a6d4f573-ea4a-4c8b-85da-cd3ce54f732b-v4-0-config-user-template-login\") pod \"a6d4f573-ea4a-4c8b-85da-cd3ce54f732b\" (UID: \"a6d4f573-ea4a-4c8b-85da-cd3ce54f732b\") " Oct 03 08:45:12 crc kubenswrapper[4790]: I1003 08:45:12.197231 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/a6d4f573-ea4a-4c8b-85da-cd3ce54f732b-v4-0-config-system-router-certs\") pod \"a6d4f573-ea4a-4c8b-85da-cd3ce54f732b\" (UID: \"a6d4f573-ea4a-4c8b-85da-cd3ce54f732b\") " Oct 03 08:45:12 crc kubenswrapper[4790]: I1003 08:45:12.197257 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a6d4f573-ea4a-4c8b-85da-cd3ce54f732b-v4-0-config-system-trusted-ca-bundle\") pod \"a6d4f573-ea4a-4c8b-85da-cd3ce54f732b\" (UID: \"a6d4f573-ea4a-4c8b-85da-cd3ce54f732b\") " Oct 03 08:45:12 crc kubenswrapper[4790]: I1003 08:45:12.197280 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/a6d4f573-ea4a-4c8b-85da-cd3ce54f732b-audit-dir\") pod \"a6d4f573-ea4a-4c8b-85da-cd3ce54f732b\" (UID: \"a6d4f573-ea4a-4c8b-85da-cd3ce54f732b\") " Oct 03 08:45:12 crc kubenswrapper[4790]: I1003 08:45:12.197308 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mhmwn\" (UniqueName: \"kubernetes.io/projected/a6d4f573-ea4a-4c8b-85da-cd3ce54f732b-kube-api-access-mhmwn\") pod \"a6d4f573-ea4a-4c8b-85da-cd3ce54f732b\" (UID: \"a6d4f573-ea4a-4c8b-85da-cd3ce54f732b\") " Oct 03 08:45:12 crc kubenswrapper[4790]: I1003 08:45:12.197330 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a6d4f573-ea4a-4c8b-85da-cd3ce54f732b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "a6d4f573-ea4a-4c8b-85da-cd3ce54f732b" (UID: "a6d4f573-ea4a-4c8b-85da-cd3ce54f732b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:45:12 crc kubenswrapper[4790]: I1003 08:45:12.197355 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a6d4f573-ea4a-4c8b-85da-cd3ce54f732b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "a6d4f573-ea4a-4c8b-85da-cd3ce54f732b" (UID: "a6d4f573-ea4a-4c8b-85da-cd3ce54f732b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:45:12 crc kubenswrapper[4790]: I1003 08:45:12.197647 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/424cb3c9-488b-4f71-967c-c9d0eed4b676-v4-0-config-user-template-login\") pod \"oauth-openshift-c494796b-wq8x5\" (UID: \"424cb3c9-488b-4f71-967c-c9d0eed4b676\") " pod="openshift-authentication/oauth-openshift-c494796b-wq8x5" Oct 03 08:45:12 crc kubenswrapper[4790]: I1003 08:45:12.197752 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/424cb3c9-488b-4f71-967c-c9d0eed4b676-v4-0-config-system-serving-cert\") pod \"oauth-openshift-c494796b-wq8x5\" (UID: \"424cb3c9-488b-4f71-967c-c9d0eed4b676\") " pod="openshift-authentication/oauth-openshift-c494796b-wq8x5" Oct 03 08:45:12 crc kubenswrapper[4790]: I1003 08:45:12.197802 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/424cb3c9-488b-4f71-967c-c9d0eed4b676-v4-0-config-user-template-error\") pod \"oauth-openshift-c494796b-wq8x5\" (UID: \"424cb3c9-488b-4f71-967c-c9d0eed4b676\") " pod="openshift-authentication/oauth-openshift-c494796b-wq8x5" Oct 03 08:45:12 crc kubenswrapper[4790]: I1003 08:45:12.197854 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a6d4f573-ea4a-4c8b-85da-cd3ce54f732b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "a6d4f573-ea4a-4c8b-85da-cd3ce54f732b" (UID: "a6d4f573-ea4a-4c8b-85da-cd3ce54f732b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:45:12 crc kubenswrapper[4790]: I1003 08:45:12.197862 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/424cb3c9-488b-4f71-967c-c9d0eed4b676-v4-0-config-system-cliconfig\") pod \"oauth-openshift-c494796b-wq8x5\" (UID: \"424cb3c9-488b-4f71-967c-c9d0eed4b676\") " pod="openshift-authentication/oauth-openshift-c494796b-wq8x5" Oct 03 08:45:12 crc kubenswrapper[4790]: I1003 08:45:12.197885 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a6d4f573-ea4a-4c8b-85da-cd3ce54f732b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "a6d4f573-ea4a-4c8b-85da-cd3ce54f732b" (UID: "a6d4f573-ea4a-4c8b-85da-cd3ce54f732b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:45:12 crc kubenswrapper[4790]: I1003 08:45:12.197930 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a6d4f573-ea4a-4c8b-85da-cd3ce54f732b-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "a6d4f573-ea4a-4c8b-85da-cd3ce54f732b" (UID: "a6d4f573-ea4a-4c8b-85da-cd3ce54f732b"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 08:45:12 crc kubenswrapper[4790]: I1003 08:45:12.198005 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/424cb3c9-488b-4f71-967c-c9d0eed4b676-audit-dir\") pod \"oauth-openshift-c494796b-wq8x5\" (UID: \"424cb3c9-488b-4f71-967c-c9d0eed4b676\") " pod="openshift-authentication/oauth-openshift-c494796b-wq8x5" Oct 03 08:45:12 crc kubenswrapper[4790]: I1003 08:45:12.198076 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/424cb3c9-488b-4f71-967c-c9d0eed4b676-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-c494796b-wq8x5\" (UID: \"424cb3c9-488b-4f71-967c-c9d0eed4b676\") " pod="openshift-authentication/oauth-openshift-c494796b-wq8x5" Oct 03 08:45:12 crc kubenswrapper[4790]: I1003 08:45:12.198182 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/424cb3c9-488b-4f71-967c-c9d0eed4b676-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-c494796b-wq8x5\" (UID: \"424cb3c9-488b-4f71-967c-c9d0eed4b676\") " pod="openshift-authentication/oauth-openshift-c494796b-wq8x5" Oct 03 08:45:12 crc kubenswrapper[4790]: I1003 08:45:12.198220 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/424cb3c9-488b-4f71-967c-c9d0eed4b676-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-c494796b-wq8x5\" (UID: \"424cb3c9-488b-4f71-967c-c9d0eed4b676\") " pod="openshift-authentication/oauth-openshift-c494796b-wq8x5" Oct 03 08:45:12 crc kubenswrapper[4790]: I1003 08:45:12.198243 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/424cb3c9-488b-4f71-967c-c9d0eed4b676-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-c494796b-wq8x5\" (UID: \"424cb3c9-488b-4f71-967c-c9d0eed4b676\") " pod="openshift-authentication/oauth-openshift-c494796b-wq8x5" Oct 03 08:45:12 crc kubenswrapper[4790]: I1003 08:45:12.198373 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/424cb3c9-488b-4f71-967c-c9d0eed4b676-v4-0-config-system-session\") pod \"oauth-openshift-c494796b-wq8x5\" (UID: \"424cb3c9-488b-4f71-967c-c9d0eed4b676\") " pod="openshift-authentication/oauth-openshift-c494796b-wq8x5" Oct 03 08:45:12 crc kubenswrapper[4790]: I1003 08:45:12.198474 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/424cb3c9-488b-4f71-967c-c9d0eed4b676-v4-0-config-system-router-certs\") pod \"oauth-openshift-c494796b-wq8x5\" (UID: \"424cb3c9-488b-4f71-967c-c9d0eed4b676\") " pod="openshift-authentication/oauth-openshift-c494796b-wq8x5" Oct 03 08:45:12 crc kubenswrapper[4790]: I1003 08:45:12.198584 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-97d4t\" (UniqueName: \"kubernetes.io/projected/424cb3c9-488b-4f71-967c-c9d0eed4b676-kube-api-access-97d4t\") pod \"oauth-openshift-c494796b-wq8x5\" (UID: \"424cb3c9-488b-4f71-967c-c9d0eed4b676\") " pod="openshift-authentication/oauth-openshift-c494796b-wq8x5" Oct 03 08:45:12 crc kubenswrapper[4790]: I1003 08:45:12.198620 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/424cb3c9-488b-4f71-967c-c9d0eed4b676-audit-policies\") pod \"oauth-openshift-c494796b-wq8x5\" (UID: \"424cb3c9-488b-4f71-967c-c9d0eed4b676\") " pod="openshift-authentication/oauth-openshift-c494796b-wq8x5" Oct 03 08:45:12 crc kubenswrapper[4790]: I1003 08:45:12.198646 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/424cb3c9-488b-4f71-967c-c9d0eed4b676-v4-0-config-system-service-ca\") pod \"oauth-openshift-c494796b-wq8x5\" (UID: \"424cb3c9-488b-4f71-967c-c9d0eed4b676\") " pod="openshift-authentication/oauth-openshift-c494796b-wq8x5" Oct 03 08:45:12 crc kubenswrapper[4790]: I1003 08:45:12.198703 4790 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/a6d4f573-ea4a-4c8b-85da-cd3ce54f732b-audit-policies\") on node \"crc\" DevicePath \"\"" Oct 03 08:45:12 crc kubenswrapper[4790]: I1003 08:45:12.198718 4790 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/a6d4f573-ea4a-4c8b-85da-cd3ce54f732b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Oct 03 08:45:12 crc kubenswrapper[4790]: I1003 08:45:12.198731 4790 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/a6d4f573-ea4a-4c8b-85da-cd3ce54f732b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Oct 03 08:45:12 crc kubenswrapper[4790]: I1003 08:45:12.198743 4790 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a6d4f573-ea4a-4c8b-85da-cd3ce54f732b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 08:45:12 crc kubenswrapper[4790]: I1003 08:45:12.198756 4790 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/a6d4f573-ea4a-4c8b-85da-cd3ce54f732b-audit-dir\") on node \"crc\" DevicePath \"\"" Oct 03 08:45:12 crc kubenswrapper[4790]: I1003 08:45:12.203611 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a6d4f573-ea4a-4c8b-85da-cd3ce54f732b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "a6d4f573-ea4a-4c8b-85da-cd3ce54f732b" (UID: "a6d4f573-ea4a-4c8b-85da-cd3ce54f732b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:45:12 crc kubenswrapper[4790]: I1003 08:45:12.204358 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a6d4f573-ea4a-4c8b-85da-cd3ce54f732b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "a6d4f573-ea4a-4c8b-85da-cd3ce54f732b" (UID: "a6d4f573-ea4a-4c8b-85da-cd3ce54f732b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:45:12 crc kubenswrapper[4790]: I1003 08:45:12.204776 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a6d4f573-ea4a-4c8b-85da-cd3ce54f732b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "a6d4f573-ea4a-4c8b-85da-cd3ce54f732b" (UID: "a6d4f573-ea4a-4c8b-85da-cd3ce54f732b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:45:12 crc kubenswrapper[4790]: I1003 08:45:12.204886 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a6d4f573-ea4a-4c8b-85da-cd3ce54f732b-kube-api-access-mhmwn" (OuterVolumeSpecName: "kube-api-access-mhmwn") pod "a6d4f573-ea4a-4c8b-85da-cd3ce54f732b" (UID: "a6d4f573-ea4a-4c8b-85da-cd3ce54f732b"). InnerVolumeSpecName "kube-api-access-mhmwn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:45:12 crc kubenswrapper[4790]: I1003 08:45:12.205388 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a6d4f573-ea4a-4c8b-85da-cd3ce54f732b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "a6d4f573-ea4a-4c8b-85da-cd3ce54f732b" (UID: "a6d4f573-ea4a-4c8b-85da-cd3ce54f732b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:45:12 crc kubenswrapper[4790]: I1003 08:45:12.209979 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a6d4f573-ea4a-4c8b-85da-cd3ce54f732b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "a6d4f573-ea4a-4c8b-85da-cd3ce54f732b" (UID: "a6d4f573-ea4a-4c8b-85da-cd3ce54f732b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:45:12 crc kubenswrapper[4790]: I1003 08:45:12.210628 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a6d4f573-ea4a-4c8b-85da-cd3ce54f732b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "a6d4f573-ea4a-4c8b-85da-cd3ce54f732b" (UID: "a6d4f573-ea4a-4c8b-85da-cd3ce54f732b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:45:12 crc kubenswrapper[4790]: I1003 08:45:12.211567 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a6d4f573-ea4a-4c8b-85da-cd3ce54f732b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "a6d4f573-ea4a-4c8b-85da-cd3ce54f732b" (UID: "a6d4f573-ea4a-4c8b-85da-cd3ce54f732b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:45:12 crc kubenswrapper[4790]: I1003 08:45:12.212245 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a6d4f573-ea4a-4c8b-85da-cd3ce54f732b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "a6d4f573-ea4a-4c8b-85da-cd3ce54f732b" (UID: "a6d4f573-ea4a-4c8b-85da-cd3ce54f732b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:45:12 crc kubenswrapper[4790]: I1003 08:45:12.300201 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/424cb3c9-488b-4f71-967c-c9d0eed4b676-v4-0-config-user-template-login\") pod \"oauth-openshift-c494796b-wq8x5\" (UID: \"424cb3c9-488b-4f71-967c-c9d0eed4b676\") " pod="openshift-authentication/oauth-openshift-c494796b-wq8x5" Oct 03 08:45:12 crc kubenswrapper[4790]: I1003 08:45:12.300274 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/424cb3c9-488b-4f71-967c-c9d0eed4b676-v4-0-config-system-serving-cert\") pod \"oauth-openshift-c494796b-wq8x5\" (UID: \"424cb3c9-488b-4f71-967c-c9d0eed4b676\") " pod="openshift-authentication/oauth-openshift-c494796b-wq8x5" Oct 03 08:45:12 crc kubenswrapper[4790]: I1003 08:45:12.300302 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/424cb3c9-488b-4f71-967c-c9d0eed4b676-v4-0-config-user-template-error\") pod \"oauth-openshift-c494796b-wq8x5\" (UID: \"424cb3c9-488b-4f71-967c-c9d0eed4b676\") " pod="openshift-authentication/oauth-openshift-c494796b-wq8x5" Oct 03 08:45:12 crc kubenswrapper[4790]: I1003 08:45:12.300329 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/424cb3c9-488b-4f71-967c-c9d0eed4b676-v4-0-config-system-cliconfig\") pod \"oauth-openshift-c494796b-wq8x5\" (UID: \"424cb3c9-488b-4f71-967c-c9d0eed4b676\") " pod="openshift-authentication/oauth-openshift-c494796b-wq8x5" Oct 03 08:45:12 crc kubenswrapper[4790]: I1003 08:45:12.300357 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/424cb3c9-488b-4f71-967c-c9d0eed4b676-audit-dir\") pod \"oauth-openshift-c494796b-wq8x5\" (UID: \"424cb3c9-488b-4f71-967c-c9d0eed4b676\") " pod="openshift-authentication/oauth-openshift-c494796b-wq8x5" Oct 03 08:45:12 crc kubenswrapper[4790]: I1003 08:45:12.300389 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/424cb3c9-488b-4f71-967c-c9d0eed4b676-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-c494796b-wq8x5\" (UID: \"424cb3c9-488b-4f71-967c-c9d0eed4b676\") " pod="openshift-authentication/oauth-openshift-c494796b-wq8x5" Oct 03 08:45:12 crc kubenswrapper[4790]: I1003 08:45:12.300425 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/424cb3c9-488b-4f71-967c-c9d0eed4b676-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-c494796b-wq8x5\" (UID: \"424cb3c9-488b-4f71-967c-c9d0eed4b676\") " pod="openshift-authentication/oauth-openshift-c494796b-wq8x5" Oct 03 08:45:12 crc kubenswrapper[4790]: I1003 08:45:12.300448 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/424cb3c9-488b-4f71-967c-c9d0eed4b676-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-c494796b-wq8x5\" (UID: \"424cb3c9-488b-4f71-967c-c9d0eed4b676\") " pod="openshift-authentication/oauth-openshift-c494796b-wq8x5" Oct 03 08:45:12 crc kubenswrapper[4790]: I1003 08:45:12.300470 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/424cb3c9-488b-4f71-967c-c9d0eed4b676-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-c494796b-wq8x5\" (UID: \"424cb3c9-488b-4f71-967c-c9d0eed4b676\") " pod="openshift-authentication/oauth-openshift-c494796b-wq8x5" Oct 03 08:45:12 crc kubenswrapper[4790]: I1003 08:45:12.300493 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/424cb3c9-488b-4f71-967c-c9d0eed4b676-v4-0-config-system-session\") pod \"oauth-openshift-c494796b-wq8x5\" (UID: \"424cb3c9-488b-4f71-967c-c9d0eed4b676\") " pod="openshift-authentication/oauth-openshift-c494796b-wq8x5" Oct 03 08:45:12 crc kubenswrapper[4790]: I1003 08:45:12.300518 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/424cb3c9-488b-4f71-967c-c9d0eed4b676-v4-0-config-system-router-certs\") pod \"oauth-openshift-c494796b-wq8x5\" (UID: \"424cb3c9-488b-4f71-967c-c9d0eed4b676\") " pod="openshift-authentication/oauth-openshift-c494796b-wq8x5" Oct 03 08:45:12 crc kubenswrapper[4790]: I1003 08:45:12.300547 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-97d4t\" (UniqueName: \"kubernetes.io/projected/424cb3c9-488b-4f71-967c-c9d0eed4b676-kube-api-access-97d4t\") pod \"oauth-openshift-c494796b-wq8x5\" (UID: \"424cb3c9-488b-4f71-967c-c9d0eed4b676\") " pod="openshift-authentication/oauth-openshift-c494796b-wq8x5" Oct 03 08:45:12 crc kubenswrapper[4790]: I1003 08:45:12.300586 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/424cb3c9-488b-4f71-967c-c9d0eed4b676-audit-policies\") pod \"oauth-openshift-c494796b-wq8x5\" (UID: \"424cb3c9-488b-4f71-967c-c9d0eed4b676\") " pod="openshift-authentication/oauth-openshift-c494796b-wq8x5" Oct 03 08:45:12 crc kubenswrapper[4790]: I1003 08:45:12.300612 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/424cb3c9-488b-4f71-967c-c9d0eed4b676-v4-0-config-system-service-ca\") pod \"oauth-openshift-c494796b-wq8x5\" (UID: \"424cb3c9-488b-4f71-967c-c9d0eed4b676\") " pod="openshift-authentication/oauth-openshift-c494796b-wq8x5" Oct 03 08:45:12 crc kubenswrapper[4790]: I1003 08:45:12.300664 4790 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/a6d4f573-ea4a-4c8b-85da-cd3ce54f732b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Oct 03 08:45:12 crc kubenswrapper[4790]: I1003 08:45:12.300682 4790 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/a6d4f573-ea4a-4c8b-85da-cd3ce54f732b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Oct 03 08:45:12 crc kubenswrapper[4790]: I1003 08:45:12.300696 4790 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/a6d4f573-ea4a-4c8b-85da-cd3ce54f732b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Oct 03 08:45:12 crc kubenswrapper[4790]: I1003 08:45:12.300709 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mhmwn\" (UniqueName: \"kubernetes.io/projected/a6d4f573-ea4a-4c8b-85da-cd3ce54f732b-kube-api-access-mhmwn\") on node \"crc\" DevicePath \"\"" Oct 03 08:45:12 crc kubenswrapper[4790]: I1003 08:45:12.300724 4790 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/a6d4f573-ea4a-4c8b-85da-cd3ce54f732b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Oct 03 08:45:12 crc kubenswrapper[4790]: I1003 08:45:12.300739 4790 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/a6d4f573-ea4a-4c8b-85da-cd3ce54f732b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Oct 03 08:45:12 crc kubenswrapper[4790]: I1003 08:45:12.300753 4790 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/a6d4f573-ea4a-4c8b-85da-cd3ce54f732b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Oct 03 08:45:12 crc kubenswrapper[4790]: I1003 08:45:12.300767 4790 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/a6d4f573-ea4a-4c8b-85da-cd3ce54f732b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Oct 03 08:45:12 crc kubenswrapper[4790]: I1003 08:45:12.300781 4790 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/a6d4f573-ea4a-4c8b-85da-cd3ce54f732b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 03 08:45:12 crc kubenswrapper[4790]: I1003 08:45:12.301874 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/424cb3c9-488b-4f71-967c-c9d0eed4b676-audit-dir\") pod \"oauth-openshift-c494796b-wq8x5\" (UID: \"424cb3c9-488b-4f71-967c-c9d0eed4b676\") " pod="openshift-authentication/oauth-openshift-c494796b-wq8x5" Oct 03 08:45:12 crc kubenswrapper[4790]: I1003 08:45:12.301920 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/424cb3c9-488b-4f71-967c-c9d0eed4b676-v4-0-config-system-service-ca\") pod \"oauth-openshift-c494796b-wq8x5\" (UID: \"424cb3c9-488b-4f71-967c-c9d0eed4b676\") " pod="openshift-authentication/oauth-openshift-c494796b-wq8x5" Oct 03 08:45:12 crc kubenswrapper[4790]: I1003 08:45:12.302703 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/424cb3c9-488b-4f71-967c-c9d0eed4b676-v4-0-config-system-cliconfig\") pod \"oauth-openshift-c494796b-wq8x5\" (UID: \"424cb3c9-488b-4f71-967c-c9d0eed4b676\") " pod="openshift-authentication/oauth-openshift-c494796b-wq8x5" Oct 03 08:45:12 crc kubenswrapper[4790]: I1003 08:45:12.303013 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/424cb3c9-488b-4f71-967c-c9d0eed4b676-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-c494796b-wq8x5\" (UID: \"424cb3c9-488b-4f71-967c-c9d0eed4b676\") " pod="openshift-authentication/oauth-openshift-c494796b-wq8x5" Oct 03 08:45:12 crc kubenswrapper[4790]: I1003 08:45:12.303927 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/424cb3c9-488b-4f71-967c-c9d0eed4b676-audit-policies\") pod \"oauth-openshift-c494796b-wq8x5\" (UID: \"424cb3c9-488b-4f71-967c-c9d0eed4b676\") " pod="openshift-authentication/oauth-openshift-c494796b-wq8x5" Oct 03 08:45:12 crc kubenswrapper[4790]: I1003 08:45:12.305954 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/424cb3c9-488b-4f71-967c-c9d0eed4b676-v4-0-config-system-serving-cert\") pod \"oauth-openshift-c494796b-wq8x5\" (UID: \"424cb3c9-488b-4f71-967c-c9d0eed4b676\") " pod="openshift-authentication/oauth-openshift-c494796b-wq8x5" Oct 03 08:45:12 crc kubenswrapper[4790]: I1003 08:45:12.306340 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/424cb3c9-488b-4f71-967c-c9d0eed4b676-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-c494796b-wq8x5\" (UID: \"424cb3c9-488b-4f71-967c-c9d0eed4b676\") " pod="openshift-authentication/oauth-openshift-c494796b-wq8x5" Oct 03 08:45:12 crc kubenswrapper[4790]: I1003 08:45:12.306437 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/424cb3c9-488b-4f71-967c-c9d0eed4b676-v4-0-config-user-template-error\") pod \"oauth-openshift-c494796b-wq8x5\" (UID: \"424cb3c9-488b-4f71-967c-c9d0eed4b676\") " pod="openshift-authentication/oauth-openshift-c494796b-wq8x5" Oct 03 08:45:12 crc kubenswrapper[4790]: I1003 08:45:12.306366 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/424cb3c9-488b-4f71-967c-c9d0eed4b676-v4-0-config-user-template-login\") pod \"oauth-openshift-c494796b-wq8x5\" (UID: \"424cb3c9-488b-4f71-967c-c9d0eed4b676\") " pod="openshift-authentication/oauth-openshift-c494796b-wq8x5" Oct 03 08:45:12 crc kubenswrapper[4790]: I1003 08:45:12.306731 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/424cb3c9-488b-4f71-967c-c9d0eed4b676-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-c494796b-wq8x5\" (UID: \"424cb3c9-488b-4f71-967c-c9d0eed4b676\") " pod="openshift-authentication/oauth-openshift-c494796b-wq8x5" Oct 03 08:45:12 crc kubenswrapper[4790]: I1003 08:45:12.307184 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/424cb3c9-488b-4f71-967c-c9d0eed4b676-v4-0-config-system-router-certs\") pod \"oauth-openshift-c494796b-wq8x5\" (UID: \"424cb3c9-488b-4f71-967c-c9d0eed4b676\") " pod="openshift-authentication/oauth-openshift-c494796b-wq8x5" Oct 03 08:45:12 crc kubenswrapper[4790]: I1003 08:45:12.308440 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/424cb3c9-488b-4f71-967c-c9d0eed4b676-v4-0-config-system-session\") pod \"oauth-openshift-c494796b-wq8x5\" (UID: \"424cb3c9-488b-4f71-967c-c9d0eed4b676\") " pod="openshift-authentication/oauth-openshift-c494796b-wq8x5" Oct 03 08:45:12 crc kubenswrapper[4790]: I1003 08:45:12.308556 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/424cb3c9-488b-4f71-967c-c9d0eed4b676-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-c494796b-wq8x5\" (UID: \"424cb3c9-488b-4f71-967c-c9d0eed4b676\") " pod="openshift-authentication/oauth-openshift-c494796b-wq8x5" Oct 03 08:45:12 crc kubenswrapper[4790]: I1003 08:45:12.321952 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-97d4t\" (UniqueName: \"kubernetes.io/projected/424cb3c9-488b-4f71-967c-c9d0eed4b676-kube-api-access-97d4t\") pod \"oauth-openshift-c494796b-wq8x5\" (UID: \"424cb3c9-488b-4f71-967c-c9d0eed4b676\") " pod="openshift-authentication/oauth-openshift-c494796b-wq8x5" Oct 03 08:45:12 crc kubenswrapper[4790]: I1003 08:45:12.330529 4790 generic.go:334] "Generic (PLEG): container finished" podID="a6d4f573-ea4a-4c8b-85da-cd3ce54f732b" containerID="88098454f7c1829922b9c3fd5e3355181c08a2efab48195c7ebe99ac39fabfc4" exitCode=0 Oct 03 08:45:12 crc kubenswrapper[4790]: I1003 08:45:12.330628 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-r9zxx" Oct 03 08:45:12 crc kubenswrapper[4790]: I1003 08:45:12.335258 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-r9zxx" event={"ID":"a6d4f573-ea4a-4c8b-85da-cd3ce54f732b","Type":"ContainerDied","Data":"88098454f7c1829922b9c3fd5e3355181c08a2efab48195c7ebe99ac39fabfc4"} Oct 03 08:45:12 crc kubenswrapper[4790]: I1003 08:45:12.335299 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-r9zxx" event={"ID":"a6d4f573-ea4a-4c8b-85da-cd3ce54f732b","Type":"ContainerDied","Data":"e0ebe644bff0fed5794d6b8d21fa932c48c5c1d17133a5cf89e2ee3f8f5eb6fe"} Oct 03 08:45:12 crc kubenswrapper[4790]: I1003 08:45:12.335320 4790 scope.go:117] "RemoveContainer" containerID="88098454f7c1829922b9c3fd5e3355181c08a2efab48195c7ebe99ac39fabfc4" Oct 03 08:45:12 crc kubenswrapper[4790]: I1003 08:45:12.371671 4790 scope.go:117] "RemoveContainer" containerID="88098454f7c1829922b9c3fd5e3355181c08a2efab48195c7ebe99ac39fabfc4" Oct 03 08:45:12 crc kubenswrapper[4790]: E1003 08:45:12.373384 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"88098454f7c1829922b9c3fd5e3355181c08a2efab48195c7ebe99ac39fabfc4\": container with ID starting with 88098454f7c1829922b9c3fd5e3355181c08a2efab48195c7ebe99ac39fabfc4 not found: ID does not exist" containerID="88098454f7c1829922b9c3fd5e3355181c08a2efab48195c7ebe99ac39fabfc4" Oct 03 08:45:12 crc kubenswrapper[4790]: I1003 08:45:12.373458 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"88098454f7c1829922b9c3fd5e3355181c08a2efab48195c7ebe99ac39fabfc4"} err="failed to get container status \"88098454f7c1829922b9c3fd5e3355181c08a2efab48195c7ebe99ac39fabfc4\": rpc error: code = NotFound desc = could not find container \"88098454f7c1829922b9c3fd5e3355181c08a2efab48195c7ebe99ac39fabfc4\": container with ID starting with 88098454f7c1829922b9c3fd5e3355181c08a2efab48195c7ebe99ac39fabfc4 not found: ID does not exist" Oct 03 08:45:12 crc kubenswrapper[4790]: I1003 08:45:12.376129 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-r9zxx"] Oct 03 08:45:12 crc kubenswrapper[4790]: I1003 08:45:12.379545 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-r9zxx"] Oct 03 08:45:12 crc kubenswrapper[4790]: I1003 08:45:12.469773 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-c494796b-wq8x5" Oct 03 08:45:12 crc kubenswrapper[4790]: I1003 08:45:12.682543 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-c494796b-wq8x5"] Oct 03 08:45:13 crc kubenswrapper[4790]: I1003 08:45:13.341326 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-c494796b-wq8x5" event={"ID":"424cb3c9-488b-4f71-967c-c9d0eed4b676","Type":"ContainerStarted","Data":"6d04c051a73cd42d46555995fa4ecf49e695ecb7632fbadae83ad5176f4a80dc"} Oct 03 08:45:13 crc kubenswrapper[4790]: I1003 08:45:13.341847 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-c494796b-wq8x5" Oct 03 08:45:13 crc kubenswrapper[4790]: I1003 08:45:13.341862 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-c494796b-wq8x5" event={"ID":"424cb3c9-488b-4f71-967c-c9d0eed4b676","Type":"ContainerStarted","Data":"41398c3c125b3e17584094dec1cafb0520fc908af1a821eb600b4c6ffc7dc25c"} Oct 03 08:45:13 crc kubenswrapper[4790]: I1003 08:45:13.370383 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-c494796b-wq8x5" podStartSLOduration=27.370348952 podStartE2EDuration="27.370348952s" podCreationTimestamp="2025-10-03 08:44:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 08:45:13.368386096 +0000 UTC m=+299.721320344" watchObservedRunningTime="2025-10-03 08:45:13.370348952 +0000 UTC m=+299.723283190" Oct 03 08:45:13 crc kubenswrapper[4790]: I1003 08:45:13.613444 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-c494796b-wq8x5" Oct 03 08:45:14 crc kubenswrapper[4790]: I1003 08:45:14.339611 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a6d4f573-ea4a-4c8b-85da-cd3ce54f732b" path="/var/lib/kubelet/pods/a6d4f573-ea4a-4c8b-85da-cd3ce54f732b/volumes" Oct 03 08:45:27 crc kubenswrapper[4790]: I1003 08:45:27.646063 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-g9chf"] Oct 03 08:45:27 crc kubenswrapper[4790]: I1003 08:45:27.646962 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-g9chf" podUID="6b044862-0484-4701-a965-796c87a3adc1" containerName="registry-server" containerID="cri-o://6022d3806f2192f5e6ec39991c18c426252ea4fb82e052f02eed52662c0885d9" gracePeriod=30 Oct 03 08:45:27 crc kubenswrapper[4790]: I1003 08:45:27.672645 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-zj7jl"] Oct 03 08:45:27 crc kubenswrapper[4790]: I1003 08:45:27.673311 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-zj7jl" podUID="b66141f0-fc3d-4596-9484-d01137e1f76c" containerName="registry-server" containerID="cri-o://12a322c107c67611ae45e6459662e2d1b17284183a82e4934aeb2943ec617d9a" gracePeriod=30 Oct 03 08:45:27 crc kubenswrapper[4790]: I1003 08:45:27.677068 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-h7tdr"] Oct 03 08:45:27 crc kubenswrapper[4790]: I1003 08:45:27.677315 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-h7tdr" podUID="a04ec950-87ce-4f04-9c9d-201a4f6099a1" containerName="marketplace-operator" containerID="cri-o://0a70a9044adf281ff7b64c63a15d94aefe74174c8b191ba0fc8a2cbb6c6e67a8" gracePeriod=30 Oct 03 08:45:27 crc kubenswrapper[4790]: I1003 08:45:27.686166 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-sttgw"] Oct 03 08:45:27 crc kubenswrapper[4790]: I1003 08:45:27.686388 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-sttgw" podUID="de73f7fd-8767-4ac4-8975-9eb18a03b881" containerName="registry-server" containerID="cri-o://34d15504da66bdd752c1892d9faff81461cfed9a95b3f9e8cb5c059411aa1014" gracePeriod=30 Oct 03 08:45:27 crc kubenswrapper[4790]: I1003 08:45:27.693988 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-njqpz"] Oct 03 08:45:27 crc kubenswrapper[4790]: I1003 08:45:27.694237 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-njqpz" podUID="4882956d-71db-444d-9573-83151dc6f80f" containerName="registry-server" containerID="cri-o://7132b4101d980da555344150482a7deafd8272dc23c4a7ed9360fdf717e66621" gracePeriod=30 Oct 03 08:45:27 crc kubenswrapper[4790]: I1003 08:45:27.707394 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-5sbms"] Oct 03 08:45:27 crc kubenswrapper[4790]: I1003 08:45:27.708086 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-5sbms" Oct 03 08:45:27 crc kubenswrapper[4790]: I1003 08:45:27.731659 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-5sbms"] Oct 03 08:45:27 crc kubenswrapper[4790]: I1003 08:45:27.827730 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/dc166124-1b08-4c1e-90fe-a35d6331fcf4-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-5sbms\" (UID: \"dc166124-1b08-4c1e-90fe-a35d6331fcf4\") " pod="openshift-marketplace/marketplace-operator-79b997595-5sbms" Oct 03 08:45:27 crc kubenswrapper[4790]: I1003 08:45:27.827797 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m5t9b\" (UniqueName: \"kubernetes.io/projected/dc166124-1b08-4c1e-90fe-a35d6331fcf4-kube-api-access-m5t9b\") pod \"marketplace-operator-79b997595-5sbms\" (UID: \"dc166124-1b08-4c1e-90fe-a35d6331fcf4\") " pod="openshift-marketplace/marketplace-operator-79b997595-5sbms" Oct 03 08:45:27 crc kubenswrapper[4790]: I1003 08:45:27.827840 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/dc166124-1b08-4c1e-90fe-a35d6331fcf4-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-5sbms\" (UID: \"dc166124-1b08-4c1e-90fe-a35d6331fcf4\") " pod="openshift-marketplace/marketplace-operator-79b997595-5sbms" Oct 03 08:45:27 crc kubenswrapper[4790]: I1003 08:45:27.929242 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/dc166124-1b08-4c1e-90fe-a35d6331fcf4-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-5sbms\" (UID: \"dc166124-1b08-4c1e-90fe-a35d6331fcf4\") " pod="openshift-marketplace/marketplace-operator-79b997595-5sbms" Oct 03 08:45:27 crc kubenswrapper[4790]: I1003 08:45:27.929321 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/dc166124-1b08-4c1e-90fe-a35d6331fcf4-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-5sbms\" (UID: \"dc166124-1b08-4c1e-90fe-a35d6331fcf4\") " pod="openshift-marketplace/marketplace-operator-79b997595-5sbms" Oct 03 08:45:27 crc kubenswrapper[4790]: I1003 08:45:27.929432 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m5t9b\" (UniqueName: \"kubernetes.io/projected/dc166124-1b08-4c1e-90fe-a35d6331fcf4-kube-api-access-m5t9b\") pod \"marketplace-operator-79b997595-5sbms\" (UID: \"dc166124-1b08-4c1e-90fe-a35d6331fcf4\") " pod="openshift-marketplace/marketplace-operator-79b997595-5sbms" Oct 03 08:45:27 crc kubenswrapper[4790]: I1003 08:45:27.933009 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/dc166124-1b08-4c1e-90fe-a35d6331fcf4-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-5sbms\" (UID: \"dc166124-1b08-4c1e-90fe-a35d6331fcf4\") " pod="openshift-marketplace/marketplace-operator-79b997595-5sbms" Oct 03 08:45:27 crc kubenswrapper[4790]: I1003 08:45:27.938677 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/dc166124-1b08-4c1e-90fe-a35d6331fcf4-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-5sbms\" (UID: \"dc166124-1b08-4c1e-90fe-a35d6331fcf4\") " pod="openshift-marketplace/marketplace-operator-79b997595-5sbms" Oct 03 08:45:27 crc kubenswrapper[4790]: I1003 08:45:27.956695 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m5t9b\" (UniqueName: \"kubernetes.io/projected/dc166124-1b08-4c1e-90fe-a35d6331fcf4-kube-api-access-m5t9b\") pod \"marketplace-operator-79b997595-5sbms\" (UID: \"dc166124-1b08-4c1e-90fe-a35d6331fcf4\") " pod="openshift-marketplace/marketplace-operator-79b997595-5sbms" Oct 03 08:45:28 crc kubenswrapper[4790]: I1003 08:45:28.032794 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-5sbms" Oct 03 08:45:28 crc kubenswrapper[4790]: I1003 08:45:28.135655 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-g9chf" Oct 03 08:45:28 crc kubenswrapper[4790]: I1003 08:45:28.146861 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zj7jl" Oct 03 08:45:28 crc kubenswrapper[4790]: I1003 08:45:28.149959 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-sttgw" Oct 03 08:45:28 crc kubenswrapper[4790]: I1003 08:45:28.182973 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-njqpz" Oct 03 08:45:28 crc kubenswrapper[4790]: I1003 08:45:28.232607 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2b485\" (UniqueName: \"kubernetes.io/projected/4882956d-71db-444d-9573-83151dc6f80f-kube-api-access-2b485\") pod \"4882956d-71db-444d-9573-83151dc6f80f\" (UID: \"4882956d-71db-444d-9573-83151dc6f80f\") " Oct 03 08:45:28 crc kubenswrapper[4790]: I1003 08:45:28.232668 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4882956d-71db-444d-9573-83151dc6f80f-utilities\") pod \"4882956d-71db-444d-9573-83151dc6f80f\" (UID: \"4882956d-71db-444d-9573-83151dc6f80f\") " Oct 03 08:45:28 crc kubenswrapper[4790]: I1003 08:45:28.232714 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b66141f0-fc3d-4596-9484-d01137e1f76c-catalog-content\") pod \"b66141f0-fc3d-4596-9484-d01137e1f76c\" (UID: \"b66141f0-fc3d-4596-9484-d01137e1f76c\") " Oct 03 08:45:28 crc kubenswrapper[4790]: I1003 08:45:28.232827 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/de73f7fd-8767-4ac4-8975-9eb18a03b881-utilities\") pod \"de73f7fd-8767-4ac4-8975-9eb18a03b881\" (UID: \"de73f7fd-8767-4ac4-8975-9eb18a03b881\") " Oct 03 08:45:28 crc kubenswrapper[4790]: I1003 08:45:28.232852 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6b044862-0484-4701-a965-796c87a3adc1-utilities\") pod \"6b044862-0484-4701-a965-796c87a3adc1\" (UID: \"6b044862-0484-4701-a965-796c87a3adc1\") " Oct 03 08:45:28 crc kubenswrapper[4790]: I1003 08:45:28.232886 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6b044862-0484-4701-a965-796c87a3adc1-catalog-content\") pod \"6b044862-0484-4701-a965-796c87a3adc1\" (UID: \"6b044862-0484-4701-a965-796c87a3adc1\") " Oct 03 08:45:28 crc kubenswrapper[4790]: I1003 08:45:28.232907 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b66141f0-fc3d-4596-9484-d01137e1f76c-utilities\") pod \"b66141f0-fc3d-4596-9484-d01137e1f76c\" (UID: \"b66141f0-fc3d-4596-9484-d01137e1f76c\") " Oct 03 08:45:28 crc kubenswrapper[4790]: I1003 08:45:28.232935 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bdtwz\" (UniqueName: \"kubernetes.io/projected/b66141f0-fc3d-4596-9484-d01137e1f76c-kube-api-access-bdtwz\") pod \"b66141f0-fc3d-4596-9484-d01137e1f76c\" (UID: \"b66141f0-fc3d-4596-9484-d01137e1f76c\") " Oct 03 08:45:28 crc kubenswrapper[4790]: I1003 08:45:28.233007 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/de73f7fd-8767-4ac4-8975-9eb18a03b881-catalog-content\") pod \"de73f7fd-8767-4ac4-8975-9eb18a03b881\" (UID: \"de73f7fd-8767-4ac4-8975-9eb18a03b881\") " Oct 03 08:45:28 crc kubenswrapper[4790]: I1003 08:45:28.233034 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t6w6k\" (UniqueName: \"kubernetes.io/projected/6b044862-0484-4701-a965-796c87a3adc1-kube-api-access-t6w6k\") pod \"6b044862-0484-4701-a965-796c87a3adc1\" (UID: \"6b044862-0484-4701-a965-796c87a3adc1\") " Oct 03 08:45:28 crc kubenswrapper[4790]: I1003 08:45:28.233053 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vrqzl\" (UniqueName: \"kubernetes.io/projected/de73f7fd-8767-4ac4-8975-9eb18a03b881-kube-api-access-vrqzl\") pod \"de73f7fd-8767-4ac4-8975-9eb18a03b881\" (UID: \"de73f7fd-8767-4ac4-8975-9eb18a03b881\") " Oct 03 08:45:28 crc kubenswrapper[4790]: I1003 08:45:28.233078 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4882956d-71db-444d-9573-83151dc6f80f-catalog-content\") pod \"4882956d-71db-444d-9573-83151dc6f80f\" (UID: \"4882956d-71db-444d-9573-83151dc6f80f\") " Oct 03 08:45:28 crc kubenswrapper[4790]: I1003 08:45:28.234792 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6b044862-0484-4701-a965-796c87a3adc1-utilities" (OuterVolumeSpecName: "utilities") pod "6b044862-0484-4701-a965-796c87a3adc1" (UID: "6b044862-0484-4701-a965-796c87a3adc1"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 08:45:28 crc kubenswrapper[4790]: I1003 08:45:28.234821 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4882956d-71db-444d-9573-83151dc6f80f-utilities" (OuterVolumeSpecName: "utilities") pod "4882956d-71db-444d-9573-83151dc6f80f" (UID: "4882956d-71db-444d-9573-83151dc6f80f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 08:45:28 crc kubenswrapper[4790]: I1003 08:45:28.234870 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/de73f7fd-8767-4ac4-8975-9eb18a03b881-utilities" (OuterVolumeSpecName: "utilities") pod "de73f7fd-8767-4ac4-8975-9eb18a03b881" (UID: "de73f7fd-8767-4ac4-8975-9eb18a03b881"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 08:45:28 crc kubenswrapper[4790]: I1003 08:45:28.234952 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b66141f0-fc3d-4596-9484-d01137e1f76c-utilities" (OuterVolumeSpecName: "utilities") pod "b66141f0-fc3d-4596-9484-d01137e1f76c" (UID: "b66141f0-fc3d-4596-9484-d01137e1f76c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 08:45:28 crc kubenswrapper[4790]: I1003 08:45:28.236616 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4882956d-71db-444d-9573-83151dc6f80f-kube-api-access-2b485" (OuterVolumeSpecName: "kube-api-access-2b485") pod "4882956d-71db-444d-9573-83151dc6f80f" (UID: "4882956d-71db-444d-9573-83151dc6f80f"). InnerVolumeSpecName "kube-api-access-2b485". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:45:28 crc kubenswrapper[4790]: I1003 08:45:28.236934 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b66141f0-fc3d-4596-9484-d01137e1f76c-kube-api-access-bdtwz" (OuterVolumeSpecName: "kube-api-access-bdtwz") pod "b66141f0-fc3d-4596-9484-d01137e1f76c" (UID: "b66141f0-fc3d-4596-9484-d01137e1f76c"). InnerVolumeSpecName "kube-api-access-bdtwz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:45:28 crc kubenswrapper[4790]: I1003 08:45:28.241788 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/de73f7fd-8767-4ac4-8975-9eb18a03b881-kube-api-access-vrqzl" (OuterVolumeSpecName: "kube-api-access-vrqzl") pod "de73f7fd-8767-4ac4-8975-9eb18a03b881" (UID: "de73f7fd-8767-4ac4-8975-9eb18a03b881"). InnerVolumeSpecName "kube-api-access-vrqzl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:45:28 crc kubenswrapper[4790]: I1003 08:45:28.251925 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/de73f7fd-8767-4ac4-8975-9eb18a03b881-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "de73f7fd-8767-4ac4-8975-9eb18a03b881" (UID: "de73f7fd-8767-4ac4-8975-9eb18a03b881"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 08:45:28 crc kubenswrapper[4790]: I1003 08:45:28.254862 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6b044862-0484-4701-a965-796c87a3adc1-kube-api-access-t6w6k" (OuterVolumeSpecName: "kube-api-access-t6w6k") pod "6b044862-0484-4701-a965-796c87a3adc1" (UID: "6b044862-0484-4701-a965-796c87a3adc1"). InnerVolumeSpecName "kube-api-access-t6w6k". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:45:28 crc kubenswrapper[4790]: I1003 08:45:28.287759 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6b044862-0484-4701-a965-796c87a3adc1-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6b044862-0484-4701-a965-796c87a3adc1" (UID: "6b044862-0484-4701-a965-796c87a3adc1"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 08:45:28 crc kubenswrapper[4790]: I1003 08:45:28.293284 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b66141f0-fc3d-4596-9484-d01137e1f76c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b66141f0-fc3d-4596-9484-d01137e1f76c" (UID: "b66141f0-fc3d-4596-9484-d01137e1f76c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 08:45:28 crc kubenswrapper[4790]: I1003 08:45:28.333774 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4882956d-71db-444d-9573-83151dc6f80f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4882956d-71db-444d-9573-83151dc6f80f" (UID: "4882956d-71db-444d-9573-83151dc6f80f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 08:45:28 crc kubenswrapper[4790]: I1003 08:45:28.334279 4790 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/de73f7fd-8767-4ac4-8975-9eb18a03b881-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 08:45:28 crc kubenswrapper[4790]: I1003 08:45:28.334304 4790 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6b044862-0484-4701-a965-796c87a3adc1-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 08:45:28 crc kubenswrapper[4790]: I1003 08:45:28.334314 4790 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b66141f0-fc3d-4596-9484-d01137e1f76c-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 08:45:28 crc kubenswrapper[4790]: I1003 08:45:28.334323 4790 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6b044862-0484-4701-a965-796c87a3adc1-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 08:45:28 crc kubenswrapper[4790]: I1003 08:45:28.334460 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bdtwz\" (UniqueName: \"kubernetes.io/projected/b66141f0-fc3d-4596-9484-d01137e1f76c-kube-api-access-bdtwz\") on node \"crc\" DevicePath \"\"" Oct 03 08:45:28 crc kubenswrapper[4790]: I1003 08:45:28.334622 4790 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/de73f7fd-8767-4ac4-8975-9eb18a03b881-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 08:45:28 crc kubenswrapper[4790]: I1003 08:45:28.334685 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t6w6k\" (UniqueName: \"kubernetes.io/projected/6b044862-0484-4701-a965-796c87a3adc1-kube-api-access-t6w6k\") on node \"crc\" DevicePath \"\"" Oct 03 08:45:28 crc kubenswrapper[4790]: I1003 08:45:28.334700 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vrqzl\" (UniqueName: \"kubernetes.io/projected/de73f7fd-8767-4ac4-8975-9eb18a03b881-kube-api-access-vrqzl\") on node \"crc\" DevicePath \"\"" Oct 03 08:45:28 crc kubenswrapper[4790]: I1003 08:45:28.334710 4790 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4882956d-71db-444d-9573-83151dc6f80f-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 08:45:28 crc kubenswrapper[4790]: I1003 08:45:28.334720 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2b485\" (UniqueName: \"kubernetes.io/projected/4882956d-71db-444d-9573-83151dc6f80f-kube-api-access-2b485\") on node \"crc\" DevicePath \"\"" Oct 03 08:45:28 crc kubenswrapper[4790]: I1003 08:45:28.334730 4790 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4882956d-71db-444d-9573-83151dc6f80f-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 08:45:28 crc kubenswrapper[4790]: I1003 08:45:28.334739 4790 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b66141f0-fc3d-4596-9484-d01137e1f76c-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 08:45:28 crc kubenswrapper[4790]: I1003 08:45:28.420736 4790 generic.go:334] "Generic (PLEG): container finished" podID="a04ec950-87ce-4f04-9c9d-201a4f6099a1" containerID="0a70a9044adf281ff7b64c63a15d94aefe74174c8b191ba0fc8a2cbb6c6e67a8" exitCode=0 Oct 03 08:45:28 crc kubenswrapper[4790]: I1003 08:45:28.420811 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-h7tdr" event={"ID":"a04ec950-87ce-4f04-9c9d-201a4f6099a1","Type":"ContainerDied","Data":"0a70a9044adf281ff7b64c63a15d94aefe74174c8b191ba0fc8a2cbb6c6e67a8"} Oct 03 08:45:28 crc kubenswrapper[4790]: I1003 08:45:28.423348 4790 generic.go:334] "Generic (PLEG): container finished" podID="6b044862-0484-4701-a965-796c87a3adc1" containerID="6022d3806f2192f5e6ec39991c18c426252ea4fb82e052f02eed52662c0885d9" exitCode=0 Oct 03 08:45:28 crc kubenswrapper[4790]: I1003 08:45:28.423411 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-g9chf" event={"ID":"6b044862-0484-4701-a965-796c87a3adc1","Type":"ContainerDied","Data":"6022d3806f2192f5e6ec39991c18c426252ea4fb82e052f02eed52662c0885d9"} Oct 03 08:45:28 crc kubenswrapper[4790]: I1003 08:45:28.423433 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-g9chf" event={"ID":"6b044862-0484-4701-a965-796c87a3adc1","Type":"ContainerDied","Data":"ae841a1c47fc6b6929936a4b8de238798847375c72009d576550606b5a6c4d5a"} Oct 03 08:45:28 crc kubenswrapper[4790]: I1003 08:45:28.423453 4790 scope.go:117] "RemoveContainer" containerID="6022d3806f2192f5e6ec39991c18c426252ea4fb82e052f02eed52662c0885d9" Oct 03 08:45:28 crc kubenswrapper[4790]: I1003 08:45:28.423414 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-g9chf" Oct 03 08:45:28 crc kubenswrapper[4790]: I1003 08:45:28.425747 4790 generic.go:334] "Generic (PLEG): container finished" podID="de73f7fd-8767-4ac4-8975-9eb18a03b881" containerID="34d15504da66bdd752c1892d9faff81461cfed9a95b3f9e8cb5c059411aa1014" exitCode=0 Oct 03 08:45:28 crc kubenswrapper[4790]: I1003 08:45:28.425793 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sttgw" event={"ID":"de73f7fd-8767-4ac4-8975-9eb18a03b881","Type":"ContainerDied","Data":"34d15504da66bdd752c1892d9faff81461cfed9a95b3f9e8cb5c059411aa1014"} Oct 03 08:45:28 crc kubenswrapper[4790]: I1003 08:45:28.425813 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sttgw" event={"ID":"de73f7fd-8767-4ac4-8975-9eb18a03b881","Type":"ContainerDied","Data":"1ba3363a7c04846776f042309df4aa2352784325f68d70ea8fd7ece9fc2d41f2"} Oct 03 08:45:28 crc kubenswrapper[4790]: I1003 08:45:28.425867 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-sttgw" Oct 03 08:45:28 crc kubenswrapper[4790]: I1003 08:45:28.430075 4790 generic.go:334] "Generic (PLEG): container finished" podID="4882956d-71db-444d-9573-83151dc6f80f" containerID="7132b4101d980da555344150482a7deafd8272dc23c4a7ed9360fdf717e66621" exitCode=0 Oct 03 08:45:28 crc kubenswrapper[4790]: I1003 08:45:28.430203 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-njqpz" Oct 03 08:45:28 crc kubenswrapper[4790]: I1003 08:45:28.430309 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-njqpz" event={"ID":"4882956d-71db-444d-9573-83151dc6f80f","Type":"ContainerDied","Data":"7132b4101d980da555344150482a7deafd8272dc23c4a7ed9360fdf717e66621"} Oct 03 08:45:28 crc kubenswrapper[4790]: I1003 08:45:28.430434 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-njqpz" event={"ID":"4882956d-71db-444d-9573-83151dc6f80f","Type":"ContainerDied","Data":"7d5ac4fac659d76d7b1b9352a79b6eb40b0c8666b7614e205a39fe34eb7c3f4d"} Oct 03 08:45:28 crc kubenswrapper[4790]: I1003 08:45:28.435164 4790 generic.go:334] "Generic (PLEG): container finished" podID="b66141f0-fc3d-4596-9484-d01137e1f76c" containerID="12a322c107c67611ae45e6459662e2d1b17284183a82e4934aeb2943ec617d9a" exitCode=0 Oct 03 08:45:28 crc kubenswrapper[4790]: I1003 08:45:28.435211 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zj7jl" event={"ID":"b66141f0-fc3d-4596-9484-d01137e1f76c","Type":"ContainerDied","Data":"12a322c107c67611ae45e6459662e2d1b17284183a82e4934aeb2943ec617d9a"} Oct 03 08:45:28 crc kubenswrapper[4790]: I1003 08:45:28.435222 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zj7jl" Oct 03 08:45:28 crc kubenswrapper[4790]: I1003 08:45:28.435241 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zj7jl" event={"ID":"b66141f0-fc3d-4596-9484-d01137e1f76c","Type":"ContainerDied","Data":"af5f332bd6f8ba73c91b2fab8740d3da9902a418a6b8e11f089f6525fa28536d"} Oct 03 08:45:28 crc kubenswrapper[4790]: I1003 08:45:28.454701 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-sttgw"] Oct 03 08:45:28 crc kubenswrapper[4790]: I1003 08:45:28.454839 4790 scope.go:117] "RemoveContainer" containerID="24140849270be8938403c7e2730646843e2ba15ade9747b98c5cf8b0969b72ca" Oct 03 08:45:28 crc kubenswrapper[4790]: I1003 08:45:28.458619 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-sttgw"] Oct 03 08:45:28 crc kubenswrapper[4790]: I1003 08:45:28.474772 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-g9chf"] Oct 03 08:45:28 crc kubenswrapper[4790]: I1003 08:45:28.480164 4790 scope.go:117] "RemoveContainer" containerID="93ce6f94129b1a31d4e944d62f4de87a09c5ebc34e114abf8f90f72a1c9aa16d" Oct 03 08:45:28 crc kubenswrapper[4790]: I1003 08:45:28.484609 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-g9chf"] Oct 03 08:45:28 crc kubenswrapper[4790]: I1003 08:45:28.488086 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-5sbms"] Oct 03 08:45:28 crc kubenswrapper[4790]: I1003 08:45:28.491251 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-zj7jl"] Oct 03 08:45:28 crc kubenswrapper[4790]: W1003 08:45:28.493957 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddc166124_1b08_4c1e_90fe_a35d6331fcf4.slice/crio-216628e2fcdc1f98d977bffb181ca3980fe515d0e57edef54b2a951af6a6f0a1 WatchSource:0}: Error finding container 216628e2fcdc1f98d977bffb181ca3980fe515d0e57edef54b2a951af6a6f0a1: Status 404 returned error can't find the container with id 216628e2fcdc1f98d977bffb181ca3980fe515d0e57edef54b2a951af6a6f0a1 Oct 03 08:45:28 crc kubenswrapper[4790]: I1003 08:45:28.496319 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-zj7jl"] Oct 03 08:45:28 crc kubenswrapper[4790]: I1003 08:45:28.498185 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-njqpz"] Oct 03 08:45:28 crc kubenswrapper[4790]: I1003 08:45:28.500614 4790 scope.go:117] "RemoveContainer" containerID="6022d3806f2192f5e6ec39991c18c426252ea4fb82e052f02eed52662c0885d9" Oct 03 08:45:28 crc kubenswrapper[4790]: I1003 08:45:28.501054 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-njqpz"] Oct 03 08:45:28 crc kubenswrapper[4790]: E1003 08:45:28.501140 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6022d3806f2192f5e6ec39991c18c426252ea4fb82e052f02eed52662c0885d9\": container with ID starting with 6022d3806f2192f5e6ec39991c18c426252ea4fb82e052f02eed52662c0885d9 not found: ID does not exist" containerID="6022d3806f2192f5e6ec39991c18c426252ea4fb82e052f02eed52662c0885d9" Oct 03 08:45:28 crc kubenswrapper[4790]: I1003 08:45:28.501214 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6022d3806f2192f5e6ec39991c18c426252ea4fb82e052f02eed52662c0885d9"} err="failed to get container status \"6022d3806f2192f5e6ec39991c18c426252ea4fb82e052f02eed52662c0885d9\": rpc error: code = NotFound desc = could not find container \"6022d3806f2192f5e6ec39991c18c426252ea4fb82e052f02eed52662c0885d9\": container with ID starting with 6022d3806f2192f5e6ec39991c18c426252ea4fb82e052f02eed52662c0885d9 not found: ID does not exist" Oct 03 08:45:28 crc kubenswrapper[4790]: I1003 08:45:28.501251 4790 scope.go:117] "RemoveContainer" containerID="24140849270be8938403c7e2730646843e2ba15ade9747b98c5cf8b0969b72ca" Oct 03 08:45:28 crc kubenswrapper[4790]: E1003 08:45:28.501758 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"24140849270be8938403c7e2730646843e2ba15ade9747b98c5cf8b0969b72ca\": container with ID starting with 24140849270be8938403c7e2730646843e2ba15ade9747b98c5cf8b0969b72ca not found: ID does not exist" containerID="24140849270be8938403c7e2730646843e2ba15ade9747b98c5cf8b0969b72ca" Oct 03 08:45:28 crc kubenswrapper[4790]: I1003 08:45:28.501786 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"24140849270be8938403c7e2730646843e2ba15ade9747b98c5cf8b0969b72ca"} err="failed to get container status \"24140849270be8938403c7e2730646843e2ba15ade9747b98c5cf8b0969b72ca\": rpc error: code = NotFound desc = could not find container \"24140849270be8938403c7e2730646843e2ba15ade9747b98c5cf8b0969b72ca\": container with ID starting with 24140849270be8938403c7e2730646843e2ba15ade9747b98c5cf8b0969b72ca not found: ID does not exist" Oct 03 08:45:28 crc kubenswrapper[4790]: I1003 08:45:28.501805 4790 scope.go:117] "RemoveContainer" containerID="93ce6f94129b1a31d4e944d62f4de87a09c5ebc34e114abf8f90f72a1c9aa16d" Oct 03 08:45:28 crc kubenswrapper[4790]: E1003 08:45:28.502231 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"93ce6f94129b1a31d4e944d62f4de87a09c5ebc34e114abf8f90f72a1c9aa16d\": container with ID starting with 93ce6f94129b1a31d4e944d62f4de87a09c5ebc34e114abf8f90f72a1c9aa16d not found: ID does not exist" containerID="93ce6f94129b1a31d4e944d62f4de87a09c5ebc34e114abf8f90f72a1c9aa16d" Oct 03 08:45:28 crc kubenswrapper[4790]: I1003 08:45:28.502270 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"93ce6f94129b1a31d4e944d62f4de87a09c5ebc34e114abf8f90f72a1c9aa16d"} err="failed to get container status \"93ce6f94129b1a31d4e944d62f4de87a09c5ebc34e114abf8f90f72a1c9aa16d\": rpc error: code = NotFound desc = could not find container \"93ce6f94129b1a31d4e944d62f4de87a09c5ebc34e114abf8f90f72a1c9aa16d\": container with ID starting with 93ce6f94129b1a31d4e944d62f4de87a09c5ebc34e114abf8f90f72a1c9aa16d not found: ID does not exist" Oct 03 08:45:28 crc kubenswrapper[4790]: I1003 08:45:28.502305 4790 scope.go:117] "RemoveContainer" containerID="34d15504da66bdd752c1892d9faff81461cfed9a95b3f9e8cb5c059411aa1014" Oct 03 08:45:28 crc kubenswrapper[4790]: I1003 08:45:28.516424 4790 scope.go:117] "RemoveContainer" containerID="3b6a79be929bc70d84cd1facc64a8b15f799abf3812164ad34a102ca0f3134ae" Oct 03 08:45:28 crc kubenswrapper[4790]: I1003 08:45:28.549659 4790 scope.go:117] "RemoveContainer" containerID="d5afb3910230680bc2f2461fbb476801408abf43495bca0cfb330a04d6d34eb6" Oct 03 08:45:28 crc kubenswrapper[4790]: I1003 08:45:28.569157 4790 scope.go:117] "RemoveContainer" containerID="34d15504da66bdd752c1892d9faff81461cfed9a95b3f9e8cb5c059411aa1014" Oct 03 08:45:28 crc kubenswrapper[4790]: E1003 08:45:28.569635 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"34d15504da66bdd752c1892d9faff81461cfed9a95b3f9e8cb5c059411aa1014\": container with ID starting with 34d15504da66bdd752c1892d9faff81461cfed9a95b3f9e8cb5c059411aa1014 not found: ID does not exist" containerID="34d15504da66bdd752c1892d9faff81461cfed9a95b3f9e8cb5c059411aa1014" Oct 03 08:45:28 crc kubenswrapper[4790]: I1003 08:45:28.569700 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"34d15504da66bdd752c1892d9faff81461cfed9a95b3f9e8cb5c059411aa1014"} err="failed to get container status \"34d15504da66bdd752c1892d9faff81461cfed9a95b3f9e8cb5c059411aa1014\": rpc error: code = NotFound desc = could not find container \"34d15504da66bdd752c1892d9faff81461cfed9a95b3f9e8cb5c059411aa1014\": container with ID starting with 34d15504da66bdd752c1892d9faff81461cfed9a95b3f9e8cb5c059411aa1014 not found: ID does not exist" Oct 03 08:45:28 crc kubenswrapper[4790]: I1003 08:45:28.569737 4790 scope.go:117] "RemoveContainer" containerID="3b6a79be929bc70d84cd1facc64a8b15f799abf3812164ad34a102ca0f3134ae" Oct 03 08:45:28 crc kubenswrapper[4790]: E1003 08:45:28.570070 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3b6a79be929bc70d84cd1facc64a8b15f799abf3812164ad34a102ca0f3134ae\": container with ID starting with 3b6a79be929bc70d84cd1facc64a8b15f799abf3812164ad34a102ca0f3134ae not found: ID does not exist" containerID="3b6a79be929bc70d84cd1facc64a8b15f799abf3812164ad34a102ca0f3134ae" Oct 03 08:45:28 crc kubenswrapper[4790]: I1003 08:45:28.570107 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3b6a79be929bc70d84cd1facc64a8b15f799abf3812164ad34a102ca0f3134ae"} err="failed to get container status \"3b6a79be929bc70d84cd1facc64a8b15f799abf3812164ad34a102ca0f3134ae\": rpc error: code = NotFound desc = could not find container \"3b6a79be929bc70d84cd1facc64a8b15f799abf3812164ad34a102ca0f3134ae\": container with ID starting with 3b6a79be929bc70d84cd1facc64a8b15f799abf3812164ad34a102ca0f3134ae not found: ID does not exist" Oct 03 08:45:28 crc kubenswrapper[4790]: I1003 08:45:28.570133 4790 scope.go:117] "RemoveContainer" containerID="d5afb3910230680bc2f2461fbb476801408abf43495bca0cfb330a04d6d34eb6" Oct 03 08:45:28 crc kubenswrapper[4790]: E1003 08:45:28.570395 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d5afb3910230680bc2f2461fbb476801408abf43495bca0cfb330a04d6d34eb6\": container with ID starting with d5afb3910230680bc2f2461fbb476801408abf43495bca0cfb330a04d6d34eb6 not found: ID does not exist" containerID="d5afb3910230680bc2f2461fbb476801408abf43495bca0cfb330a04d6d34eb6" Oct 03 08:45:28 crc kubenswrapper[4790]: I1003 08:45:28.570428 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d5afb3910230680bc2f2461fbb476801408abf43495bca0cfb330a04d6d34eb6"} err="failed to get container status \"d5afb3910230680bc2f2461fbb476801408abf43495bca0cfb330a04d6d34eb6\": rpc error: code = NotFound desc = could not find container \"d5afb3910230680bc2f2461fbb476801408abf43495bca0cfb330a04d6d34eb6\": container with ID starting with d5afb3910230680bc2f2461fbb476801408abf43495bca0cfb330a04d6d34eb6 not found: ID does not exist" Oct 03 08:45:28 crc kubenswrapper[4790]: I1003 08:45:28.570448 4790 scope.go:117] "RemoveContainer" containerID="7132b4101d980da555344150482a7deafd8272dc23c4a7ed9360fdf717e66621" Oct 03 08:45:28 crc kubenswrapper[4790]: I1003 08:45:28.588777 4790 scope.go:117] "RemoveContainer" containerID="06df50519bd50c832b25e7eda9d7116fd1f9d8e905fb8c3a607fdc4b734a9143" Oct 03 08:45:28 crc kubenswrapper[4790]: I1003 08:45:28.623252 4790 scope.go:117] "RemoveContainer" containerID="f39196339522d3de25733de5e7e0f2405478d69a9af5fffadc44b19bcd036be4" Oct 03 08:45:28 crc kubenswrapper[4790]: I1003 08:45:28.637846 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-h7tdr" Oct 03 08:45:28 crc kubenswrapper[4790]: I1003 08:45:28.644233 4790 scope.go:117] "RemoveContainer" containerID="7132b4101d980da555344150482a7deafd8272dc23c4a7ed9360fdf717e66621" Oct 03 08:45:28 crc kubenswrapper[4790]: E1003 08:45:28.644793 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7132b4101d980da555344150482a7deafd8272dc23c4a7ed9360fdf717e66621\": container with ID starting with 7132b4101d980da555344150482a7deafd8272dc23c4a7ed9360fdf717e66621 not found: ID does not exist" containerID="7132b4101d980da555344150482a7deafd8272dc23c4a7ed9360fdf717e66621" Oct 03 08:45:28 crc kubenswrapper[4790]: I1003 08:45:28.644836 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7132b4101d980da555344150482a7deafd8272dc23c4a7ed9360fdf717e66621"} err="failed to get container status \"7132b4101d980da555344150482a7deafd8272dc23c4a7ed9360fdf717e66621\": rpc error: code = NotFound desc = could not find container \"7132b4101d980da555344150482a7deafd8272dc23c4a7ed9360fdf717e66621\": container with ID starting with 7132b4101d980da555344150482a7deafd8272dc23c4a7ed9360fdf717e66621 not found: ID does not exist" Oct 03 08:45:28 crc kubenswrapper[4790]: I1003 08:45:28.644882 4790 scope.go:117] "RemoveContainer" containerID="06df50519bd50c832b25e7eda9d7116fd1f9d8e905fb8c3a607fdc4b734a9143" Oct 03 08:45:28 crc kubenswrapper[4790]: E1003 08:45:28.645265 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"06df50519bd50c832b25e7eda9d7116fd1f9d8e905fb8c3a607fdc4b734a9143\": container with ID starting with 06df50519bd50c832b25e7eda9d7116fd1f9d8e905fb8c3a607fdc4b734a9143 not found: ID does not exist" containerID="06df50519bd50c832b25e7eda9d7116fd1f9d8e905fb8c3a607fdc4b734a9143" Oct 03 08:45:28 crc kubenswrapper[4790]: I1003 08:45:28.645365 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"06df50519bd50c832b25e7eda9d7116fd1f9d8e905fb8c3a607fdc4b734a9143"} err="failed to get container status \"06df50519bd50c832b25e7eda9d7116fd1f9d8e905fb8c3a607fdc4b734a9143\": rpc error: code = NotFound desc = could not find container \"06df50519bd50c832b25e7eda9d7116fd1f9d8e905fb8c3a607fdc4b734a9143\": container with ID starting with 06df50519bd50c832b25e7eda9d7116fd1f9d8e905fb8c3a607fdc4b734a9143 not found: ID does not exist" Oct 03 08:45:28 crc kubenswrapper[4790]: I1003 08:45:28.645398 4790 scope.go:117] "RemoveContainer" containerID="f39196339522d3de25733de5e7e0f2405478d69a9af5fffadc44b19bcd036be4" Oct 03 08:45:28 crc kubenswrapper[4790]: E1003 08:45:28.645749 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f39196339522d3de25733de5e7e0f2405478d69a9af5fffadc44b19bcd036be4\": container with ID starting with f39196339522d3de25733de5e7e0f2405478d69a9af5fffadc44b19bcd036be4 not found: ID does not exist" containerID="f39196339522d3de25733de5e7e0f2405478d69a9af5fffadc44b19bcd036be4" Oct 03 08:45:28 crc kubenswrapper[4790]: I1003 08:45:28.645800 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f39196339522d3de25733de5e7e0f2405478d69a9af5fffadc44b19bcd036be4"} err="failed to get container status \"f39196339522d3de25733de5e7e0f2405478d69a9af5fffadc44b19bcd036be4\": rpc error: code = NotFound desc = could not find container \"f39196339522d3de25733de5e7e0f2405478d69a9af5fffadc44b19bcd036be4\": container with ID starting with f39196339522d3de25733de5e7e0f2405478d69a9af5fffadc44b19bcd036be4 not found: ID does not exist" Oct 03 08:45:28 crc kubenswrapper[4790]: I1003 08:45:28.645818 4790 scope.go:117] "RemoveContainer" containerID="12a322c107c67611ae45e6459662e2d1b17284183a82e4934aeb2943ec617d9a" Oct 03 08:45:28 crc kubenswrapper[4790]: I1003 08:45:28.669839 4790 scope.go:117] "RemoveContainer" containerID="a5e26cd7d7a34a16853f6c5645972850a920d7b13bb478f7380111c7c241bbba" Oct 03 08:45:28 crc kubenswrapper[4790]: I1003 08:45:28.692094 4790 scope.go:117] "RemoveContainer" containerID="6f54614a7213f63ecaf6d9243af5412285bcbd46c6d78db7c1e8f4f9e053a086" Oct 03 08:45:28 crc kubenswrapper[4790]: I1003 08:45:28.713715 4790 scope.go:117] "RemoveContainer" containerID="12a322c107c67611ae45e6459662e2d1b17284183a82e4934aeb2943ec617d9a" Oct 03 08:45:28 crc kubenswrapper[4790]: E1003 08:45:28.714221 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"12a322c107c67611ae45e6459662e2d1b17284183a82e4934aeb2943ec617d9a\": container with ID starting with 12a322c107c67611ae45e6459662e2d1b17284183a82e4934aeb2943ec617d9a not found: ID does not exist" containerID="12a322c107c67611ae45e6459662e2d1b17284183a82e4934aeb2943ec617d9a" Oct 03 08:45:28 crc kubenswrapper[4790]: I1003 08:45:28.715835 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"12a322c107c67611ae45e6459662e2d1b17284183a82e4934aeb2943ec617d9a"} err="failed to get container status \"12a322c107c67611ae45e6459662e2d1b17284183a82e4934aeb2943ec617d9a\": rpc error: code = NotFound desc = could not find container \"12a322c107c67611ae45e6459662e2d1b17284183a82e4934aeb2943ec617d9a\": container with ID starting with 12a322c107c67611ae45e6459662e2d1b17284183a82e4934aeb2943ec617d9a not found: ID does not exist" Oct 03 08:45:28 crc kubenswrapper[4790]: I1003 08:45:28.715877 4790 scope.go:117] "RemoveContainer" containerID="a5e26cd7d7a34a16853f6c5645972850a920d7b13bb478f7380111c7c241bbba" Oct 03 08:45:28 crc kubenswrapper[4790]: E1003 08:45:28.716748 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a5e26cd7d7a34a16853f6c5645972850a920d7b13bb478f7380111c7c241bbba\": container with ID starting with a5e26cd7d7a34a16853f6c5645972850a920d7b13bb478f7380111c7c241bbba not found: ID does not exist" containerID="a5e26cd7d7a34a16853f6c5645972850a920d7b13bb478f7380111c7c241bbba" Oct 03 08:45:28 crc kubenswrapper[4790]: I1003 08:45:28.716784 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a5e26cd7d7a34a16853f6c5645972850a920d7b13bb478f7380111c7c241bbba"} err="failed to get container status \"a5e26cd7d7a34a16853f6c5645972850a920d7b13bb478f7380111c7c241bbba\": rpc error: code = NotFound desc = could not find container \"a5e26cd7d7a34a16853f6c5645972850a920d7b13bb478f7380111c7c241bbba\": container with ID starting with a5e26cd7d7a34a16853f6c5645972850a920d7b13bb478f7380111c7c241bbba not found: ID does not exist" Oct 03 08:45:28 crc kubenswrapper[4790]: I1003 08:45:28.716811 4790 scope.go:117] "RemoveContainer" containerID="6f54614a7213f63ecaf6d9243af5412285bcbd46c6d78db7c1e8f4f9e053a086" Oct 03 08:45:28 crc kubenswrapper[4790]: E1003 08:45:28.717145 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6f54614a7213f63ecaf6d9243af5412285bcbd46c6d78db7c1e8f4f9e053a086\": container with ID starting with 6f54614a7213f63ecaf6d9243af5412285bcbd46c6d78db7c1e8f4f9e053a086 not found: ID does not exist" containerID="6f54614a7213f63ecaf6d9243af5412285bcbd46c6d78db7c1e8f4f9e053a086" Oct 03 08:45:28 crc kubenswrapper[4790]: I1003 08:45:28.717178 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6f54614a7213f63ecaf6d9243af5412285bcbd46c6d78db7c1e8f4f9e053a086"} err="failed to get container status \"6f54614a7213f63ecaf6d9243af5412285bcbd46c6d78db7c1e8f4f9e053a086\": rpc error: code = NotFound desc = could not find container \"6f54614a7213f63ecaf6d9243af5412285bcbd46c6d78db7c1e8f4f9e053a086\": container with ID starting with 6f54614a7213f63ecaf6d9243af5412285bcbd46c6d78db7c1e8f4f9e053a086 not found: ID does not exist" Oct 03 08:45:28 crc kubenswrapper[4790]: I1003 08:45:28.739562 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z6ngr\" (UniqueName: \"kubernetes.io/projected/a04ec950-87ce-4f04-9c9d-201a4f6099a1-kube-api-access-z6ngr\") pod \"a04ec950-87ce-4f04-9c9d-201a4f6099a1\" (UID: \"a04ec950-87ce-4f04-9c9d-201a4f6099a1\") " Oct 03 08:45:28 crc kubenswrapper[4790]: I1003 08:45:28.739653 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/a04ec950-87ce-4f04-9c9d-201a4f6099a1-marketplace-operator-metrics\") pod \"a04ec950-87ce-4f04-9c9d-201a4f6099a1\" (UID: \"a04ec950-87ce-4f04-9c9d-201a4f6099a1\") " Oct 03 08:45:28 crc kubenswrapper[4790]: I1003 08:45:28.739711 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a04ec950-87ce-4f04-9c9d-201a4f6099a1-marketplace-trusted-ca\") pod \"a04ec950-87ce-4f04-9c9d-201a4f6099a1\" (UID: \"a04ec950-87ce-4f04-9c9d-201a4f6099a1\") " Oct 03 08:45:28 crc kubenswrapper[4790]: I1003 08:45:28.740377 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a04ec950-87ce-4f04-9c9d-201a4f6099a1-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "a04ec950-87ce-4f04-9c9d-201a4f6099a1" (UID: "a04ec950-87ce-4f04-9c9d-201a4f6099a1"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:45:28 crc kubenswrapper[4790]: I1003 08:45:28.744418 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a04ec950-87ce-4f04-9c9d-201a4f6099a1-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "a04ec950-87ce-4f04-9c9d-201a4f6099a1" (UID: "a04ec950-87ce-4f04-9c9d-201a4f6099a1"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:45:28 crc kubenswrapper[4790]: I1003 08:45:28.744523 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a04ec950-87ce-4f04-9c9d-201a4f6099a1-kube-api-access-z6ngr" (OuterVolumeSpecName: "kube-api-access-z6ngr") pod "a04ec950-87ce-4f04-9c9d-201a4f6099a1" (UID: "a04ec950-87ce-4f04-9c9d-201a4f6099a1"). InnerVolumeSpecName "kube-api-access-z6ngr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:45:28 crc kubenswrapper[4790]: I1003 08:45:28.841052 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z6ngr\" (UniqueName: \"kubernetes.io/projected/a04ec950-87ce-4f04-9c9d-201a4f6099a1-kube-api-access-z6ngr\") on node \"crc\" DevicePath \"\"" Oct 03 08:45:28 crc kubenswrapper[4790]: I1003 08:45:28.841093 4790 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/a04ec950-87ce-4f04-9c9d-201a4f6099a1-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Oct 03 08:45:28 crc kubenswrapper[4790]: I1003 08:45:28.841103 4790 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a04ec950-87ce-4f04-9c9d-201a4f6099a1-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 03 08:45:29 crc kubenswrapper[4790]: I1003 08:45:29.443636 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-h7tdr" Oct 03 08:45:29 crc kubenswrapper[4790]: I1003 08:45:29.443630 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-h7tdr" event={"ID":"a04ec950-87ce-4f04-9c9d-201a4f6099a1","Type":"ContainerDied","Data":"0ed1e9b90bbf329b5d60466677ba8dcd121b8b3a8d8653d055c6bada0a7579e9"} Oct 03 08:45:29 crc kubenswrapper[4790]: I1003 08:45:29.443774 4790 scope.go:117] "RemoveContainer" containerID="0a70a9044adf281ff7b64c63a15d94aefe74174c8b191ba0fc8a2cbb6c6e67a8" Oct 03 08:45:29 crc kubenswrapper[4790]: I1003 08:45:29.446760 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-5sbms" event={"ID":"dc166124-1b08-4c1e-90fe-a35d6331fcf4","Type":"ContainerStarted","Data":"24f7c99d8603bff9358810bd4eb8aedd0d8f1928e50a05fc5b63228794b20e30"} Oct 03 08:45:29 crc kubenswrapper[4790]: I1003 08:45:29.446799 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-5sbms" event={"ID":"dc166124-1b08-4c1e-90fe-a35d6331fcf4","Type":"ContainerStarted","Data":"216628e2fcdc1f98d977bffb181ca3980fe515d0e57edef54b2a951af6a6f0a1"} Oct 03 08:45:29 crc kubenswrapper[4790]: I1003 08:45:29.465329 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-5sbms" podStartSLOduration=2.465304651 podStartE2EDuration="2.465304651s" podCreationTimestamp="2025-10-03 08:45:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 08:45:29.463927903 +0000 UTC m=+315.816862161" watchObservedRunningTime="2025-10-03 08:45:29.465304651 +0000 UTC m=+315.818238889" Oct 03 08:45:29 crc kubenswrapper[4790]: I1003 08:45:29.480949 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-h7tdr"] Oct 03 08:45:29 crc kubenswrapper[4790]: I1003 08:45:29.485779 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-h7tdr"] Oct 03 08:45:29 crc kubenswrapper[4790]: I1003 08:45:29.866392 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-jshwg"] Oct 03 08:45:29 crc kubenswrapper[4790]: E1003 08:45:29.866780 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b66141f0-fc3d-4596-9484-d01137e1f76c" containerName="registry-server" Oct 03 08:45:29 crc kubenswrapper[4790]: I1003 08:45:29.866806 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="b66141f0-fc3d-4596-9484-d01137e1f76c" containerName="registry-server" Oct 03 08:45:29 crc kubenswrapper[4790]: E1003 08:45:29.866826 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4882956d-71db-444d-9573-83151dc6f80f" containerName="extract-content" Oct 03 08:45:29 crc kubenswrapper[4790]: I1003 08:45:29.866839 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="4882956d-71db-444d-9573-83151dc6f80f" containerName="extract-content" Oct 03 08:45:29 crc kubenswrapper[4790]: E1003 08:45:29.866856 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b66141f0-fc3d-4596-9484-d01137e1f76c" containerName="extract-content" Oct 03 08:45:29 crc kubenswrapper[4790]: I1003 08:45:29.866870 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="b66141f0-fc3d-4596-9484-d01137e1f76c" containerName="extract-content" Oct 03 08:45:29 crc kubenswrapper[4790]: E1003 08:45:29.866889 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de73f7fd-8767-4ac4-8975-9eb18a03b881" containerName="registry-server" Oct 03 08:45:29 crc kubenswrapper[4790]: I1003 08:45:29.866901 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="de73f7fd-8767-4ac4-8975-9eb18a03b881" containerName="registry-server" Oct 03 08:45:29 crc kubenswrapper[4790]: E1003 08:45:29.866917 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b044862-0484-4701-a965-796c87a3adc1" containerName="extract-content" Oct 03 08:45:29 crc kubenswrapper[4790]: I1003 08:45:29.866929 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b044862-0484-4701-a965-796c87a3adc1" containerName="extract-content" Oct 03 08:45:29 crc kubenswrapper[4790]: E1003 08:45:29.866949 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de73f7fd-8767-4ac4-8975-9eb18a03b881" containerName="extract-content" Oct 03 08:45:29 crc kubenswrapper[4790]: I1003 08:45:29.866961 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="de73f7fd-8767-4ac4-8975-9eb18a03b881" containerName="extract-content" Oct 03 08:45:29 crc kubenswrapper[4790]: E1003 08:45:29.866975 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b044862-0484-4701-a965-796c87a3adc1" containerName="extract-utilities" Oct 03 08:45:29 crc kubenswrapper[4790]: I1003 08:45:29.866986 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b044862-0484-4701-a965-796c87a3adc1" containerName="extract-utilities" Oct 03 08:45:29 crc kubenswrapper[4790]: E1003 08:45:29.867005 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b66141f0-fc3d-4596-9484-d01137e1f76c" containerName="extract-utilities" Oct 03 08:45:29 crc kubenswrapper[4790]: I1003 08:45:29.867017 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="b66141f0-fc3d-4596-9484-d01137e1f76c" containerName="extract-utilities" Oct 03 08:45:29 crc kubenswrapper[4790]: E1003 08:45:29.867031 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4882956d-71db-444d-9573-83151dc6f80f" containerName="extract-utilities" Oct 03 08:45:29 crc kubenswrapper[4790]: I1003 08:45:29.867043 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="4882956d-71db-444d-9573-83151dc6f80f" containerName="extract-utilities" Oct 03 08:45:29 crc kubenswrapper[4790]: E1003 08:45:29.867060 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de73f7fd-8767-4ac4-8975-9eb18a03b881" containerName="extract-utilities" Oct 03 08:45:29 crc kubenswrapper[4790]: I1003 08:45:29.867071 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="de73f7fd-8767-4ac4-8975-9eb18a03b881" containerName="extract-utilities" Oct 03 08:45:29 crc kubenswrapper[4790]: E1003 08:45:29.867086 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4882956d-71db-444d-9573-83151dc6f80f" containerName="registry-server" Oct 03 08:45:29 crc kubenswrapper[4790]: I1003 08:45:29.867098 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="4882956d-71db-444d-9573-83151dc6f80f" containerName="registry-server" Oct 03 08:45:29 crc kubenswrapper[4790]: E1003 08:45:29.867113 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a04ec950-87ce-4f04-9c9d-201a4f6099a1" containerName="marketplace-operator" Oct 03 08:45:29 crc kubenswrapper[4790]: I1003 08:45:29.867125 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="a04ec950-87ce-4f04-9c9d-201a4f6099a1" containerName="marketplace-operator" Oct 03 08:45:29 crc kubenswrapper[4790]: E1003 08:45:29.867144 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b044862-0484-4701-a965-796c87a3adc1" containerName="registry-server" Oct 03 08:45:29 crc kubenswrapper[4790]: I1003 08:45:29.867155 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b044862-0484-4701-a965-796c87a3adc1" containerName="registry-server" Oct 03 08:45:29 crc kubenswrapper[4790]: I1003 08:45:29.867296 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="de73f7fd-8767-4ac4-8975-9eb18a03b881" containerName="registry-server" Oct 03 08:45:29 crc kubenswrapper[4790]: I1003 08:45:29.867326 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="a04ec950-87ce-4f04-9c9d-201a4f6099a1" containerName="marketplace-operator" Oct 03 08:45:29 crc kubenswrapper[4790]: I1003 08:45:29.867343 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="b66141f0-fc3d-4596-9484-d01137e1f76c" containerName="registry-server" Oct 03 08:45:29 crc kubenswrapper[4790]: I1003 08:45:29.867359 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="6b044862-0484-4701-a965-796c87a3adc1" containerName="registry-server" Oct 03 08:45:29 crc kubenswrapper[4790]: I1003 08:45:29.867378 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="4882956d-71db-444d-9573-83151dc6f80f" containerName="registry-server" Oct 03 08:45:29 crc kubenswrapper[4790]: I1003 08:45:29.868499 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jshwg" Oct 03 08:45:29 crc kubenswrapper[4790]: I1003 08:45:29.873019 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Oct 03 08:45:29 crc kubenswrapper[4790]: I1003 08:45:29.874304 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-jshwg"] Oct 03 08:45:29 crc kubenswrapper[4790]: I1003 08:45:29.953677 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h827s\" (UniqueName: \"kubernetes.io/projected/cd3a5dd4-70b9-462e-994b-18d4fbf399c3-kube-api-access-h827s\") pod \"redhat-marketplace-jshwg\" (UID: \"cd3a5dd4-70b9-462e-994b-18d4fbf399c3\") " pod="openshift-marketplace/redhat-marketplace-jshwg" Oct 03 08:45:29 crc kubenswrapper[4790]: I1003 08:45:29.953733 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cd3a5dd4-70b9-462e-994b-18d4fbf399c3-catalog-content\") pod \"redhat-marketplace-jshwg\" (UID: \"cd3a5dd4-70b9-462e-994b-18d4fbf399c3\") " pod="openshift-marketplace/redhat-marketplace-jshwg" Oct 03 08:45:29 crc kubenswrapper[4790]: I1003 08:45:29.953766 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cd3a5dd4-70b9-462e-994b-18d4fbf399c3-utilities\") pod \"redhat-marketplace-jshwg\" (UID: \"cd3a5dd4-70b9-462e-994b-18d4fbf399c3\") " pod="openshift-marketplace/redhat-marketplace-jshwg" Oct 03 08:45:30 crc kubenswrapper[4790]: I1003 08:45:30.056872 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h827s\" (UniqueName: \"kubernetes.io/projected/cd3a5dd4-70b9-462e-994b-18d4fbf399c3-kube-api-access-h827s\") pod \"redhat-marketplace-jshwg\" (UID: \"cd3a5dd4-70b9-462e-994b-18d4fbf399c3\") " pod="openshift-marketplace/redhat-marketplace-jshwg" Oct 03 08:45:30 crc kubenswrapper[4790]: I1003 08:45:30.056959 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cd3a5dd4-70b9-462e-994b-18d4fbf399c3-catalog-content\") pod \"redhat-marketplace-jshwg\" (UID: \"cd3a5dd4-70b9-462e-994b-18d4fbf399c3\") " pod="openshift-marketplace/redhat-marketplace-jshwg" Oct 03 08:45:30 crc kubenswrapper[4790]: I1003 08:45:30.057009 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cd3a5dd4-70b9-462e-994b-18d4fbf399c3-utilities\") pod \"redhat-marketplace-jshwg\" (UID: \"cd3a5dd4-70b9-462e-994b-18d4fbf399c3\") " pod="openshift-marketplace/redhat-marketplace-jshwg" Oct 03 08:45:30 crc kubenswrapper[4790]: I1003 08:45:30.057712 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cd3a5dd4-70b9-462e-994b-18d4fbf399c3-utilities\") pod \"redhat-marketplace-jshwg\" (UID: \"cd3a5dd4-70b9-462e-994b-18d4fbf399c3\") " pod="openshift-marketplace/redhat-marketplace-jshwg" Oct 03 08:45:30 crc kubenswrapper[4790]: I1003 08:45:30.058082 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cd3a5dd4-70b9-462e-994b-18d4fbf399c3-catalog-content\") pod \"redhat-marketplace-jshwg\" (UID: \"cd3a5dd4-70b9-462e-994b-18d4fbf399c3\") " pod="openshift-marketplace/redhat-marketplace-jshwg" Oct 03 08:45:30 crc kubenswrapper[4790]: I1003 08:45:30.066570 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-5z6kh"] Oct 03 08:45:30 crc kubenswrapper[4790]: I1003 08:45:30.067877 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5z6kh" Oct 03 08:45:30 crc kubenswrapper[4790]: I1003 08:45:30.071168 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Oct 03 08:45:30 crc kubenswrapper[4790]: I1003 08:45:30.074387 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-5z6kh"] Oct 03 08:45:30 crc kubenswrapper[4790]: I1003 08:45:30.092771 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h827s\" (UniqueName: \"kubernetes.io/projected/cd3a5dd4-70b9-462e-994b-18d4fbf399c3-kube-api-access-h827s\") pod \"redhat-marketplace-jshwg\" (UID: \"cd3a5dd4-70b9-462e-994b-18d4fbf399c3\") " pod="openshift-marketplace/redhat-marketplace-jshwg" Oct 03 08:45:30 crc kubenswrapper[4790]: I1003 08:45:30.158687 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/61616e67-5189-42b0-a4bc-8f6fc803f2a7-utilities\") pod \"redhat-operators-5z6kh\" (UID: \"61616e67-5189-42b0-a4bc-8f6fc803f2a7\") " pod="openshift-marketplace/redhat-operators-5z6kh" Oct 03 08:45:30 crc kubenswrapper[4790]: I1003 08:45:30.158786 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z6srd\" (UniqueName: \"kubernetes.io/projected/61616e67-5189-42b0-a4bc-8f6fc803f2a7-kube-api-access-z6srd\") pod \"redhat-operators-5z6kh\" (UID: \"61616e67-5189-42b0-a4bc-8f6fc803f2a7\") " pod="openshift-marketplace/redhat-operators-5z6kh" Oct 03 08:45:30 crc kubenswrapper[4790]: I1003 08:45:30.158848 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/61616e67-5189-42b0-a4bc-8f6fc803f2a7-catalog-content\") pod \"redhat-operators-5z6kh\" (UID: \"61616e67-5189-42b0-a4bc-8f6fc803f2a7\") " pod="openshift-marketplace/redhat-operators-5z6kh" Oct 03 08:45:30 crc kubenswrapper[4790]: I1003 08:45:30.185433 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jshwg" Oct 03 08:45:30 crc kubenswrapper[4790]: I1003 08:45:30.266929 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/61616e67-5189-42b0-a4bc-8f6fc803f2a7-utilities\") pod \"redhat-operators-5z6kh\" (UID: \"61616e67-5189-42b0-a4bc-8f6fc803f2a7\") " pod="openshift-marketplace/redhat-operators-5z6kh" Oct 03 08:45:30 crc kubenswrapper[4790]: I1003 08:45:30.268194 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z6srd\" (UniqueName: \"kubernetes.io/projected/61616e67-5189-42b0-a4bc-8f6fc803f2a7-kube-api-access-z6srd\") pod \"redhat-operators-5z6kh\" (UID: \"61616e67-5189-42b0-a4bc-8f6fc803f2a7\") " pod="openshift-marketplace/redhat-operators-5z6kh" Oct 03 08:45:30 crc kubenswrapper[4790]: I1003 08:45:30.268290 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/61616e67-5189-42b0-a4bc-8f6fc803f2a7-catalog-content\") pod \"redhat-operators-5z6kh\" (UID: \"61616e67-5189-42b0-a4bc-8f6fc803f2a7\") " pod="openshift-marketplace/redhat-operators-5z6kh" Oct 03 08:45:30 crc kubenswrapper[4790]: I1003 08:45:30.269414 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/61616e67-5189-42b0-a4bc-8f6fc803f2a7-utilities\") pod \"redhat-operators-5z6kh\" (UID: \"61616e67-5189-42b0-a4bc-8f6fc803f2a7\") " pod="openshift-marketplace/redhat-operators-5z6kh" Oct 03 08:45:30 crc kubenswrapper[4790]: I1003 08:45:30.269633 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/61616e67-5189-42b0-a4bc-8f6fc803f2a7-catalog-content\") pod \"redhat-operators-5z6kh\" (UID: \"61616e67-5189-42b0-a4bc-8f6fc803f2a7\") " pod="openshift-marketplace/redhat-operators-5z6kh" Oct 03 08:45:30 crc kubenswrapper[4790]: I1003 08:45:30.301135 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z6srd\" (UniqueName: \"kubernetes.io/projected/61616e67-5189-42b0-a4bc-8f6fc803f2a7-kube-api-access-z6srd\") pod \"redhat-operators-5z6kh\" (UID: \"61616e67-5189-42b0-a4bc-8f6fc803f2a7\") " pod="openshift-marketplace/redhat-operators-5z6kh" Oct 03 08:45:30 crc kubenswrapper[4790]: I1003 08:45:30.338147 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4882956d-71db-444d-9573-83151dc6f80f" path="/var/lib/kubelet/pods/4882956d-71db-444d-9573-83151dc6f80f/volumes" Oct 03 08:45:30 crc kubenswrapper[4790]: I1003 08:45:30.338783 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6b044862-0484-4701-a965-796c87a3adc1" path="/var/lib/kubelet/pods/6b044862-0484-4701-a965-796c87a3adc1/volumes" Oct 03 08:45:30 crc kubenswrapper[4790]: I1003 08:45:30.339389 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a04ec950-87ce-4f04-9c9d-201a4f6099a1" path="/var/lib/kubelet/pods/a04ec950-87ce-4f04-9c9d-201a4f6099a1/volumes" Oct 03 08:45:30 crc kubenswrapper[4790]: I1003 08:45:30.340333 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b66141f0-fc3d-4596-9484-d01137e1f76c" path="/var/lib/kubelet/pods/b66141f0-fc3d-4596-9484-d01137e1f76c/volumes" Oct 03 08:45:30 crc kubenswrapper[4790]: I1003 08:45:30.341341 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="de73f7fd-8767-4ac4-8975-9eb18a03b881" path="/var/lib/kubelet/pods/de73f7fd-8767-4ac4-8975-9eb18a03b881/volumes" Oct 03 08:45:30 crc kubenswrapper[4790]: I1003 08:45:30.365283 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-jshwg"] Oct 03 08:45:30 crc kubenswrapper[4790]: I1003 08:45:30.406298 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5z6kh" Oct 03 08:45:30 crc kubenswrapper[4790]: I1003 08:45:30.454423 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jshwg" event={"ID":"cd3a5dd4-70b9-462e-994b-18d4fbf399c3","Type":"ContainerStarted","Data":"8b5bdaf92e6d98571734976d3f5c7e91ed6d9c6a0a8a5091d0da299eeb1229fa"} Oct 03 08:45:30 crc kubenswrapper[4790]: I1003 08:45:30.455740 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-5sbms" Oct 03 08:45:30 crc kubenswrapper[4790]: I1003 08:45:30.460667 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-5sbms" Oct 03 08:45:30 crc kubenswrapper[4790]: I1003 08:45:30.585075 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-5z6kh"] Oct 03 08:45:30 crc kubenswrapper[4790]: W1003 08:45:30.590067 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod61616e67_5189_42b0_a4bc_8f6fc803f2a7.slice/crio-afc821655ccb71d8abd4232427955ff6a96caa08ab10776d77ccaa3b0559d257 WatchSource:0}: Error finding container afc821655ccb71d8abd4232427955ff6a96caa08ab10776d77ccaa3b0559d257: Status 404 returned error can't find the container with id afc821655ccb71d8abd4232427955ff6a96caa08ab10776d77ccaa3b0559d257 Oct 03 08:45:31 crc kubenswrapper[4790]: I1003 08:45:31.463288 4790 generic.go:334] "Generic (PLEG): container finished" podID="61616e67-5189-42b0-a4bc-8f6fc803f2a7" containerID="bf00f6387e7f4ee8efecbcd602c2f33d44dae208b03b5af5a11293c6e3f7106b" exitCode=0 Oct 03 08:45:31 crc kubenswrapper[4790]: I1003 08:45:31.463387 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5z6kh" event={"ID":"61616e67-5189-42b0-a4bc-8f6fc803f2a7","Type":"ContainerDied","Data":"bf00f6387e7f4ee8efecbcd602c2f33d44dae208b03b5af5a11293c6e3f7106b"} Oct 03 08:45:31 crc kubenswrapper[4790]: I1003 08:45:31.463832 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5z6kh" event={"ID":"61616e67-5189-42b0-a4bc-8f6fc803f2a7","Type":"ContainerStarted","Data":"afc821655ccb71d8abd4232427955ff6a96caa08ab10776d77ccaa3b0559d257"} Oct 03 08:45:31 crc kubenswrapper[4790]: I1003 08:45:31.465430 4790 generic.go:334] "Generic (PLEG): container finished" podID="cd3a5dd4-70b9-462e-994b-18d4fbf399c3" containerID="caabe4d3e06b0d756b8cf52a0c1c2304a0b5a336c731f02e11ee0b8614d0837f" exitCode=0 Oct 03 08:45:31 crc kubenswrapper[4790]: I1003 08:45:31.465698 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jshwg" event={"ID":"cd3a5dd4-70b9-462e-994b-18d4fbf399c3","Type":"ContainerDied","Data":"caabe4d3e06b0d756b8cf52a0c1c2304a0b5a336c731f02e11ee0b8614d0837f"} Oct 03 08:45:32 crc kubenswrapper[4790]: I1003 08:45:32.275419 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-8rbb4"] Oct 03 08:45:32 crc kubenswrapper[4790]: I1003 08:45:32.276835 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8rbb4" Oct 03 08:45:32 crc kubenswrapper[4790]: I1003 08:45:32.279105 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Oct 03 08:45:32 crc kubenswrapper[4790]: I1003 08:45:32.283248 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-8rbb4"] Oct 03 08:45:32 crc kubenswrapper[4790]: I1003 08:45:32.416392 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/668fc837-03c4-4539-a9c6-dc5aca43693a-catalog-content\") pod \"community-operators-8rbb4\" (UID: \"668fc837-03c4-4539-a9c6-dc5aca43693a\") " pod="openshift-marketplace/community-operators-8rbb4" Oct 03 08:45:32 crc kubenswrapper[4790]: I1003 08:45:32.416658 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/668fc837-03c4-4539-a9c6-dc5aca43693a-utilities\") pod \"community-operators-8rbb4\" (UID: \"668fc837-03c4-4539-a9c6-dc5aca43693a\") " pod="openshift-marketplace/community-operators-8rbb4" Oct 03 08:45:32 crc kubenswrapper[4790]: I1003 08:45:32.416903 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gn6ns\" (UniqueName: \"kubernetes.io/projected/668fc837-03c4-4539-a9c6-dc5aca43693a-kube-api-access-gn6ns\") pod \"community-operators-8rbb4\" (UID: \"668fc837-03c4-4539-a9c6-dc5aca43693a\") " pod="openshift-marketplace/community-operators-8rbb4" Oct 03 08:45:32 crc kubenswrapper[4790]: I1003 08:45:32.470258 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-zkj66"] Oct 03 08:45:32 crc kubenswrapper[4790]: I1003 08:45:32.471307 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zkj66" Oct 03 08:45:32 crc kubenswrapper[4790]: I1003 08:45:32.474300 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Oct 03 08:45:32 crc kubenswrapper[4790]: I1003 08:45:32.477804 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5z6kh" event={"ID":"61616e67-5189-42b0-a4bc-8f6fc803f2a7","Type":"ContainerStarted","Data":"859560a317d03405f9077c79a5b8cc409dbb9187e722c563195d8bb689e04a98"} Oct 03 08:45:32 crc kubenswrapper[4790]: I1003 08:45:32.481338 4790 generic.go:334] "Generic (PLEG): container finished" podID="cd3a5dd4-70b9-462e-994b-18d4fbf399c3" containerID="0e3dd823b0e7edeb48bea5e24769d489449e67969f61df23346d62600de762c7" exitCode=0 Oct 03 08:45:32 crc kubenswrapper[4790]: I1003 08:45:32.481361 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jshwg" event={"ID":"cd3a5dd4-70b9-462e-994b-18d4fbf399c3","Type":"ContainerDied","Data":"0e3dd823b0e7edeb48bea5e24769d489449e67969f61df23346d62600de762c7"} Oct 03 08:45:32 crc kubenswrapper[4790]: I1003 08:45:32.492600 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-zkj66"] Oct 03 08:45:32 crc kubenswrapper[4790]: I1003 08:45:32.518184 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/668fc837-03c4-4539-a9c6-dc5aca43693a-catalog-content\") pod \"community-operators-8rbb4\" (UID: \"668fc837-03c4-4539-a9c6-dc5aca43693a\") " pod="openshift-marketplace/community-operators-8rbb4" Oct 03 08:45:32 crc kubenswrapper[4790]: I1003 08:45:32.518228 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/668fc837-03c4-4539-a9c6-dc5aca43693a-utilities\") pod \"community-operators-8rbb4\" (UID: \"668fc837-03c4-4539-a9c6-dc5aca43693a\") " pod="openshift-marketplace/community-operators-8rbb4" Oct 03 08:45:32 crc kubenswrapper[4790]: I1003 08:45:32.518264 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gn6ns\" (UniqueName: \"kubernetes.io/projected/668fc837-03c4-4539-a9c6-dc5aca43693a-kube-api-access-gn6ns\") pod \"community-operators-8rbb4\" (UID: \"668fc837-03c4-4539-a9c6-dc5aca43693a\") " pod="openshift-marketplace/community-operators-8rbb4" Oct 03 08:45:32 crc kubenswrapper[4790]: I1003 08:45:32.519184 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/668fc837-03c4-4539-a9c6-dc5aca43693a-catalog-content\") pod \"community-operators-8rbb4\" (UID: \"668fc837-03c4-4539-a9c6-dc5aca43693a\") " pod="openshift-marketplace/community-operators-8rbb4" Oct 03 08:45:32 crc kubenswrapper[4790]: I1003 08:45:32.519375 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/668fc837-03c4-4539-a9c6-dc5aca43693a-utilities\") pod \"community-operators-8rbb4\" (UID: \"668fc837-03c4-4539-a9c6-dc5aca43693a\") " pod="openshift-marketplace/community-operators-8rbb4" Oct 03 08:45:32 crc kubenswrapper[4790]: I1003 08:45:32.538201 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gn6ns\" (UniqueName: \"kubernetes.io/projected/668fc837-03c4-4539-a9c6-dc5aca43693a-kube-api-access-gn6ns\") pod \"community-operators-8rbb4\" (UID: \"668fc837-03c4-4539-a9c6-dc5aca43693a\") " pod="openshift-marketplace/community-operators-8rbb4" Oct 03 08:45:32 crc kubenswrapper[4790]: I1003 08:45:32.618808 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bec1bd8b-1006-4ab4-b8bc-79b3ea63ab3c-utilities\") pod \"certified-operators-zkj66\" (UID: \"bec1bd8b-1006-4ab4-b8bc-79b3ea63ab3c\") " pod="openshift-marketplace/certified-operators-zkj66" Oct 03 08:45:32 crc kubenswrapper[4790]: I1003 08:45:32.619152 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t4qqm\" (UniqueName: \"kubernetes.io/projected/bec1bd8b-1006-4ab4-b8bc-79b3ea63ab3c-kube-api-access-t4qqm\") pod \"certified-operators-zkj66\" (UID: \"bec1bd8b-1006-4ab4-b8bc-79b3ea63ab3c\") " pod="openshift-marketplace/certified-operators-zkj66" Oct 03 08:45:32 crc kubenswrapper[4790]: I1003 08:45:32.619304 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bec1bd8b-1006-4ab4-b8bc-79b3ea63ab3c-catalog-content\") pod \"certified-operators-zkj66\" (UID: \"bec1bd8b-1006-4ab4-b8bc-79b3ea63ab3c\") " pod="openshift-marketplace/certified-operators-zkj66" Oct 03 08:45:32 crc kubenswrapper[4790]: I1003 08:45:32.619833 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8rbb4" Oct 03 08:45:32 crc kubenswrapper[4790]: I1003 08:45:32.720157 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t4qqm\" (UniqueName: \"kubernetes.io/projected/bec1bd8b-1006-4ab4-b8bc-79b3ea63ab3c-kube-api-access-t4qqm\") pod \"certified-operators-zkj66\" (UID: \"bec1bd8b-1006-4ab4-b8bc-79b3ea63ab3c\") " pod="openshift-marketplace/certified-operators-zkj66" Oct 03 08:45:32 crc kubenswrapper[4790]: I1003 08:45:32.720427 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bec1bd8b-1006-4ab4-b8bc-79b3ea63ab3c-catalog-content\") pod \"certified-operators-zkj66\" (UID: \"bec1bd8b-1006-4ab4-b8bc-79b3ea63ab3c\") " pod="openshift-marketplace/certified-operators-zkj66" Oct 03 08:45:32 crc kubenswrapper[4790]: I1003 08:45:32.720478 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bec1bd8b-1006-4ab4-b8bc-79b3ea63ab3c-utilities\") pod \"certified-operators-zkj66\" (UID: \"bec1bd8b-1006-4ab4-b8bc-79b3ea63ab3c\") " pod="openshift-marketplace/certified-operators-zkj66" Oct 03 08:45:32 crc kubenswrapper[4790]: I1003 08:45:32.721035 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bec1bd8b-1006-4ab4-b8bc-79b3ea63ab3c-utilities\") pod \"certified-operators-zkj66\" (UID: \"bec1bd8b-1006-4ab4-b8bc-79b3ea63ab3c\") " pod="openshift-marketplace/certified-operators-zkj66" Oct 03 08:45:32 crc kubenswrapper[4790]: I1003 08:45:32.721751 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bec1bd8b-1006-4ab4-b8bc-79b3ea63ab3c-catalog-content\") pod \"certified-operators-zkj66\" (UID: \"bec1bd8b-1006-4ab4-b8bc-79b3ea63ab3c\") " pod="openshift-marketplace/certified-operators-zkj66" Oct 03 08:45:32 crc kubenswrapper[4790]: I1003 08:45:32.744268 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t4qqm\" (UniqueName: \"kubernetes.io/projected/bec1bd8b-1006-4ab4-b8bc-79b3ea63ab3c-kube-api-access-t4qqm\") pod \"certified-operators-zkj66\" (UID: \"bec1bd8b-1006-4ab4-b8bc-79b3ea63ab3c\") " pod="openshift-marketplace/certified-operators-zkj66" Oct 03 08:45:32 crc kubenswrapper[4790]: I1003 08:45:32.789310 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zkj66" Oct 03 08:45:32 crc kubenswrapper[4790]: I1003 08:45:32.849185 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-8rbb4"] Oct 03 08:45:32 crc kubenswrapper[4790]: W1003 08:45:32.899898 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod668fc837_03c4_4539_a9c6_dc5aca43693a.slice/crio-f830574ffe0327d08f29fec0d939edd213f2e01f7b7967fe12e735e9beaf50ed WatchSource:0}: Error finding container f830574ffe0327d08f29fec0d939edd213f2e01f7b7967fe12e735e9beaf50ed: Status 404 returned error can't find the container with id f830574ffe0327d08f29fec0d939edd213f2e01f7b7967fe12e735e9beaf50ed Oct 03 08:45:32 crc kubenswrapper[4790]: I1003 08:45:32.993718 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-zkj66"] Oct 03 08:45:33 crc kubenswrapper[4790]: I1003 08:45:33.488769 4790 generic.go:334] "Generic (PLEG): container finished" podID="61616e67-5189-42b0-a4bc-8f6fc803f2a7" containerID="859560a317d03405f9077c79a5b8cc409dbb9187e722c563195d8bb689e04a98" exitCode=0 Oct 03 08:45:33 crc kubenswrapper[4790]: I1003 08:45:33.488880 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5z6kh" event={"ID":"61616e67-5189-42b0-a4bc-8f6fc803f2a7","Type":"ContainerDied","Data":"859560a317d03405f9077c79a5b8cc409dbb9187e722c563195d8bb689e04a98"} Oct 03 08:45:33 crc kubenswrapper[4790]: I1003 08:45:33.489825 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zkj66" event={"ID":"bec1bd8b-1006-4ab4-b8bc-79b3ea63ab3c","Type":"ContainerStarted","Data":"e0dfab27a69b28f659abc6b1e6bc5b0f0173e297aa4743fc1516f047bd9c8381"} Oct 03 08:45:33 crc kubenswrapper[4790]: I1003 08:45:33.491236 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8rbb4" event={"ID":"668fc837-03c4-4539-a9c6-dc5aca43693a","Type":"ContainerStarted","Data":"f830574ffe0327d08f29fec0d939edd213f2e01f7b7967fe12e735e9beaf50ed"} Oct 03 08:45:34 crc kubenswrapper[4790]: I1003 08:45:34.497945 4790 generic.go:334] "Generic (PLEG): container finished" podID="668fc837-03c4-4539-a9c6-dc5aca43693a" containerID="71645f7c203aa8bf40c347599f8091ac073762c0c8d6c862706dff01ae74bcc0" exitCode=0 Oct 03 08:45:34 crc kubenswrapper[4790]: I1003 08:45:34.497981 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8rbb4" event={"ID":"668fc837-03c4-4539-a9c6-dc5aca43693a","Type":"ContainerDied","Data":"71645f7c203aa8bf40c347599f8091ac073762c0c8d6c862706dff01ae74bcc0"} Oct 03 08:45:34 crc kubenswrapper[4790]: I1003 08:45:34.500225 4790 generic.go:334] "Generic (PLEG): container finished" podID="bec1bd8b-1006-4ab4-b8bc-79b3ea63ab3c" containerID="d7956c6663bb46086c7b72c6751828b48eb7bcb83d18a148dddd345490e0e065" exitCode=0 Oct 03 08:45:34 crc kubenswrapper[4790]: I1003 08:45:34.500250 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zkj66" event={"ID":"bec1bd8b-1006-4ab4-b8bc-79b3ea63ab3c","Type":"ContainerDied","Data":"d7956c6663bb46086c7b72c6751828b48eb7bcb83d18a148dddd345490e0e065"} Oct 03 08:45:37 crc kubenswrapper[4790]: I1003 08:45:37.521776 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jshwg" event={"ID":"cd3a5dd4-70b9-462e-994b-18d4fbf399c3","Type":"ContainerStarted","Data":"1bc13b57ea6e2ed6fc304f6ce464c3200074ba653f838440e64d46ba58d3b2f7"} Oct 03 08:45:38 crc kubenswrapper[4790]: I1003 08:45:38.550547 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-jshwg" podStartSLOduration=4.417046073 podStartE2EDuration="9.550527805s" podCreationTimestamp="2025-10-03 08:45:29 +0000 UTC" firstStartedPulling="2025-10-03 08:45:31.467661964 +0000 UTC m=+317.820596202" lastFinishedPulling="2025-10-03 08:45:36.601143686 +0000 UTC m=+322.954077934" observedRunningTime="2025-10-03 08:45:38.54822757 +0000 UTC m=+324.901161808" watchObservedRunningTime="2025-10-03 08:45:38.550527805 +0000 UTC m=+324.903462043" Oct 03 08:45:39 crc kubenswrapper[4790]: I1003 08:45:39.535428 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5z6kh" event={"ID":"61616e67-5189-42b0-a4bc-8f6fc803f2a7","Type":"ContainerStarted","Data":"2a4a64632c331c57e4b1b1bdc557976e3b4e63805d1d3bdce94c3c9fd5f9619c"} Oct 03 08:45:39 crc kubenswrapper[4790]: I1003 08:45:39.537733 4790 generic.go:334] "Generic (PLEG): container finished" podID="bec1bd8b-1006-4ab4-b8bc-79b3ea63ab3c" containerID="f70766f17459dbfcda9d17e09b1265ee5bc00fc6da5d76c303711180af956c4e" exitCode=0 Oct 03 08:45:39 crc kubenswrapper[4790]: I1003 08:45:39.537820 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zkj66" event={"ID":"bec1bd8b-1006-4ab4-b8bc-79b3ea63ab3c","Type":"ContainerDied","Data":"f70766f17459dbfcda9d17e09b1265ee5bc00fc6da5d76c303711180af956c4e"} Oct 03 08:45:39 crc kubenswrapper[4790]: I1003 08:45:39.540637 4790 generic.go:334] "Generic (PLEG): container finished" podID="668fc837-03c4-4539-a9c6-dc5aca43693a" containerID="0fbaf01ed76d7988df91296fb78b40946e27d33fd907a2bf8a44d64979915e2a" exitCode=0 Oct 03 08:45:39 crc kubenswrapper[4790]: I1003 08:45:39.540766 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8rbb4" event={"ID":"668fc837-03c4-4539-a9c6-dc5aca43693a","Type":"ContainerDied","Data":"0fbaf01ed76d7988df91296fb78b40946e27d33fd907a2bf8a44d64979915e2a"} Oct 03 08:45:39 crc kubenswrapper[4790]: I1003 08:45:39.561807 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-5z6kh" podStartSLOduration=2.533390432 podStartE2EDuration="9.561788297s" podCreationTimestamp="2025-10-03 08:45:30 +0000 UTC" firstStartedPulling="2025-10-03 08:45:31.467027556 +0000 UTC m=+317.819961794" lastFinishedPulling="2025-10-03 08:45:38.495425421 +0000 UTC m=+324.848359659" observedRunningTime="2025-10-03 08:45:39.558390511 +0000 UTC m=+325.911324749" watchObservedRunningTime="2025-10-03 08:45:39.561788297 +0000 UTC m=+325.914722535" Oct 03 08:45:40 crc kubenswrapper[4790]: I1003 08:45:40.186848 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-jshwg" Oct 03 08:45:40 crc kubenswrapper[4790]: I1003 08:45:40.187232 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-jshwg" Oct 03 08:45:40 crc kubenswrapper[4790]: I1003 08:45:40.239348 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-jshwg" Oct 03 08:45:40 crc kubenswrapper[4790]: I1003 08:45:40.407819 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-5z6kh" Oct 03 08:45:40 crc kubenswrapper[4790]: I1003 08:45:40.407980 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-5z6kh" Oct 03 08:45:40 crc kubenswrapper[4790]: I1003 08:45:40.550274 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8rbb4" event={"ID":"668fc837-03c4-4539-a9c6-dc5aca43693a","Type":"ContainerStarted","Data":"504859a6c6fd684c5dc10798a342b9da957079dfa73307f1d4d4eef17a460706"} Oct 03 08:45:40 crc kubenswrapper[4790]: I1003 08:45:40.553392 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zkj66" event={"ID":"bec1bd8b-1006-4ab4-b8bc-79b3ea63ab3c","Type":"ContainerStarted","Data":"b4b0c4c414cea22c8ac9e197d95aff4ef010372f650c891dec3549489faa276e"} Oct 03 08:45:40 crc kubenswrapper[4790]: I1003 08:45:40.578899 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-8rbb4" podStartSLOduration=3.09401244 podStartE2EDuration="8.578874175s" podCreationTimestamp="2025-10-03 08:45:32 +0000 UTC" firstStartedPulling="2025-10-03 08:45:34.499209408 +0000 UTC m=+320.852143646" lastFinishedPulling="2025-10-03 08:45:39.984071143 +0000 UTC m=+326.337005381" observedRunningTime="2025-10-03 08:45:40.575222512 +0000 UTC m=+326.928156750" watchObservedRunningTime="2025-10-03 08:45:40.578874175 +0000 UTC m=+326.931808413" Oct 03 08:45:41 crc kubenswrapper[4790]: I1003 08:45:41.453010 4790 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-5z6kh" podUID="61616e67-5189-42b0-a4bc-8f6fc803f2a7" containerName="registry-server" probeResult="failure" output=< Oct 03 08:45:41 crc kubenswrapper[4790]: timeout: failed to connect service ":50051" within 1s Oct 03 08:45:41 crc kubenswrapper[4790]: > Oct 03 08:45:41 crc kubenswrapper[4790]: I1003 08:45:41.604938 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-jshwg" Oct 03 08:45:41 crc kubenswrapper[4790]: I1003 08:45:41.632877 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-zkj66" podStartSLOduration=4.011244053 podStartE2EDuration="9.63285126s" podCreationTimestamp="2025-10-03 08:45:32 +0000 UTC" firstStartedPulling="2025-10-03 08:45:34.501722199 +0000 UTC m=+320.854656437" lastFinishedPulling="2025-10-03 08:45:40.123329406 +0000 UTC m=+326.476263644" observedRunningTime="2025-10-03 08:45:40.606715205 +0000 UTC m=+326.959649463" watchObservedRunningTime="2025-10-03 08:45:41.63285126 +0000 UTC m=+327.985785498" Oct 03 08:45:42 crc kubenswrapper[4790]: I1003 08:45:42.620510 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-8rbb4" Oct 03 08:45:42 crc kubenswrapper[4790]: I1003 08:45:42.621070 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-8rbb4" Oct 03 08:45:42 crc kubenswrapper[4790]: I1003 08:45:42.667876 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-8rbb4" Oct 03 08:45:42 crc kubenswrapper[4790]: I1003 08:45:42.790238 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-zkj66" Oct 03 08:45:42 crc kubenswrapper[4790]: I1003 08:45:42.791331 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-zkj66" Oct 03 08:45:42 crc kubenswrapper[4790]: I1003 08:45:42.840146 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-zkj66" Oct 03 08:45:50 crc kubenswrapper[4790]: I1003 08:45:50.448935 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-5z6kh" Oct 03 08:45:50 crc kubenswrapper[4790]: I1003 08:45:50.492884 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-5z6kh" Oct 03 08:45:52 crc kubenswrapper[4790]: I1003 08:45:52.662770 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-8rbb4" Oct 03 08:45:52 crc kubenswrapper[4790]: I1003 08:45:52.875289 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-zkj66" Oct 03 08:46:10 crc kubenswrapper[4790]: I1003 08:46:10.187144 4790 patch_prober.go:28] interesting pod/machine-config-daemon-zqj4j container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 08:46:10 crc kubenswrapper[4790]: I1003 08:46:10.187608 4790 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" podUID="36eae06a-d8be-4128-8d68-acb32e1f011b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 08:46:40 crc kubenswrapper[4790]: I1003 08:46:40.186660 4790 patch_prober.go:28] interesting pod/machine-config-daemon-zqj4j container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 08:46:40 crc kubenswrapper[4790]: I1003 08:46:40.187423 4790 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" podUID="36eae06a-d8be-4128-8d68-acb32e1f011b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 08:47:10 crc kubenswrapper[4790]: I1003 08:47:10.187171 4790 patch_prober.go:28] interesting pod/machine-config-daemon-zqj4j container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 08:47:10 crc kubenswrapper[4790]: I1003 08:47:10.188652 4790 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" podUID="36eae06a-d8be-4128-8d68-acb32e1f011b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 08:47:10 crc kubenswrapper[4790]: I1003 08:47:10.188827 4790 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" Oct 03 08:47:10 crc kubenswrapper[4790]: I1003 08:47:10.189540 4790 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"43a28ab79be93dc66c4ea1f907c9eeb943c508f991b4352e4e0fde3491023e73"} pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 03 08:47:10 crc kubenswrapper[4790]: I1003 08:47:10.189725 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" podUID="36eae06a-d8be-4128-8d68-acb32e1f011b" containerName="machine-config-daemon" containerID="cri-o://43a28ab79be93dc66c4ea1f907c9eeb943c508f991b4352e4e0fde3491023e73" gracePeriod=600 Oct 03 08:47:11 crc kubenswrapper[4790]: I1003 08:47:11.099345 4790 generic.go:334] "Generic (PLEG): container finished" podID="36eae06a-d8be-4128-8d68-acb32e1f011b" containerID="43a28ab79be93dc66c4ea1f907c9eeb943c508f991b4352e4e0fde3491023e73" exitCode=0 Oct 03 08:47:11 crc kubenswrapper[4790]: I1003 08:47:11.099427 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" event={"ID":"36eae06a-d8be-4128-8d68-acb32e1f011b","Type":"ContainerDied","Data":"43a28ab79be93dc66c4ea1f907c9eeb943c508f991b4352e4e0fde3491023e73"} Oct 03 08:47:11 crc kubenswrapper[4790]: I1003 08:47:11.100093 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" event={"ID":"36eae06a-d8be-4128-8d68-acb32e1f011b","Type":"ContainerStarted","Data":"63ef82ccb7061f9ee1f5e4db190568a8e3d5bb60550f8a976b4df71eb3e198ad"} Oct 03 08:47:11 crc kubenswrapper[4790]: I1003 08:47:11.100124 4790 scope.go:117] "RemoveContainer" containerID="b8c3aea222eb433729d8eac107ebcdb306f649bef25942cd0390bfc1bbda5c74" Oct 03 08:47:30 crc kubenswrapper[4790]: I1003 08:47:30.128674 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-s722g"] Oct 03 08:47:30 crc kubenswrapper[4790]: I1003 08:47:30.130005 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-s722g" Oct 03 08:47:30 crc kubenswrapper[4790]: I1003 08:47:30.149285 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-s722g"] Oct 03 08:47:30 crc kubenswrapper[4790]: I1003 08:47:30.302397 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/9fbf594c-c9be-40c1-936c-6de1fe7f6208-ca-trust-extracted\") pod \"image-registry-66df7c8f76-s722g\" (UID: \"9fbf594c-c9be-40c1-936c-6de1fe7f6208\") " pod="openshift-image-registry/image-registry-66df7c8f76-s722g" Oct 03 08:47:30 crc kubenswrapper[4790]: I1003 08:47:30.302462 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/9fbf594c-c9be-40c1-936c-6de1fe7f6208-bound-sa-token\") pod \"image-registry-66df7c8f76-s722g\" (UID: \"9fbf594c-c9be-40c1-936c-6de1fe7f6208\") " pod="openshift-image-registry/image-registry-66df7c8f76-s722g" Oct 03 08:47:30 crc kubenswrapper[4790]: I1003 08:47:30.302492 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/9fbf594c-c9be-40c1-936c-6de1fe7f6208-registry-tls\") pod \"image-registry-66df7c8f76-s722g\" (UID: \"9fbf594c-c9be-40c1-936c-6de1fe7f6208\") " pod="openshift-image-registry/image-registry-66df7c8f76-s722g" Oct 03 08:47:30 crc kubenswrapper[4790]: I1003 08:47:30.302830 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9fbf594c-c9be-40c1-936c-6de1fe7f6208-trusted-ca\") pod \"image-registry-66df7c8f76-s722g\" (UID: \"9fbf594c-c9be-40c1-936c-6de1fe7f6208\") " pod="openshift-image-registry/image-registry-66df7c8f76-s722g" Oct 03 08:47:30 crc kubenswrapper[4790]: I1003 08:47:30.303026 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-s722g\" (UID: \"9fbf594c-c9be-40c1-936c-6de1fe7f6208\") " pod="openshift-image-registry/image-registry-66df7c8f76-s722g" Oct 03 08:47:30 crc kubenswrapper[4790]: I1003 08:47:30.303083 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ksf5t\" (UniqueName: \"kubernetes.io/projected/9fbf594c-c9be-40c1-936c-6de1fe7f6208-kube-api-access-ksf5t\") pod \"image-registry-66df7c8f76-s722g\" (UID: \"9fbf594c-c9be-40c1-936c-6de1fe7f6208\") " pod="openshift-image-registry/image-registry-66df7c8f76-s722g" Oct 03 08:47:30 crc kubenswrapper[4790]: I1003 08:47:30.303169 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/9fbf594c-c9be-40c1-936c-6de1fe7f6208-installation-pull-secrets\") pod \"image-registry-66df7c8f76-s722g\" (UID: \"9fbf594c-c9be-40c1-936c-6de1fe7f6208\") " pod="openshift-image-registry/image-registry-66df7c8f76-s722g" Oct 03 08:47:30 crc kubenswrapper[4790]: I1003 08:47:30.303239 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/9fbf594c-c9be-40c1-936c-6de1fe7f6208-registry-certificates\") pod \"image-registry-66df7c8f76-s722g\" (UID: \"9fbf594c-c9be-40c1-936c-6de1fe7f6208\") " pod="openshift-image-registry/image-registry-66df7c8f76-s722g" Oct 03 08:47:30 crc kubenswrapper[4790]: I1003 08:47:30.337428 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-s722g\" (UID: \"9fbf594c-c9be-40c1-936c-6de1fe7f6208\") " pod="openshift-image-registry/image-registry-66df7c8f76-s722g" Oct 03 08:47:30 crc kubenswrapper[4790]: I1003 08:47:30.404877 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/9fbf594c-c9be-40c1-936c-6de1fe7f6208-installation-pull-secrets\") pod \"image-registry-66df7c8f76-s722g\" (UID: \"9fbf594c-c9be-40c1-936c-6de1fe7f6208\") " pod="openshift-image-registry/image-registry-66df7c8f76-s722g" Oct 03 08:47:30 crc kubenswrapper[4790]: I1003 08:47:30.404942 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/9fbf594c-c9be-40c1-936c-6de1fe7f6208-registry-certificates\") pod \"image-registry-66df7c8f76-s722g\" (UID: \"9fbf594c-c9be-40c1-936c-6de1fe7f6208\") " pod="openshift-image-registry/image-registry-66df7c8f76-s722g" Oct 03 08:47:30 crc kubenswrapper[4790]: I1003 08:47:30.404992 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/9fbf594c-c9be-40c1-936c-6de1fe7f6208-ca-trust-extracted\") pod \"image-registry-66df7c8f76-s722g\" (UID: \"9fbf594c-c9be-40c1-936c-6de1fe7f6208\") " pod="openshift-image-registry/image-registry-66df7c8f76-s722g" Oct 03 08:47:30 crc kubenswrapper[4790]: I1003 08:47:30.405022 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/9fbf594c-c9be-40c1-936c-6de1fe7f6208-bound-sa-token\") pod \"image-registry-66df7c8f76-s722g\" (UID: \"9fbf594c-c9be-40c1-936c-6de1fe7f6208\") " pod="openshift-image-registry/image-registry-66df7c8f76-s722g" Oct 03 08:47:30 crc kubenswrapper[4790]: I1003 08:47:30.405055 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/9fbf594c-c9be-40c1-936c-6de1fe7f6208-registry-tls\") pod \"image-registry-66df7c8f76-s722g\" (UID: \"9fbf594c-c9be-40c1-936c-6de1fe7f6208\") " pod="openshift-image-registry/image-registry-66df7c8f76-s722g" Oct 03 08:47:30 crc kubenswrapper[4790]: I1003 08:47:30.405107 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9fbf594c-c9be-40c1-936c-6de1fe7f6208-trusted-ca\") pod \"image-registry-66df7c8f76-s722g\" (UID: \"9fbf594c-c9be-40c1-936c-6de1fe7f6208\") " pod="openshift-image-registry/image-registry-66df7c8f76-s722g" Oct 03 08:47:30 crc kubenswrapper[4790]: I1003 08:47:30.405143 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ksf5t\" (UniqueName: \"kubernetes.io/projected/9fbf594c-c9be-40c1-936c-6de1fe7f6208-kube-api-access-ksf5t\") pod \"image-registry-66df7c8f76-s722g\" (UID: \"9fbf594c-c9be-40c1-936c-6de1fe7f6208\") " pod="openshift-image-registry/image-registry-66df7c8f76-s722g" Oct 03 08:47:30 crc kubenswrapper[4790]: I1003 08:47:30.406226 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/9fbf594c-c9be-40c1-936c-6de1fe7f6208-ca-trust-extracted\") pod \"image-registry-66df7c8f76-s722g\" (UID: \"9fbf594c-c9be-40c1-936c-6de1fe7f6208\") " pod="openshift-image-registry/image-registry-66df7c8f76-s722g" Oct 03 08:47:30 crc kubenswrapper[4790]: I1003 08:47:30.407010 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9fbf594c-c9be-40c1-936c-6de1fe7f6208-trusted-ca\") pod \"image-registry-66df7c8f76-s722g\" (UID: \"9fbf594c-c9be-40c1-936c-6de1fe7f6208\") " pod="openshift-image-registry/image-registry-66df7c8f76-s722g" Oct 03 08:47:30 crc kubenswrapper[4790]: I1003 08:47:30.407089 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/9fbf594c-c9be-40c1-936c-6de1fe7f6208-registry-certificates\") pod \"image-registry-66df7c8f76-s722g\" (UID: \"9fbf594c-c9be-40c1-936c-6de1fe7f6208\") " pod="openshift-image-registry/image-registry-66df7c8f76-s722g" Oct 03 08:47:30 crc kubenswrapper[4790]: I1003 08:47:30.413318 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/9fbf594c-c9be-40c1-936c-6de1fe7f6208-installation-pull-secrets\") pod \"image-registry-66df7c8f76-s722g\" (UID: \"9fbf594c-c9be-40c1-936c-6de1fe7f6208\") " pod="openshift-image-registry/image-registry-66df7c8f76-s722g" Oct 03 08:47:30 crc kubenswrapper[4790]: I1003 08:47:30.419140 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/9fbf594c-c9be-40c1-936c-6de1fe7f6208-registry-tls\") pod \"image-registry-66df7c8f76-s722g\" (UID: \"9fbf594c-c9be-40c1-936c-6de1fe7f6208\") " pod="openshift-image-registry/image-registry-66df7c8f76-s722g" Oct 03 08:47:30 crc kubenswrapper[4790]: I1003 08:47:30.426142 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/9fbf594c-c9be-40c1-936c-6de1fe7f6208-bound-sa-token\") pod \"image-registry-66df7c8f76-s722g\" (UID: \"9fbf594c-c9be-40c1-936c-6de1fe7f6208\") " pod="openshift-image-registry/image-registry-66df7c8f76-s722g" Oct 03 08:47:30 crc kubenswrapper[4790]: I1003 08:47:30.426165 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ksf5t\" (UniqueName: \"kubernetes.io/projected/9fbf594c-c9be-40c1-936c-6de1fe7f6208-kube-api-access-ksf5t\") pod \"image-registry-66df7c8f76-s722g\" (UID: \"9fbf594c-c9be-40c1-936c-6de1fe7f6208\") " pod="openshift-image-registry/image-registry-66df7c8f76-s722g" Oct 03 08:47:30 crc kubenswrapper[4790]: I1003 08:47:30.445162 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-s722g" Oct 03 08:47:30 crc kubenswrapper[4790]: I1003 08:47:30.649475 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-s722g"] Oct 03 08:47:31 crc kubenswrapper[4790]: I1003 08:47:31.216366 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-s722g" event={"ID":"9fbf594c-c9be-40c1-936c-6de1fe7f6208","Type":"ContainerStarted","Data":"f931d712cb9e00a8a7f6eb64c076eeae31427d8e4f550b04fa0f9bde0efc34b4"} Oct 03 08:47:31 crc kubenswrapper[4790]: I1003 08:47:31.216428 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-s722g" event={"ID":"9fbf594c-c9be-40c1-936c-6de1fe7f6208","Type":"ContainerStarted","Data":"38fc40a3cc53eb91e557aa72d672dad17b0f89d5e52e91b0e0822479fa2fbf6e"} Oct 03 08:47:31 crc kubenswrapper[4790]: I1003 08:47:31.216543 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-s722g" Oct 03 08:47:31 crc kubenswrapper[4790]: I1003 08:47:31.238805 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-s722g" podStartSLOduration=1.238783202 podStartE2EDuration="1.238783202s" podCreationTimestamp="2025-10-03 08:47:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 08:47:31.235091537 +0000 UTC m=+437.588025795" watchObservedRunningTime="2025-10-03 08:47:31.238783202 +0000 UTC m=+437.591717430" Oct 03 08:47:50 crc kubenswrapper[4790]: I1003 08:47:50.451617 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-s722g" Oct 03 08:47:50 crc kubenswrapper[4790]: I1003 08:47:50.508141 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-csbrj"] Oct 03 08:48:15 crc kubenswrapper[4790]: I1003 08:48:15.555487 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-csbrj" podUID="4f5de6e6-d3aa-4e4e-9f8c-bf67bf228b8c" containerName="registry" containerID="cri-o://401e578b1d7b13af1d7223c99b3abb03355dad107f7f7280b0ad422a93328769" gracePeriod=30 Oct 03 08:48:15 crc kubenswrapper[4790]: I1003 08:48:15.942779 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-csbrj" Oct 03 08:48:16 crc kubenswrapper[4790]: I1003 08:48:16.056872 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tjp5v\" (UniqueName: \"kubernetes.io/projected/4f5de6e6-d3aa-4e4e-9f8c-bf67bf228b8c-kube-api-access-tjp5v\") pod \"4f5de6e6-d3aa-4e4e-9f8c-bf67bf228b8c\" (UID: \"4f5de6e6-d3aa-4e4e-9f8c-bf67bf228b8c\") " Oct 03 08:48:16 crc kubenswrapper[4790]: I1003 08:48:16.057003 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/4f5de6e6-d3aa-4e4e-9f8c-bf67bf228b8c-ca-trust-extracted\") pod \"4f5de6e6-d3aa-4e4e-9f8c-bf67bf228b8c\" (UID: \"4f5de6e6-d3aa-4e4e-9f8c-bf67bf228b8c\") " Oct 03 08:48:16 crc kubenswrapper[4790]: I1003 08:48:16.057105 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/4f5de6e6-d3aa-4e4e-9f8c-bf67bf228b8c-registry-certificates\") pod \"4f5de6e6-d3aa-4e4e-9f8c-bf67bf228b8c\" (UID: \"4f5de6e6-d3aa-4e4e-9f8c-bf67bf228b8c\") " Oct 03 08:48:16 crc kubenswrapper[4790]: I1003 08:48:16.057185 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4f5de6e6-d3aa-4e4e-9f8c-bf67bf228b8c-trusted-ca\") pod \"4f5de6e6-d3aa-4e4e-9f8c-bf67bf228b8c\" (UID: \"4f5de6e6-d3aa-4e4e-9f8c-bf67bf228b8c\") " Oct 03 08:48:16 crc kubenswrapper[4790]: I1003 08:48:16.057830 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"4f5de6e6-d3aa-4e4e-9f8c-bf67bf228b8c\" (UID: \"4f5de6e6-d3aa-4e4e-9f8c-bf67bf228b8c\") " Oct 03 08:48:16 crc kubenswrapper[4790]: I1003 08:48:16.057933 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4f5de6e6-d3aa-4e4e-9f8c-bf67bf228b8c-bound-sa-token\") pod \"4f5de6e6-d3aa-4e4e-9f8c-bf67bf228b8c\" (UID: \"4f5de6e6-d3aa-4e4e-9f8c-bf67bf228b8c\") " Oct 03 08:48:16 crc kubenswrapper[4790]: I1003 08:48:16.057994 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/4f5de6e6-d3aa-4e4e-9f8c-bf67bf228b8c-installation-pull-secrets\") pod \"4f5de6e6-d3aa-4e4e-9f8c-bf67bf228b8c\" (UID: \"4f5de6e6-d3aa-4e4e-9f8c-bf67bf228b8c\") " Oct 03 08:48:16 crc kubenswrapper[4790]: I1003 08:48:16.058036 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/4f5de6e6-d3aa-4e4e-9f8c-bf67bf228b8c-registry-tls\") pod \"4f5de6e6-d3aa-4e4e-9f8c-bf67bf228b8c\" (UID: \"4f5de6e6-d3aa-4e4e-9f8c-bf67bf228b8c\") " Oct 03 08:48:16 crc kubenswrapper[4790]: I1003 08:48:16.058045 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4f5de6e6-d3aa-4e4e-9f8c-bf67bf228b8c-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "4f5de6e6-d3aa-4e4e-9f8c-bf67bf228b8c" (UID: "4f5de6e6-d3aa-4e4e-9f8c-bf67bf228b8c"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:48:16 crc kubenswrapper[4790]: I1003 08:48:16.058057 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4f5de6e6-d3aa-4e4e-9f8c-bf67bf228b8c-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "4f5de6e6-d3aa-4e4e-9f8c-bf67bf228b8c" (UID: "4f5de6e6-d3aa-4e4e-9f8c-bf67bf228b8c"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:48:16 crc kubenswrapper[4790]: I1003 08:48:16.058614 4790 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/4f5de6e6-d3aa-4e4e-9f8c-bf67bf228b8c-registry-certificates\") on node \"crc\" DevicePath \"\"" Oct 03 08:48:16 crc kubenswrapper[4790]: I1003 08:48:16.058654 4790 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4f5de6e6-d3aa-4e4e-9f8c-bf67bf228b8c-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 03 08:48:16 crc kubenswrapper[4790]: I1003 08:48:16.062912 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4f5de6e6-d3aa-4e4e-9f8c-bf67bf228b8c-kube-api-access-tjp5v" (OuterVolumeSpecName: "kube-api-access-tjp5v") pod "4f5de6e6-d3aa-4e4e-9f8c-bf67bf228b8c" (UID: "4f5de6e6-d3aa-4e4e-9f8c-bf67bf228b8c"). InnerVolumeSpecName "kube-api-access-tjp5v". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:48:16 crc kubenswrapper[4790]: I1003 08:48:16.065487 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4f5de6e6-d3aa-4e4e-9f8c-bf67bf228b8c-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "4f5de6e6-d3aa-4e4e-9f8c-bf67bf228b8c" (UID: "4f5de6e6-d3aa-4e4e-9f8c-bf67bf228b8c"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:48:16 crc kubenswrapper[4790]: I1003 08:48:16.067101 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4f5de6e6-d3aa-4e4e-9f8c-bf67bf228b8c-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "4f5de6e6-d3aa-4e4e-9f8c-bf67bf228b8c" (UID: "4f5de6e6-d3aa-4e4e-9f8c-bf67bf228b8c"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:48:16 crc kubenswrapper[4790]: I1003 08:48:16.069758 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4f5de6e6-d3aa-4e4e-9f8c-bf67bf228b8c-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "4f5de6e6-d3aa-4e4e-9f8c-bf67bf228b8c" (UID: "4f5de6e6-d3aa-4e4e-9f8c-bf67bf228b8c"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:48:16 crc kubenswrapper[4790]: I1003 08:48:16.080874 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "4f5de6e6-d3aa-4e4e-9f8c-bf67bf228b8c" (UID: "4f5de6e6-d3aa-4e4e-9f8c-bf67bf228b8c"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Oct 03 08:48:16 crc kubenswrapper[4790]: I1003 08:48:16.081057 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4f5de6e6-d3aa-4e4e-9f8c-bf67bf228b8c-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "4f5de6e6-d3aa-4e4e-9f8c-bf67bf228b8c" (UID: "4f5de6e6-d3aa-4e4e-9f8c-bf67bf228b8c"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 08:48:16 crc kubenswrapper[4790]: I1003 08:48:16.160016 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tjp5v\" (UniqueName: \"kubernetes.io/projected/4f5de6e6-d3aa-4e4e-9f8c-bf67bf228b8c-kube-api-access-tjp5v\") on node \"crc\" DevicePath \"\"" Oct 03 08:48:16 crc kubenswrapper[4790]: I1003 08:48:16.160084 4790 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/4f5de6e6-d3aa-4e4e-9f8c-bf67bf228b8c-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Oct 03 08:48:16 crc kubenswrapper[4790]: I1003 08:48:16.160100 4790 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4f5de6e6-d3aa-4e4e-9f8c-bf67bf228b8c-bound-sa-token\") on node \"crc\" DevicePath \"\"" Oct 03 08:48:16 crc kubenswrapper[4790]: I1003 08:48:16.160113 4790 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/4f5de6e6-d3aa-4e4e-9f8c-bf67bf228b8c-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Oct 03 08:48:16 crc kubenswrapper[4790]: I1003 08:48:16.160126 4790 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/4f5de6e6-d3aa-4e4e-9f8c-bf67bf228b8c-registry-tls\") on node \"crc\" DevicePath \"\"" Oct 03 08:48:16 crc kubenswrapper[4790]: I1003 08:48:16.500518 4790 generic.go:334] "Generic (PLEG): container finished" podID="4f5de6e6-d3aa-4e4e-9f8c-bf67bf228b8c" containerID="401e578b1d7b13af1d7223c99b3abb03355dad107f7f7280b0ad422a93328769" exitCode=0 Oct 03 08:48:16 crc kubenswrapper[4790]: I1003 08:48:16.500577 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-csbrj" event={"ID":"4f5de6e6-d3aa-4e4e-9f8c-bf67bf228b8c","Type":"ContainerDied","Data":"401e578b1d7b13af1d7223c99b3abb03355dad107f7f7280b0ad422a93328769"} Oct 03 08:48:16 crc kubenswrapper[4790]: I1003 08:48:16.500635 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-csbrj" event={"ID":"4f5de6e6-d3aa-4e4e-9f8c-bf67bf228b8c","Type":"ContainerDied","Data":"5b0e9a812811ac3d469408b413be27b27ef9acd8068d65cb0718c22f9d428890"} Oct 03 08:48:16 crc kubenswrapper[4790]: I1003 08:48:16.500658 4790 scope.go:117] "RemoveContainer" containerID="401e578b1d7b13af1d7223c99b3abb03355dad107f7f7280b0ad422a93328769" Oct 03 08:48:16 crc kubenswrapper[4790]: I1003 08:48:16.500808 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-csbrj" Oct 03 08:48:16 crc kubenswrapper[4790]: I1003 08:48:16.525512 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-csbrj"] Oct 03 08:48:16 crc kubenswrapper[4790]: I1003 08:48:16.526460 4790 scope.go:117] "RemoveContainer" containerID="401e578b1d7b13af1d7223c99b3abb03355dad107f7f7280b0ad422a93328769" Oct 03 08:48:16 crc kubenswrapper[4790]: E1003 08:48:16.526948 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"401e578b1d7b13af1d7223c99b3abb03355dad107f7f7280b0ad422a93328769\": container with ID starting with 401e578b1d7b13af1d7223c99b3abb03355dad107f7f7280b0ad422a93328769 not found: ID does not exist" containerID="401e578b1d7b13af1d7223c99b3abb03355dad107f7f7280b0ad422a93328769" Oct 03 08:48:16 crc kubenswrapper[4790]: I1003 08:48:16.526995 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"401e578b1d7b13af1d7223c99b3abb03355dad107f7f7280b0ad422a93328769"} err="failed to get container status \"401e578b1d7b13af1d7223c99b3abb03355dad107f7f7280b0ad422a93328769\": rpc error: code = NotFound desc = could not find container \"401e578b1d7b13af1d7223c99b3abb03355dad107f7f7280b0ad422a93328769\": container with ID starting with 401e578b1d7b13af1d7223c99b3abb03355dad107f7f7280b0ad422a93328769 not found: ID does not exist" Oct 03 08:48:16 crc kubenswrapper[4790]: I1003 08:48:16.528142 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-csbrj"] Oct 03 08:48:18 crc kubenswrapper[4790]: I1003 08:48:18.334619 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4f5de6e6-d3aa-4e4e-9f8c-bf67bf228b8c" path="/var/lib/kubelet/pods/4f5de6e6-d3aa-4e4e-9f8c-bf67bf228b8c/volumes" Oct 03 08:49:10 crc kubenswrapper[4790]: I1003 08:49:10.186480 4790 patch_prober.go:28] interesting pod/machine-config-daemon-zqj4j container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 08:49:10 crc kubenswrapper[4790]: I1003 08:49:10.187455 4790 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" podUID="36eae06a-d8be-4128-8d68-acb32e1f011b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 08:49:40 crc kubenswrapper[4790]: I1003 08:49:40.186281 4790 patch_prober.go:28] interesting pod/machine-config-daemon-zqj4j container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 08:49:40 crc kubenswrapper[4790]: I1003 08:49:40.186914 4790 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" podUID="36eae06a-d8be-4128-8d68-acb32e1f011b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 08:50:10 crc kubenswrapper[4790]: I1003 08:50:10.187866 4790 patch_prober.go:28] interesting pod/machine-config-daemon-zqj4j container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 08:50:10 crc kubenswrapper[4790]: I1003 08:50:10.188341 4790 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" podUID="36eae06a-d8be-4128-8d68-acb32e1f011b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 08:50:10 crc kubenswrapper[4790]: I1003 08:50:10.188403 4790 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" Oct 03 08:50:10 crc kubenswrapper[4790]: I1003 08:50:10.189265 4790 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"63ef82ccb7061f9ee1f5e4db190568a8e3d5bb60550f8a976b4df71eb3e198ad"} pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 03 08:50:10 crc kubenswrapper[4790]: I1003 08:50:10.189319 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" podUID="36eae06a-d8be-4128-8d68-acb32e1f011b" containerName="machine-config-daemon" containerID="cri-o://63ef82ccb7061f9ee1f5e4db190568a8e3d5bb60550f8a976b4df71eb3e198ad" gracePeriod=600 Oct 03 08:50:11 crc kubenswrapper[4790]: I1003 08:50:11.250113 4790 generic.go:334] "Generic (PLEG): container finished" podID="36eae06a-d8be-4128-8d68-acb32e1f011b" containerID="63ef82ccb7061f9ee1f5e4db190568a8e3d5bb60550f8a976b4df71eb3e198ad" exitCode=0 Oct 03 08:50:11 crc kubenswrapper[4790]: I1003 08:50:11.250217 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" event={"ID":"36eae06a-d8be-4128-8d68-acb32e1f011b","Type":"ContainerDied","Data":"63ef82ccb7061f9ee1f5e4db190568a8e3d5bb60550f8a976b4df71eb3e198ad"} Oct 03 08:50:11 crc kubenswrapper[4790]: I1003 08:50:11.253437 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" event={"ID":"36eae06a-d8be-4128-8d68-acb32e1f011b","Type":"ContainerStarted","Data":"4ae452d98acbb7343cd83b179511e310c36044718e3f86be8c5675a6c5f1473a"} Oct 03 08:50:11 crc kubenswrapper[4790]: I1003 08:50:11.253534 4790 scope.go:117] "RemoveContainer" containerID="43a28ab79be93dc66c4ea1f907c9eeb943c508f991b4352e4e0fde3491023e73" Oct 03 08:50:56 crc kubenswrapper[4790]: I1003 08:50:56.887078 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-r87sx"] Oct 03 08:50:56 crc kubenswrapper[4790]: E1003 08:50:56.888557 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4f5de6e6-d3aa-4e4e-9f8c-bf67bf228b8c" containerName="registry" Oct 03 08:50:56 crc kubenswrapper[4790]: I1003 08:50:56.888603 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f5de6e6-d3aa-4e4e-9f8c-bf67bf228b8c" containerName="registry" Oct 03 08:50:56 crc kubenswrapper[4790]: I1003 08:50:56.888913 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="4f5de6e6-d3aa-4e4e-9f8c-bf67bf228b8c" containerName="registry" Oct 03 08:50:56 crc kubenswrapper[4790]: I1003 08:50:56.895074 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-7f985d654d-r87sx" Oct 03 08:50:56 crc kubenswrapper[4790]: I1003 08:50:56.904084 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-r87sx"] Oct 03 08:50:56 crc kubenswrapper[4790]: I1003 08:50:56.904151 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-5b446d88c5-fpr25"] Oct 03 08:50:56 crc kubenswrapper[4790]: I1003 08:50:56.904833 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-5b446d88c5-fpr25" Oct 03 08:50:56 crc kubenswrapper[4790]: I1003 08:50:56.909061 4790 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-t7jtw" Oct 03 08:50:56 crc kubenswrapper[4790]: I1003 08:50:56.909315 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Oct 03 08:50:56 crc kubenswrapper[4790]: I1003 08:50:56.911105 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Oct 03 08:50:56 crc kubenswrapper[4790]: I1003 08:50:56.919155 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-9k2xl"] Oct 03 08:50:56 crc kubenswrapper[4790]: I1003 08:50:56.920154 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-5655c58dd6-9k2xl" Oct 03 08:50:56 crc kubenswrapper[4790]: I1003 08:50:56.920678 4790 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-8dtgh" Oct 03 08:50:56 crc kubenswrapper[4790]: I1003 08:50:56.925443 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-5b446d88c5-fpr25"] Oct 03 08:50:56 crc kubenswrapper[4790]: I1003 08:50:56.926794 4790 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-l7hhc" Oct 03 08:50:56 crc kubenswrapper[4790]: I1003 08:50:56.932000 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-9k2xl"] Oct 03 08:50:56 crc kubenswrapper[4790]: I1003 08:50:56.933947 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-brzw7\" (UniqueName: \"kubernetes.io/projected/594cfed0-b92c-4763-998c-c7aceef4c2f5-kube-api-access-brzw7\") pod \"cert-manager-5b446d88c5-fpr25\" (UID: \"594cfed0-b92c-4763-998c-c7aceef4c2f5\") " pod="cert-manager/cert-manager-5b446d88c5-fpr25" Oct 03 08:50:56 crc kubenswrapper[4790]: I1003 08:50:56.933994 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7tlqj\" (UniqueName: \"kubernetes.io/projected/f6eb09d0-b3d6-4bc0-9125-5389bf8124dd-kube-api-access-7tlqj\") pod \"cert-manager-cainjector-7f985d654d-r87sx\" (UID: \"f6eb09d0-b3d6-4bc0-9125-5389bf8124dd\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-r87sx" Oct 03 08:50:56 crc kubenswrapper[4790]: I1003 08:50:56.934033 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fjxk8\" (UniqueName: \"kubernetes.io/projected/9ea495a1-9693-44fd-a9d5-6f9a710456f5-kube-api-access-fjxk8\") pod \"cert-manager-webhook-5655c58dd6-9k2xl\" (UID: \"9ea495a1-9693-44fd-a9d5-6f9a710456f5\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-9k2xl" Oct 03 08:50:57 crc kubenswrapper[4790]: I1003 08:50:57.035514 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-brzw7\" (UniqueName: \"kubernetes.io/projected/594cfed0-b92c-4763-998c-c7aceef4c2f5-kube-api-access-brzw7\") pod \"cert-manager-5b446d88c5-fpr25\" (UID: \"594cfed0-b92c-4763-998c-c7aceef4c2f5\") " pod="cert-manager/cert-manager-5b446d88c5-fpr25" Oct 03 08:50:57 crc kubenswrapper[4790]: I1003 08:50:57.035626 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7tlqj\" (UniqueName: \"kubernetes.io/projected/f6eb09d0-b3d6-4bc0-9125-5389bf8124dd-kube-api-access-7tlqj\") pod \"cert-manager-cainjector-7f985d654d-r87sx\" (UID: \"f6eb09d0-b3d6-4bc0-9125-5389bf8124dd\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-r87sx" Oct 03 08:50:57 crc kubenswrapper[4790]: I1003 08:50:57.035687 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fjxk8\" (UniqueName: \"kubernetes.io/projected/9ea495a1-9693-44fd-a9d5-6f9a710456f5-kube-api-access-fjxk8\") pod \"cert-manager-webhook-5655c58dd6-9k2xl\" (UID: \"9ea495a1-9693-44fd-a9d5-6f9a710456f5\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-9k2xl" Oct 03 08:50:57 crc kubenswrapper[4790]: I1003 08:50:57.055588 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7tlqj\" (UniqueName: \"kubernetes.io/projected/f6eb09d0-b3d6-4bc0-9125-5389bf8124dd-kube-api-access-7tlqj\") pod \"cert-manager-cainjector-7f985d654d-r87sx\" (UID: \"f6eb09d0-b3d6-4bc0-9125-5389bf8124dd\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-r87sx" Oct 03 08:50:57 crc kubenswrapper[4790]: I1003 08:50:57.055691 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-brzw7\" (UniqueName: \"kubernetes.io/projected/594cfed0-b92c-4763-998c-c7aceef4c2f5-kube-api-access-brzw7\") pod \"cert-manager-5b446d88c5-fpr25\" (UID: \"594cfed0-b92c-4763-998c-c7aceef4c2f5\") " pod="cert-manager/cert-manager-5b446d88c5-fpr25" Oct 03 08:50:57 crc kubenswrapper[4790]: I1003 08:50:57.057393 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fjxk8\" (UniqueName: \"kubernetes.io/projected/9ea495a1-9693-44fd-a9d5-6f9a710456f5-kube-api-access-fjxk8\") pod \"cert-manager-webhook-5655c58dd6-9k2xl\" (UID: \"9ea495a1-9693-44fd-a9d5-6f9a710456f5\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-9k2xl" Oct 03 08:50:57 crc kubenswrapper[4790]: I1003 08:50:57.237102 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-5b446d88c5-fpr25" Oct 03 08:50:57 crc kubenswrapper[4790]: I1003 08:50:57.251527 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-7f985d654d-r87sx" Oct 03 08:50:57 crc kubenswrapper[4790]: I1003 08:50:57.263036 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-5655c58dd6-9k2xl" Oct 03 08:50:57 crc kubenswrapper[4790]: I1003 08:50:57.474399 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-5b446d88c5-fpr25"] Oct 03 08:50:57 crc kubenswrapper[4790]: I1003 08:50:57.483342 4790 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 03 08:50:57 crc kubenswrapper[4790]: I1003 08:50:57.555893 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-5b446d88c5-fpr25" event={"ID":"594cfed0-b92c-4763-998c-c7aceef4c2f5","Type":"ContainerStarted","Data":"6bc7f6512f24b9daf71a82aea9014dfffd638be43139de715687dc91502e5abb"} Oct 03 08:50:57 crc kubenswrapper[4790]: I1003 08:50:57.722343 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-9k2xl"] Oct 03 08:50:57 crc kubenswrapper[4790]: W1003 08:50:57.728957 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9ea495a1_9693_44fd_a9d5_6f9a710456f5.slice/crio-1b2873acab14ec0437b394db5ffec53b8f04ca4711e1743550b28ef64a79c536 WatchSource:0}: Error finding container 1b2873acab14ec0437b394db5ffec53b8f04ca4711e1743550b28ef64a79c536: Status 404 returned error can't find the container with id 1b2873acab14ec0437b394db5ffec53b8f04ca4711e1743550b28ef64a79c536 Oct 03 08:50:57 crc kubenswrapper[4790]: I1003 08:50:57.736536 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-r87sx"] Oct 03 08:50:58 crc kubenswrapper[4790]: I1003 08:50:58.563112 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-7f985d654d-r87sx" event={"ID":"f6eb09d0-b3d6-4bc0-9125-5389bf8124dd","Type":"ContainerStarted","Data":"6a129494489d631ce601f1136beb506c2a66effe78c057fcf4c1d3352a0636fb"} Oct 03 08:50:58 crc kubenswrapper[4790]: I1003 08:50:58.566368 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-5655c58dd6-9k2xl" event={"ID":"9ea495a1-9693-44fd-a9d5-6f9a710456f5","Type":"ContainerStarted","Data":"1b2873acab14ec0437b394db5ffec53b8f04ca4711e1743550b28ef64a79c536"} Oct 03 08:51:00 crc kubenswrapper[4790]: I1003 08:51:00.580120 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-5b446d88c5-fpr25" event={"ID":"594cfed0-b92c-4763-998c-c7aceef4c2f5","Type":"ContainerStarted","Data":"22f7ff6a5e20dde7db1b42261249a9652e6711302782257cf123060dc908d5c4"} Oct 03 08:51:00 crc kubenswrapper[4790]: I1003 08:51:00.598380 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-5b446d88c5-fpr25" podStartSLOduration=2.414635053 podStartE2EDuration="4.598182664s" podCreationTimestamp="2025-10-03 08:50:56 +0000 UTC" firstStartedPulling="2025-10-03 08:50:57.483125291 +0000 UTC m=+643.836059529" lastFinishedPulling="2025-10-03 08:50:59.666672902 +0000 UTC m=+646.019607140" observedRunningTime="2025-10-03 08:51:00.595541554 +0000 UTC m=+646.948475792" watchObservedRunningTime="2025-10-03 08:51:00.598182664 +0000 UTC m=+646.951116902" Oct 03 08:51:01 crc kubenswrapper[4790]: I1003 08:51:01.590161 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-7f985d654d-r87sx" event={"ID":"f6eb09d0-b3d6-4bc0-9125-5389bf8124dd","Type":"ContainerStarted","Data":"cc8abd5eba0db3dc83e4c50c3523b10b7049e0ad88f8dc8dbaa96ff65440291e"} Oct 03 08:51:01 crc kubenswrapper[4790]: I1003 08:51:01.592673 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-5655c58dd6-9k2xl" event={"ID":"9ea495a1-9693-44fd-a9d5-6f9a710456f5","Type":"ContainerStarted","Data":"26fce29fbf1beead0be08f77e3104fb2f0d341a13a14c1abcd774780a894f5ad"} Oct 03 08:51:01 crc kubenswrapper[4790]: I1003 08:51:01.592815 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-5655c58dd6-9k2xl" Oct 03 08:51:01 crc kubenswrapper[4790]: I1003 08:51:01.611772 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-7f985d654d-r87sx" podStartSLOduration=2.426543771 podStartE2EDuration="5.611750315s" podCreationTimestamp="2025-10-03 08:50:56 +0000 UTC" firstStartedPulling="2025-10-03 08:50:57.74452254 +0000 UTC m=+644.097456788" lastFinishedPulling="2025-10-03 08:51:00.929729094 +0000 UTC m=+647.282663332" observedRunningTime="2025-10-03 08:51:01.609502115 +0000 UTC m=+647.962436353" watchObservedRunningTime="2025-10-03 08:51:01.611750315 +0000 UTC m=+647.964684553" Oct 03 08:51:01 crc kubenswrapper[4790]: I1003 08:51:01.629922 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-5655c58dd6-9k2xl" podStartSLOduration=2.432012937 podStartE2EDuration="5.629900329s" podCreationTimestamp="2025-10-03 08:50:56 +0000 UTC" firstStartedPulling="2025-10-03 08:50:57.731348528 +0000 UTC m=+644.084282766" lastFinishedPulling="2025-10-03 08:51:00.92923592 +0000 UTC m=+647.282170158" observedRunningTime="2025-10-03 08:51:01.626657902 +0000 UTC m=+647.979592150" watchObservedRunningTime="2025-10-03 08:51:01.629900329 +0000 UTC m=+647.982834567" Oct 03 08:51:07 crc kubenswrapper[4790]: I1003 08:51:07.267394 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-5655c58dd6-9k2xl" Oct 03 08:51:07 crc kubenswrapper[4790]: I1003 08:51:07.390686 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-vsjvh"] Oct 03 08:51:07 crc kubenswrapper[4790]: I1003 08:51:07.391098 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-vsjvh" podUID="fa372dcc-63aa-48e5-b153-7739078026ef" containerName="ovn-controller" containerID="cri-o://27b1573f76936a5148d24accaa7fdc1a1da552fb56b348812445e7f1c1e4cbef" gracePeriod=30 Oct 03 08:51:07 crc kubenswrapper[4790]: I1003 08:51:07.391345 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-vsjvh" podUID="fa372dcc-63aa-48e5-b153-7739078026ef" containerName="northd" containerID="cri-o://6f2aaa340e3927261ef0119f786b348e2091ddcd4b791619cc521122cc0774e7" gracePeriod=30 Oct 03 08:51:07 crc kubenswrapper[4790]: I1003 08:51:07.391498 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-vsjvh" podUID="fa372dcc-63aa-48e5-b153-7739078026ef" containerName="nbdb" containerID="cri-o://55b4e8ab33316a7f7057ee723ebd49eb0adcd7270124f084b2828ce538ed3297" gracePeriod=30 Oct 03 08:51:07 crc kubenswrapper[4790]: I1003 08:51:07.391507 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-vsjvh" podUID="fa372dcc-63aa-48e5-b153-7739078026ef" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://e52e55afef7b68f05e0c7d975d4497ee57dd9fa5aef7f72e9ca7441ea1633c05" gracePeriod=30 Oct 03 08:51:07 crc kubenswrapper[4790]: I1003 08:51:07.391513 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-vsjvh" podUID="fa372dcc-63aa-48e5-b153-7739078026ef" containerName="sbdb" containerID="cri-o://74d44e422b0cbbd3793553dad98475d5a93ac38530a167db0dd1ef152edac010" gracePeriod=30 Oct 03 08:51:07 crc kubenswrapper[4790]: I1003 08:51:07.391560 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-vsjvh" podUID="fa372dcc-63aa-48e5-b153-7739078026ef" containerName="ovn-acl-logging" containerID="cri-o://ff0f1dd623e164ebe1862661c5905a38bec3aa1dd5c61982eebde25d0feb4c6c" gracePeriod=30 Oct 03 08:51:07 crc kubenswrapper[4790]: I1003 08:51:07.391617 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-vsjvh" podUID="fa372dcc-63aa-48e5-b153-7739078026ef" containerName="kube-rbac-proxy-node" containerID="cri-o://e69102d376e90485ca565d34e18dbc12067e166f0bdf3c4325a0873f9f2b9b0b" gracePeriod=30 Oct 03 08:51:07 crc kubenswrapper[4790]: I1003 08:51:07.432934 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-vsjvh" podUID="fa372dcc-63aa-48e5-b153-7739078026ef" containerName="ovnkube-controller" containerID="cri-o://85ca20fdf25ea02c6a17221523f0f3ffcf7af8343ea18ff5121a059dea5bddb2" gracePeriod=30 Oct 03 08:51:07 crc kubenswrapper[4790]: I1003 08:51:07.644230 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vsjvh_fa372dcc-63aa-48e5-b153-7739078026ef/ovnkube-controller/3.log" Oct 03 08:51:07 crc kubenswrapper[4790]: I1003 08:51:07.647678 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vsjvh_fa372dcc-63aa-48e5-b153-7739078026ef/ovn-acl-logging/0.log" Oct 03 08:51:07 crc kubenswrapper[4790]: I1003 08:51:07.648538 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vsjvh_fa372dcc-63aa-48e5-b153-7739078026ef/ovn-controller/0.log" Oct 03 08:51:07 crc kubenswrapper[4790]: I1003 08:51:07.649194 4790 generic.go:334] "Generic (PLEG): container finished" podID="fa372dcc-63aa-48e5-b153-7739078026ef" containerID="85ca20fdf25ea02c6a17221523f0f3ffcf7af8343ea18ff5121a059dea5bddb2" exitCode=0 Oct 03 08:51:07 crc kubenswrapper[4790]: I1003 08:51:07.649227 4790 generic.go:334] "Generic (PLEG): container finished" podID="fa372dcc-63aa-48e5-b153-7739078026ef" containerID="74d44e422b0cbbd3793553dad98475d5a93ac38530a167db0dd1ef152edac010" exitCode=0 Oct 03 08:51:07 crc kubenswrapper[4790]: I1003 08:51:07.649238 4790 generic.go:334] "Generic (PLEG): container finished" podID="fa372dcc-63aa-48e5-b153-7739078026ef" containerID="6f2aaa340e3927261ef0119f786b348e2091ddcd4b791619cc521122cc0774e7" exitCode=0 Oct 03 08:51:07 crc kubenswrapper[4790]: I1003 08:51:07.649235 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vsjvh" event={"ID":"fa372dcc-63aa-48e5-b153-7739078026ef","Type":"ContainerDied","Data":"85ca20fdf25ea02c6a17221523f0f3ffcf7af8343ea18ff5121a059dea5bddb2"} Oct 03 08:51:07 crc kubenswrapper[4790]: I1003 08:51:07.649299 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vsjvh" event={"ID":"fa372dcc-63aa-48e5-b153-7739078026ef","Type":"ContainerDied","Data":"74d44e422b0cbbd3793553dad98475d5a93ac38530a167db0dd1ef152edac010"} Oct 03 08:51:07 crc kubenswrapper[4790]: I1003 08:51:07.649249 4790 generic.go:334] "Generic (PLEG): container finished" podID="fa372dcc-63aa-48e5-b153-7739078026ef" containerID="e52e55afef7b68f05e0c7d975d4497ee57dd9fa5aef7f72e9ca7441ea1633c05" exitCode=0 Oct 03 08:51:07 crc kubenswrapper[4790]: I1003 08:51:07.649331 4790 scope.go:117] "RemoveContainer" containerID="873da5acf186f02e995f5db997cebf47651cb6a2b87d4641ecb0318c004619e8" Oct 03 08:51:07 crc kubenswrapper[4790]: I1003 08:51:07.649334 4790 generic.go:334] "Generic (PLEG): container finished" podID="fa372dcc-63aa-48e5-b153-7739078026ef" containerID="e69102d376e90485ca565d34e18dbc12067e166f0bdf3c4325a0873f9f2b9b0b" exitCode=0 Oct 03 08:51:07 crc kubenswrapper[4790]: I1003 08:51:07.649348 4790 generic.go:334] "Generic (PLEG): container finished" podID="fa372dcc-63aa-48e5-b153-7739078026ef" containerID="ff0f1dd623e164ebe1862661c5905a38bec3aa1dd5c61982eebde25d0feb4c6c" exitCode=143 Oct 03 08:51:07 crc kubenswrapper[4790]: I1003 08:51:07.649358 4790 generic.go:334] "Generic (PLEG): container finished" podID="fa372dcc-63aa-48e5-b153-7739078026ef" containerID="27b1573f76936a5148d24accaa7fdc1a1da552fb56b348812445e7f1c1e4cbef" exitCode=143 Oct 03 08:51:07 crc kubenswrapper[4790]: I1003 08:51:07.649316 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vsjvh" event={"ID":"fa372dcc-63aa-48e5-b153-7739078026ef","Type":"ContainerDied","Data":"6f2aaa340e3927261ef0119f786b348e2091ddcd4b791619cc521122cc0774e7"} Oct 03 08:51:07 crc kubenswrapper[4790]: I1003 08:51:07.649473 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vsjvh" event={"ID":"fa372dcc-63aa-48e5-b153-7739078026ef","Type":"ContainerDied","Data":"e52e55afef7b68f05e0c7d975d4497ee57dd9fa5aef7f72e9ca7441ea1633c05"} Oct 03 08:51:07 crc kubenswrapper[4790]: I1003 08:51:07.649504 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vsjvh" event={"ID":"fa372dcc-63aa-48e5-b153-7739078026ef","Type":"ContainerDied","Data":"e69102d376e90485ca565d34e18dbc12067e166f0bdf3c4325a0873f9f2b9b0b"} Oct 03 08:51:07 crc kubenswrapper[4790]: I1003 08:51:07.649528 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vsjvh" event={"ID":"fa372dcc-63aa-48e5-b153-7739078026ef","Type":"ContainerDied","Data":"ff0f1dd623e164ebe1862661c5905a38bec3aa1dd5c61982eebde25d0feb4c6c"} Oct 03 08:51:07 crc kubenswrapper[4790]: I1003 08:51:07.649545 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vsjvh" event={"ID":"fa372dcc-63aa-48e5-b153-7739078026ef","Type":"ContainerDied","Data":"27b1573f76936a5148d24accaa7fdc1a1da552fb56b348812445e7f1c1e4cbef"} Oct 03 08:51:07 crc kubenswrapper[4790]: I1003 08:51:07.651939 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-hncl2_f2f5631d-8486-44f9-81b0-ee78515e7866/kube-multus/2.log" Oct 03 08:51:07 crc kubenswrapper[4790]: I1003 08:51:07.652490 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-hncl2_f2f5631d-8486-44f9-81b0-ee78515e7866/kube-multus/1.log" Oct 03 08:51:07 crc kubenswrapper[4790]: I1003 08:51:07.652527 4790 generic.go:334] "Generic (PLEG): container finished" podID="f2f5631d-8486-44f9-81b0-ee78515e7866" containerID="6c79a380a2fa475e309213e6650566780b9ed65a03fcabc66eca015ddcc0ecde" exitCode=2 Oct 03 08:51:07 crc kubenswrapper[4790]: I1003 08:51:07.652546 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-hncl2" event={"ID":"f2f5631d-8486-44f9-81b0-ee78515e7866","Type":"ContainerDied","Data":"6c79a380a2fa475e309213e6650566780b9ed65a03fcabc66eca015ddcc0ecde"} Oct 03 08:51:07 crc kubenswrapper[4790]: I1003 08:51:07.653098 4790 scope.go:117] "RemoveContainer" containerID="6c79a380a2fa475e309213e6650566780b9ed65a03fcabc66eca015ddcc0ecde" Oct 03 08:51:07 crc kubenswrapper[4790]: E1003 08:51:07.653287 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-hncl2_openshift-multus(f2f5631d-8486-44f9-81b0-ee78515e7866)\"" pod="openshift-multus/multus-hncl2" podUID="f2f5631d-8486-44f9-81b0-ee78515e7866" Oct 03 08:51:07 crc kubenswrapper[4790]: I1003 08:51:07.705210 4790 scope.go:117] "RemoveContainer" containerID="934c9d6aafe23db007c316b08845e9d67eecf0072ba180fa513fa213245cc37c" Oct 03 08:51:07 crc kubenswrapper[4790]: I1003 08:51:07.740008 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vsjvh_fa372dcc-63aa-48e5-b153-7739078026ef/ovn-acl-logging/0.log" Oct 03 08:51:07 crc kubenswrapper[4790]: I1003 08:51:07.740536 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vsjvh_fa372dcc-63aa-48e5-b153-7739078026ef/ovn-controller/0.log" Oct 03 08:51:07 crc kubenswrapper[4790]: I1003 08:51:07.741097 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-vsjvh" Oct 03 08:51:07 crc kubenswrapper[4790]: I1003 08:51:07.793078 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-qlwvl"] Oct 03 08:51:07 crc kubenswrapper[4790]: E1003 08:51:07.793320 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa372dcc-63aa-48e5-b153-7739078026ef" containerName="ovnkube-controller" Oct 03 08:51:07 crc kubenswrapper[4790]: I1003 08:51:07.793332 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa372dcc-63aa-48e5-b153-7739078026ef" containerName="ovnkube-controller" Oct 03 08:51:07 crc kubenswrapper[4790]: E1003 08:51:07.793341 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa372dcc-63aa-48e5-b153-7739078026ef" containerName="ovnkube-controller" Oct 03 08:51:07 crc kubenswrapper[4790]: I1003 08:51:07.793347 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa372dcc-63aa-48e5-b153-7739078026ef" containerName="ovnkube-controller" Oct 03 08:51:07 crc kubenswrapper[4790]: E1003 08:51:07.793355 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa372dcc-63aa-48e5-b153-7739078026ef" containerName="kube-rbac-proxy-ovn-metrics" Oct 03 08:51:07 crc kubenswrapper[4790]: I1003 08:51:07.793362 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa372dcc-63aa-48e5-b153-7739078026ef" containerName="kube-rbac-proxy-ovn-metrics" Oct 03 08:51:07 crc kubenswrapper[4790]: E1003 08:51:07.793374 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa372dcc-63aa-48e5-b153-7739078026ef" containerName="ovnkube-controller" Oct 03 08:51:07 crc kubenswrapper[4790]: I1003 08:51:07.793380 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa372dcc-63aa-48e5-b153-7739078026ef" containerName="ovnkube-controller" Oct 03 08:51:07 crc kubenswrapper[4790]: E1003 08:51:07.793390 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa372dcc-63aa-48e5-b153-7739078026ef" containerName="nbdb" Oct 03 08:51:07 crc kubenswrapper[4790]: I1003 08:51:07.793397 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa372dcc-63aa-48e5-b153-7739078026ef" containerName="nbdb" Oct 03 08:51:07 crc kubenswrapper[4790]: E1003 08:51:07.793408 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa372dcc-63aa-48e5-b153-7739078026ef" containerName="ovn-acl-logging" Oct 03 08:51:07 crc kubenswrapper[4790]: I1003 08:51:07.793414 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa372dcc-63aa-48e5-b153-7739078026ef" containerName="ovn-acl-logging" Oct 03 08:51:07 crc kubenswrapper[4790]: E1003 08:51:07.793421 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa372dcc-63aa-48e5-b153-7739078026ef" containerName="northd" Oct 03 08:51:07 crc kubenswrapper[4790]: I1003 08:51:07.793427 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa372dcc-63aa-48e5-b153-7739078026ef" containerName="northd" Oct 03 08:51:07 crc kubenswrapper[4790]: E1003 08:51:07.793433 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa372dcc-63aa-48e5-b153-7739078026ef" containerName="kubecfg-setup" Oct 03 08:51:07 crc kubenswrapper[4790]: I1003 08:51:07.793439 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa372dcc-63aa-48e5-b153-7739078026ef" containerName="kubecfg-setup" Oct 03 08:51:07 crc kubenswrapper[4790]: E1003 08:51:07.793449 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa372dcc-63aa-48e5-b153-7739078026ef" containerName="ovn-controller" Oct 03 08:51:07 crc kubenswrapper[4790]: I1003 08:51:07.793454 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa372dcc-63aa-48e5-b153-7739078026ef" containerName="ovn-controller" Oct 03 08:51:07 crc kubenswrapper[4790]: E1003 08:51:07.793462 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa372dcc-63aa-48e5-b153-7739078026ef" containerName="sbdb" Oct 03 08:51:07 crc kubenswrapper[4790]: I1003 08:51:07.793468 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa372dcc-63aa-48e5-b153-7739078026ef" containerName="sbdb" Oct 03 08:51:07 crc kubenswrapper[4790]: E1003 08:51:07.793500 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa372dcc-63aa-48e5-b153-7739078026ef" containerName="kube-rbac-proxy-node" Oct 03 08:51:07 crc kubenswrapper[4790]: I1003 08:51:07.793507 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa372dcc-63aa-48e5-b153-7739078026ef" containerName="kube-rbac-proxy-node" Oct 03 08:51:07 crc kubenswrapper[4790]: I1003 08:51:07.793631 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="fa372dcc-63aa-48e5-b153-7739078026ef" containerName="kube-rbac-proxy-ovn-metrics" Oct 03 08:51:07 crc kubenswrapper[4790]: I1003 08:51:07.793643 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="fa372dcc-63aa-48e5-b153-7739078026ef" containerName="nbdb" Oct 03 08:51:07 crc kubenswrapper[4790]: I1003 08:51:07.793651 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="fa372dcc-63aa-48e5-b153-7739078026ef" containerName="ovnkube-controller" Oct 03 08:51:07 crc kubenswrapper[4790]: I1003 08:51:07.793658 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="fa372dcc-63aa-48e5-b153-7739078026ef" containerName="ovn-acl-logging" Oct 03 08:51:07 crc kubenswrapper[4790]: I1003 08:51:07.793666 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="fa372dcc-63aa-48e5-b153-7739078026ef" containerName="ovn-controller" Oct 03 08:51:07 crc kubenswrapper[4790]: I1003 08:51:07.793674 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="fa372dcc-63aa-48e5-b153-7739078026ef" containerName="ovnkube-controller" Oct 03 08:51:07 crc kubenswrapper[4790]: I1003 08:51:07.793681 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="fa372dcc-63aa-48e5-b153-7739078026ef" containerName="northd" Oct 03 08:51:07 crc kubenswrapper[4790]: I1003 08:51:07.793687 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="fa372dcc-63aa-48e5-b153-7739078026ef" containerName="ovnkube-controller" Oct 03 08:51:07 crc kubenswrapper[4790]: I1003 08:51:07.793693 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="fa372dcc-63aa-48e5-b153-7739078026ef" containerName="kube-rbac-proxy-node" Oct 03 08:51:07 crc kubenswrapper[4790]: I1003 08:51:07.793702 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="fa372dcc-63aa-48e5-b153-7739078026ef" containerName="sbdb" Oct 03 08:51:07 crc kubenswrapper[4790]: E1003 08:51:07.793793 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa372dcc-63aa-48e5-b153-7739078026ef" containerName="ovnkube-controller" Oct 03 08:51:07 crc kubenswrapper[4790]: I1003 08:51:07.793800 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa372dcc-63aa-48e5-b153-7739078026ef" containerName="ovnkube-controller" Oct 03 08:51:07 crc kubenswrapper[4790]: E1003 08:51:07.793810 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa372dcc-63aa-48e5-b153-7739078026ef" containerName="ovnkube-controller" Oct 03 08:51:07 crc kubenswrapper[4790]: I1003 08:51:07.793816 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa372dcc-63aa-48e5-b153-7739078026ef" containerName="ovnkube-controller" Oct 03 08:51:07 crc kubenswrapper[4790]: I1003 08:51:07.793911 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="fa372dcc-63aa-48e5-b153-7739078026ef" containerName="ovnkube-controller" Oct 03 08:51:07 crc kubenswrapper[4790]: I1003 08:51:07.794126 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="fa372dcc-63aa-48e5-b153-7739078026ef" containerName="ovnkube-controller" Oct 03 08:51:07 crc kubenswrapper[4790]: I1003 08:51:07.795734 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-qlwvl" Oct 03 08:51:07 crc kubenswrapper[4790]: I1003 08:51:07.891678 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fa372dcc-63aa-48e5-b153-7739078026ef-var-lib-openvswitch\") pod \"fa372dcc-63aa-48e5-b153-7739078026ef\" (UID: \"fa372dcc-63aa-48e5-b153-7739078026ef\") " Oct 03 08:51:07 crc kubenswrapper[4790]: I1003 08:51:07.891745 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/fa372dcc-63aa-48e5-b153-7739078026ef-host-run-netns\") pod \"fa372dcc-63aa-48e5-b153-7739078026ef\" (UID: \"fa372dcc-63aa-48e5-b153-7739078026ef\") " Oct 03 08:51:07 crc kubenswrapper[4790]: I1003 08:51:07.891788 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/fa372dcc-63aa-48e5-b153-7739078026ef-systemd-units\") pod \"fa372dcc-63aa-48e5-b153-7739078026ef\" (UID: \"fa372dcc-63aa-48e5-b153-7739078026ef\") " Oct 03 08:51:07 crc kubenswrapper[4790]: I1003 08:51:07.891838 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/fa372dcc-63aa-48e5-b153-7739078026ef-host-cni-netd\") pod \"fa372dcc-63aa-48e5-b153-7739078026ef\" (UID: \"fa372dcc-63aa-48e5-b153-7739078026ef\") " Oct 03 08:51:07 crc kubenswrapper[4790]: I1003 08:51:07.891859 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fa372dcc-63aa-48e5-b153-7739078026ef-run-openvswitch\") pod \"fa372dcc-63aa-48e5-b153-7739078026ef\" (UID: \"fa372dcc-63aa-48e5-b153-7739078026ef\") " Oct 03 08:51:07 crc kubenswrapper[4790]: I1003 08:51:07.891865 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fa372dcc-63aa-48e5-b153-7739078026ef-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "fa372dcc-63aa-48e5-b153-7739078026ef" (UID: "fa372dcc-63aa-48e5-b153-7739078026ef"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 08:51:07 crc kubenswrapper[4790]: I1003 08:51:07.891883 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fa372dcc-63aa-48e5-b153-7739078026ef-host-run-ovn-kubernetes\") pod \"fa372dcc-63aa-48e5-b153-7739078026ef\" (UID: \"fa372dcc-63aa-48e5-b153-7739078026ef\") " Oct 03 08:51:07 crc kubenswrapper[4790]: I1003 08:51:07.891942 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fa372dcc-63aa-48e5-b153-7739078026ef-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "fa372dcc-63aa-48e5-b153-7739078026ef" (UID: "fa372dcc-63aa-48e5-b153-7739078026ef"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 08:51:07 crc kubenswrapper[4790]: I1003 08:51:07.891987 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fa372dcc-63aa-48e5-b153-7739078026ef-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "fa372dcc-63aa-48e5-b153-7739078026ef" (UID: "fa372dcc-63aa-48e5-b153-7739078026ef"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 08:51:07 crc kubenswrapper[4790]: I1003 08:51:07.892002 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fa372dcc-63aa-48e5-b153-7739078026ef-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "fa372dcc-63aa-48e5-b153-7739078026ef" (UID: "fa372dcc-63aa-48e5-b153-7739078026ef"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 08:51:07 crc kubenswrapper[4790]: I1003 08:51:07.892012 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/fa372dcc-63aa-48e5-b153-7739078026ef-log-socket\") pod \"fa372dcc-63aa-48e5-b153-7739078026ef\" (UID: \"fa372dcc-63aa-48e5-b153-7739078026ef\") " Oct 03 08:51:07 crc kubenswrapper[4790]: I1003 08:51:07.892033 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fa372dcc-63aa-48e5-b153-7739078026ef-log-socket" (OuterVolumeSpecName: "log-socket") pod "fa372dcc-63aa-48e5-b153-7739078026ef" (UID: "fa372dcc-63aa-48e5-b153-7739078026ef"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 08:51:07 crc kubenswrapper[4790]: I1003 08:51:07.892018 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fa372dcc-63aa-48e5-b153-7739078026ef-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "fa372dcc-63aa-48e5-b153-7739078026ef" (UID: "fa372dcc-63aa-48e5-b153-7739078026ef"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 08:51:07 crc kubenswrapper[4790]: I1003 08:51:07.892108 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fa372dcc-63aa-48e5-b153-7739078026ef-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "fa372dcc-63aa-48e5-b153-7739078026ef" (UID: "fa372dcc-63aa-48e5-b153-7739078026ef"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 08:51:07 crc kubenswrapper[4790]: I1003 08:51:07.892179 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/fa372dcc-63aa-48e5-b153-7739078026ef-ovnkube-script-lib\") pod \"fa372dcc-63aa-48e5-b153-7739078026ef\" (UID: \"fa372dcc-63aa-48e5-b153-7739078026ef\") " Oct 03 08:51:07 crc kubenswrapper[4790]: I1003 08:51:07.892513 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/fa372dcc-63aa-48e5-b153-7739078026ef-ovn-node-metrics-cert\") pod \"fa372dcc-63aa-48e5-b153-7739078026ef\" (UID: \"fa372dcc-63aa-48e5-b153-7739078026ef\") " Oct 03 08:51:07 crc kubenswrapper[4790]: I1003 08:51:07.892620 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fa372dcc-63aa-48e5-b153-7739078026ef-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "fa372dcc-63aa-48e5-b153-7739078026ef" (UID: "fa372dcc-63aa-48e5-b153-7739078026ef"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:51:07 crc kubenswrapper[4790]: I1003 08:51:07.892749 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/fa372dcc-63aa-48e5-b153-7739078026ef-ovnkube-config\") pod \"fa372dcc-63aa-48e5-b153-7739078026ef\" (UID: \"fa372dcc-63aa-48e5-b153-7739078026ef\") " Oct 03 08:51:07 crc kubenswrapper[4790]: I1003 08:51:07.892786 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fa372dcc-63aa-48e5-b153-7739078026ef-host-var-lib-cni-networks-ovn-kubernetes\") pod \"fa372dcc-63aa-48e5-b153-7739078026ef\" (UID: \"fa372dcc-63aa-48e5-b153-7739078026ef\") " Oct 03 08:51:07 crc kubenswrapper[4790]: I1003 08:51:07.892814 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/fa372dcc-63aa-48e5-b153-7739078026ef-run-systemd\") pod \"fa372dcc-63aa-48e5-b153-7739078026ef\" (UID: \"fa372dcc-63aa-48e5-b153-7739078026ef\") " Oct 03 08:51:07 crc kubenswrapper[4790]: I1003 08:51:07.892839 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/fa372dcc-63aa-48e5-b153-7739078026ef-host-slash\") pod \"fa372dcc-63aa-48e5-b153-7739078026ef\" (UID: \"fa372dcc-63aa-48e5-b153-7739078026ef\") " Oct 03 08:51:07 crc kubenswrapper[4790]: I1003 08:51:07.892879 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/fa372dcc-63aa-48e5-b153-7739078026ef-node-log\") pod \"fa372dcc-63aa-48e5-b153-7739078026ef\" (UID: \"fa372dcc-63aa-48e5-b153-7739078026ef\") " Oct 03 08:51:07 crc kubenswrapper[4790]: I1003 08:51:07.892897 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/fa372dcc-63aa-48e5-b153-7739078026ef-run-ovn\") pod \"fa372dcc-63aa-48e5-b153-7739078026ef\" (UID: \"fa372dcc-63aa-48e5-b153-7739078026ef\") " Oct 03 08:51:07 crc kubenswrapper[4790]: I1003 08:51:07.892899 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fa372dcc-63aa-48e5-b153-7739078026ef-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "fa372dcc-63aa-48e5-b153-7739078026ef" (UID: "fa372dcc-63aa-48e5-b153-7739078026ef"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 08:51:07 crc kubenswrapper[4790]: I1003 08:51:07.892915 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/fa372dcc-63aa-48e5-b153-7739078026ef-host-cni-bin\") pod \"fa372dcc-63aa-48e5-b153-7739078026ef\" (UID: \"fa372dcc-63aa-48e5-b153-7739078026ef\") " Oct 03 08:51:07 crc kubenswrapper[4790]: I1003 08:51:07.892933 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fa372dcc-63aa-48e5-b153-7739078026ef-host-slash" (OuterVolumeSpecName: "host-slash") pod "fa372dcc-63aa-48e5-b153-7739078026ef" (UID: "fa372dcc-63aa-48e5-b153-7739078026ef"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 08:51:07 crc kubenswrapper[4790]: I1003 08:51:07.892979 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-52pgl\" (UniqueName: \"kubernetes.io/projected/fa372dcc-63aa-48e5-b153-7739078026ef-kube-api-access-52pgl\") pod \"fa372dcc-63aa-48e5-b153-7739078026ef\" (UID: \"fa372dcc-63aa-48e5-b153-7739078026ef\") " Oct 03 08:51:07 crc kubenswrapper[4790]: I1003 08:51:07.893041 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/fa372dcc-63aa-48e5-b153-7739078026ef-env-overrides\") pod \"fa372dcc-63aa-48e5-b153-7739078026ef\" (UID: \"fa372dcc-63aa-48e5-b153-7739078026ef\") " Oct 03 08:51:07 crc kubenswrapper[4790]: I1003 08:51:07.892985 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fa372dcc-63aa-48e5-b153-7739078026ef-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "fa372dcc-63aa-48e5-b153-7739078026ef" (UID: "fa372dcc-63aa-48e5-b153-7739078026ef"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 08:51:07 crc kubenswrapper[4790]: I1003 08:51:07.893097 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/fa372dcc-63aa-48e5-b153-7739078026ef-host-kubelet\") pod \"fa372dcc-63aa-48e5-b153-7739078026ef\" (UID: \"fa372dcc-63aa-48e5-b153-7739078026ef\") " Oct 03 08:51:07 crc kubenswrapper[4790]: I1003 08:51:07.893070 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fa372dcc-63aa-48e5-b153-7739078026ef-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "fa372dcc-63aa-48e5-b153-7739078026ef" (UID: "fa372dcc-63aa-48e5-b153-7739078026ef"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 08:51:07 crc kubenswrapper[4790]: I1003 08:51:07.893162 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fa372dcc-63aa-48e5-b153-7739078026ef-etc-openvswitch\") pod \"fa372dcc-63aa-48e5-b153-7739078026ef\" (UID: \"fa372dcc-63aa-48e5-b153-7739078026ef\") " Oct 03 08:51:07 crc kubenswrapper[4790]: I1003 08:51:07.893068 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fa372dcc-63aa-48e5-b153-7739078026ef-node-log" (OuterVolumeSpecName: "node-log") pod "fa372dcc-63aa-48e5-b153-7739078026ef" (UID: "fa372dcc-63aa-48e5-b153-7739078026ef"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 08:51:07 crc kubenswrapper[4790]: I1003 08:51:07.893132 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fa372dcc-63aa-48e5-b153-7739078026ef-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "fa372dcc-63aa-48e5-b153-7739078026ef" (UID: "fa372dcc-63aa-48e5-b153-7739078026ef"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 08:51:07 crc kubenswrapper[4790]: I1003 08:51:07.893137 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fa372dcc-63aa-48e5-b153-7739078026ef-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "fa372dcc-63aa-48e5-b153-7739078026ef" (UID: "fa372dcc-63aa-48e5-b153-7739078026ef"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:51:07 crc kubenswrapper[4790]: I1003 08:51:07.893197 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fa372dcc-63aa-48e5-b153-7739078026ef-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "fa372dcc-63aa-48e5-b153-7739078026ef" (UID: "fa372dcc-63aa-48e5-b153-7739078026ef"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 08:51:07 crc kubenswrapper[4790]: I1003 08:51:07.893412 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/06207a11-586c-4faa-a5df-4ab5215cef89-env-overrides\") pod \"ovnkube-node-qlwvl\" (UID: \"06207a11-586c-4faa-a5df-4ab5215cef89\") " pod="openshift-ovn-kubernetes/ovnkube-node-qlwvl" Oct 03 08:51:07 crc kubenswrapper[4790]: I1003 08:51:07.893457 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/06207a11-586c-4faa-a5df-4ab5215cef89-run-openvswitch\") pod \"ovnkube-node-qlwvl\" (UID: \"06207a11-586c-4faa-a5df-4ab5215cef89\") " pod="openshift-ovn-kubernetes/ovnkube-node-qlwvl" Oct 03 08:51:07 crc kubenswrapper[4790]: I1003 08:51:07.893468 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fa372dcc-63aa-48e5-b153-7739078026ef-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "fa372dcc-63aa-48e5-b153-7739078026ef" (UID: "fa372dcc-63aa-48e5-b153-7739078026ef"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:51:07 crc kubenswrapper[4790]: I1003 08:51:07.893492 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/06207a11-586c-4faa-a5df-4ab5215cef89-systemd-units\") pod \"ovnkube-node-qlwvl\" (UID: \"06207a11-586c-4faa-a5df-4ab5215cef89\") " pod="openshift-ovn-kubernetes/ovnkube-node-qlwvl" Oct 03 08:51:07 crc kubenswrapper[4790]: I1003 08:51:07.893646 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/06207a11-586c-4faa-a5df-4ab5215cef89-ovnkube-config\") pod \"ovnkube-node-qlwvl\" (UID: \"06207a11-586c-4faa-a5df-4ab5215cef89\") " pod="openshift-ovn-kubernetes/ovnkube-node-qlwvl" Oct 03 08:51:07 crc kubenswrapper[4790]: I1003 08:51:07.893706 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/06207a11-586c-4faa-a5df-4ab5215cef89-etc-openvswitch\") pod \"ovnkube-node-qlwvl\" (UID: \"06207a11-586c-4faa-a5df-4ab5215cef89\") " pod="openshift-ovn-kubernetes/ovnkube-node-qlwvl" Oct 03 08:51:07 crc kubenswrapper[4790]: I1003 08:51:07.893736 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/06207a11-586c-4faa-a5df-4ab5215cef89-run-ovn\") pod \"ovnkube-node-qlwvl\" (UID: \"06207a11-586c-4faa-a5df-4ab5215cef89\") " pod="openshift-ovn-kubernetes/ovnkube-node-qlwvl" Oct 03 08:51:07 crc kubenswrapper[4790]: I1003 08:51:07.893788 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/06207a11-586c-4faa-a5df-4ab5215cef89-ovn-node-metrics-cert\") pod \"ovnkube-node-qlwvl\" (UID: \"06207a11-586c-4faa-a5df-4ab5215cef89\") " pod="openshift-ovn-kubernetes/ovnkube-node-qlwvl" Oct 03 08:51:07 crc kubenswrapper[4790]: I1003 08:51:07.893827 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/06207a11-586c-4faa-a5df-4ab5215cef89-host-slash\") pod \"ovnkube-node-qlwvl\" (UID: \"06207a11-586c-4faa-a5df-4ab5215cef89\") " pod="openshift-ovn-kubernetes/ovnkube-node-qlwvl" Oct 03 08:51:07 crc kubenswrapper[4790]: I1003 08:51:07.893869 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/06207a11-586c-4faa-a5df-4ab5215cef89-host-run-ovn-kubernetes\") pod \"ovnkube-node-qlwvl\" (UID: \"06207a11-586c-4faa-a5df-4ab5215cef89\") " pod="openshift-ovn-kubernetes/ovnkube-node-qlwvl" Oct 03 08:51:07 crc kubenswrapper[4790]: I1003 08:51:07.893926 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/06207a11-586c-4faa-a5df-4ab5215cef89-host-kubelet\") pod \"ovnkube-node-qlwvl\" (UID: \"06207a11-586c-4faa-a5df-4ab5215cef89\") " pod="openshift-ovn-kubernetes/ovnkube-node-qlwvl" Oct 03 08:51:07 crc kubenswrapper[4790]: I1003 08:51:07.893981 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/06207a11-586c-4faa-a5df-4ab5215cef89-host-cni-netd\") pod \"ovnkube-node-qlwvl\" (UID: \"06207a11-586c-4faa-a5df-4ab5215cef89\") " pod="openshift-ovn-kubernetes/ovnkube-node-qlwvl" Oct 03 08:51:07 crc kubenswrapper[4790]: I1003 08:51:07.894026 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/06207a11-586c-4faa-a5df-4ab5215cef89-run-systemd\") pod \"ovnkube-node-qlwvl\" (UID: \"06207a11-586c-4faa-a5df-4ab5215cef89\") " pod="openshift-ovn-kubernetes/ovnkube-node-qlwvl" Oct 03 08:51:07 crc kubenswrapper[4790]: I1003 08:51:07.894063 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/06207a11-586c-4faa-a5df-4ab5215cef89-node-log\") pod \"ovnkube-node-qlwvl\" (UID: \"06207a11-586c-4faa-a5df-4ab5215cef89\") " pod="openshift-ovn-kubernetes/ovnkube-node-qlwvl" Oct 03 08:51:07 crc kubenswrapper[4790]: I1003 08:51:07.894103 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/06207a11-586c-4faa-a5df-4ab5215cef89-host-cni-bin\") pod \"ovnkube-node-qlwvl\" (UID: \"06207a11-586c-4faa-a5df-4ab5215cef89\") " pod="openshift-ovn-kubernetes/ovnkube-node-qlwvl" Oct 03 08:51:07 crc kubenswrapper[4790]: I1003 08:51:07.894160 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/06207a11-586c-4faa-a5df-4ab5215cef89-var-lib-openvswitch\") pod \"ovnkube-node-qlwvl\" (UID: \"06207a11-586c-4faa-a5df-4ab5215cef89\") " pod="openshift-ovn-kubernetes/ovnkube-node-qlwvl" Oct 03 08:51:07 crc kubenswrapper[4790]: I1003 08:51:07.894208 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/06207a11-586c-4faa-a5df-4ab5215cef89-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-qlwvl\" (UID: \"06207a11-586c-4faa-a5df-4ab5215cef89\") " pod="openshift-ovn-kubernetes/ovnkube-node-qlwvl" Oct 03 08:51:07 crc kubenswrapper[4790]: I1003 08:51:07.894369 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/06207a11-586c-4faa-a5df-4ab5215cef89-host-run-netns\") pod \"ovnkube-node-qlwvl\" (UID: \"06207a11-586c-4faa-a5df-4ab5215cef89\") " pod="openshift-ovn-kubernetes/ovnkube-node-qlwvl" Oct 03 08:51:07 crc kubenswrapper[4790]: I1003 08:51:07.894454 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/06207a11-586c-4faa-a5df-4ab5215cef89-log-socket\") pod \"ovnkube-node-qlwvl\" (UID: \"06207a11-586c-4faa-a5df-4ab5215cef89\") " pod="openshift-ovn-kubernetes/ovnkube-node-qlwvl" Oct 03 08:51:07 crc kubenswrapper[4790]: I1003 08:51:07.894477 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/06207a11-586c-4faa-a5df-4ab5215cef89-ovnkube-script-lib\") pod \"ovnkube-node-qlwvl\" (UID: \"06207a11-586c-4faa-a5df-4ab5215cef89\") " pod="openshift-ovn-kubernetes/ovnkube-node-qlwvl" Oct 03 08:51:07 crc kubenswrapper[4790]: I1003 08:51:07.894624 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n4wd7\" (UniqueName: \"kubernetes.io/projected/06207a11-586c-4faa-a5df-4ab5215cef89-kube-api-access-n4wd7\") pod \"ovnkube-node-qlwvl\" (UID: \"06207a11-586c-4faa-a5df-4ab5215cef89\") " pod="openshift-ovn-kubernetes/ovnkube-node-qlwvl" Oct 03 08:51:07 crc kubenswrapper[4790]: I1003 08:51:07.894738 4790 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fa372dcc-63aa-48e5-b153-7739078026ef-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Oct 03 08:51:07 crc kubenswrapper[4790]: I1003 08:51:07.894756 4790 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/fa372dcc-63aa-48e5-b153-7739078026ef-ovnkube-config\") on node \"crc\" DevicePath \"\"" Oct 03 08:51:07 crc kubenswrapper[4790]: I1003 08:51:07.894770 4790 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/fa372dcc-63aa-48e5-b153-7739078026ef-host-slash\") on node \"crc\" DevicePath \"\"" Oct 03 08:51:07 crc kubenswrapper[4790]: I1003 08:51:07.894780 4790 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/fa372dcc-63aa-48e5-b153-7739078026ef-node-log\") on node \"crc\" DevicePath \"\"" Oct 03 08:51:07 crc kubenswrapper[4790]: I1003 08:51:07.894789 4790 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/fa372dcc-63aa-48e5-b153-7739078026ef-run-ovn\") on node \"crc\" DevicePath \"\"" Oct 03 08:51:07 crc kubenswrapper[4790]: I1003 08:51:07.894798 4790 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/fa372dcc-63aa-48e5-b153-7739078026ef-host-cni-bin\") on node \"crc\" DevicePath \"\"" Oct 03 08:51:07 crc kubenswrapper[4790]: I1003 08:51:07.894807 4790 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/fa372dcc-63aa-48e5-b153-7739078026ef-host-kubelet\") on node \"crc\" DevicePath \"\"" Oct 03 08:51:07 crc kubenswrapper[4790]: I1003 08:51:07.894816 4790 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/fa372dcc-63aa-48e5-b153-7739078026ef-env-overrides\") on node \"crc\" DevicePath \"\"" Oct 03 08:51:07 crc kubenswrapper[4790]: I1003 08:51:07.894824 4790 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fa372dcc-63aa-48e5-b153-7739078026ef-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Oct 03 08:51:07 crc kubenswrapper[4790]: I1003 08:51:07.894834 4790 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fa372dcc-63aa-48e5-b153-7739078026ef-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Oct 03 08:51:07 crc kubenswrapper[4790]: I1003 08:51:07.894844 4790 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/fa372dcc-63aa-48e5-b153-7739078026ef-host-run-netns\") on node \"crc\" DevicePath \"\"" Oct 03 08:51:07 crc kubenswrapper[4790]: I1003 08:51:07.894852 4790 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/fa372dcc-63aa-48e5-b153-7739078026ef-systemd-units\") on node \"crc\" DevicePath \"\"" Oct 03 08:51:07 crc kubenswrapper[4790]: I1003 08:51:07.894862 4790 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/fa372dcc-63aa-48e5-b153-7739078026ef-host-cni-netd\") on node \"crc\" DevicePath \"\"" Oct 03 08:51:07 crc kubenswrapper[4790]: I1003 08:51:07.894870 4790 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fa372dcc-63aa-48e5-b153-7739078026ef-run-openvswitch\") on node \"crc\" DevicePath \"\"" Oct 03 08:51:07 crc kubenswrapper[4790]: I1003 08:51:07.894881 4790 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fa372dcc-63aa-48e5-b153-7739078026ef-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Oct 03 08:51:07 crc kubenswrapper[4790]: I1003 08:51:07.894892 4790 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/fa372dcc-63aa-48e5-b153-7739078026ef-log-socket\") on node \"crc\" DevicePath \"\"" Oct 03 08:51:07 crc kubenswrapper[4790]: I1003 08:51:07.894900 4790 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/fa372dcc-63aa-48e5-b153-7739078026ef-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Oct 03 08:51:07 crc kubenswrapper[4790]: I1003 08:51:07.898533 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fa372dcc-63aa-48e5-b153-7739078026ef-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "fa372dcc-63aa-48e5-b153-7739078026ef" (UID: "fa372dcc-63aa-48e5-b153-7739078026ef"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:51:07 crc kubenswrapper[4790]: I1003 08:51:07.899506 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fa372dcc-63aa-48e5-b153-7739078026ef-kube-api-access-52pgl" (OuterVolumeSpecName: "kube-api-access-52pgl") pod "fa372dcc-63aa-48e5-b153-7739078026ef" (UID: "fa372dcc-63aa-48e5-b153-7739078026ef"). InnerVolumeSpecName "kube-api-access-52pgl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:51:07 crc kubenswrapper[4790]: I1003 08:51:07.909263 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fa372dcc-63aa-48e5-b153-7739078026ef-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "fa372dcc-63aa-48e5-b153-7739078026ef" (UID: "fa372dcc-63aa-48e5-b153-7739078026ef"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 08:51:07 crc kubenswrapper[4790]: I1003 08:51:07.996320 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n4wd7\" (UniqueName: \"kubernetes.io/projected/06207a11-586c-4faa-a5df-4ab5215cef89-kube-api-access-n4wd7\") pod \"ovnkube-node-qlwvl\" (UID: \"06207a11-586c-4faa-a5df-4ab5215cef89\") " pod="openshift-ovn-kubernetes/ovnkube-node-qlwvl" Oct 03 08:51:07 crc kubenswrapper[4790]: I1003 08:51:07.996396 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/06207a11-586c-4faa-a5df-4ab5215cef89-env-overrides\") pod \"ovnkube-node-qlwvl\" (UID: \"06207a11-586c-4faa-a5df-4ab5215cef89\") " pod="openshift-ovn-kubernetes/ovnkube-node-qlwvl" Oct 03 08:51:07 crc kubenswrapper[4790]: I1003 08:51:07.996432 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/06207a11-586c-4faa-a5df-4ab5215cef89-systemd-units\") pod \"ovnkube-node-qlwvl\" (UID: \"06207a11-586c-4faa-a5df-4ab5215cef89\") " pod="openshift-ovn-kubernetes/ovnkube-node-qlwvl" Oct 03 08:51:07 crc kubenswrapper[4790]: I1003 08:51:07.996466 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/06207a11-586c-4faa-a5df-4ab5215cef89-run-openvswitch\") pod \"ovnkube-node-qlwvl\" (UID: \"06207a11-586c-4faa-a5df-4ab5215cef89\") " pod="openshift-ovn-kubernetes/ovnkube-node-qlwvl" Oct 03 08:51:07 crc kubenswrapper[4790]: I1003 08:51:07.996517 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/06207a11-586c-4faa-a5df-4ab5215cef89-ovnkube-config\") pod \"ovnkube-node-qlwvl\" (UID: \"06207a11-586c-4faa-a5df-4ab5215cef89\") " pod="openshift-ovn-kubernetes/ovnkube-node-qlwvl" Oct 03 08:51:07 crc kubenswrapper[4790]: I1003 08:51:07.996558 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/06207a11-586c-4faa-a5df-4ab5215cef89-etc-openvswitch\") pod \"ovnkube-node-qlwvl\" (UID: \"06207a11-586c-4faa-a5df-4ab5215cef89\") " pod="openshift-ovn-kubernetes/ovnkube-node-qlwvl" Oct 03 08:51:07 crc kubenswrapper[4790]: I1003 08:51:07.996630 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/06207a11-586c-4faa-a5df-4ab5215cef89-run-ovn\") pod \"ovnkube-node-qlwvl\" (UID: \"06207a11-586c-4faa-a5df-4ab5215cef89\") " pod="openshift-ovn-kubernetes/ovnkube-node-qlwvl" Oct 03 08:51:07 crc kubenswrapper[4790]: I1003 08:51:07.996631 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/06207a11-586c-4faa-a5df-4ab5215cef89-run-openvswitch\") pod \"ovnkube-node-qlwvl\" (UID: \"06207a11-586c-4faa-a5df-4ab5215cef89\") " pod="openshift-ovn-kubernetes/ovnkube-node-qlwvl" Oct 03 08:51:07 crc kubenswrapper[4790]: I1003 08:51:07.996685 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/06207a11-586c-4faa-a5df-4ab5215cef89-etc-openvswitch\") pod \"ovnkube-node-qlwvl\" (UID: \"06207a11-586c-4faa-a5df-4ab5215cef89\") " pod="openshift-ovn-kubernetes/ovnkube-node-qlwvl" Oct 03 08:51:07 crc kubenswrapper[4790]: I1003 08:51:07.996669 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/06207a11-586c-4faa-a5df-4ab5215cef89-ovn-node-metrics-cert\") pod \"ovnkube-node-qlwvl\" (UID: \"06207a11-586c-4faa-a5df-4ab5215cef89\") " pod="openshift-ovn-kubernetes/ovnkube-node-qlwvl" Oct 03 08:51:07 crc kubenswrapper[4790]: I1003 08:51:07.996631 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/06207a11-586c-4faa-a5df-4ab5215cef89-systemd-units\") pod \"ovnkube-node-qlwvl\" (UID: \"06207a11-586c-4faa-a5df-4ab5215cef89\") " pod="openshift-ovn-kubernetes/ovnkube-node-qlwvl" Oct 03 08:51:07 crc kubenswrapper[4790]: I1003 08:51:07.996863 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/06207a11-586c-4faa-a5df-4ab5215cef89-host-slash\") pod \"ovnkube-node-qlwvl\" (UID: \"06207a11-586c-4faa-a5df-4ab5215cef89\") " pod="openshift-ovn-kubernetes/ovnkube-node-qlwvl" Oct 03 08:51:07 crc kubenswrapper[4790]: I1003 08:51:07.996892 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/06207a11-586c-4faa-a5df-4ab5215cef89-run-ovn\") pod \"ovnkube-node-qlwvl\" (UID: \"06207a11-586c-4faa-a5df-4ab5215cef89\") " pod="openshift-ovn-kubernetes/ovnkube-node-qlwvl" Oct 03 08:51:07 crc kubenswrapper[4790]: I1003 08:51:07.997018 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/06207a11-586c-4faa-a5df-4ab5215cef89-host-slash\") pod \"ovnkube-node-qlwvl\" (UID: \"06207a11-586c-4faa-a5df-4ab5215cef89\") " pod="openshift-ovn-kubernetes/ovnkube-node-qlwvl" Oct 03 08:51:07 crc kubenswrapper[4790]: I1003 08:51:07.997060 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/06207a11-586c-4faa-a5df-4ab5215cef89-host-run-ovn-kubernetes\") pod \"ovnkube-node-qlwvl\" (UID: \"06207a11-586c-4faa-a5df-4ab5215cef89\") " pod="openshift-ovn-kubernetes/ovnkube-node-qlwvl" Oct 03 08:51:07 crc kubenswrapper[4790]: I1003 08:51:07.997091 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/06207a11-586c-4faa-a5df-4ab5215cef89-host-cni-netd\") pod \"ovnkube-node-qlwvl\" (UID: \"06207a11-586c-4faa-a5df-4ab5215cef89\") " pod="openshift-ovn-kubernetes/ovnkube-node-qlwvl" Oct 03 08:51:07 crc kubenswrapper[4790]: I1003 08:51:07.997117 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/06207a11-586c-4faa-a5df-4ab5215cef89-host-kubelet\") pod \"ovnkube-node-qlwvl\" (UID: \"06207a11-586c-4faa-a5df-4ab5215cef89\") " pod="openshift-ovn-kubernetes/ovnkube-node-qlwvl" Oct 03 08:51:07 crc kubenswrapper[4790]: I1003 08:51:07.997150 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/06207a11-586c-4faa-a5df-4ab5215cef89-run-systemd\") pod \"ovnkube-node-qlwvl\" (UID: \"06207a11-586c-4faa-a5df-4ab5215cef89\") " pod="openshift-ovn-kubernetes/ovnkube-node-qlwvl" Oct 03 08:51:07 crc kubenswrapper[4790]: I1003 08:51:07.997153 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/06207a11-586c-4faa-a5df-4ab5215cef89-env-overrides\") pod \"ovnkube-node-qlwvl\" (UID: \"06207a11-586c-4faa-a5df-4ab5215cef89\") " pod="openshift-ovn-kubernetes/ovnkube-node-qlwvl" Oct 03 08:51:07 crc kubenswrapper[4790]: I1003 08:51:07.997173 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/06207a11-586c-4faa-a5df-4ab5215cef89-node-log\") pod \"ovnkube-node-qlwvl\" (UID: \"06207a11-586c-4faa-a5df-4ab5215cef89\") " pod="openshift-ovn-kubernetes/ovnkube-node-qlwvl" Oct 03 08:51:07 crc kubenswrapper[4790]: I1003 08:51:07.997202 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/06207a11-586c-4faa-a5df-4ab5215cef89-host-cni-bin\") pod \"ovnkube-node-qlwvl\" (UID: \"06207a11-586c-4faa-a5df-4ab5215cef89\") " pod="openshift-ovn-kubernetes/ovnkube-node-qlwvl" Oct 03 08:51:07 crc kubenswrapper[4790]: I1003 08:51:07.997208 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/06207a11-586c-4faa-a5df-4ab5215cef89-host-kubelet\") pod \"ovnkube-node-qlwvl\" (UID: \"06207a11-586c-4faa-a5df-4ab5215cef89\") " pod="openshift-ovn-kubernetes/ovnkube-node-qlwvl" Oct 03 08:51:07 crc kubenswrapper[4790]: I1003 08:51:07.997222 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/06207a11-586c-4faa-a5df-4ab5215cef89-run-systemd\") pod \"ovnkube-node-qlwvl\" (UID: \"06207a11-586c-4faa-a5df-4ab5215cef89\") " pod="openshift-ovn-kubernetes/ovnkube-node-qlwvl" Oct 03 08:51:07 crc kubenswrapper[4790]: I1003 08:51:07.997246 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/06207a11-586c-4faa-a5df-4ab5215cef89-var-lib-openvswitch\") pod \"ovnkube-node-qlwvl\" (UID: \"06207a11-586c-4faa-a5df-4ab5215cef89\") " pod="openshift-ovn-kubernetes/ovnkube-node-qlwvl" Oct 03 08:51:07 crc kubenswrapper[4790]: I1003 08:51:07.997209 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/06207a11-586c-4faa-a5df-4ab5215cef89-host-run-ovn-kubernetes\") pod \"ovnkube-node-qlwvl\" (UID: \"06207a11-586c-4faa-a5df-4ab5215cef89\") " pod="openshift-ovn-kubernetes/ovnkube-node-qlwvl" Oct 03 08:51:07 crc kubenswrapper[4790]: I1003 08:51:07.997230 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/06207a11-586c-4faa-a5df-4ab5215cef89-host-cni-netd\") pod \"ovnkube-node-qlwvl\" (UID: \"06207a11-586c-4faa-a5df-4ab5215cef89\") " pod="openshift-ovn-kubernetes/ovnkube-node-qlwvl" Oct 03 08:51:07 crc kubenswrapper[4790]: I1003 08:51:07.997288 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/06207a11-586c-4faa-a5df-4ab5215cef89-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-qlwvl\" (UID: \"06207a11-586c-4faa-a5df-4ab5215cef89\") " pod="openshift-ovn-kubernetes/ovnkube-node-qlwvl" Oct 03 08:51:07 crc kubenswrapper[4790]: I1003 08:51:07.997264 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/06207a11-586c-4faa-a5df-4ab5215cef89-host-cni-bin\") pod \"ovnkube-node-qlwvl\" (UID: \"06207a11-586c-4faa-a5df-4ab5215cef89\") " pod="openshift-ovn-kubernetes/ovnkube-node-qlwvl" Oct 03 08:51:07 crc kubenswrapper[4790]: I1003 08:51:07.997236 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/06207a11-586c-4faa-a5df-4ab5215cef89-node-log\") pod \"ovnkube-node-qlwvl\" (UID: \"06207a11-586c-4faa-a5df-4ab5215cef89\") " pod="openshift-ovn-kubernetes/ovnkube-node-qlwvl" Oct 03 08:51:07 crc kubenswrapper[4790]: I1003 08:51:07.997314 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/06207a11-586c-4faa-a5df-4ab5215cef89-var-lib-openvswitch\") pod \"ovnkube-node-qlwvl\" (UID: \"06207a11-586c-4faa-a5df-4ab5215cef89\") " pod="openshift-ovn-kubernetes/ovnkube-node-qlwvl" Oct 03 08:51:07 crc kubenswrapper[4790]: I1003 08:51:07.997368 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/06207a11-586c-4faa-a5df-4ab5215cef89-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-qlwvl\" (UID: \"06207a11-586c-4faa-a5df-4ab5215cef89\") " pod="openshift-ovn-kubernetes/ovnkube-node-qlwvl" Oct 03 08:51:07 crc kubenswrapper[4790]: I1003 08:51:07.997462 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/06207a11-586c-4faa-a5df-4ab5215cef89-host-run-netns\") pod \"ovnkube-node-qlwvl\" (UID: \"06207a11-586c-4faa-a5df-4ab5215cef89\") " pod="openshift-ovn-kubernetes/ovnkube-node-qlwvl" Oct 03 08:51:07 crc kubenswrapper[4790]: I1003 08:51:07.997481 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/06207a11-586c-4faa-a5df-4ab5215cef89-host-run-netns\") pod \"ovnkube-node-qlwvl\" (UID: \"06207a11-586c-4faa-a5df-4ab5215cef89\") " pod="openshift-ovn-kubernetes/ovnkube-node-qlwvl" Oct 03 08:51:07 crc kubenswrapper[4790]: I1003 08:51:07.997604 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/06207a11-586c-4faa-a5df-4ab5215cef89-ovnkube-config\") pod \"ovnkube-node-qlwvl\" (UID: \"06207a11-586c-4faa-a5df-4ab5215cef89\") " pod="openshift-ovn-kubernetes/ovnkube-node-qlwvl" Oct 03 08:51:07 crc kubenswrapper[4790]: I1003 08:51:07.997635 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/06207a11-586c-4faa-a5df-4ab5215cef89-log-socket\") pod \"ovnkube-node-qlwvl\" (UID: \"06207a11-586c-4faa-a5df-4ab5215cef89\") " pod="openshift-ovn-kubernetes/ovnkube-node-qlwvl" Oct 03 08:51:07 crc kubenswrapper[4790]: I1003 08:51:07.997617 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/06207a11-586c-4faa-a5df-4ab5215cef89-log-socket\") pod \"ovnkube-node-qlwvl\" (UID: \"06207a11-586c-4faa-a5df-4ab5215cef89\") " pod="openshift-ovn-kubernetes/ovnkube-node-qlwvl" Oct 03 08:51:07 crc kubenswrapper[4790]: I1003 08:51:07.997686 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/06207a11-586c-4faa-a5df-4ab5215cef89-ovnkube-script-lib\") pod \"ovnkube-node-qlwvl\" (UID: \"06207a11-586c-4faa-a5df-4ab5215cef89\") " pod="openshift-ovn-kubernetes/ovnkube-node-qlwvl" Oct 03 08:51:07 crc kubenswrapper[4790]: I1003 08:51:07.997744 4790 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/fa372dcc-63aa-48e5-b153-7739078026ef-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Oct 03 08:51:07 crc kubenswrapper[4790]: I1003 08:51:07.997765 4790 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/fa372dcc-63aa-48e5-b153-7739078026ef-run-systemd\") on node \"crc\" DevicePath \"\"" Oct 03 08:51:07 crc kubenswrapper[4790]: I1003 08:51:07.997782 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-52pgl\" (UniqueName: \"kubernetes.io/projected/fa372dcc-63aa-48e5-b153-7739078026ef-kube-api-access-52pgl\") on node \"crc\" DevicePath \"\"" Oct 03 08:51:07 crc kubenswrapper[4790]: I1003 08:51:07.998508 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/06207a11-586c-4faa-a5df-4ab5215cef89-ovnkube-script-lib\") pod \"ovnkube-node-qlwvl\" (UID: \"06207a11-586c-4faa-a5df-4ab5215cef89\") " pod="openshift-ovn-kubernetes/ovnkube-node-qlwvl" Oct 03 08:51:08 crc kubenswrapper[4790]: I1003 08:51:08.000965 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/06207a11-586c-4faa-a5df-4ab5215cef89-ovn-node-metrics-cert\") pod \"ovnkube-node-qlwvl\" (UID: \"06207a11-586c-4faa-a5df-4ab5215cef89\") " pod="openshift-ovn-kubernetes/ovnkube-node-qlwvl" Oct 03 08:51:08 crc kubenswrapper[4790]: I1003 08:51:08.015642 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n4wd7\" (UniqueName: \"kubernetes.io/projected/06207a11-586c-4faa-a5df-4ab5215cef89-kube-api-access-n4wd7\") pod \"ovnkube-node-qlwvl\" (UID: \"06207a11-586c-4faa-a5df-4ab5215cef89\") " pod="openshift-ovn-kubernetes/ovnkube-node-qlwvl" Oct 03 08:51:08 crc kubenswrapper[4790]: I1003 08:51:08.111336 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-qlwvl" Oct 03 08:51:08 crc kubenswrapper[4790]: I1003 08:51:08.663849 4790 generic.go:334] "Generic (PLEG): container finished" podID="06207a11-586c-4faa-a5df-4ab5215cef89" containerID="f4a4efc68a8e91e312c5e81f144f1095497506dd2a3c1dc90cf6e2c4c23dee80" exitCode=0 Oct 03 08:51:08 crc kubenswrapper[4790]: I1003 08:51:08.663983 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qlwvl" event={"ID":"06207a11-586c-4faa-a5df-4ab5215cef89","Type":"ContainerDied","Data":"f4a4efc68a8e91e312c5e81f144f1095497506dd2a3c1dc90cf6e2c4c23dee80"} Oct 03 08:51:08 crc kubenswrapper[4790]: I1003 08:51:08.664275 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qlwvl" event={"ID":"06207a11-586c-4faa-a5df-4ab5215cef89","Type":"ContainerStarted","Data":"5e630c92fac9ad3dcfd5265300041154a10d52effc2729aa4a4632cbb2340ee1"} Oct 03 08:51:08 crc kubenswrapper[4790]: I1003 08:51:08.673976 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vsjvh_fa372dcc-63aa-48e5-b153-7739078026ef/ovn-acl-logging/0.log" Oct 03 08:51:08 crc kubenswrapper[4790]: I1003 08:51:08.677809 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vsjvh_fa372dcc-63aa-48e5-b153-7739078026ef/ovn-controller/0.log" Oct 03 08:51:08 crc kubenswrapper[4790]: I1003 08:51:08.678805 4790 generic.go:334] "Generic (PLEG): container finished" podID="fa372dcc-63aa-48e5-b153-7739078026ef" containerID="55b4e8ab33316a7f7057ee723ebd49eb0adcd7270124f084b2828ce538ed3297" exitCode=0 Oct 03 08:51:08 crc kubenswrapper[4790]: I1003 08:51:08.678947 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vsjvh" event={"ID":"fa372dcc-63aa-48e5-b153-7739078026ef","Type":"ContainerDied","Data":"55b4e8ab33316a7f7057ee723ebd49eb0adcd7270124f084b2828ce538ed3297"} Oct 03 08:51:08 crc kubenswrapper[4790]: I1003 08:51:08.678957 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-vsjvh" Oct 03 08:51:08 crc kubenswrapper[4790]: I1003 08:51:08.678995 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vsjvh" event={"ID":"fa372dcc-63aa-48e5-b153-7739078026ef","Type":"ContainerDied","Data":"179fd7ef9236cbccc231d9c8ae77862c35390fb123bb38b62441df8c69e2b09e"} Oct 03 08:51:08 crc kubenswrapper[4790]: I1003 08:51:08.679031 4790 scope.go:117] "RemoveContainer" containerID="85ca20fdf25ea02c6a17221523f0f3ffcf7af8343ea18ff5121a059dea5bddb2" Oct 03 08:51:08 crc kubenswrapper[4790]: I1003 08:51:08.682962 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-hncl2_f2f5631d-8486-44f9-81b0-ee78515e7866/kube-multus/2.log" Oct 03 08:51:08 crc kubenswrapper[4790]: I1003 08:51:08.707492 4790 scope.go:117] "RemoveContainer" containerID="74d44e422b0cbbd3793553dad98475d5a93ac38530a167db0dd1ef152edac010" Oct 03 08:51:08 crc kubenswrapper[4790]: I1003 08:51:08.745159 4790 scope.go:117] "RemoveContainer" containerID="55b4e8ab33316a7f7057ee723ebd49eb0adcd7270124f084b2828ce538ed3297" Oct 03 08:51:08 crc kubenswrapper[4790]: I1003 08:51:08.749644 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-vsjvh"] Oct 03 08:51:08 crc kubenswrapper[4790]: I1003 08:51:08.754009 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-vsjvh"] Oct 03 08:51:08 crc kubenswrapper[4790]: I1003 08:51:08.765473 4790 scope.go:117] "RemoveContainer" containerID="6f2aaa340e3927261ef0119f786b348e2091ddcd4b791619cc521122cc0774e7" Oct 03 08:51:08 crc kubenswrapper[4790]: I1003 08:51:08.799805 4790 scope.go:117] "RemoveContainer" containerID="e52e55afef7b68f05e0c7d975d4497ee57dd9fa5aef7f72e9ca7441ea1633c05" Oct 03 08:51:08 crc kubenswrapper[4790]: I1003 08:51:08.819247 4790 scope.go:117] "RemoveContainer" containerID="e69102d376e90485ca565d34e18dbc12067e166f0bdf3c4325a0873f9f2b9b0b" Oct 03 08:51:08 crc kubenswrapper[4790]: I1003 08:51:08.837092 4790 scope.go:117] "RemoveContainer" containerID="ff0f1dd623e164ebe1862661c5905a38bec3aa1dd5c61982eebde25d0feb4c6c" Oct 03 08:51:08 crc kubenswrapper[4790]: I1003 08:51:08.856798 4790 scope.go:117] "RemoveContainer" containerID="27b1573f76936a5148d24accaa7fdc1a1da552fb56b348812445e7f1c1e4cbef" Oct 03 08:51:08 crc kubenswrapper[4790]: I1003 08:51:08.874004 4790 scope.go:117] "RemoveContainer" containerID="e45c39722f37fc673cee930ca10396bba43b56e19e61aa99bd2b89f131341cb4" Oct 03 08:51:08 crc kubenswrapper[4790]: I1003 08:51:08.905353 4790 scope.go:117] "RemoveContainer" containerID="85ca20fdf25ea02c6a17221523f0f3ffcf7af8343ea18ff5121a059dea5bddb2" Oct 03 08:51:08 crc kubenswrapper[4790]: E1003 08:51:08.905999 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"85ca20fdf25ea02c6a17221523f0f3ffcf7af8343ea18ff5121a059dea5bddb2\": container with ID starting with 85ca20fdf25ea02c6a17221523f0f3ffcf7af8343ea18ff5121a059dea5bddb2 not found: ID does not exist" containerID="85ca20fdf25ea02c6a17221523f0f3ffcf7af8343ea18ff5121a059dea5bddb2" Oct 03 08:51:08 crc kubenswrapper[4790]: I1003 08:51:08.906060 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"85ca20fdf25ea02c6a17221523f0f3ffcf7af8343ea18ff5121a059dea5bddb2"} err="failed to get container status \"85ca20fdf25ea02c6a17221523f0f3ffcf7af8343ea18ff5121a059dea5bddb2\": rpc error: code = NotFound desc = could not find container \"85ca20fdf25ea02c6a17221523f0f3ffcf7af8343ea18ff5121a059dea5bddb2\": container with ID starting with 85ca20fdf25ea02c6a17221523f0f3ffcf7af8343ea18ff5121a059dea5bddb2 not found: ID does not exist" Oct 03 08:51:08 crc kubenswrapper[4790]: I1003 08:51:08.906096 4790 scope.go:117] "RemoveContainer" containerID="74d44e422b0cbbd3793553dad98475d5a93ac38530a167db0dd1ef152edac010" Oct 03 08:51:08 crc kubenswrapper[4790]: E1003 08:51:08.906493 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"74d44e422b0cbbd3793553dad98475d5a93ac38530a167db0dd1ef152edac010\": container with ID starting with 74d44e422b0cbbd3793553dad98475d5a93ac38530a167db0dd1ef152edac010 not found: ID does not exist" containerID="74d44e422b0cbbd3793553dad98475d5a93ac38530a167db0dd1ef152edac010" Oct 03 08:51:08 crc kubenswrapper[4790]: I1003 08:51:08.906515 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"74d44e422b0cbbd3793553dad98475d5a93ac38530a167db0dd1ef152edac010"} err="failed to get container status \"74d44e422b0cbbd3793553dad98475d5a93ac38530a167db0dd1ef152edac010\": rpc error: code = NotFound desc = could not find container \"74d44e422b0cbbd3793553dad98475d5a93ac38530a167db0dd1ef152edac010\": container with ID starting with 74d44e422b0cbbd3793553dad98475d5a93ac38530a167db0dd1ef152edac010 not found: ID does not exist" Oct 03 08:51:08 crc kubenswrapper[4790]: I1003 08:51:08.906528 4790 scope.go:117] "RemoveContainer" containerID="55b4e8ab33316a7f7057ee723ebd49eb0adcd7270124f084b2828ce538ed3297" Oct 03 08:51:08 crc kubenswrapper[4790]: E1003 08:51:08.907046 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"55b4e8ab33316a7f7057ee723ebd49eb0adcd7270124f084b2828ce538ed3297\": container with ID starting with 55b4e8ab33316a7f7057ee723ebd49eb0adcd7270124f084b2828ce538ed3297 not found: ID does not exist" containerID="55b4e8ab33316a7f7057ee723ebd49eb0adcd7270124f084b2828ce538ed3297" Oct 03 08:51:08 crc kubenswrapper[4790]: I1003 08:51:08.907065 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"55b4e8ab33316a7f7057ee723ebd49eb0adcd7270124f084b2828ce538ed3297"} err="failed to get container status \"55b4e8ab33316a7f7057ee723ebd49eb0adcd7270124f084b2828ce538ed3297\": rpc error: code = NotFound desc = could not find container \"55b4e8ab33316a7f7057ee723ebd49eb0adcd7270124f084b2828ce538ed3297\": container with ID starting with 55b4e8ab33316a7f7057ee723ebd49eb0adcd7270124f084b2828ce538ed3297 not found: ID does not exist" Oct 03 08:51:08 crc kubenswrapper[4790]: I1003 08:51:08.907078 4790 scope.go:117] "RemoveContainer" containerID="6f2aaa340e3927261ef0119f786b348e2091ddcd4b791619cc521122cc0774e7" Oct 03 08:51:08 crc kubenswrapper[4790]: E1003 08:51:08.907384 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6f2aaa340e3927261ef0119f786b348e2091ddcd4b791619cc521122cc0774e7\": container with ID starting with 6f2aaa340e3927261ef0119f786b348e2091ddcd4b791619cc521122cc0774e7 not found: ID does not exist" containerID="6f2aaa340e3927261ef0119f786b348e2091ddcd4b791619cc521122cc0774e7" Oct 03 08:51:08 crc kubenswrapper[4790]: I1003 08:51:08.907431 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6f2aaa340e3927261ef0119f786b348e2091ddcd4b791619cc521122cc0774e7"} err="failed to get container status \"6f2aaa340e3927261ef0119f786b348e2091ddcd4b791619cc521122cc0774e7\": rpc error: code = NotFound desc = could not find container \"6f2aaa340e3927261ef0119f786b348e2091ddcd4b791619cc521122cc0774e7\": container with ID starting with 6f2aaa340e3927261ef0119f786b348e2091ddcd4b791619cc521122cc0774e7 not found: ID does not exist" Oct 03 08:51:08 crc kubenswrapper[4790]: I1003 08:51:08.907470 4790 scope.go:117] "RemoveContainer" containerID="e52e55afef7b68f05e0c7d975d4497ee57dd9fa5aef7f72e9ca7441ea1633c05" Oct 03 08:51:08 crc kubenswrapper[4790]: E1003 08:51:08.909444 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e52e55afef7b68f05e0c7d975d4497ee57dd9fa5aef7f72e9ca7441ea1633c05\": container with ID starting with e52e55afef7b68f05e0c7d975d4497ee57dd9fa5aef7f72e9ca7441ea1633c05 not found: ID does not exist" containerID="e52e55afef7b68f05e0c7d975d4497ee57dd9fa5aef7f72e9ca7441ea1633c05" Oct 03 08:51:08 crc kubenswrapper[4790]: I1003 08:51:08.909523 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e52e55afef7b68f05e0c7d975d4497ee57dd9fa5aef7f72e9ca7441ea1633c05"} err="failed to get container status \"e52e55afef7b68f05e0c7d975d4497ee57dd9fa5aef7f72e9ca7441ea1633c05\": rpc error: code = NotFound desc = could not find container \"e52e55afef7b68f05e0c7d975d4497ee57dd9fa5aef7f72e9ca7441ea1633c05\": container with ID starting with e52e55afef7b68f05e0c7d975d4497ee57dd9fa5aef7f72e9ca7441ea1633c05 not found: ID does not exist" Oct 03 08:51:08 crc kubenswrapper[4790]: I1003 08:51:08.909837 4790 scope.go:117] "RemoveContainer" containerID="e69102d376e90485ca565d34e18dbc12067e166f0bdf3c4325a0873f9f2b9b0b" Oct 03 08:51:08 crc kubenswrapper[4790]: E1003 08:51:08.910325 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e69102d376e90485ca565d34e18dbc12067e166f0bdf3c4325a0873f9f2b9b0b\": container with ID starting with e69102d376e90485ca565d34e18dbc12067e166f0bdf3c4325a0873f9f2b9b0b not found: ID does not exist" containerID="e69102d376e90485ca565d34e18dbc12067e166f0bdf3c4325a0873f9f2b9b0b" Oct 03 08:51:08 crc kubenswrapper[4790]: I1003 08:51:08.910407 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e69102d376e90485ca565d34e18dbc12067e166f0bdf3c4325a0873f9f2b9b0b"} err="failed to get container status \"e69102d376e90485ca565d34e18dbc12067e166f0bdf3c4325a0873f9f2b9b0b\": rpc error: code = NotFound desc = could not find container \"e69102d376e90485ca565d34e18dbc12067e166f0bdf3c4325a0873f9f2b9b0b\": container with ID starting with e69102d376e90485ca565d34e18dbc12067e166f0bdf3c4325a0873f9f2b9b0b not found: ID does not exist" Oct 03 08:51:08 crc kubenswrapper[4790]: I1003 08:51:08.910452 4790 scope.go:117] "RemoveContainer" containerID="ff0f1dd623e164ebe1862661c5905a38bec3aa1dd5c61982eebde25d0feb4c6c" Oct 03 08:51:08 crc kubenswrapper[4790]: E1003 08:51:08.910990 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ff0f1dd623e164ebe1862661c5905a38bec3aa1dd5c61982eebde25d0feb4c6c\": container with ID starting with ff0f1dd623e164ebe1862661c5905a38bec3aa1dd5c61982eebde25d0feb4c6c not found: ID does not exist" containerID="ff0f1dd623e164ebe1862661c5905a38bec3aa1dd5c61982eebde25d0feb4c6c" Oct 03 08:51:08 crc kubenswrapper[4790]: I1003 08:51:08.911019 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ff0f1dd623e164ebe1862661c5905a38bec3aa1dd5c61982eebde25d0feb4c6c"} err="failed to get container status \"ff0f1dd623e164ebe1862661c5905a38bec3aa1dd5c61982eebde25d0feb4c6c\": rpc error: code = NotFound desc = could not find container \"ff0f1dd623e164ebe1862661c5905a38bec3aa1dd5c61982eebde25d0feb4c6c\": container with ID starting with ff0f1dd623e164ebe1862661c5905a38bec3aa1dd5c61982eebde25d0feb4c6c not found: ID does not exist" Oct 03 08:51:08 crc kubenswrapper[4790]: I1003 08:51:08.911037 4790 scope.go:117] "RemoveContainer" containerID="27b1573f76936a5148d24accaa7fdc1a1da552fb56b348812445e7f1c1e4cbef" Oct 03 08:51:08 crc kubenswrapper[4790]: E1003 08:51:08.911553 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"27b1573f76936a5148d24accaa7fdc1a1da552fb56b348812445e7f1c1e4cbef\": container with ID starting with 27b1573f76936a5148d24accaa7fdc1a1da552fb56b348812445e7f1c1e4cbef not found: ID does not exist" containerID="27b1573f76936a5148d24accaa7fdc1a1da552fb56b348812445e7f1c1e4cbef" Oct 03 08:51:08 crc kubenswrapper[4790]: I1003 08:51:08.911607 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"27b1573f76936a5148d24accaa7fdc1a1da552fb56b348812445e7f1c1e4cbef"} err="failed to get container status \"27b1573f76936a5148d24accaa7fdc1a1da552fb56b348812445e7f1c1e4cbef\": rpc error: code = NotFound desc = could not find container \"27b1573f76936a5148d24accaa7fdc1a1da552fb56b348812445e7f1c1e4cbef\": container with ID starting with 27b1573f76936a5148d24accaa7fdc1a1da552fb56b348812445e7f1c1e4cbef not found: ID does not exist" Oct 03 08:51:08 crc kubenswrapper[4790]: I1003 08:51:08.911627 4790 scope.go:117] "RemoveContainer" containerID="e45c39722f37fc673cee930ca10396bba43b56e19e61aa99bd2b89f131341cb4" Oct 03 08:51:08 crc kubenswrapper[4790]: E1003 08:51:08.911951 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e45c39722f37fc673cee930ca10396bba43b56e19e61aa99bd2b89f131341cb4\": container with ID starting with e45c39722f37fc673cee930ca10396bba43b56e19e61aa99bd2b89f131341cb4 not found: ID does not exist" containerID="e45c39722f37fc673cee930ca10396bba43b56e19e61aa99bd2b89f131341cb4" Oct 03 08:51:08 crc kubenswrapper[4790]: I1003 08:51:08.911974 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e45c39722f37fc673cee930ca10396bba43b56e19e61aa99bd2b89f131341cb4"} err="failed to get container status \"e45c39722f37fc673cee930ca10396bba43b56e19e61aa99bd2b89f131341cb4\": rpc error: code = NotFound desc = could not find container \"e45c39722f37fc673cee930ca10396bba43b56e19e61aa99bd2b89f131341cb4\": container with ID starting with e45c39722f37fc673cee930ca10396bba43b56e19e61aa99bd2b89f131341cb4 not found: ID does not exist" Oct 03 08:51:09 crc kubenswrapper[4790]: I1003 08:51:09.700550 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qlwvl" event={"ID":"06207a11-586c-4faa-a5df-4ab5215cef89","Type":"ContainerStarted","Data":"f62ec0dd841d30de740309e7f9c594476b7d29c2a22a57430d253cf54b7fd344"} Oct 03 08:51:09 crc kubenswrapper[4790]: I1003 08:51:09.700612 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qlwvl" event={"ID":"06207a11-586c-4faa-a5df-4ab5215cef89","Type":"ContainerStarted","Data":"cc111c610e9df3af456d6db2d2ee1a3238fcaceb9fcfd9cf40294e67d3c2bc21"} Oct 03 08:51:09 crc kubenswrapper[4790]: I1003 08:51:09.700622 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qlwvl" event={"ID":"06207a11-586c-4faa-a5df-4ab5215cef89","Type":"ContainerStarted","Data":"9bf80fb67bf9907fe172491b3f5ea5ca89123cfbadb72f43d9fbb18bc1cb8fd7"} Oct 03 08:51:09 crc kubenswrapper[4790]: I1003 08:51:09.700631 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qlwvl" event={"ID":"06207a11-586c-4faa-a5df-4ab5215cef89","Type":"ContainerStarted","Data":"6078a0c028c412e225736468fcee91c2537d7c043a8728cf035a54d924573db1"} Oct 03 08:51:09 crc kubenswrapper[4790]: I1003 08:51:09.700639 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qlwvl" event={"ID":"06207a11-586c-4faa-a5df-4ab5215cef89","Type":"ContainerStarted","Data":"a298ae6ba6f349853365dc907c670bad91ccad7f40b6e0973c0879d01acb3b30"} Oct 03 08:51:09 crc kubenswrapper[4790]: I1003 08:51:09.700647 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qlwvl" event={"ID":"06207a11-586c-4faa-a5df-4ab5215cef89","Type":"ContainerStarted","Data":"6f6b844ba292c73cd3297cd400b20e764fdcb7bd9fd4fdafc868b2941ffa81b4"} Oct 03 08:51:10 crc kubenswrapper[4790]: I1003 08:51:10.341285 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fa372dcc-63aa-48e5-b153-7739078026ef" path="/var/lib/kubelet/pods/fa372dcc-63aa-48e5-b153-7739078026ef/volumes" Oct 03 08:51:12 crc kubenswrapper[4790]: I1003 08:51:12.725810 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qlwvl" event={"ID":"06207a11-586c-4faa-a5df-4ab5215cef89","Type":"ContainerStarted","Data":"432b0271d1c067c175b9ac3fcc01fd303d9e409dfc705b5e1f80244b6cca8b29"} Oct 03 08:51:14 crc kubenswrapper[4790]: I1003 08:51:14.743840 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qlwvl" event={"ID":"06207a11-586c-4faa-a5df-4ab5215cef89","Type":"ContainerStarted","Data":"207042e60a61781e36ffeed3358c076f857c8d9e9ea6a3642bd700b8cbdefe74"} Oct 03 08:51:14 crc kubenswrapper[4790]: I1003 08:51:14.744440 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-qlwvl" Oct 03 08:51:14 crc kubenswrapper[4790]: I1003 08:51:14.778988 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-qlwvl" podStartSLOduration=7.778968464 podStartE2EDuration="7.778968464s" podCreationTimestamp="2025-10-03 08:51:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 08:51:14.775764459 +0000 UTC m=+661.128698777" watchObservedRunningTime="2025-10-03 08:51:14.778968464 +0000 UTC m=+661.131902702" Oct 03 08:51:14 crc kubenswrapper[4790]: I1003 08:51:14.788398 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-qlwvl" Oct 03 08:51:15 crc kubenswrapper[4790]: I1003 08:51:15.749836 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-qlwvl" Oct 03 08:51:15 crc kubenswrapper[4790]: I1003 08:51:15.749883 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-qlwvl" Oct 03 08:51:15 crc kubenswrapper[4790]: I1003 08:51:15.777284 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-qlwvl" Oct 03 08:51:18 crc kubenswrapper[4790]: I1003 08:51:18.328225 4790 scope.go:117] "RemoveContainer" containerID="6c79a380a2fa475e309213e6650566780b9ed65a03fcabc66eca015ddcc0ecde" Oct 03 08:51:18 crc kubenswrapper[4790]: E1003 08:51:18.330309 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-hncl2_openshift-multus(f2f5631d-8486-44f9-81b0-ee78515e7866)\"" pod="openshift-multus/multus-hncl2" podUID="f2f5631d-8486-44f9-81b0-ee78515e7866" Oct 03 08:51:33 crc kubenswrapper[4790]: I1003 08:51:33.328647 4790 scope.go:117] "RemoveContainer" containerID="6c79a380a2fa475e309213e6650566780b9ed65a03fcabc66eca015ddcc0ecde" Oct 03 08:51:33 crc kubenswrapper[4790]: I1003 08:51:33.862722 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-hncl2_f2f5631d-8486-44f9-81b0-ee78515e7866/kube-multus/2.log" Oct 03 08:51:33 crc kubenswrapper[4790]: I1003 08:51:33.863299 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-hncl2" event={"ID":"f2f5631d-8486-44f9-81b0-ee78515e7866","Type":"ContainerStarted","Data":"67ff3b9f20784bcb9bfd4fc8baad78e796845405a6d2abefb0ec6003ce13c3df"} Oct 03 08:51:35 crc kubenswrapper[4790]: I1003 08:51:35.347180 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dkh77q"] Oct 03 08:51:35 crc kubenswrapper[4790]: I1003 08:51:35.348755 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dkh77q" Oct 03 08:51:35 crc kubenswrapper[4790]: I1003 08:51:35.351752 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Oct 03 08:51:35 crc kubenswrapper[4790]: I1003 08:51:35.365749 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dkh77q"] Oct 03 08:51:35 crc kubenswrapper[4790]: I1003 08:51:35.474481 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gcvzl\" (UniqueName: \"kubernetes.io/projected/f8b2b263-b285-4112-96fe-22081359fece-kube-api-access-gcvzl\") pod \"a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dkh77q\" (UID: \"f8b2b263-b285-4112-96fe-22081359fece\") " pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dkh77q" Oct 03 08:51:35 crc kubenswrapper[4790]: I1003 08:51:35.474990 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f8b2b263-b285-4112-96fe-22081359fece-bundle\") pod \"a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dkh77q\" (UID: \"f8b2b263-b285-4112-96fe-22081359fece\") " pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dkh77q" Oct 03 08:51:35 crc kubenswrapper[4790]: I1003 08:51:35.475016 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f8b2b263-b285-4112-96fe-22081359fece-util\") pod \"a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dkh77q\" (UID: \"f8b2b263-b285-4112-96fe-22081359fece\") " pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dkh77q" Oct 03 08:51:35 crc kubenswrapper[4790]: I1003 08:51:35.576367 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gcvzl\" (UniqueName: \"kubernetes.io/projected/f8b2b263-b285-4112-96fe-22081359fece-kube-api-access-gcvzl\") pod \"a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dkh77q\" (UID: \"f8b2b263-b285-4112-96fe-22081359fece\") " pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dkh77q" Oct 03 08:51:35 crc kubenswrapper[4790]: I1003 08:51:35.576440 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f8b2b263-b285-4112-96fe-22081359fece-bundle\") pod \"a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dkh77q\" (UID: \"f8b2b263-b285-4112-96fe-22081359fece\") " pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dkh77q" Oct 03 08:51:35 crc kubenswrapper[4790]: I1003 08:51:35.576464 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f8b2b263-b285-4112-96fe-22081359fece-util\") pod \"a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dkh77q\" (UID: \"f8b2b263-b285-4112-96fe-22081359fece\") " pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dkh77q" Oct 03 08:51:35 crc kubenswrapper[4790]: I1003 08:51:35.577085 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f8b2b263-b285-4112-96fe-22081359fece-util\") pod \"a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dkh77q\" (UID: \"f8b2b263-b285-4112-96fe-22081359fece\") " pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dkh77q" Oct 03 08:51:35 crc kubenswrapper[4790]: I1003 08:51:35.577274 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f8b2b263-b285-4112-96fe-22081359fece-bundle\") pod \"a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dkh77q\" (UID: \"f8b2b263-b285-4112-96fe-22081359fece\") " pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dkh77q" Oct 03 08:51:35 crc kubenswrapper[4790]: I1003 08:51:35.598443 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gcvzl\" (UniqueName: \"kubernetes.io/projected/f8b2b263-b285-4112-96fe-22081359fece-kube-api-access-gcvzl\") pod \"a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dkh77q\" (UID: \"f8b2b263-b285-4112-96fe-22081359fece\") " pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dkh77q" Oct 03 08:51:35 crc kubenswrapper[4790]: I1003 08:51:35.671828 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dkh77q" Oct 03 08:51:36 crc kubenswrapper[4790]: I1003 08:51:36.113167 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dkh77q"] Oct 03 08:51:36 crc kubenswrapper[4790]: I1003 08:51:36.888294 4790 generic.go:334] "Generic (PLEG): container finished" podID="f8b2b263-b285-4112-96fe-22081359fece" containerID="b7f50745551035ed71d73b89bc8a2686a867da672e8f2bec890893758da34f16" exitCode=0 Oct 03 08:51:36 crc kubenswrapper[4790]: I1003 08:51:36.888347 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dkh77q" event={"ID":"f8b2b263-b285-4112-96fe-22081359fece","Type":"ContainerDied","Data":"b7f50745551035ed71d73b89bc8a2686a867da672e8f2bec890893758da34f16"} Oct 03 08:51:36 crc kubenswrapper[4790]: I1003 08:51:36.888378 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dkh77q" event={"ID":"f8b2b263-b285-4112-96fe-22081359fece","Type":"ContainerStarted","Data":"7f6b01efdc4e6d447f3891e0396b735a2681933ce3329ccb2e3814ff9290e83e"} Oct 03 08:51:38 crc kubenswrapper[4790]: I1003 08:51:38.142206 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-qlwvl" Oct 03 08:51:38 crc kubenswrapper[4790]: I1003 08:51:38.903595 4790 generic.go:334] "Generic (PLEG): container finished" podID="f8b2b263-b285-4112-96fe-22081359fece" containerID="41c36893abb51a56216539ed063211890ae3ce2e98342f3115a677fe91034912" exitCode=0 Oct 03 08:51:38 crc kubenswrapper[4790]: I1003 08:51:38.903691 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dkh77q" event={"ID":"f8b2b263-b285-4112-96fe-22081359fece","Type":"ContainerDied","Data":"41c36893abb51a56216539ed063211890ae3ce2e98342f3115a677fe91034912"} Oct 03 08:51:39 crc kubenswrapper[4790]: I1003 08:51:39.911869 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dkh77q" event={"ID":"f8b2b263-b285-4112-96fe-22081359fece","Type":"ContainerStarted","Data":"8e530201e263f9d259f27e54af1b9a80e33a632b819e7604527a194cd84bb6f6"} Oct 03 08:51:39 crc kubenswrapper[4790]: I1003 08:51:39.940163 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dkh77q" podStartSLOduration=3.7762062419999998 podStartE2EDuration="4.940135807s" podCreationTimestamp="2025-10-03 08:51:35 +0000 UTC" firstStartedPulling="2025-10-03 08:51:36.889760761 +0000 UTC m=+683.242695009" lastFinishedPulling="2025-10-03 08:51:38.053690336 +0000 UTC m=+684.406624574" observedRunningTime="2025-10-03 08:51:39.938175895 +0000 UTC m=+686.291110153" watchObservedRunningTime="2025-10-03 08:51:39.940135807 +0000 UTC m=+686.293070065" Oct 03 08:51:40 crc kubenswrapper[4790]: I1003 08:51:40.919403 4790 generic.go:334] "Generic (PLEG): container finished" podID="f8b2b263-b285-4112-96fe-22081359fece" containerID="8e530201e263f9d259f27e54af1b9a80e33a632b819e7604527a194cd84bb6f6" exitCode=0 Oct 03 08:51:40 crc kubenswrapper[4790]: I1003 08:51:40.919464 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dkh77q" event={"ID":"f8b2b263-b285-4112-96fe-22081359fece","Type":"ContainerDied","Data":"8e530201e263f9d259f27e54af1b9a80e33a632b819e7604527a194cd84bb6f6"} Oct 03 08:51:42 crc kubenswrapper[4790]: I1003 08:51:42.137544 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dkh77q" Oct 03 08:51:42 crc kubenswrapper[4790]: I1003 08:51:42.288295 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f8b2b263-b285-4112-96fe-22081359fece-bundle\") pod \"f8b2b263-b285-4112-96fe-22081359fece\" (UID: \"f8b2b263-b285-4112-96fe-22081359fece\") " Oct 03 08:51:42 crc kubenswrapper[4790]: I1003 08:51:42.288441 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gcvzl\" (UniqueName: \"kubernetes.io/projected/f8b2b263-b285-4112-96fe-22081359fece-kube-api-access-gcvzl\") pod \"f8b2b263-b285-4112-96fe-22081359fece\" (UID: \"f8b2b263-b285-4112-96fe-22081359fece\") " Oct 03 08:51:42 crc kubenswrapper[4790]: I1003 08:51:42.288519 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f8b2b263-b285-4112-96fe-22081359fece-util\") pod \"f8b2b263-b285-4112-96fe-22081359fece\" (UID: \"f8b2b263-b285-4112-96fe-22081359fece\") " Oct 03 08:51:42 crc kubenswrapper[4790]: I1003 08:51:42.290992 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f8b2b263-b285-4112-96fe-22081359fece-bundle" (OuterVolumeSpecName: "bundle") pod "f8b2b263-b285-4112-96fe-22081359fece" (UID: "f8b2b263-b285-4112-96fe-22081359fece"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 08:51:42 crc kubenswrapper[4790]: I1003 08:51:42.293664 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f8b2b263-b285-4112-96fe-22081359fece-kube-api-access-gcvzl" (OuterVolumeSpecName: "kube-api-access-gcvzl") pod "f8b2b263-b285-4112-96fe-22081359fece" (UID: "f8b2b263-b285-4112-96fe-22081359fece"). InnerVolumeSpecName "kube-api-access-gcvzl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:51:42 crc kubenswrapper[4790]: I1003 08:51:42.303828 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f8b2b263-b285-4112-96fe-22081359fece-util" (OuterVolumeSpecName: "util") pod "f8b2b263-b285-4112-96fe-22081359fece" (UID: "f8b2b263-b285-4112-96fe-22081359fece"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 08:51:42 crc kubenswrapper[4790]: I1003 08:51:42.390374 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gcvzl\" (UniqueName: \"kubernetes.io/projected/f8b2b263-b285-4112-96fe-22081359fece-kube-api-access-gcvzl\") on node \"crc\" DevicePath \"\"" Oct 03 08:51:42 crc kubenswrapper[4790]: I1003 08:51:42.390418 4790 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f8b2b263-b285-4112-96fe-22081359fece-util\") on node \"crc\" DevicePath \"\"" Oct 03 08:51:42 crc kubenswrapper[4790]: I1003 08:51:42.390433 4790 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f8b2b263-b285-4112-96fe-22081359fece-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 08:51:42 crc kubenswrapper[4790]: I1003 08:51:42.936850 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dkh77q" Oct 03 08:51:42 crc kubenswrapper[4790]: I1003 08:51:42.937561 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dkh77q" event={"ID":"f8b2b263-b285-4112-96fe-22081359fece","Type":"ContainerDied","Data":"7f6b01efdc4e6d447f3891e0396b735a2681933ce3329ccb2e3814ff9290e83e"} Oct 03 08:51:42 crc kubenswrapper[4790]: I1003 08:51:42.937735 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7f6b01efdc4e6d447f3891e0396b735a2681933ce3329ccb2e3814ff9290e83e" Oct 03 08:51:52 crc kubenswrapper[4790]: I1003 08:51:52.304704 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-7c8cf85677-wlf7r"] Oct 03 08:51:52 crc kubenswrapper[4790]: E1003 08:51:52.305486 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f8b2b263-b285-4112-96fe-22081359fece" containerName="util" Oct 03 08:51:52 crc kubenswrapper[4790]: I1003 08:51:52.305500 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="f8b2b263-b285-4112-96fe-22081359fece" containerName="util" Oct 03 08:51:52 crc kubenswrapper[4790]: E1003 08:51:52.305511 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f8b2b263-b285-4112-96fe-22081359fece" containerName="pull" Oct 03 08:51:52 crc kubenswrapper[4790]: I1003 08:51:52.305517 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="f8b2b263-b285-4112-96fe-22081359fece" containerName="pull" Oct 03 08:51:52 crc kubenswrapper[4790]: E1003 08:51:52.305530 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f8b2b263-b285-4112-96fe-22081359fece" containerName="extract" Oct 03 08:51:52 crc kubenswrapper[4790]: I1003 08:51:52.305536 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="f8b2b263-b285-4112-96fe-22081359fece" containerName="extract" Oct 03 08:51:52 crc kubenswrapper[4790]: I1003 08:51:52.305651 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="f8b2b263-b285-4112-96fe-22081359fece" containerName="extract" Oct 03 08:51:52 crc kubenswrapper[4790]: I1003 08:51:52.306129 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-7c8cf85677-wlf7r" Oct 03 08:51:52 crc kubenswrapper[4790]: I1003 08:51:52.308765 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-dockercfg-9xs5b" Oct 03 08:51:52 crc kubenswrapper[4790]: I1003 08:51:52.309254 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"kube-root-ca.crt" Oct 03 08:51:52 crc kubenswrapper[4790]: I1003 08:51:52.309385 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"openshift-service-ca.crt" Oct 03 08:51:52 crc kubenswrapper[4790]: I1003 08:51:52.335938 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-7c8cf85677-wlf7r"] Oct 03 08:51:52 crc kubenswrapper[4790]: I1003 08:51:52.415350 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p5cnr\" (UniqueName: \"kubernetes.io/projected/66046d21-5295-4dd4-9f38-3859fc71918e-kube-api-access-p5cnr\") pod \"obo-prometheus-operator-7c8cf85677-wlf7r\" (UID: \"66046d21-5295-4dd4-9f38-3859fc71918e\") " pod="openshift-operators/obo-prometheus-operator-7c8cf85677-wlf7r" Oct 03 08:51:52 crc kubenswrapper[4790]: I1003 08:51:52.475842 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-f956bfc86-24zvx"] Oct 03 08:51:52 crc kubenswrapper[4790]: I1003 08:51:52.482710 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-f956bfc86-24zvx" Oct 03 08:51:52 crc kubenswrapper[4790]: I1003 08:51:52.502973 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-dockercfg-l5964" Oct 03 08:51:52 crc kubenswrapper[4790]: I1003 08:51:52.509300 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-service-cert" Oct 03 08:51:52 crc kubenswrapper[4790]: I1003 08:51:52.510855 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-f956bfc86-q2djd"] Oct 03 08:51:52 crc kubenswrapper[4790]: I1003 08:51:52.512979 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-f956bfc86-q2djd" Oct 03 08:51:52 crc kubenswrapper[4790]: I1003 08:51:52.516188 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-f956bfc86-24zvx"] Oct 03 08:51:52 crc kubenswrapper[4790]: I1003 08:51:52.516810 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p5cnr\" (UniqueName: \"kubernetes.io/projected/66046d21-5295-4dd4-9f38-3859fc71918e-kube-api-access-p5cnr\") pod \"obo-prometheus-operator-7c8cf85677-wlf7r\" (UID: \"66046d21-5295-4dd4-9f38-3859fc71918e\") " pod="openshift-operators/obo-prometheus-operator-7c8cf85677-wlf7r" Oct 03 08:51:52 crc kubenswrapper[4790]: I1003 08:51:52.525840 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-f956bfc86-q2djd"] Oct 03 08:51:52 crc kubenswrapper[4790]: I1003 08:51:52.542853 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p5cnr\" (UniqueName: \"kubernetes.io/projected/66046d21-5295-4dd4-9f38-3859fc71918e-kube-api-access-p5cnr\") pod \"obo-prometheus-operator-7c8cf85677-wlf7r\" (UID: \"66046d21-5295-4dd4-9f38-3859fc71918e\") " pod="openshift-operators/obo-prometheus-operator-7c8cf85677-wlf7r" Oct 03 08:51:52 crc kubenswrapper[4790]: I1003 08:51:52.618762 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/2f0ce448-4fba-4522-b3d3-8287c471d89b-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-f956bfc86-q2djd\" (UID: \"2f0ce448-4fba-4522-b3d3-8287c471d89b\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-f956bfc86-q2djd" Oct 03 08:51:52 crc kubenswrapper[4790]: I1003 08:51:52.619096 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/9a774d8d-e6d0-4a3c-ad82-767e7a213d2e-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-f956bfc86-24zvx\" (UID: \"9a774d8d-e6d0-4a3c-ad82-767e7a213d2e\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-f956bfc86-24zvx" Oct 03 08:51:52 crc kubenswrapper[4790]: I1003 08:51:52.619193 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/2f0ce448-4fba-4522-b3d3-8287c471d89b-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-f956bfc86-q2djd\" (UID: \"2f0ce448-4fba-4522-b3d3-8287c471d89b\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-f956bfc86-q2djd" Oct 03 08:51:52 crc kubenswrapper[4790]: I1003 08:51:52.619220 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/9a774d8d-e6d0-4a3c-ad82-767e7a213d2e-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-f956bfc86-24zvx\" (UID: \"9a774d8d-e6d0-4a3c-ad82-767e7a213d2e\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-f956bfc86-24zvx" Oct 03 08:51:52 crc kubenswrapper[4790]: I1003 08:51:52.629157 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-7c8cf85677-wlf7r" Oct 03 08:51:52 crc kubenswrapper[4790]: I1003 08:51:52.672815 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/observability-operator-cc5f78dfc-rmlf2"] Oct 03 08:51:52 crc kubenswrapper[4790]: I1003 08:51:52.674030 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-cc5f78dfc-rmlf2" Oct 03 08:51:52 crc kubenswrapper[4790]: W1003 08:51:52.675952 4790 reflector.go:561] object-"openshift-operators"/"observability-operator-sa-dockercfg-tp5tb": failed to list *v1.Secret: secrets "observability-operator-sa-dockercfg-tp5tb" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-operators": no relationship found between node 'crc' and this object Oct 03 08:51:52 crc kubenswrapper[4790]: E1003 08:51:52.675988 4790 reflector.go:158] "Unhandled Error" err="object-\"openshift-operators\"/\"observability-operator-sa-dockercfg-tp5tb\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"observability-operator-sa-dockercfg-tp5tb\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-operators\": no relationship found between node 'crc' and this object" logger="UnhandledError" Oct 03 08:51:52 crc kubenswrapper[4790]: W1003 08:51:52.676122 4790 reflector.go:561] object-"openshift-operators"/"observability-operator-tls": failed to list *v1.Secret: secrets "observability-operator-tls" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-operators": no relationship found between node 'crc' and this object Oct 03 08:51:52 crc kubenswrapper[4790]: E1003 08:51:52.676142 4790 reflector.go:158] "Unhandled Error" err="object-\"openshift-operators\"/\"observability-operator-tls\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"observability-operator-tls\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-operators\": no relationship found between node 'crc' and this object" logger="UnhandledError" Oct 03 08:51:52 crc kubenswrapper[4790]: I1003 08:51:52.688545 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-cc5f78dfc-rmlf2"] Oct 03 08:51:52 crc kubenswrapper[4790]: I1003 08:51:52.721024 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/9a774d8d-e6d0-4a3c-ad82-767e7a213d2e-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-f956bfc86-24zvx\" (UID: \"9a774d8d-e6d0-4a3c-ad82-767e7a213d2e\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-f956bfc86-24zvx" Oct 03 08:51:52 crc kubenswrapper[4790]: I1003 08:51:52.721070 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/2f0ce448-4fba-4522-b3d3-8287c471d89b-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-f956bfc86-q2djd\" (UID: \"2f0ce448-4fba-4522-b3d3-8287c471d89b\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-f956bfc86-q2djd" Oct 03 08:51:52 crc kubenswrapper[4790]: I1003 08:51:52.721087 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/9a774d8d-e6d0-4a3c-ad82-767e7a213d2e-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-f956bfc86-24zvx\" (UID: \"9a774d8d-e6d0-4a3c-ad82-767e7a213d2e\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-f956bfc86-24zvx" Oct 03 08:51:52 crc kubenswrapper[4790]: I1003 08:51:52.721287 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/2f0ce448-4fba-4522-b3d3-8287c471d89b-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-f956bfc86-q2djd\" (UID: \"2f0ce448-4fba-4522-b3d3-8287c471d89b\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-f956bfc86-q2djd" Oct 03 08:51:52 crc kubenswrapper[4790]: I1003 08:51:52.728114 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/9a774d8d-e6d0-4a3c-ad82-767e7a213d2e-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-f956bfc86-24zvx\" (UID: \"9a774d8d-e6d0-4a3c-ad82-767e7a213d2e\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-f956bfc86-24zvx" Oct 03 08:51:52 crc kubenswrapper[4790]: I1003 08:51:52.742210 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/9a774d8d-e6d0-4a3c-ad82-767e7a213d2e-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-f956bfc86-24zvx\" (UID: \"9a774d8d-e6d0-4a3c-ad82-767e7a213d2e\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-f956bfc86-24zvx" Oct 03 08:51:52 crc kubenswrapper[4790]: I1003 08:51:52.742333 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/2f0ce448-4fba-4522-b3d3-8287c471d89b-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-f956bfc86-q2djd\" (UID: \"2f0ce448-4fba-4522-b3d3-8287c471d89b\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-f956bfc86-q2djd" Oct 03 08:51:52 crc kubenswrapper[4790]: I1003 08:51:52.746076 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/2f0ce448-4fba-4522-b3d3-8287c471d89b-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-f956bfc86-q2djd\" (UID: \"2f0ce448-4fba-4522-b3d3-8287c471d89b\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-f956bfc86-q2djd" Oct 03 08:51:52 crc kubenswrapper[4790]: I1003 08:51:52.793263 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/perses-operator-54bc95c9fb-8sh7j"] Oct 03 08:51:52 crc kubenswrapper[4790]: I1003 08:51:52.794082 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-54bc95c9fb-8sh7j" Oct 03 08:51:52 crc kubenswrapper[4790]: I1003 08:51:52.798995 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"perses-operator-dockercfg-2rkzw" Oct 03 08:51:52 crc kubenswrapper[4790]: I1003 08:51:52.817967 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-f956bfc86-24zvx" Oct 03 08:51:52 crc kubenswrapper[4790]: I1003 08:51:52.824674 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dklfr\" (UniqueName: \"kubernetes.io/projected/ed2d3ffa-e89d-45d0-b131-8567b5abda08-kube-api-access-dklfr\") pod \"perses-operator-54bc95c9fb-8sh7j\" (UID: \"ed2d3ffa-e89d-45d0-b131-8567b5abda08\") " pod="openshift-operators/perses-operator-54bc95c9fb-8sh7j" Oct 03 08:51:52 crc kubenswrapper[4790]: I1003 08:51:52.824770 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/feadd2c8-de45-4819-8626-812b7007c679-observability-operator-tls\") pod \"observability-operator-cc5f78dfc-rmlf2\" (UID: \"feadd2c8-de45-4819-8626-812b7007c679\") " pod="openshift-operators/observability-operator-cc5f78dfc-rmlf2" Oct 03 08:51:52 crc kubenswrapper[4790]: I1003 08:51:52.824799 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/ed2d3ffa-e89d-45d0-b131-8567b5abda08-openshift-service-ca\") pod \"perses-operator-54bc95c9fb-8sh7j\" (UID: \"ed2d3ffa-e89d-45d0-b131-8567b5abda08\") " pod="openshift-operators/perses-operator-54bc95c9fb-8sh7j" Oct 03 08:51:52 crc kubenswrapper[4790]: I1003 08:51:52.824842 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fwm9x\" (UniqueName: \"kubernetes.io/projected/feadd2c8-de45-4819-8626-812b7007c679-kube-api-access-fwm9x\") pod \"observability-operator-cc5f78dfc-rmlf2\" (UID: \"feadd2c8-de45-4819-8626-812b7007c679\") " pod="openshift-operators/observability-operator-cc5f78dfc-rmlf2" Oct 03 08:51:52 crc kubenswrapper[4790]: I1003 08:51:52.830642 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-f956bfc86-q2djd" Oct 03 08:51:52 crc kubenswrapper[4790]: I1003 08:51:52.840660 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-54bc95c9fb-8sh7j"] Oct 03 08:51:52 crc kubenswrapper[4790]: I1003 08:51:52.926049 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/feadd2c8-de45-4819-8626-812b7007c679-observability-operator-tls\") pod \"observability-operator-cc5f78dfc-rmlf2\" (UID: \"feadd2c8-de45-4819-8626-812b7007c679\") " pod="openshift-operators/observability-operator-cc5f78dfc-rmlf2" Oct 03 08:51:52 crc kubenswrapper[4790]: I1003 08:51:52.926458 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/ed2d3ffa-e89d-45d0-b131-8567b5abda08-openshift-service-ca\") pod \"perses-operator-54bc95c9fb-8sh7j\" (UID: \"ed2d3ffa-e89d-45d0-b131-8567b5abda08\") " pod="openshift-operators/perses-operator-54bc95c9fb-8sh7j" Oct 03 08:51:52 crc kubenswrapper[4790]: I1003 08:51:52.926503 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fwm9x\" (UniqueName: \"kubernetes.io/projected/feadd2c8-de45-4819-8626-812b7007c679-kube-api-access-fwm9x\") pod \"observability-operator-cc5f78dfc-rmlf2\" (UID: \"feadd2c8-de45-4819-8626-812b7007c679\") " pod="openshift-operators/observability-operator-cc5f78dfc-rmlf2" Oct 03 08:51:52 crc kubenswrapper[4790]: I1003 08:51:52.926545 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dklfr\" (UniqueName: \"kubernetes.io/projected/ed2d3ffa-e89d-45d0-b131-8567b5abda08-kube-api-access-dklfr\") pod \"perses-operator-54bc95c9fb-8sh7j\" (UID: \"ed2d3ffa-e89d-45d0-b131-8567b5abda08\") " pod="openshift-operators/perses-operator-54bc95c9fb-8sh7j" Oct 03 08:51:52 crc kubenswrapper[4790]: I1003 08:51:52.927714 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/ed2d3ffa-e89d-45d0-b131-8567b5abda08-openshift-service-ca\") pod \"perses-operator-54bc95c9fb-8sh7j\" (UID: \"ed2d3ffa-e89d-45d0-b131-8567b5abda08\") " pod="openshift-operators/perses-operator-54bc95c9fb-8sh7j" Oct 03 08:51:52 crc kubenswrapper[4790]: I1003 08:51:52.947277 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-7c8cf85677-wlf7r"] Oct 03 08:51:52 crc kubenswrapper[4790]: I1003 08:51:52.950717 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dklfr\" (UniqueName: \"kubernetes.io/projected/ed2d3ffa-e89d-45d0-b131-8567b5abda08-kube-api-access-dklfr\") pod \"perses-operator-54bc95c9fb-8sh7j\" (UID: \"ed2d3ffa-e89d-45d0-b131-8567b5abda08\") " pod="openshift-operators/perses-operator-54bc95c9fb-8sh7j" Oct 03 08:51:52 crc kubenswrapper[4790]: I1003 08:51:52.954980 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fwm9x\" (UniqueName: \"kubernetes.io/projected/feadd2c8-de45-4819-8626-812b7007c679-kube-api-access-fwm9x\") pod \"observability-operator-cc5f78dfc-rmlf2\" (UID: \"feadd2c8-de45-4819-8626-812b7007c679\") " pod="openshift-operators/observability-operator-cc5f78dfc-rmlf2" Oct 03 08:51:52 crc kubenswrapper[4790]: W1003 08:51:52.968398 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod66046d21_5295_4dd4_9f38_3859fc71918e.slice/crio-3a43dddb2a7ca4283822a0a8974842f1e6455b85b26b0f17fb9d00a3e32dc781 WatchSource:0}: Error finding container 3a43dddb2a7ca4283822a0a8974842f1e6455b85b26b0f17fb9d00a3e32dc781: Status 404 returned error can't find the container with id 3a43dddb2a7ca4283822a0a8974842f1e6455b85b26b0f17fb9d00a3e32dc781 Oct 03 08:51:52 crc kubenswrapper[4790]: I1003 08:51:52.990111 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-7c8cf85677-wlf7r" event={"ID":"66046d21-5295-4dd4-9f38-3859fc71918e","Type":"ContainerStarted","Data":"3a43dddb2a7ca4283822a0a8974842f1e6455b85b26b0f17fb9d00a3e32dc781"} Oct 03 08:51:53 crc kubenswrapper[4790]: I1003 08:51:53.149428 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-54bc95c9fb-8sh7j" Oct 03 08:51:53 crc kubenswrapper[4790]: I1003 08:51:53.305020 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-f956bfc86-24zvx"] Oct 03 08:51:53 crc kubenswrapper[4790]: W1003 08:51:53.319439 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9a774d8d_e6d0_4a3c_ad82_767e7a213d2e.slice/crio-324ae829bcfc7ca06ab3238085e104aa3d6f0c16dbd093a0e3ec372ad2741495 WatchSource:0}: Error finding container 324ae829bcfc7ca06ab3238085e104aa3d6f0c16dbd093a0e3ec372ad2741495: Status 404 returned error can't find the container with id 324ae829bcfc7ca06ab3238085e104aa3d6f0c16dbd093a0e3ec372ad2741495 Oct 03 08:51:53 crc kubenswrapper[4790]: I1003 08:51:53.321350 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-f956bfc86-q2djd"] Oct 03 08:51:53 crc kubenswrapper[4790]: I1003 08:51:53.391434 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-54bc95c9fb-8sh7j"] Oct 03 08:51:53 crc kubenswrapper[4790]: I1003 08:51:53.525851 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-tls" Oct 03 08:51:53 crc kubenswrapper[4790]: I1003 08:51:53.533487 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/feadd2c8-de45-4819-8626-812b7007c679-observability-operator-tls\") pod \"observability-operator-cc5f78dfc-rmlf2\" (UID: \"feadd2c8-de45-4819-8626-812b7007c679\") " pod="openshift-operators/observability-operator-cc5f78dfc-rmlf2" Oct 03 08:51:53 crc kubenswrapper[4790]: I1003 08:51:53.804224 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-sa-dockercfg-tp5tb" Oct 03 08:51:53 crc kubenswrapper[4790]: I1003 08:51:53.813811 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-cc5f78dfc-rmlf2" Oct 03 08:51:54 crc kubenswrapper[4790]: I1003 08:51:54.040559 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-54bc95c9fb-8sh7j" event={"ID":"ed2d3ffa-e89d-45d0-b131-8567b5abda08","Type":"ContainerStarted","Data":"4747aa4b380267db942c839ca1a083895c9a29c5d7cf26c3faca1522213304ca"} Oct 03 08:51:54 crc kubenswrapper[4790]: I1003 08:51:54.058550 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-f956bfc86-q2djd" event={"ID":"2f0ce448-4fba-4522-b3d3-8287c471d89b","Type":"ContainerStarted","Data":"b981c69cd89e581e2635fa51eaafbeadb20b9be33a52225db74be2fdb0e333ba"} Oct 03 08:51:54 crc kubenswrapper[4790]: I1003 08:51:54.088941 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-f956bfc86-24zvx" event={"ID":"9a774d8d-e6d0-4a3c-ad82-767e7a213d2e","Type":"ContainerStarted","Data":"324ae829bcfc7ca06ab3238085e104aa3d6f0c16dbd093a0e3ec372ad2741495"} Oct 03 08:51:54 crc kubenswrapper[4790]: I1003 08:51:54.110083 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-cc5f78dfc-rmlf2"] Oct 03 08:51:54 crc kubenswrapper[4790]: W1003 08:51:54.122812 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfeadd2c8_de45_4819_8626_812b7007c679.slice/crio-7e90197e275e5121a68e33d189dd37cf0d0aa6345002ba5af75c938572b47b2d WatchSource:0}: Error finding container 7e90197e275e5121a68e33d189dd37cf0d0aa6345002ba5af75c938572b47b2d: Status 404 returned error can't find the container with id 7e90197e275e5121a68e33d189dd37cf0d0aa6345002ba5af75c938572b47b2d Oct 03 08:51:55 crc kubenswrapper[4790]: I1003 08:51:55.109200 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-cc5f78dfc-rmlf2" event={"ID":"feadd2c8-de45-4819-8626-812b7007c679","Type":"ContainerStarted","Data":"7e90197e275e5121a68e33d189dd37cf0d0aa6345002ba5af75c938572b47b2d"} Oct 03 08:52:08 crc kubenswrapper[4790]: E1003 08:52:08.703816 4790 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/cluster-observability-operator/obo-prometheus-operator-admission-webhook-rhel9@sha256:e54c1e1301be66933f3ecb01d5a0ca27f58aabfd905b18b7d057bbf23bdb7b0d" Oct 03 08:52:08 crc kubenswrapper[4790]: E1003 08:52:08.705513 4790 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:prometheus-operator-admission-webhook,Image:registry.redhat.io/cluster-observability-operator/obo-prometheus-operator-admission-webhook-rhel9@sha256:e54c1e1301be66933f3ecb01d5a0ca27f58aabfd905b18b7d057bbf23bdb7b0d,Command:[],Args:[--web.enable-tls=true --web.cert-file=/tmp/k8s-webhook-server/serving-certs/tls.crt --web.key-file=/tmp/k8s-webhook-server/serving-certs/tls.key],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_CONDITION_NAME,Value:cluster-observability-operator.v1.2.2,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{209715200 0} {} BinarySI},},Requests:ResourceList{cpu: {{50 -3} {} 50m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:apiservice-cert,ReadOnly:false,MountPath:/apiserver.local.config/certificates,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:webhook-cert,ReadOnly:false,MountPath:/tmp/k8s-webhook-server/serving-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:*true,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod obo-prometheus-operator-admission-webhook-f956bfc86-q2djd_openshift-operators(2f0ce448-4fba-4522-b3d3-8287c471d89b): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 03 08:52:08 crc kubenswrapper[4790]: E1003 08:52:08.707155 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"prometheus-operator-admission-webhook\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-operators/obo-prometheus-operator-admission-webhook-f956bfc86-q2djd" podUID="2f0ce448-4fba-4522-b3d3-8287c471d89b" Oct 03 08:52:09 crc kubenswrapper[4790]: E1003 08:52:09.209127 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"prometheus-operator-admission-webhook\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/cluster-observability-operator/obo-prometheus-operator-admission-webhook-rhel9@sha256:e54c1e1301be66933f3ecb01d5a0ca27f58aabfd905b18b7d057bbf23bdb7b0d\\\"\"" pod="openshift-operators/obo-prometheus-operator-admission-webhook-f956bfc86-q2djd" podUID="2f0ce448-4fba-4522-b3d3-8287c471d89b" Oct 03 08:52:09 crc kubenswrapper[4790]: E1003 08:52:09.260301 4790 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/cluster-observability-operator/perses-0-1-rhel9-operator@sha256:bfed9f442aea6e8165644f1dc615beea06ec7fd84ea3f8ca393a63d3627c6a7c" Oct 03 08:52:09 crc kubenswrapper[4790]: E1003 08:52:09.260512 4790 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:perses-operator,Image:registry.redhat.io/cluster-observability-operator/perses-0-1-rhel9-operator@sha256:bfed9f442aea6e8165644f1dc615beea06ec7fd84ea3f8ca393a63d3627c6a7c,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:OPERATOR_CONDITION_NAME,Value:cluster-observability-operator.v1.2.2,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{100 -3} {} 100m DecimalSI},memory: {{134217728 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:openshift-service-ca,ReadOnly:true,MountPath:/ca,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-dklfr,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000350000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod perses-operator-54bc95c9fb-8sh7j_openshift-operators(ed2d3ffa-e89d-45d0-b131-8567b5abda08): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 03 08:52:09 crc kubenswrapper[4790]: E1003 08:52:09.262345 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"perses-operator\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-operators/perses-operator-54bc95c9fb-8sh7j" podUID="ed2d3ffa-e89d-45d0-b131-8567b5abda08" Oct 03 08:52:10 crc kubenswrapper[4790]: I1003 08:52:10.186214 4790 patch_prober.go:28] interesting pod/machine-config-daemon-zqj4j container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 08:52:10 crc kubenswrapper[4790]: I1003 08:52:10.187176 4790 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" podUID="36eae06a-d8be-4128-8d68-acb32e1f011b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 08:52:10 crc kubenswrapper[4790]: I1003 08:52:10.217518 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-cc5f78dfc-rmlf2" event={"ID":"feadd2c8-de45-4819-8626-812b7007c679","Type":"ContainerStarted","Data":"bb1fbaea75e7ef7831a5ba2109d324c4d251c3d26b8ccb565fd590aed9693eed"} Oct 03 08:52:10 crc kubenswrapper[4790]: I1003 08:52:10.218185 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/observability-operator-cc5f78dfc-rmlf2" Oct 03 08:52:10 crc kubenswrapper[4790]: I1003 08:52:10.220063 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-7c8cf85677-wlf7r" event={"ID":"66046d21-5295-4dd4-9f38-3859fc71918e","Type":"ContainerStarted","Data":"e22e532fa6933e85ab9421dc1c079481677b65dc7d8111ca8325dc89d8d5fdbb"} Oct 03 08:52:10 crc kubenswrapper[4790]: I1003 08:52:10.223655 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-f956bfc86-24zvx" event={"ID":"9a774d8d-e6d0-4a3c-ad82-767e7a213d2e","Type":"ContainerStarted","Data":"ed49cbe685e8a9f715c917a8d8071e257c860537ea13b98e3d37f4270ecadd7f"} Oct 03 08:52:10 crc kubenswrapper[4790]: E1003 08:52:10.228899 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"perses-operator\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/cluster-observability-operator/perses-0-1-rhel9-operator@sha256:bfed9f442aea6e8165644f1dc615beea06ec7fd84ea3f8ca393a63d3627c6a7c\\\"\"" pod="openshift-operators/perses-operator-54bc95c9fb-8sh7j" podUID="ed2d3ffa-e89d-45d0-b131-8567b5abda08" Oct 03 08:52:10 crc kubenswrapper[4790]: I1003 08:52:10.249063 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/observability-operator-cc5f78dfc-rmlf2" Oct 03 08:52:10 crc kubenswrapper[4790]: I1003 08:52:10.271956 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/observability-operator-cc5f78dfc-rmlf2" podStartSLOduration=3.119070792 podStartE2EDuration="18.271929637s" podCreationTimestamp="2025-10-03 08:51:52 +0000 UTC" firstStartedPulling="2025-10-03 08:51:54.126420005 +0000 UTC m=+700.479354243" lastFinishedPulling="2025-10-03 08:52:09.27927886 +0000 UTC m=+715.632213088" observedRunningTime="2025-10-03 08:52:10.246151235 +0000 UTC m=+716.599085483" watchObservedRunningTime="2025-10-03 08:52:10.271929637 +0000 UTC m=+716.624863875" Oct 03 08:52:10 crc kubenswrapper[4790]: I1003 08:52:10.313197 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-f956bfc86-24zvx" podStartSLOduration=2.380930109 podStartE2EDuration="18.313169915s" podCreationTimestamp="2025-10-03 08:51:52 +0000 UTC" firstStartedPulling="2025-10-03 08:51:53.325141096 +0000 UTC m=+699.678075334" lastFinishedPulling="2025-10-03 08:52:09.257380902 +0000 UTC m=+715.610315140" observedRunningTime="2025-10-03 08:52:10.286775387 +0000 UTC m=+716.639709625" watchObservedRunningTime="2025-10-03 08:52:10.313169915 +0000 UTC m=+716.666104153" Oct 03 08:52:10 crc kubenswrapper[4790]: I1003 08:52:10.313898 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-7c8cf85677-wlf7r" podStartSLOduration=2.016949959 podStartE2EDuration="18.313892734s" podCreationTimestamp="2025-10-03 08:51:52 +0000 UTC" firstStartedPulling="2025-10-03 08:51:52.979193811 +0000 UTC m=+699.332128049" lastFinishedPulling="2025-10-03 08:52:09.276136586 +0000 UTC m=+715.629070824" observedRunningTime="2025-10-03 08:52:10.310108653 +0000 UTC m=+716.663042891" watchObservedRunningTime="2025-10-03 08:52:10.313892734 +0000 UTC m=+716.666826972" Oct 03 08:52:21 crc kubenswrapper[4790]: I1003 08:52:21.292332 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-f956bfc86-q2djd" event={"ID":"2f0ce448-4fba-4522-b3d3-8287c471d89b","Type":"ContainerStarted","Data":"eb71b14fad5fbc08b432c10b597174bf1f185f3e006805223fd20b6c38d98f44"} Oct 03 08:52:21 crc kubenswrapper[4790]: I1003 08:52:21.319499 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-f956bfc86-q2djd" podStartSLOduration=-9223372007.535297 podStartE2EDuration="29.319479293s" podCreationTimestamp="2025-10-03 08:51:52 +0000 UTC" firstStartedPulling="2025-10-03 08:51:53.355841547 +0000 UTC m=+699.708775795" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 08:52:21.315196567 +0000 UTC m=+727.668130805" watchObservedRunningTime="2025-10-03 08:52:21.319479293 +0000 UTC m=+727.672413541" Oct 03 08:52:23 crc kubenswrapper[4790]: I1003 08:52:23.306646 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-54bc95c9fb-8sh7j" event={"ID":"ed2d3ffa-e89d-45d0-b131-8567b5abda08","Type":"ContainerStarted","Data":"a4092d9e5d67adb85f5d82566b3a5994c34aebad2ef7a1feab7e1108178782e2"} Oct 03 08:52:23 crc kubenswrapper[4790]: I1003 08:52:23.307968 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/perses-operator-54bc95c9fb-8sh7j" Oct 03 08:52:23 crc kubenswrapper[4790]: I1003 08:52:23.328661 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/perses-operator-54bc95c9fb-8sh7j" podStartSLOduration=2.444704807 podStartE2EDuration="31.328632777s" podCreationTimestamp="2025-10-03 08:51:52 +0000 UTC" firstStartedPulling="2025-10-03 08:51:53.404609356 +0000 UTC m=+699.757543594" lastFinishedPulling="2025-10-03 08:52:22.288537316 +0000 UTC m=+728.641471564" observedRunningTime="2025-10-03 08:52:23.327169448 +0000 UTC m=+729.680103686" watchObservedRunningTime="2025-10-03 08:52:23.328632777 +0000 UTC m=+729.681567055" Oct 03 08:52:33 crc kubenswrapper[4790]: I1003 08:52:33.152414 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/perses-operator-54bc95c9fb-8sh7j" Oct 03 08:52:40 crc kubenswrapper[4790]: I1003 08:52:40.185875 4790 patch_prober.go:28] interesting pod/machine-config-daemon-zqj4j container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 08:52:40 crc kubenswrapper[4790]: I1003 08:52:40.186550 4790 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" podUID="36eae06a-d8be-4128-8d68-acb32e1f011b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 08:52:40 crc kubenswrapper[4790]: I1003 08:52:40.740318 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-tdsgg"] Oct 03 08:52:40 crc kubenswrapper[4790]: I1003 08:52:40.740990 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-tdsgg" podUID="3bf63bc2-e5fc-4974-b78e-185796fea03e" containerName="controller-manager" containerID="cri-o://e2db524342a88bc7dcd429091f1f63fcfaf4f9a51cde62b28fc9fd2132843d3a" gracePeriod=30 Oct 03 08:52:40 crc kubenswrapper[4790]: I1003 08:52:40.846855 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-tvcgx"] Oct 03 08:52:40 crc kubenswrapper[4790]: I1003 08:52:40.847153 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-tvcgx" podUID="74be60a5-ccdf-40b5-bc76-ebf2196ae376" containerName="route-controller-manager" containerID="cri-o://9db2214616770ef4914f92507e05e694fad50a555ad92b0ae796b74294560fdd" gracePeriod=30 Oct 03 08:52:41 crc kubenswrapper[4790]: I1003 08:52:41.416329 4790 generic.go:334] "Generic (PLEG): container finished" podID="74be60a5-ccdf-40b5-bc76-ebf2196ae376" containerID="9db2214616770ef4914f92507e05e694fad50a555ad92b0ae796b74294560fdd" exitCode=0 Oct 03 08:52:41 crc kubenswrapper[4790]: I1003 08:52:41.416403 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-tvcgx" event={"ID":"74be60a5-ccdf-40b5-bc76-ebf2196ae376","Type":"ContainerDied","Data":"9db2214616770ef4914f92507e05e694fad50a555ad92b0ae796b74294560fdd"} Oct 03 08:52:41 crc kubenswrapper[4790]: I1003 08:52:41.417955 4790 generic.go:334] "Generic (PLEG): container finished" podID="3bf63bc2-e5fc-4974-b78e-185796fea03e" containerID="e2db524342a88bc7dcd429091f1f63fcfaf4f9a51cde62b28fc9fd2132843d3a" exitCode=0 Oct 03 08:52:41 crc kubenswrapper[4790]: I1003 08:52:41.418000 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-tdsgg" event={"ID":"3bf63bc2-e5fc-4974-b78e-185796fea03e","Type":"ContainerDied","Data":"e2db524342a88bc7dcd429091f1f63fcfaf4f9a51cde62b28fc9fd2132843d3a"} Oct 03 08:52:42 crc kubenswrapper[4790]: I1003 08:52:42.389905 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-tdsgg" Oct 03 08:52:42 crc kubenswrapper[4790]: I1003 08:52:42.417051 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-5db9d9ffdf-29mtp"] Oct 03 08:52:42 crc kubenswrapper[4790]: E1003 08:52:42.417334 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3bf63bc2-e5fc-4974-b78e-185796fea03e" containerName="controller-manager" Oct 03 08:52:42 crc kubenswrapper[4790]: I1003 08:52:42.417351 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="3bf63bc2-e5fc-4974-b78e-185796fea03e" containerName="controller-manager" Oct 03 08:52:42 crc kubenswrapper[4790]: I1003 08:52:42.417512 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="3bf63bc2-e5fc-4974-b78e-185796fea03e" containerName="controller-manager" Oct 03 08:52:42 crc kubenswrapper[4790]: I1003 08:52:42.418059 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5db9d9ffdf-29mtp" Oct 03 08:52:42 crc kubenswrapper[4790]: I1003 08:52:42.431250 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-tdsgg" event={"ID":"3bf63bc2-e5fc-4974-b78e-185796fea03e","Type":"ContainerDied","Data":"baffeff961dc8555947bfd2a0d8dd5a3dbaccffc3162a3c75124d4e4b4bec747"} Oct 03 08:52:42 crc kubenswrapper[4790]: I1003 08:52:42.431281 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-tdsgg" Oct 03 08:52:42 crc kubenswrapper[4790]: I1003 08:52:42.431320 4790 scope.go:117] "RemoveContainer" containerID="e2db524342a88bc7dcd429091f1f63fcfaf4f9a51cde62b28fc9fd2132843d3a" Oct 03 08:52:42 crc kubenswrapper[4790]: I1003 08:52:42.435649 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-tvcgx" event={"ID":"74be60a5-ccdf-40b5-bc76-ebf2196ae376","Type":"ContainerDied","Data":"ce6b5ae317d76771beeaac7b0a3d9a71ec020b54a995aae36e41c7855e739511"} Oct 03 08:52:42 crc kubenswrapper[4790]: I1003 08:52:42.435703 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ce6b5ae317d76771beeaac7b0a3d9a71ec020b54a995aae36e41c7855e739511" Oct 03 08:52:42 crc kubenswrapper[4790]: I1003 08:52:42.439487 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-5db9d9ffdf-29mtp"] Oct 03 08:52:42 crc kubenswrapper[4790]: I1003 08:52:42.485446 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-tvcgx" Oct 03 08:52:42 crc kubenswrapper[4790]: I1003 08:52:42.489740 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ljf4f\" (UniqueName: \"kubernetes.io/projected/74be60a5-ccdf-40b5-bc76-ebf2196ae376-kube-api-access-ljf4f\") pod \"74be60a5-ccdf-40b5-bc76-ebf2196ae376\" (UID: \"74be60a5-ccdf-40b5-bc76-ebf2196ae376\") " Oct 03 08:52:42 crc kubenswrapper[4790]: I1003 08:52:42.489798 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-drfp6\" (UniqueName: \"kubernetes.io/projected/3bf63bc2-e5fc-4974-b78e-185796fea03e-kube-api-access-drfp6\") pod \"3bf63bc2-e5fc-4974-b78e-185796fea03e\" (UID: \"3bf63bc2-e5fc-4974-b78e-185796fea03e\") " Oct 03 08:52:42 crc kubenswrapper[4790]: I1003 08:52:42.489821 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3bf63bc2-e5fc-4974-b78e-185796fea03e-serving-cert\") pod \"3bf63bc2-e5fc-4974-b78e-185796fea03e\" (UID: \"3bf63bc2-e5fc-4974-b78e-185796fea03e\") " Oct 03 08:52:42 crc kubenswrapper[4790]: I1003 08:52:42.489864 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/3bf63bc2-e5fc-4974-b78e-185796fea03e-proxy-ca-bundles\") pod \"3bf63bc2-e5fc-4974-b78e-185796fea03e\" (UID: \"3bf63bc2-e5fc-4974-b78e-185796fea03e\") " Oct 03 08:52:42 crc kubenswrapper[4790]: I1003 08:52:42.489903 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/74be60a5-ccdf-40b5-bc76-ebf2196ae376-config\") pod \"74be60a5-ccdf-40b5-bc76-ebf2196ae376\" (UID: \"74be60a5-ccdf-40b5-bc76-ebf2196ae376\") " Oct 03 08:52:42 crc kubenswrapper[4790]: I1003 08:52:42.489925 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3bf63bc2-e5fc-4974-b78e-185796fea03e-client-ca\") pod \"3bf63bc2-e5fc-4974-b78e-185796fea03e\" (UID: \"3bf63bc2-e5fc-4974-b78e-185796fea03e\") " Oct 03 08:52:42 crc kubenswrapper[4790]: I1003 08:52:42.489964 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/74be60a5-ccdf-40b5-bc76-ebf2196ae376-serving-cert\") pod \"74be60a5-ccdf-40b5-bc76-ebf2196ae376\" (UID: \"74be60a5-ccdf-40b5-bc76-ebf2196ae376\") " Oct 03 08:52:42 crc kubenswrapper[4790]: I1003 08:52:42.490012 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3bf63bc2-e5fc-4974-b78e-185796fea03e-config\") pod \"3bf63bc2-e5fc-4974-b78e-185796fea03e\" (UID: \"3bf63bc2-e5fc-4974-b78e-185796fea03e\") " Oct 03 08:52:42 crc kubenswrapper[4790]: I1003 08:52:42.490052 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/74be60a5-ccdf-40b5-bc76-ebf2196ae376-client-ca\") pod \"74be60a5-ccdf-40b5-bc76-ebf2196ae376\" (UID: \"74be60a5-ccdf-40b5-bc76-ebf2196ae376\") " Oct 03 08:52:42 crc kubenswrapper[4790]: I1003 08:52:42.490229 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ksgxv\" (UniqueName: \"kubernetes.io/projected/735f91e2-93fe-466d-b8e0-277a91194a3a-kube-api-access-ksgxv\") pod \"controller-manager-5db9d9ffdf-29mtp\" (UID: \"735f91e2-93fe-466d-b8e0-277a91194a3a\") " pod="openshift-controller-manager/controller-manager-5db9d9ffdf-29mtp" Oct 03 08:52:42 crc kubenswrapper[4790]: I1003 08:52:42.490260 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/735f91e2-93fe-466d-b8e0-277a91194a3a-config\") pod \"controller-manager-5db9d9ffdf-29mtp\" (UID: \"735f91e2-93fe-466d-b8e0-277a91194a3a\") " pod="openshift-controller-manager/controller-manager-5db9d9ffdf-29mtp" Oct 03 08:52:42 crc kubenswrapper[4790]: I1003 08:52:42.490281 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/735f91e2-93fe-466d-b8e0-277a91194a3a-serving-cert\") pod \"controller-manager-5db9d9ffdf-29mtp\" (UID: \"735f91e2-93fe-466d-b8e0-277a91194a3a\") " pod="openshift-controller-manager/controller-manager-5db9d9ffdf-29mtp" Oct 03 08:52:42 crc kubenswrapper[4790]: I1003 08:52:42.490309 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/735f91e2-93fe-466d-b8e0-277a91194a3a-client-ca\") pod \"controller-manager-5db9d9ffdf-29mtp\" (UID: \"735f91e2-93fe-466d-b8e0-277a91194a3a\") " pod="openshift-controller-manager/controller-manager-5db9d9ffdf-29mtp" Oct 03 08:52:42 crc kubenswrapper[4790]: I1003 08:52:42.490333 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/735f91e2-93fe-466d-b8e0-277a91194a3a-proxy-ca-bundles\") pod \"controller-manager-5db9d9ffdf-29mtp\" (UID: \"735f91e2-93fe-466d-b8e0-277a91194a3a\") " pod="openshift-controller-manager/controller-manager-5db9d9ffdf-29mtp" Oct 03 08:52:42 crc kubenswrapper[4790]: I1003 08:52:42.492259 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3bf63bc2-e5fc-4974-b78e-185796fea03e-config" (OuterVolumeSpecName: "config") pod "3bf63bc2-e5fc-4974-b78e-185796fea03e" (UID: "3bf63bc2-e5fc-4974-b78e-185796fea03e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:52:42 crc kubenswrapper[4790]: I1003 08:52:42.492946 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/74be60a5-ccdf-40b5-bc76-ebf2196ae376-config" (OuterVolumeSpecName: "config") pod "74be60a5-ccdf-40b5-bc76-ebf2196ae376" (UID: "74be60a5-ccdf-40b5-bc76-ebf2196ae376"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:52:42 crc kubenswrapper[4790]: I1003 08:52:42.493419 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/74be60a5-ccdf-40b5-bc76-ebf2196ae376-client-ca" (OuterVolumeSpecName: "client-ca") pod "74be60a5-ccdf-40b5-bc76-ebf2196ae376" (UID: "74be60a5-ccdf-40b5-bc76-ebf2196ae376"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:52:42 crc kubenswrapper[4790]: I1003 08:52:42.493746 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3bf63bc2-e5fc-4974-b78e-185796fea03e-client-ca" (OuterVolumeSpecName: "client-ca") pod "3bf63bc2-e5fc-4974-b78e-185796fea03e" (UID: "3bf63bc2-e5fc-4974-b78e-185796fea03e"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:52:42 crc kubenswrapper[4790]: I1003 08:52:42.494043 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3bf63bc2-e5fc-4974-b78e-185796fea03e-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "3bf63bc2-e5fc-4974-b78e-185796fea03e" (UID: "3bf63bc2-e5fc-4974-b78e-185796fea03e"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:52:42 crc kubenswrapper[4790]: I1003 08:52:42.500335 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/74be60a5-ccdf-40b5-bc76-ebf2196ae376-kube-api-access-ljf4f" (OuterVolumeSpecName: "kube-api-access-ljf4f") pod "74be60a5-ccdf-40b5-bc76-ebf2196ae376" (UID: "74be60a5-ccdf-40b5-bc76-ebf2196ae376"). InnerVolumeSpecName "kube-api-access-ljf4f". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:52:42 crc kubenswrapper[4790]: I1003 08:52:42.504094 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3bf63bc2-e5fc-4974-b78e-185796fea03e-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "3bf63bc2-e5fc-4974-b78e-185796fea03e" (UID: "3bf63bc2-e5fc-4974-b78e-185796fea03e"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:52:42 crc kubenswrapper[4790]: I1003 08:52:42.507229 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3bf63bc2-e5fc-4974-b78e-185796fea03e-kube-api-access-drfp6" (OuterVolumeSpecName: "kube-api-access-drfp6") pod "3bf63bc2-e5fc-4974-b78e-185796fea03e" (UID: "3bf63bc2-e5fc-4974-b78e-185796fea03e"). InnerVolumeSpecName "kube-api-access-drfp6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:52:42 crc kubenswrapper[4790]: I1003 08:52:42.507833 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/74be60a5-ccdf-40b5-bc76-ebf2196ae376-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "74be60a5-ccdf-40b5-bc76-ebf2196ae376" (UID: "74be60a5-ccdf-40b5-bc76-ebf2196ae376"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:52:42 crc kubenswrapper[4790]: I1003 08:52:42.590639 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ksgxv\" (UniqueName: \"kubernetes.io/projected/735f91e2-93fe-466d-b8e0-277a91194a3a-kube-api-access-ksgxv\") pod \"controller-manager-5db9d9ffdf-29mtp\" (UID: \"735f91e2-93fe-466d-b8e0-277a91194a3a\") " pod="openshift-controller-manager/controller-manager-5db9d9ffdf-29mtp" Oct 03 08:52:42 crc kubenswrapper[4790]: I1003 08:52:42.590684 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/735f91e2-93fe-466d-b8e0-277a91194a3a-config\") pod \"controller-manager-5db9d9ffdf-29mtp\" (UID: \"735f91e2-93fe-466d-b8e0-277a91194a3a\") " pod="openshift-controller-manager/controller-manager-5db9d9ffdf-29mtp" Oct 03 08:52:42 crc kubenswrapper[4790]: I1003 08:52:42.590704 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/735f91e2-93fe-466d-b8e0-277a91194a3a-serving-cert\") pod \"controller-manager-5db9d9ffdf-29mtp\" (UID: \"735f91e2-93fe-466d-b8e0-277a91194a3a\") " pod="openshift-controller-manager/controller-manager-5db9d9ffdf-29mtp" Oct 03 08:52:42 crc kubenswrapper[4790]: I1003 08:52:42.590730 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/735f91e2-93fe-466d-b8e0-277a91194a3a-client-ca\") pod \"controller-manager-5db9d9ffdf-29mtp\" (UID: \"735f91e2-93fe-466d-b8e0-277a91194a3a\") " pod="openshift-controller-manager/controller-manager-5db9d9ffdf-29mtp" Oct 03 08:52:42 crc kubenswrapper[4790]: I1003 08:52:42.590752 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/735f91e2-93fe-466d-b8e0-277a91194a3a-proxy-ca-bundles\") pod \"controller-manager-5db9d9ffdf-29mtp\" (UID: \"735f91e2-93fe-466d-b8e0-277a91194a3a\") " pod="openshift-controller-manager/controller-manager-5db9d9ffdf-29mtp" Oct 03 08:52:42 crc kubenswrapper[4790]: I1003 08:52:42.590810 4790 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/3bf63bc2-e5fc-4974-b78e-185796fea03e-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Oct 03 08:52:42 crc kubenswrapper[4790]: I1003 08:52:42.590820 4790 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/74be60a5-ccdf-40b5-bc76-ebf2196ae376-config\") on node \"crc\" DevicePath \"\"" Oct 03 08:52:42 crc kubenswrapper[4790]: I1003 08:52:42.590829 4790 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3bf63bc2-e5fc-4974-b78e-185796fea03e-client-ca\") on node \"crc\" DevicePath \"\"" Oct 03 08:52:42 crc kubenswrapper[4790]: I1003 08:52:42.590836 4790 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/74be60a5-ccdf-40b5-bc76-ebf2196ae376-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 03 08:52:42 crc kubenswrapper[4790]: I1003 08:52:42.590844 4790 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3bf63bc2-e5fc-4974-b78e-185796fea03e-config\") on node \"crc\" DevicePath \"\"" Oct 03 08:52:42 crc kubenswrapper[4790]: I1003 08:52:42.590852 4790 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/74be60a5-ccdf-40b5-bc76-ebf2196ae376-client-ca\") on node \"crc\" DevicePath \"\"" Oct 03 08:52:42 crc kubenswrapper[4790]: I1003 08:52:42.590863 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ljf4f\" (UniqueName: \"kubernetes.io/projected/74be60a5-ccdf-40b5-bc76-ebf2196ae376-kube-api-access-ljf4f\") on node \"crc\" DevicePath \"\"" Oct 03 08:52:42 crc kubenswrapper[4790]: I1003 08:52:42.590872 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-drfp6\" (UniqueName: \"kubernetes.io/projected/3bf63bc2-e5fc-4974-b78e-185796fea03e-kube-api-access-drfp6\") on node \"crc\" DevicePath \"\"" Oct 03 08:52:42 crc kubenswrapper[4790]: I1003 08:52:42.590880 4790 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3bf63bc2-e5fc-4974-b78e-185796fea03e-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 03 08:52:42 crc kubenswrapper[4790]: I1003 08:52:42.592027 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/735f91e2-93fe-466d-b8e0-277a91194a3a-proxy-ca-bundles\") pod \"controller-manager-5db9d9ffdf-29mtp\" (UID: \"735f91e2-93fe-466d-b8e0-277a91194a3a\") " pod="openshift-controller-manager/controller-manager-5db9d9ffdf-29mtp" Oct 03 08:52:42 crc kubenswrapper[4790]: I1003 08:52:42.592057 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/735f91e2-93fe-466d-b8e0-277a91194a3a-client-ca\") pod \"controller-manager-5db9d9ffdf-29mtp\" (UID: \"735f91e2-93fe-466d-b8e0-277a91194a3a\") " pod="openshift-controller-manager/controller-manager-5db9d9ffdf-29mtp" Oct 03 08:52:42 crc kubenswrapper[4790]: I1003 08:52:42.592660 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/735f91e2-93fe-466d-b8e0-277a91194a3a-config\") pod \"controller-manager-5db9d9ffdf-29mtp\" (UID: \"735f91e2-93fe-466d-b8e0-277a91194a3a\") " pod="openshift-controller-manager/controller-manager-5db9d9ffdf-29mtp" Oct 03 08:52:42 crc kubenswrapper[4790]: I1003 08:52:42.595451 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/735f91e2-93fe-466d-b8e0-277a91194a3a-serving-cert\") pod \"controller-manager-5db9d9ffdf-29mtp\" (UID: \"735f91e2-93fe-466d-b8e0-277a91194a3a\") " pod="openshift-controller-manager/controller-manager-5db9d9ffdf-29mtp" Oct 03 08:52:42 crc kubenswrapper[4790]: I1003 08:52:42.609732 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ksgxv\" (UniqueName: \"kubernetes.io/projected/735f91e2-93fe-466d-b8e0-277a91194a3a-kube-api-access-ksgxv\") pod \"controller-manager-5db9d9ffdf-29mtp\" (UID: \"735f91e2-93fe-466d-b8e0-277a91194a3a\") " pod="openshift-controller-manager/controller-manager-5db9d9ffdf-29mtp" Oct 03 08:52:42 crc kubenswrapper[4790]: I1003 08:52:42.760625 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-tdsgg"] Oct 03 08:52:42 crc kubenswrapper[4790]: I1003 08:52:42.765263 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-tdsgg"] Oct 03 08:52:42 crc kubenswrapper[4790]: I1003 08:52:42.826137 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5db9d9ffdf-29mtp" Oct 03 08:52:43 crc kubenswrapper[4790]: I1003 08:52:43.017505 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-5db9d9ffdf-29mtp"] Oct 03 08:52:43 crc kubenswrapper[4790]: I1003 08:52:43.442305 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5db9d9ffdf-29mtp" event={"ID":"735f91e2-93fe-466d-b8e0-277a91194a3a","Type":"ContainerStarted","Data":"3f9cde8f3eca23b025227aabbfb941e8c7922519cbec0d9814342f0a5e83a94b"} Oct 03 08:52:43 crc kubenswrapper[4790]: I1003 08:52:43.443143 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-5db9d9ffdf-29mtp" Oct 03 08:52:43 crc kubenswrapper[4790]: I1003 08:52:43.443169 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5db9d9ffdf-29mtp" event={"ID":"735f91e2-93fe-466d-b8e0-277a91194a3a","Type":"ContainerStarted","Data":"14308442be99ddfc616b417b519551ef7cf57659a9be04e17d10034bfb0afeb3"} Oct 03 08:52:43 crc kubenswrapper[4790]: I1003 08:52:43.445454 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-tvcgx" Oct 03 08:52:43 crc kubenswrapper[4790]: I1003 08:52:43.482273 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-5db9d9ffdf-29mtp" podStartSLOduration=3.48225081 podStartE2EDuration="3.48225081s" podCreationTimestamp="2025-10-03 08:52:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 08:52:43.482175398 +0000 UTC m=+749.835109646" watchObservedRunningTime="2025-10-03 08:52:43.48225081 +0000 UTC m=+749.835185048" Oct 03 08:52:43 crc kubenswrapper[4790]: I1003 08:52:43.503287 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-tvcgx"] Oct 03 08:52:43 crc kubenswrapper[4790]: I1003 08:52:43.508143 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-tvcgx"] Oct 03 08:52:43 crc kubenswrapper[4790]: I1003 08:52:43.536531 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-5db9d9ffdf-29mtp" Oct 03 08:52:44 crc kubenswrapper[4790]: I1003 08:52:44.337132 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3bf63bc2-e5fc-4974-b78e-185796fea03e" path="/var/lib/kubelet/pods/3bf63bc2-e5fc-4974-b78e-185796fea03e/volumes" Oct 03 08:52:44 crc kubenswrapper[4790]: I1003 08:52:44.337834 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="74be60a5-ccdf-40b5-bc76-ebf2196ae376" path="/var/lib/kubelet/pods/74be60a5-ccdf-40b5-bc76-ebf2196ae376/volumes" Oct 03 08:52:44 crc kubenswrapper[4790]: I1003 08:52:44.683751 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5b9c97895b-2zbp7"] Oct 03 08:52:44 crc kubenswrapper[4790]: E1003 08:52:44.683999 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="74be60a5-ccdf-40b5-bc76-ebf2196ae376" containerName="route-controller-manager" Oct 03 08:52:44 crc kubenswrapper[4790]: I1003 08:52:44.684016 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="74be60a5-ccdf-40b5-bc76-ebf2196ae376" containerName="route-controller-manager" Oct 03 08:52:44 crc kubenswrapper[4790]: I1003 08:52:44.684141 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="74be60a5-ccdf-40b5-bc76-ebf2196ae376" containerName="route-controller-manager" Oct 03 08:52:44 crc kubenswrapper[4790]: I1003 08:52:44.684654 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5b9c97895b-2zbp7" Oct 03 08:52:44 crc kubenswrapper[4790]: I1003 08:52:44.688010 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Oct 03 08:52:44 crc kubenswrapper[4790]: I1003 08:52:44.688223 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Oct 03 08:52:44 crc kubenswrapper[4790]: I1003 08:52:44.688335 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Oct 03 08:52:44 crc kubenswrapper[4790]: I1003 08:52:44.688448 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Oct 03 08:52:44 crc kubenswrapper[4790]: I1003 08:52:44.688601 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Oct 03 08:52:44 crc kubenswrapper[4790]: I1003 08:52:44.688763 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Oct 03 08:52:44 crc kubenswrapper[4790]: I1003 08:52:44.692322 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5b9c97895b-2zbp7"] Oct 03 08:52:44 crc kubenswrapper[4790]: I1003 08:52:44.742915 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vhgnj\" (UniqueName: \"kubernetes.io/projected/2ebb66b8-dc59-4933-8022-a1d70e4a7e95-kube-api-access-vhgnj\") pod \"route-controller-manager-5b9c97895b-2zbp7\" (UID: \"2ebb66b8-dc59-4933-8022-a1d70e4a7e95\") " pod="openshift-route-controller-manager/route-controller-manager-5b9c97895b-2zbp7" Oct 03 08:52:44 crc kubenswrapper[4790]: I1003 08:52:44.742995 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2ebb66b8-dc59-4933-8022-a1d70e4a7e95-serving-cert\") pod \"route-controller-manager-5b9c97895b-2zbp7\" (UID: \"2ebb66b8-dc59-4933-8022-a1d70e4a7e95\") " pod="openshift-route-controller-manager/route-controller-manager-5b9c97895b-2zbp7" Oct 03 08:52:44 crc kubenswrapper[4790]: I1003 08:52:44.743031 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2ebb66b8-dc59-4933-8022-a1d70e4a7e95-config\") pod \"route-controller-manager-5b9c97895b-2zbp7\" (UID: \"2ebb66b8-dc59-4933-8022-a1d70e4a7e95\") " pod="openshift-route-controller-manager/route-controller-manager-5b9c97895b-2zbp7" Oct 03 08:52:44 crc kubenswrapper[4790]: I1003 08:52:44.743054 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2ebb66b8-dc59-4933-8022-a1d70e4a7e95-client-ca\") pod \"route-controller-manager-5b9c97895b-2zbp7\" (UID: \"2ebb66b8-dc59-4933-8022-a1d70e4a7e95\") " pod="openshift-route-controller-manager/route-controller-manager-5b9c97895b-2zbp7" Oct 03 08:52:44 crc kubenswrapper[4790]: I1003 08:52:44.845033 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vhgnj\" (UniqueName: \"kubernetes.io/projected/2ebb66b8-dc59-4933-8022-a1d70e4a7e95-kube-api-access-vhgnj\") pod \"route-controller-manager-5b9c97895b-2zbp7\" (UID: \"2ebb66b8-dc59-4933-8022-a1d70e4a7e95\") " pod="openshift-route-controller-manager/route-controller-manager-5b9c97895b-2zbp7" Oct 03 08:52:44 crc kubenswrapper[4790]: I1003 08:52:44.845318 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2ebb66b8-dc59-4933-8022-a1d70e4a7e95-serving-cert\") pod \"route-controller-manager-5b9c97895b-2zbp7\" (UID: \"2ebb66b8-dc59-4933-8022-a1d70e4a7e95\") " pod="openshift-route-controller-manager/route-controller-manager-5b9c97895b-2zbp7" Oct 03 08:52:44 crc kubenswrapper[4790]: I1003 08:52:44.845351 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2ebb66b8-dc59-4933-8022-a1d70e4a7e95-config\") pod \"route-controller-manager-5b9c97895b-2zbp7\" (UID: \"2ebb66b8-dc59-4933-8022-a1d70e4a7e95\") " pod="openshift-route-controller-manager/route-controller-manager-5b9c97895b-2zbp7" Oct 03 08:52:44 crc kubenswrapper[4790]: I1003 08:52:44.845368 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2ebb66b8-dc59-4933-8022-a1d70e4a7e95-client-ca\") pod \"route-controller-manager-5b9c97895b-2zbp7\" (UID: \"2ebb66b8-dc59-4933-8022-a1d70e4a7e95\") " pod="openshift-route-controller-manager/route-controller-manager-5b9c97895b-2zbp7" Oct 03 08:52:44 crc kubenswrapper[4790]: I1003 08:52:44.846141 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2ebb66b8-dc59-4933-8022-a1d70e4a7e95-client-ca\") pod \"route-controller-manager-5b9c97895b-2zbp7\" (UID: \"2ebb66b8-dc59-4933-8022-a1d70e4a7e95\") " pod="openshift-route-controller-manager/route-controller-manager-5b9c97895b-2zbp7" Oct 03 08:52:44 crc kubenswrapper[4790]: I1003 08:52:44.847043 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2ebb66b8-dc59-4933-8022-a1d70e4a7e95-config\") pod \"route-controller-manager-5b9c97895b-2zbp7\" (UID: \"2ebb66b8-dc59-4933-8022-a1d70e4a7e95\") " pod="openshift-route-controller-manager/route-controller-manager-5b9c97895b-2zbp7" Oct 03 08:52:44 crc kubenswrapper[4790]: I1003 08:52:44.861064 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2ebb66b8-dc59-4933-8022-a1d70e4a7e95-serving-cert\") pod \"route-controller-manager-5b9c97895b-2zbp7\" (UID: \"2ebb66b8-dc59-4933-8022-a1d70e4a7e95\") " pod="openshift-route-controller-manager/route-controller-manager-5b9c97895b-2zbp7" Oct 03 08:52:44 crc kubenswrapper[4790]: I1003 08:52:44.864246 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vhgnj\" (UniqueName: \"kubernetes.io/projected/2ebb66b8-dc59-4933-8022-a1d70e4a7e95-kube-api-access-vhgnj\") pod \"route-controller-manager-5b9c97895b-2zbp7\" (UID: \"2ebb66b8-dc59-4933-8022-a1d70e4a7e95\") " pod="openshift-route-controller-manager/route-controller-manager-5b9c97895b-2zbp7" Oct 03 08:52:45 crc kubenswrapper[4790]: I1003 08:52:45.002868 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5b9c97895b-2zbp7" Oct 03 08:52:45 crc kubenswrapper[4790]: I1003 08:52:45.228186 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5b9c97895b-2zbp7"] Oct 03 08:52:45 crc kubenswrapper[4790]: W1003 08:52:45.236273 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2ebb66b8_dc59_4933_8022_a1d70e4a7e95.slice/crio-efa03eeee1513ed2c850b54badca5ec949242fad82a9622d3a90dbbc7b812828 WatchSource:0}: Error finding container efa03eeee1513ed2c850b54badca5ec949242fad82a9622d3a90dbbc7b812828: Status 404 returned error can't find the container with id efa03eeee1513ed2c850b54badca5ec949242fad82a9622d3a90dbbc7b812828 Oct 03 08:52:45 crc kubenswrapper[4790]: I1003 08:52:45.457052 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5b9c97895b-2zbp7" event={"ID":"2ebb66b8-dc59-4933-8022-a1d70e4a7e95","Type":"ContainerStarted","Data":"dd25d19195217db098fa8dbf52ce7d7d6ed9bd6a082498a2b5db878c1342857a"} Oct 03 08:52:45 crc kubenswrapper[4790]: I1003 08:52:45.457108 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5b9c97895b-2zbp7" event={"ID":"2ebb66b8-dc59-4933-8022-a1d70e4a7e95","Type":"ContainerStarted","Data":"efa03eeee1513ed2c850b54badca5ec949242fad82a9622d3a90dbbc7b812828"} Oct 03 08:52:45 crc kubenswrapper[4790]: I1003 08:52:45.479019 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-5b9c97895b-2zbp7" podStartSLOduration=5.478995932 podStartE2EDuration="5.478995932s" podCreationTimestamp="2025-10-03 08:52:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 08:52:45.473397301 +0000 UTC m=+751.826331559" watchObservedRunningTime="2025-10-03 08:52:45.478995932 +0000 UTC m=+751.831930170" Oct 03 08:52:46 crc kubenswrapper[4790]: I1003 08:52:46.462970 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-5b9c97895b-2zbp7" Oct 03 08:52:46 crc kubenswrapper[4790]: I1003 08:52:46.468723 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-5b9c97895b-2zbp7" Oct 03 08:52:50 crc kubenswrapper[4790]: I1003 08:52:50.108494 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cblgh2"] Oct 03 08:52:50 crc kubenswrapper[4790]: I1003 08:52:50.109840 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cblgh2" Oct 03 08:52:50 crc kubenswrapper[4790]: I1003 08:52:50.118892 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cblgh2"] Oct 03 08:52:50 crc kubenswrapper[4790]: I1003 08:52:50.122761 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Oct 03 08:52:50 crc kubenswrapper[4790]: I1003 08:52:50.222828 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1db4e394-a858-4ab0-b051-3135c5516405-bundle\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cblgh2\" (UID: \"1db4e394-a858-4ab0-b051-3135c5516405\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cblgh2" Oct 03 08:52:50 crc kubenswrapper[4790]: I1003 08:52:50.223104 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1db4e394-a858-4ab0-b051-3135c5516405-util\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cblgh2\" (UID: \"1db4e394-a858-4ab0-b051-3135c5516405\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cblgh2" Oct 03 08:52:50 crc kubenswrapper[4790]: I1003 08:52:50.223235 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9cwv5\" (UniqueName: \"kubernetes.io/projected/1db4e394-a858-4ab0-b051-3135c5516405-kube-api-access-9cwv5\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cblgh2\" (UID: \"1db4e394-a858-4ab0-b051-3135c5516405\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cblgh2" Oct 03 08:52:50 crc kubenswrapper[4790]: I1003 08:52:50.324681 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1db4e394-a858-4ab0-b051-3135c5516405-bundle\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cblgh2\" (UID: \"1db4e394-a858-4ab0-b051-3135c5516405\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cblgh2" Oct 03 08:52:50 crc kubenswrapper[4790]: I1003 08:52:50.324751 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1db4e394-a858-4ab0-b051-3135c5516405-util\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cblgh2\" (UID: \"1db4e394-a858-4ab0-b051-3135c5516405\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cblgh2" Oct 03 08:52:50 crc kubenswrapper[4790]: I1003 08:52:50.324794 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9cwv5\" (UniqueName: \"kubernetes.io/projected/1db4e394-a858-4ab0-b051-3135c5516405-kube-api-access-9cwv5\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cblgh2\" (UID: \"1db4e394-a858-4ab0-b051-3135c5516405\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cblgh2" Oct 03 08:52:50 crc kubenswrapper[4790]: I1003 08:52:50.325331 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1db4e394-a858-4ab0-b051-3135c5516405-util\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cblgh2\" (UID: \"1db4e394-a858-4ab0-b051-3135c5516405\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cblgh2" Oct 03 08:52:50 crc kubenswrapper[4790]: I1003 08:52:50.325371 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1db4e394-a858-4ab0-b051-3135c5516405-bundle\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cblgh2\" (UID: \"1db4e394-a858-4ab0-b051-3135c5516405\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cblgh2" Oct 03 08:52:50 crc kubenswrapper[4790]: I1003 08:52:50.351318 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9cwv5\" (UniqueName: \"kubernetes.io/projected/1db4e394-a858-4ab0-b051-3135c5516405-kube-api-access-9cwv5\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cblgh2\" (UID: \"1db4e394-a858-4ab0-b051-3135c5516405\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cblgh2" Oct 03 08:52:50 crc kubenswrapper[4790]: I1003 08:52:50.431349 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cblgh2" Oct 03 08:52:50 crc kubenswrapper[4790]: I1003 08:52:50.897438 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cblgh2"] Oct 03 08:52:50 crc kubenswrapper[4790]: W1003 08:52:50.905487 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1db4e394_a858_4ab0_b051_3135c5516405.slice/crio-02ed722dbb6d10b77a265d4c2bb327636603d478f7bfde7a0c7b8093ea143dec WatchSource:0}: Error finding container 02ed722dbb6d10b77a265d4c2bb327636603d478f7bfde7a0c7b8093ea143dec: Status 404 returned error can't find the container with id 02ed722dbb6d10b77a265d4c2bb327636603d478f7bfde7a0c7b8093ea143dec Oct 03 08:52:50 crc kubenswrapper[4790]: I1003 08:52:50.945171 4790 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Oct 03 08:52:51 crc kubenswrapper[4790]: I1003 08:52:51.498224 4790 generic.go:334] "Generic (PLEG): container finished" podID="1db4e394-a858-4ab0-b051-3135c5516405" containerID="432eed6501e4dd0c8edfec2dc03b69fa0c40d02025b3c5ce5aa95a7b8c5c0569" exitCode=0 Oct 03 08:52:51 crc kubenswrapper[4790]: I1003 08:52:51.498350 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cblgh2" event={"ID":"1db4e394-a858-4ab0-b051-3135c5516405","Type":"ContainerDied","Data":"432eed6501e4dd0c8edfec2dc03b69fa0c40d02025b3c5ce5aa95a7b8c5c0569"} Oct 03 08:52:51 crc kubenswrapper[4790]: I1003 08:52:51.498658 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cblgh2" event={"ID":"1db4e394-a858-4ab0-b051-3135c5516405","Type":"ContainerStarted","Data":"02ed722dbb6d10b77a265d4c2bb327636603d478f7bfde7a0c7b8093ea143dec"} Oct 03 08:52:52 crc kubenswrapper[4790]: I1003 08:52:52.447007 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-qrfh8"] Oct 03 08:52:52 crc kubenswrapper[4790]: I1003 08:52:52.448373 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qrfh8" Oct 03 08:52:52 crc kubenswrapper[4790]: I1003 08:52:52.475546 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-qrfh8"] Oct 03 08:52:52 crc kubenswrapper[4790]: I1003 08:52:52.552203 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lwcmk\" (UniqueName: \"kubernetes.io/projected/2d0fa506-504c-426e-b43e-d8f46b00fcd7-kube-api-access-lwcmk\") pod \"redhat-operators-qrfh8\" (UID: \"2d0fa506-504c-426e-b43e-d8f46b00fcd7\") " pod="openshift-marketplace/redhat-operators-qrfh8" Oct 03 08:52:52 crc kubenswrapper[4790]: I1003 08:52:52.552284 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2d0fa506-504c-426e-b43e-d8f46b00fcd7-catalog-content\") pod \"redhat-operators-qrfh8\" (UID: \"2d0fa506-504c-426e-b43e-d8f46b00fcd7\") " pod="openshift-marketplace/redhat-operators-qrfh8" Oct 03 08:52:52 crc kubenswrapper[4790]: I1003 08:52:52.552328 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2d0fa506-504c-426e-b43e-d8f46b00fcd7-utilities\") pod \"redhat-operators-qrfh8\" (UID: \"2d0fa506-504c-426e-b43e-d8f46b00fcd7\") " pod="openshift-marketplace/redhat-operators-qrfh8" Oct 03 08:52:52 crc kubenswrapper[4790]: I1003 08:52:52.654164 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2d0fa506-504c-426e-b43e-d8f46b00fcd7-utilities\") pod \"redhat-operators-qrfh8\" (UID: \"2d0fa506-504c-426e-b43e-d8f46b00fcd7\") " pod="openshift-marketplace/redhat-operators-qrfh8" Oct 03 08:52:52 crc kubenswrapper[4790]: I1003 08:52:52.654276 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lwcmk\" (UniqueName: \"kubernetes.io/projected/2d0fa506-504c-426e-b43e-d8f46b00fcd7-kube-api-access-lwcmk\") pod \"redhat-operators-qrfh8\" (UID: \"2d0fa506-504c-426e-b43e-d8f46b00fcd7\") " pod="openshift-marketplace/redhat-operators-qrfh8" Oct 03 08:52:52 crc kubenswrapper[4790]: I1003 08:52:52.654313 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2d0fa506-504c-426e-b43e-d8f46b00fcd7-catalog-content\") pod \"redhat-operators-qrfh8\" (UID: \"2d0fa506-504c-426e-b43e-d8f46b00fcd7\") " pod="openshift-marketplace/redhat-operators-qrfh8" Oct 03 08:52:52 crc kubenswrapper[4790]: I1003 08:52:52.654864 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2d0fa506-504c-426e-b43e-d8f46b00fcd7-catalog-content\") pod \"redhat-operators-qrfh8\" (UID: \"2d0fa506-504c-426e-b43e-d8f46b00fcd7\") " pod="openshift-marketplace/redhat-operators-qrfh8" Oct 03 08:52:52 crc kubenswrapper[4790]: I1003 08:52:52.655138 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2d0fa506-504c-426e-b43e-d8f46b00fcd7-utilities\") pod \"redhat-operators-qrfh8\" (UID: \"2d0fa506-504c-426e-b43e-d8f46b00fcd7\") " pod="openshift-marketplace/redhat-operators-qrfh8" Oct 03 08:52:52 crc kubenswrapper[4790]: I1003 08:52:52.687228 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lwcmk\" (UniqueName: \"kubernetes.io/projected/2d0fa506-504c-426e-b43e-d8f46b00fcd7-kube-api-access-lwcmk\") pod \"redhat-operators-qrfh8\" (UID: \"2d0fa506-504c-426e-b43e-d8f46b00fcd7\") " pod="openshift-marketplace/redhat-operators-qrfh8" Oct 03 08:52:52 crc kubenswrapper[4790]: I1003 08:52:52.772056 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qrfh8" Oct 03 08:52:53 crc kubenswrapper[4790]: I1003 08:52:53.264314 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-qrfh8"] Oct 03 08:52:53 crc kubenswrapper[4790]: I1003 08:52:53.512667 4790 generic.go:334] "Generic (PLEG): container finished" podID="1db4e394-a858-4ab0-b051-3135c5516405" containerID="31aba4caa4546b63798cdd3326a2eefdaf740768fccd172905c9fa2748db5f53" exitCode=0 Oct 03 08:52:53 crc kubenswrapper[4790]: I1003 08:52:53.512727 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cblgh2" event={"ID":"1db4e394-a858-4ab0-b051-3135c5516405","Type":"ContainerDied","Data":"31aba4caa4546b63798cdd3326a2eefdaf740768fccd172905c9fa2748db5f53"} Oct 03 08:52:53 crc kubenswrapper[4790]: I1003 08:52:53.514418 4790 generic.go:334] "Generic (PLEG): container finished" podID="2d0fa506-504c-426e-b43e-d8f46b00fcd7" containerID="00d18e7d67f3cfd792fe9b657acdc8618cb0da6f40e859543758938462b4c7a4" exitCode=0 Oct 03 08:52:53 crc kubenswrapper[4790]: I1003 08:52:53.514451 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qrfh8" event={"ID":"2d0fa506-504c-426e-b43e-d8f46b00fcd7","Type":"ContainerDied","Data":"00d18e7d67f3cfd792fe9b657acdc8618cb0da6f40e859543758938462b4c7a4"} Oct 03 08:52:53 crc kubenswrapper[4790]: I1003 08:52:53.514480 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qrfh8" event={"ID":"2d0fa506-504c-426e-b43e-d8f46b00fcd7","Type":"ContainerStarted","Data":"a2b8926605ea07dc3aba4b6b8370249f71a5807bd4cd287887f23c03f9205463"} Oct 03 08:52:54 crc kubenswrapper[4790]: I1003 08:52:54.525844 4790 generic.go:334] "Generic (PLEG): container finished" podID="1db4e394-a858-4ab0-b051-3135c5516405" containerID="fab5811e302f9cd88b40f450d0d72e36f5d8d896d54308055d94985a85235ff6" exitCode=0 Oct 03 08:52:54 crc kubenswrapper[4790]: I1003 08:52:54.526054 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cblgh2" event={"ID":"1db4e394-a858-4ab0-b051-3135c5516405","Type":"ContainerDied","Data":"fab5811e302f9cd88b40f450d0d72e36f5d8d896d54308055d94985a85235ff6"} Oct 03 08:52:54 crc kubenswrapper[4790]: I1003 08:52:54.529024 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qrfh8" event={"ID":"2d0fa506-504c-426e-b43e-d8f46b00fcd7","Type":"ContainerStarted","Data":"f4d0e24c0b4008855fd52bbf65c3b9488f294b37a16e151700a199d11dd3737b"} Oct 03 08:52:55 crc kubenswrapper[4790]: I1003 08:52:55.536553 4790 generic.go:334] "Generic (PLEG): container finished" podID="2d0fa506-504c-426e-b43e-d8f46b00fcd7" containerID="f4d0e24c0b4008855fd52bbf65c3b9488f294b37a16e151700a199d11dd3737b" exitCode=0 Oct 03 08:52:55 crc kubenswrapper[4790]: I1003 08:52:55.538278 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qrfh8" event={"ID":"2d0fa506-504c-426e-b43e-d8f46b00fcd7","Type":"ContainerDied","Data":"f4d0e24c0b4008855fd52bbf65c3b9488f294b37a16e151700a199d11dd3737b"} Oct 03 08:52:55 crc kubenswrapper[4790]: I1003 08:52:55.892629 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cblgh2" Oct 03 08:52:55 crc kubenswrapper[4790]: I1003 08:52:55.998147 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1db4e394-a858-4ab0-b051-3135c5516405-bundle\") pod \"1db4e394-a858-4ab0-b051-3135c5516405\" (UID: \"1db4e394-a858-4ab0-b051-3135c5516405\") " Oct 03 08:52:55 crc kubenswrapper[4790]: I1003 08:52:55.998291 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1db4e394-a858-4ab0-b051-3135c5516405-util\") pod \"1db4e394-a858-4ab0-b051-3135c5516405\" (UID: \"1db4e394-a858-4ab0-b051-3135c5516405\") " Oct 03 08:52:55 crc kubenswrapper[4790]: I1003 08:52:55.998351 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9cwv5\" (UniqueName: \"kubernetes.io/projected/1db4e394-a858-4ab0-b051-3135c5516405-kube-api-access-9cwv5\") pod \"1db4e394-a858-4ab0-b051-3135c5516405\" (UID: \"1db4e394-a858-4ab0-b051-3135c5516405\") " Oct 03 08:52:55 crc kubenswrapper[4790]: I1003 08:52:55.998950 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1db4e394-a858-4ab0-b051-3135c5516405-bundle" (OuterVolumeSpecName: "bundle") pod "1db4e394-a858-4ab0-b051-3135c5516405" (UID: "1db4e394-a858-4ab0-b051-3135c5516405"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 08:52:56 crc kubenswrapper[4790]: I1003 08:52:56.009891 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1db4e394-a858-4ab0-b051-3135c5516405-kube-api-access-9cwv5" (OuterVolumeSpecName: "kube-api-access-9cwv5") pod "1db4e394-a858-4ab0-b051-3135c5516405" (UID: "1db4e394-a858-4ab0-b051-3135c5516405"). InnerVolumeSpecName "kube-api-access-9cwv5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:52:56 crc kubenswrapper[4790]: I1003 08:52:56.022809 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1db4e394-a858-4ab0-b051-3135c5516405-util" (OuterVolumeSpecName: "util") pod "1db4e394-a858-4ab0-b051-3135c5516405" (UID: "1db4e394-a858-4ab0-b051-3135c5516405"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 08:52:56 crc kubenswrapper[4790]: I1003 08:52:56.100694 4790 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1db4e394-a858-4ab0-b051-3135c5516405-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 08:52:56 crc kubenswrapper[4790]: I1003 08:52:56.100755 4790 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1db4e394-a858-4ab0-b051-3135c5516405-util\") on node \"crc\" DevicePath \"\"" Oct 03 08:52:56 crc kubenswrapper[4790]: I1003 08:52:56.100774 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9cwv5\" (UniqueName: \"kubernetes.io/projected/1db4e394-a858-4ab0-b051-3135c5516405-kube-api-access-9cwv5\") on node \"crc\" DevicePath \"\"" Oct 03 08:52:56 crc kubenswrapper[4790]: I1003 08:52:56.550447 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qrfh8" event={"ID":"2d0fa506-504c-426e-b43e-d8f46b00fcd7","Type":"ContainerStarted","Data":"de48adb1293998839eedc137fd002e845fba7053d13c6c92524f6279f39fdfbf"} Oct 03 08:52:56 crc kubenswrapper[4790]: I1003 08:52:56.553251 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cblgh2" event={"ID":"1db4e394-a858-4ab0-b051-3135c5516405","Type":"ContainerDied","Data":"02ed722dbb6d10b77a265d4c2bb327636603d478f7bfde7a0c7b8093ea143dec"} Oct 03 08:52:56 crc kubenswrapper[4790]: I1003 08:52:56.553296 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="02ed722dbb6d10b77a265d4c2bb327636603d478f7bfde7a0c7b8093ea143dec" Oct 03 08:52:56 crc kubenswrapper[4790]: I1003 08:52:56.553317 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cblgh2" Oct 03 08:52:56 crc kubenswrapper[4790]: I1003 08:52:56.575162 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-qrfh8" podStartSLOduration=2.166604162 podStartE2EDuration="4.575122581s" podCreationTimestamp="2025-10-03 08:52:52 +0000 UTC" firstStartedPulling="2025-10-03 08:52:53.515446816 +0000 UTC m=+759.868381054" lastFinishedPulling="2025-10-03 08:52:55.923965235 +0000 UTC m=+762.276899473" observedRunningTime="2025-10-03 08:52:56.571961966 +0000 UTC m=+762.924896224" watchObservedRunningTime="2025-10-03 08:52:56.575122581 +0000 UTC m=+762.928056859" Oct 03 08:52:58 crc kubenswrapper[4790]: I1003 08:52:58.963970 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-operator-858ddd8f98-kgcvn"] Oct 03 08:52:58 crc kubenswrapper[4790]: E1003 08:52:58.964616 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1db4e394-a858-4ab0-b051-3135c5516405" containerName="pull" Oct 03 08:52:58 crc kubenswrapper[4790]: I1003 08:52:58.964636 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="1db4e394-a858-4ab0-b051-3135c5516405" containerName="pull" Oct 03 08:52:58 crc kubenswrapper[4790]: E1003 08:52:58.964694 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1db4e394-a858-4ab0-b051-3135c5516405" containerName="extract" Oct 03 08:52:58 crc kubenswrapper[4790]: I1003 08:52:58.964703 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="1db4e394-a858-4ab0-b051-3135c5516405" containerName="extract" Oct 03 08:52:58 crc kubenswrapper[4790]: E1003 08:52:58.964714 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1db4e394-a858-4ab0-b051-3135c5516405" containerName="util" Oct 03 08:52:58 crc kubenswrapper[4790]: I1003 08:52:58.964722 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="1db4e394-a858-4ab0-b051-3135c5516405" containerName="util" Oct 03 08:52:58 crc kubenswrapper[4790]: I1003 08:52:58.964863 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="1db4e394-a858-4ab0-b051-3135c5516405" containerName="extract" Oct 03 08:52:58 crc kubenswrapper[4790]: I1003 08:52:58.965403 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-858ddd8f98-kgcvn" Oct 03 08:52:58 crc kubenswrapper[4790]: I1003 08:52:58.967549 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Oct 03 08:52:58 crc kubenswrapper[4790]: I1003 08:52:58.967549 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Oct 03 08:52:58 crc kubenswrapper[4790]: I1003 08:52:58.967974 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-operator-dockercfg-gsggm" Oct 03 08:52:58 crc kubenswrapper[4790]: I1003 08:52:58.978301 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-858ddd8f98-kgcvn"] Oct 03 08:52:59 crc kubenswrapper[4790]: I1003 08:52:59.039196 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2ffgb\" (UniqueName: \"kubernetes.io/projected/5fb9da29-2f74-417d-8394-8f8fa9fea3ec-kube-api-access-2ffgb\") pod \"nmstate-operator-858ddd8f98-kgcvn\" (UID: \"5fb9da29-2f74-417d-8394-8f8fa9fea3ec\") " pod="openshift-nmstate/nmstate-operator-858ddd8f98-kgcvn" Oct 03 08:52:59 crc kubenswrapper[4790]: I1003 08:52:59.140453 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2ffgb\" (UniqueName: \"kubernetes.io/projected/5fb9da29-2f74-417d-8394-8f8fa9fea3ec-kube-api-access-2ffgb\") pod \"nmstate-operator-858ddd8f98-kgcvn\" (UID: \"5fb9da29-2f74-417d-8394-8f8fa9fea3ec\") " pod="openshift-nmstate/nmstate-operator-858ddd8f98-kgcvn" Oct 03 08:52:59 crc kubenswrapper[4790]: I1003 08:52:59.159931 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2ffgb\" (UniqueName: \"kubernetes.io/projected/5fb9da29-2f74-417d-8394-8f8fa9fea3ec-kube-api-access-2ffgb\") pod \"nmstate-operator-858ddd8f98-kgcvn\" (UID: \"5fb9da29-2f74-417d-8394-8f8fa9fea3ec\") " pod="openshift-nmstate/nmstate-operator-858ddd8f98-kgcvn" Oct 03 08:52:59 crc kubenswrapper[4790]: I1003 08:52:59.286850 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-858ddd8f98-kgcvn" Oct 03 08:52:59 crc kubenswrapper[4790]: I1003 08:52:59.716171 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-858ddd8f98-kgcvn"] Oct 03 08:52:59 crc kubenswrapper[4790]: W1003 08:52:59.726244 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5fb9da29_2f74_417d_8394_8f8fa9fea3ec.slice/crio-4dba3a016e527af1a2eed24c3f8654fa7934e9e4cf0eb381c282aca83608c27b WatchSource:0}: Error finding container 4dba3a016e527af1a2eed24c3f8654fa7934e9e4cf0eb381c282aca83608c27b: Status 404 returned error can't find the container with id 4dba3a016e527af1a2eed24c3f8654fa7934e9e4cf0eb381c282aca83608c27b Oct 03 08:53:00 crc kubenswrapper[4790]: I1003 08:53:00.576765 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-858ddd8f98-kgcvn" event={"ID":"5fb9da29-2f74-417d-8394-8f8fa9fea3ec","Type":"ContainerStarted","Data":"4dba3a016e527af1a2eed24c3f8654fa7934e9e4cf0eb381c282aca83608c27b"} Oct 03 08:53:02 crc kubenswrapper[4790]: I1003 08:53:02.772685 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-qrfh8" Oct 03 08:53:02 crc kubenswrapper[4790]: I1003 08:53:02.773169 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-qrfh8" Oct 03 08:53:02 crc kubenswrapper[4790]: I1003 08:53:02.819862 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-qrfh8" Oct 03 08:53:03 crc kubenswrapper[4790]: I1003 08:53:03.594879 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-858ddd8f98-kgcvn" event={"ID":"5fb9da29-2f74-417d-8394-8f8fa9fea3ec","Type":"ContainerStarted","Data":"875989b8da7e4b7e6479ce3b4f053effd62554071cdb6cf23012f4655e35df7b"} Oct 03 08:53:03 crc kubenswrapper[4790]: I1003 08:53:03.624884 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-operator-858ddd8f98-kgcvn" podStartSLOduration=2.341350111 podStartE2EDuration="5.624865288s" podCreationTimestamp="2025-10-03 08:52:58 +0000 UTC" firstStartedPulling="2025-10-03 08:52:59.730207358 +0000 UTC m=+766.083141596" lastFinishedPulling="2025-10-03 08:53:03.013722535 +0000 UTC m=+769.366656773" observedRunningTime="2025-10-03 08:53:03.623522141 +0000 UTC m=+769.976456389" watchObservedRunningTime="2025-10-03 08:53:03.624865288 +0000 UTC m=+769.977799526" Oct 03 08:53:03 crc kubenswrapper[4790]: I1003 08:53:03.661816 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-qrfh8" Oct 03 08:53:04 crc kubenswrapper[4790]: I1003 08:53:04.591144 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-metrics-fdff9cb8d-hrwtp"] Oct 03 08:53:04 crc kubenswrapper[4790]: I1003 08:53:04.592352 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-hrwtp" Oct 03 08:53:04 crc kubenswrapper[4790]: I1003 08:53:04.598655 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-handler-dockercfg-fz855" Oct 03 08:53:04 crc kubenswrapper[4790]: I1003 08:53:04.603153 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-fdff9cb8d-hrwtp"] Oct 03 08:53:04 crc kubenswrapper[4790]: I1003 08:53:04.608787 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-webhook-6cdbc54649-l4kg8"] Oct 03 08:53:04 crc kubenswrapper[4790]: I1003 08:53:04.614076 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-l4kg8" Oct 03 08:53:04 crc kubenswrapper[4790]: I1003 08:53:04.621747 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook" Oct 03 08:53:04 crc kubenswrapper[4790]: I1003 08:53:04.640672 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-6cdbc54649-l4kg8"] Oct 03 08:53:04 crc kubenswrapper[4790]: I1003 08:53:04.654571 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-handler-rdglh"] Oct 03 08:53:04 crc kubenswrapper[4790]: I1003 08:53:04.655889 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-rdglh" Oct 03 08:53:04 crc kubenswrapper[4790]: I1003 08:53:04.722002 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qwzsq\" (UniqueName: \"kubernetes.io/projected/525f7376-4012-46b8-b79b-0c7bfb288354-kube-api-access-qwzsq\") pod \"nmstate-handler-rdglh\" (UID: \"525f7376-4012-46b8-b79b-0c7bfb288354\") " pod="openshift-nmstate/nmstate-handler-rdglh" Oct 03 08:53:04 crc kubenswrapper[4790]: I1003 08:53:04.722046 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2kzbd\" (UniqueName: \"kubernetes.io/projected/8ad5530d-7fb8-4c5b-b0ef-df6837ebc877-kube-api-access-2kzbd\") pod \"nmstate-webhook-6cdbc54649-l4kg8\" (UID: \"8ad5530d-7fb8-4c5b-b0ef-df6837ebc877\") " pod="openshift-nmstate/nmstate-webhook-6cdbc54649-l4kg8" Oct 03 08:53:04 crc kubenswrapper[4790]: I1003 08:53:04.722074 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cdpjz\" (UniqueName: \"kubernetes.io/projected/d542c730-e799-4975-9b94-ae826e4706cf-kube-api-access-cdpjz\") pod \"nmstate-metrics-fdff9cb8d-hrwtp\" (UID: \"d542c730-e799-4975-9b94-ae826e4706cf\") " pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-hrwtp" Oct 03 08:53:04 crc kubenswrapper[4790]: I1003 08:53:04.722295 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/525f7376-4012-46b8-b79b-0c7bfb288354-ovs-socket\") pod \"nmstate-handler-rdglh\" (UID: \"525f7376-4012-46b8-b79b-0c7bfb288354\") " pod="openshift-nmstate/nmstate-handler-rdglh" Oct 03 08:53:04 crc kubenswrapper[4790]: I1003 08:53:04.722346 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/525f7376-4012-46b8-b79b-0c7bfb288354-dbus-socket\") pod \"nmstate-handler-rdglh\" (UID: \"525f7376-4012-46b8-b79b-0c7bfb288354\") " pod="openshift-nmstate/nmstate-handler-rdglh" Oct 03 08:53:04 crc kubenswrapper[4790]: I1003 08:53:04.722498 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/8ad5530d-7fb8-4c5b-b0ef-df6837ebc877-tls-key-pair\") pod \"nmstate-webhook-6cdbc54649-l4kg8\" (UID: \"8ad5530d-7fb8-4c5b-b0ef-df6837ebc877\") " pod="openshift-nmstate/nmstate-webhook-6cdbc54649-l4kg8" Oct 03 08:53:04 crc kubenswrapper[4790]: I1003 08:53:04.722532 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/525f7376-4012-46b8-b79b-0c7bfb288354-nmstate-lock\") pod \"nmstate-handler-rdglh\" (UID: \"525f7376-4012-46b8-b79b-0c7bfb288354\") " pod="openshift-nmstate/nmstate-handler-rdglh" Oct 03 08:53:04 crc kubenswrapper[4790]: I1003 08:53:04.745942 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-console-plugin-6b874cbd85-kt62h"] Oct 03 08:53:04 crc kubenswrapper[4790]: I1003 08:53:04.746714 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-kt62h" Oct 03 08:53:04 crc kubenswrapper[4790]: I1003 08:53:04.758165 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"default-dockercfg-6z77r" Oct 03 08:53:04 crc kubenswrapper[4790]: I1003 08:53:04.759902 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"nginx-conf" Oct 03 08:53:04 crc kubenswrapper[4790]: I1003 08:53:04.760013 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"plugin-serving-cert" Oct 03 08:53:04 crc kubenswrapper[4790]: I1003 08:53:04.776407 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-6b874cbd85-kt62h"] Oct 03 08:53:04 crc kubenswrapper[4790]: I1003 08:53:04.823807 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/1594d001-e795-49a2-bd6a-3a9ec48c2e4d-plugin-serving-cert\") pod \"nmstate-console-plugin-6b874cbd85-kt62h\" (UID: \"1594d001-e795-49a2-bd6a-3a9ec48c2e4d\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-kt62h" Oct 03 08:53:04 crc kubenswrapper[4790]: I1003 08:53:04.823876 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qwzsq\" (UniqueName: \"kubernetes.io/projected/525f7376-4012-46b8-b79b-0c7bfb288354-kube-api-access-qwzsq\") pod \"nmstate-handler-rdglh\" (UID: \"525f7376-4012-46b8-b79b-0c7bfb288354\") " pod="openshift-nmstate/nmstate-handler-rdglh" Oct 03 08:53:04 crc kubenswrapper[4790]: I1003 08:53:04.823905 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/1594d001-e795-49a2-bd6a-3a9ec48c2e4d-nginx-conf\") pod \"nmstate-console-plugin-6b874cbd85-kt62h\" (UID: \"1594d001-e795-49a2-bd6a-3a9ec48c2e4d\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-kt62h" Oct 03 08:53:04 crc kubenswrapper[4790]: I1003 08:53:04.823935 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2kzbd\" (UniqueName: \"kubernetes.io/projected/8ad5530d-7fb8-4c5b-b0ef-df6837ebc877-kube-api-access-2kzbd\") pod \"nmstate-webhook-6cdbc54649-l4kg8\" (UID: \"8ad5530d-7fb8-4c5b-b0ef-df6837ebc877\") " pod="openshift-nmstate/nmstate-webhook-6cdbc54649-l4kg8" Oct 03 08:53:04 crc kubenswrapper[4790]: I1003 08:53:04.823954 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cdpjz\" (UniqueName: \"kubernetes.io/projected/d542c730-e799-4975-9b94-ae826e4706cf-kube-api-access-cdpjz\") pod \"nmstate-metrics-fdff9cb8d-hrwtp\" (UID: \"d542c730-e799-4975-9b94-ae826e4706cf\") " pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-hrwtp" Oct 03 08:53:04 crc kubenswrapper[4790]: I1003 08:53:04.823976 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/525f7376-4012-46b8-b79b-0c7bfb288354-ovs-socket\") pod \"nmstate-handler-rdglh\" (UID: \"525f7376-4012-46b8-b79b-0c7bfb288354\") " pod="openshift-nmstate/nmstate-handler-rdglh" Oct 03 08:53:04 crc kubenswrapper[4790]: I1003 08:53:04.823992 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/525f7376-4012-46b8-b79b-0c7bfb288354-dbus-socket\") pod \"nmstate-handler-rdglh\" (UID: \"525f7376-4012-46b8-b79b-0c7bfb288354\") " pod="openshift-nmstate/nmstate-handler-rdglh" Oct 03 08:53:04 crc kubenswrapper[4790]: I1003 08:53:04.824012 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7lcms\" (UniqueName: \"kubernetes.io/projected/1594d001-e795-49a2-bd6a-3a9ec48c2e4d-kube-api-access-7lcms\") pod \"nmstate-console-plugin-6b874cbd85-kt62h\" (UID: \"1594d001-e795-49a2-bd6a-3a9ec48c2e4d\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-kt62h" Oct 03 08:53:04 crc kubenswrapper[4790]: I1003 08:53:04.824046 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/8ad5530d-7fb8-4c5b-b0ef-df6837ebc877-tls-key-pair\") pod \"nmstate-webhook-6cdbc54649-l4kg8\" (UID: \"8ad5530d-7fb8-4c5b-b0ef-df6837ebc877\") " pod="openshift-nmstate/nmstate-webhook-6cdbc54649-l4kg8" Oct 03 08:53:04 crc kubenswrapper[4790]: I1003 08:53:04.824062 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/525f7376-4012-46b8-b79b-0c7bfb288354-nmstate-lock\") pod \"nmstate-handler-rdglh\" (UID: \"525f7376-4012-46b8-b79b-0c7bfb288354\") " pod="openshift-nmstate/nmstate-handler-rdglh" Oct 03 08:53:04 crc kubenswrapper[4790]: I1003 08:53:04.824137 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/525f7376-4012-46b8-b79b-0c7bfb288354-nmstate-lock\") pod \"nmstate-handler-rdglh\" (UID: \"525f7376-4012-46b8-b79b-0c7bfb288354\") " pod="openshift-nmstate/nmstate-handler-rdglh" Oct 03 08:53:04 crc kubenswrapper[4790]: I1003 08:53:04.824822 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/525f7376-4012-46b8-b79b-0c7bfb288354-ovs-socket\") pod \"nmstate-handler-rdglh\" (UID: \"525f7376-4012-46b8-b79b-0c7bfb288354\") " pod="openshift-nmstate/nmstate-handler-rdglh" Oct 03 08:53:04 crc kubenswrapper[4790]: I1003 08:53:04.825120 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/525f7376-4012-46b8-b79b-0c7bfb288354-dbus-socket\") pod \"nmstate-handler-rdglh\" (UID: \"525f7376-4012-46b8-b79b-0c7bfb288354\") " pod="openshift-nmstate/nmstate-handler-rdglh" Oct 03 08:53:04 crc kubenswrapper[4790]: I1003 08:53:04.832448 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/8ad5530d-7fb8-4c5b-b0ef-df6837ebc877-tls-key-pair\") pod \"nmstate-webhook-6cdbc54649-l4kg8\" (UID: \"8ad5530d-7fb8-4c5b-b0ef-df6837ebc877\") " pod="openshift-nmstate/nmstate-webhook-6cdbc54649-l4kg8" Oct 03 08:53:04 crc kubenswrapper[4790]: I1003 08:53:04.846491 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qwzsq\" (UniqueName: \"kubernetes.io/projected/525f7376-4012-46b8-b79b-0c7bfb288354-kube-api-access-qwzsq\") pod \"nmstate-handler-rdglh\" (UID: \"525f7376-4012-46b8-b79b-0c7bfb288354\") " pod="openshift-nmstate/nmstate-handler-rdglh" Oct 03 08:53:04 crc kubenswrapper[4790]: I1003 08:53:04.853168 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cdpjz\" (UniqueName: \"kubernetes.io/projected/d542c730-e799-4975-9b94-ae826e4706cf-kube-api-access-cdpjz\") pod \"nmstate-metrics-fdff9cb8d-hrwtp\" (UID: \"d542c730-e799-4975-9b94-ae826e4706cf\") " pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-hrwtp" Oct 03 08:53:04 crc kubenswrapper[4790]: I1003 08:53:04.863368 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2kzbd\" (UniqueName: \"kubernetes.io/projected/8ad5530d-7fb8-4c5b-b0ef-df6837ebc877-kube-api-access-2kzbd\") pod \"nmstate-webhook-6cdbc54649-l4kg8\" (UID: \"8ad5530d-7fb8-4c5b-b0ef-df6837ebc877\") " pod="openshift-nmstate/nmstate-webhook-6cdbc54649-l4kg8" Oct 03 08:53:04 crc kubenswrapper[4790]: I1003 08:53:04.917907 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-hrwtp" Oct 03 08:53:04 crc kubenswrapper[4790]: I1003 08:53:04.925454 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7lcms\" (UniqueName: \"kubernetes.io/projected/1594d001-e795-49a2-bd6a-3a9ec48c2e4d-kube-api-access-7lcms\") pod \"nmstate-console-plugin-6b874cbd85-kt62h\" (UID: \"1594d001-e795-49a2-bd6a-3a9ec48c2e4d\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-kt62h" Oct 03 08:53:04 crc kubenswrapper[4790]: I1003 08:53:04.925562 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/1594d001-e795-49a2-bd6a-3a9ec48c2e4d-plugin-serving-cert\") pod \"nmstate-console-plugin-6b874cbd85-kt62h\" (UID: \"1594d001-e795-49a2-bd6a-3a9ec48c2e4d\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-kt62h" Oct 03 08:53:04 crc kubenswrapper[4790]: I1003 08:53:04.925638 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/1594d001-e795-49a2-bd6a-3a9ec48c2e4d-nginx-conf\") pod \"nmstate-console-plugin-6b874cbd85-kt62h\" (UID: \"1594d001-e795-49a2-bd6a-3a9ec48c2e4d\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-kt62h" Oct 03 08:53:04 crc kubenswrapper[4790]: I1003 08:53:04.926794 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/1594d001-e795-49a2-bd6a-3a9ec48c2e4d-nginx-conf\") pod \"nmstate-console-plugin-6b874cbd85-kt62h\" (UID: \"1594d001-e795-49a2-bd6a-3a9ec48c2e4d\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-kt62h" Oct 03 08:53:04 crc kubenswrapper[4790]: E1003 08:53:04.927163 4790 secret.go:188] Couldn't get secret openshift-nmstate/plugin-serving-cert: secret "plugin-serving-cert" not found Oct 03 08:53:04 crc kubenswrapper[4790]: E1003 08:53:04.927216 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1594d001-e795-49a2-bd6a-3a9ec48c2e4d-plugin-serving-cert podName:1594d001-e795-49a2-bd6a-3a9ec48c2e4d nodeName:}" failed. No retries permitted until 2025-10-03 08:53:05.427200072 +0000 UTC m=+771.780134310 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "plugin-serving-cert" (UniqueName: "kubernetes.io/secret/1594d001-e795-49a2-bd6a-3a9ec48c2e4d-plugin-serving-cert") pod "nmstate-console-plugin-6b874cbd85-kt62h" (UID: "1594d001-e795-49a2-bd6a-3a9ec48c2e4d") : secret "plugin-serving-cert" not found Oct 03 08:53:04 crc kubenswrapper[4790]: I1003 08:53:04.937754 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-l4kg8" Oct 03 08:53:04 crc kubenswrapper[4790]: I1003 08:53:04.962743 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7lcms\" (UniqueName: \"kubernetes.io/projected/1594d001-e795-49a2-bd6a-3a9ec48c2e4d-kube-api-access-7lcms\") pod \"nmstate-console-plugin-6b874cbd85-kt62h\" (UID: \"1594d001-e795-49a2-bd6a-3a9ec48c2e4d\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-kt62h" Oct 03 08:53:04 crc kubenswrapper[4790]: I1003 08:53:04.979870 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-rdglh" Oct 03 08:53:05 crc kubenswrapper[4790]: I1003 08:53:05.032867 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-57b5448b76-n8nt6"] Oct 03 08:53:05 crc kubenswrapper[4790]: I1003 08:53:05.033922 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-57b5448b76-n8nt6" Oct 03 08:53:05 crc kubenswrapper[4790]: I1003 08:53:05.059149 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-57b5448b76-n8nt6"] Oct 03 08:53:05 crc kubenswrapper[4790]: I1003 08:53:05.128653 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/84ccc043-ea21-497d-84c6-ff6a9fef0604-trusted-ca-bundle\") pod \"console-57b5448b76-n8nt6\" (UID: \"84ccc043-ea21-497d-84c6-ff6a9fef0604\") " pod="openshift-console/console-57b5448b76-n8nt6" Oct 03 08:53:05 crc kubenswrapper[4790]: I1003 08:53:05.129133 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/84ccc043-ea21-497d-84c6-ff6a9fef0604-console-serving-cert\") pod \"console-57b5448b76-n8nt6\" (UID: \"84ccc043-ea21-497d-84c6-ff6a9fef0604\") " pod="openshift-console/console-57b5448b76-n8nt6" Oct 03 08:53:05 crc kubenswrapper[4790]: I1003 08:53:05.129175 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/84ccc043-ea21-497d-84c6-ff6a9fef0604-console-oauth-config\") pod \"console-57b5448b76-n8nt6\" (UID: \"84ccc043-ea21-497d-84c6-ff6a9fef0604\") " pod="openshift-console/console-57b5448b76-n8nt6" Oct 03 08:53:05 crc kubenswrapper[4790]: I1003 08:53:05.129208 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/84ccc043-ea21-497d-84c6-ff6a9fef0604-service-ca\") pod \"console-57b5448b76-n8nt6\" (UID: \"84ccc043-ea21-497d-84c6-ff6a9fef0604\") " pod="openshift-console/console-57b5448b76-n8nt6" Oct 03 08:53:05 crc kubenswrapper[4790]: I1003 08:53:05.129239 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/84ccc043-ea21-497d-84c6-ff6a9fef0604-console-config\") pod \"console-57b5448b76-n8nt6\" (UID: \"84ccc043-ea21-497d-84c6-ff6a9fef0604\") " pod="openshift-console/console-57b5448b76-n8nt6" Oct 03 08:53:05 crc kubenswrapper[4790]: I1003 08:53:05.131681 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x58px\" (UniqueName: \"kubernetes.io/projected/84ccc043-ea21-497d-84c6-ff6a9fef0604-kube-api-access-x58px\") pod \"console-57b5448b76-n8nt6\" (UID: \"84ccc043-ea21-497d-84c6-ff6a9fef0604\") " pod="openshift-console/console-57b5448b76-n8nt6" Oct 03 08:53:05 crc kubenswrapper[4790]: I1003 08:53:05.131824 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/84ccc043-ea21-497d-84c6-ff6a9fef0604-oauth-serving-cert\") pod \"console-57b5448b76-n8nt6\" (UID: \"84ccc043-ea21-497d-84c6-ff6a9fef0604\") " pod="openshift-console/console-57b5448b76-n8nt6" Oct 03 08:53:05 crc kubenswrapper[4790]: I1003 08:53:05.233390 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/84ccc043-ea21-497d-84c6-ff6a9fef0604-trusted-ca-bundle\") pod \"console-57b5448b76-n8nt6\" (UID: \"84ccc043-ea21-497d-84c6-ff6a9fef0604\") " pod="openshift-console/console-57b5448b76-n8nt6" Oct 03 08:53:05 crc kubenswrapper[4790]: I1003 08:53:05.233475 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/84ccc043-ea21-497d-84c6-ff6a9fef0604-console-serving-cert\") pod \"console-57b5448b76-n8nt6\" (UID: \"84ccc043-ea21-497d-84c6-ff6a9fef0604\") " pod="openshift-console/console-57b5448b76-n8nt6" Oct 03 08:53:05 crc kubenswrapper[4790]: I1003 08:53:05.233499 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/84ccc043-ea21-497d-84c6-ff6a9fef0604-console-oauth-config\") pod \"console-57b5448b76-n8nt6\" (UID: \"84ccc043-ea21-497d-84c6-ff6a9fef0604\") " pod="openshift-console/console-57b5448b76-n8nt6" Oct 03 08:53:05 crc kubenswrapper[4790]: I1003 08:53:05.233528 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/84ccc043-ea21-497d-84c6-ff6a9fef0604-console-config\") pod \"console-57b5448b76-n8nt6\" (UID: \"84ccc043-ea21-497d-84c6-ff6a9fef0604\") " pod="openshift-console/console-57b5448b76-n8nt6" Oct 03 08:53:05 crc kubenswrapper[4790]: I1003 08:53:05.233552 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/84ccc043-ea21-497d-84c6-ff6a9fef0604-service-ca\") pod \"console-57b5448b76-n8nt6\" (UID: \"84ccc043-ea21-497d-84c6-ff6a9fef0604\") " pod="openshift-console/console-57b5448b76-n8nt6" Oct 03 08:53:05 crc kubenswrapper[4790]: I1003 08:53:05.233610 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x58px\" (UniqueName: \"kubernetes.io/projected/84ccc043-ea21-497d-84c6-ff6a9fef0604-kube-api-access-x58px\") pod \"console-57b5448b76-n8nt6\" (UID: \"84ccc043-ea21-497d-84c6-ff6a9fef0604\") " pod="openshift-console/console-57b5448b76-n8nt6" Oct 03 08:53:05 crc kubenswrapper[4790]: I1003 08:53:05.233653 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/84ccc043-ea21-497d-84c6-ff6a9fef0604-oauth-serving-cert\") pod \"console-57b5448b76-n8nt6\" (UID: \"84ccc043-ea21-497d-84c6-ff6a9fef0604\") " pod="openshift-console/console-57b5448b76-n8nt6" Oct 03 08:53:05 crc kubenswrapper[4790]: I1003 08:53:05.234832 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/84ccc043-ea21-497d-84c6-ff6a9fef0604-oauth-serving-cert\") pod \"console-57b5448b76-n8nt6\" (UID: \"84ccc043-ea21-497d-84c6-ff6a9fef0604\") " pod="openshift-console/console-57b5448b76-n8nt6" Oct 03 08:53:05 crc kubenswrapper[4790]: I1003 08:53:05.236039 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/84ccc043-ea21-497d-84c6-ff6a9fef0604-service-ca\") pod \"console-57b5448b76-n8nt6\" (UID: \"84ccc043-ea21-497d-84c6-ff6a9fef0604\") " pod="openshift-console/console-57b5448b76-n8nt6" Oct 03 08:53:05 crc kubenswrapper[4790]: I1003 08:53:05.236053 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/84ccc043-ea21-497d-84c6-ff6a9fef0604-trusted-ca-bundle\") pod \"console-57b5448b76-n8nt6\" (UID: \"84ccc043-ea21-497d-84c6-ff6a9fef0604\") " pod="openshift-console/console-57b5448b76-n8nt6" Oct 03 08:53:05 crc kubenswrapper[4790]: I1003 08:53:05.236814 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/84ccc043-ea21-497d-84c6-ff6a9fef0604-console-config\") pod \"console-57b5448b76-n8nt6\" (UID: \"84ccc043-ea21-497d-84c6-ff6a9fef0604\") " pod="openshift-console/console-57b5448b76-n8nt6" Oct 03 08:53:05 crc kubenswrapper[4790]: I1003 08:53:05.239479 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-qrfh8"] Oct 03 08:53:05 crc kubenswrapper[4790]: I1003 08:53:05.240790 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/84ccc043-ea21-497d-84c6-ff6a9fef0604-console-serving-cert\") pod \"console-57b5448b76-n8nt6\" (UID: \"84ccc043-ea21-497d-84c6-ff6a9fef0604\") " pod="openshift-console/console-57b5448b76-n8nt6" Oct 03 08:53:05 crc kubenswrapper[4790]: I1003 08:53:05.241180 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/84ccc043-ea21-497d-84c6-ff6a9fef0604-console-oauth-config\") pod \"console-57b5448b76-n8nt6\" (UID: \"84ccc043-ea21-497d-84c6-ff6a9fef0604\") " pod="openshift-console/console-57b5448b76-n8nt6" Oct 03 08:53:05 crc kubenswrapper[4790]: I1003 08:53:05.252167 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x58px\" (UniqueName: \"kubernetes.io/projected/84ccc043-ea21-497d-84c6-ff6a9fef0604-kube-api-access-x58px\") pod \"console-57b5448b76-n8nt6\" (UID: \"84ccc043-ea21-497d-84c6-ff6a9fef0604\") " pod="openshift-console/console-57b5448b76-n8nt6" Oct 03 08:53:05 crc kubenswrapper[4790]: I1003 08:53:05.361733 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-57b5448b76-n8nt6" Oct 03 08:53:05 crc kubenswrapper[4790]: I1003 08:53:05.436277 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/1594d001-e795-49a2-bd6a-3a9ec48c2e4d-plugin-serving-cert\") pod \"nmstate-console-plugin-6b874cbd85-kt62h\" (UID: \"1594d001-e795-49a2-bd6a-3a9ec48c2e4d\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-kt62h" Oct 03 08:53:05 crc kubenswrapper[4790]: I1003 08:53:05.441883 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/1594d001-e795-49a2-bd6a-3a9ec48c2e4d-plugin-serving-cert\") pod \"nmstate-console-plugin-6b874cbd85-kt62h\" (UID: \"1594d001-e795-49a2-bd6a-3a9ec48c2e4d\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-kt62h" Oct 03 08:53:05 crc kubenswrapper[4790]: I1003 08:53:05.446253 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-fdff9cb8d-hrwtp"] Oct 03 08:53:05 crc kubenswrapper[4790]: I1003 08:53:05.451836 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-6cdbc54649-l4kg8"] Oct 03 08:53:05 crc kubenswrapper[4790]: W1003 08:53:05.454438 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8ad5530d_7fb8_4c5b_b0ef_df6837ebc877.slice/crio-0cb306b20a0cb492672f893804a157bfcece69340a0ccf71c0ed99b670844dfb WatchSource:0}: Error finding container 0cb306b20a0cb492672f893804a157bfcece69340a0ccf71c0ed99b670844dfb: Status 404 returned error can't find the container with id 0cb306b20a0cb492672f893804a157bfcece69340a0ccf71c0ed99b670844dfb Oct 03 08:53:05 crc kubenswrapper[4790]: I1003 08:53:05.610258 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-rdglh" event={"ID":"525f7376-4012-46b8-b79b-0c7bfb288354","Type":"ContainerStarted","Data":"ba424331c6f774d95bee0c50f83420aaf0dbc2b5ad14324371f4ee3b978815f2"} Oct 03 08:53:05 crc kubenswrapper[4790]: I1003 08:53:05.611941 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-hrwtp" event={"ID":"d542c730-e799-4975-9b94-ae826e4706cf","Type":"ContainerStarted","Data":"82f89eec31eaf3a886dd5c755ed4ffc6716c693d1857e85effecc7bd73fcd542"} Oct 03 08:53:05 crc kubenswrapper[4790]: I1003 08:53:05.613334 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-l4kg8" event={"ID":"8ad5530d-7fb8-4c5b-b0ef-df6837ebc877","Type":"ContainerStarted","Data":"0cb306b20a0cb492672f893804a157bfcece69340a0ccf71c0ed99b670844dfb"} Oct 03 08:53:05 crc kubenswrapper[4790]: I1003 08:53:05.613539 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-qrfh8" podUID="2d0fa506-504c-426e-b43e-d8f46b00fcd7" containerName="registry-server" containerID="cri-o://de48adb1293998839eedc137fd002e845fba7053d13c6c92524f6279f39fdfbf" gracePeriod=2 Oct 03 08:53:05 crc kubenswrapper[4790]: I1003 08:53:05.660449 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-kt62h" Oct 03 08:53:05 crc kubenswrapper[4790]: I1003 08:53:05.793811 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-57b5448b76-n8nt6"] Oct 03 08:53:06 crc kubenswrapper[4790]: I1003 08:53:06.104196 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-6b874cbd85-kt62h"] Oct 03 08:53:06 crc kubenswrapper[4790]: I1003 08:53:06.146453 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qrfh8" Oct 03 08:53:06 crc kubenswrapper[4790]: I1003 08:53:06.249146 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lwcmk\" (UniqueName: \"kubernetes.io/projected/2d0fa506-504c-426e-b43e-d8f46b00fcd7-kube-api-access-lwcmk\") pod \"2d0fa506-504c-426e-b43e-d8f46b00fcd7\" (UID: \"2d0fa506-504c-426e-b43e-d8f46b00fcd7\") " Oct 03 08:53:06 crc kubenswrapper[4790]: I1003 08:53:06.249312 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2d0fa506-504c-426e-b43e-d8f46b00fcd7-utilities\") pod \"2d0fa506-504c-426e-b43e-d8f46b00fcd7\" (UID: \"2d0fa506-504c-426e-b43e-d8f46b00fcd7\") " Oct 03 08:53:06 crc kubenswrapper[4790]: I1003 08:53:06.249363 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2d0fa506-504c-426e-b43e-d8f46b00fcd7-catalog-content\") pod \"2d0fa506-504c-426e-b43e-d8f46b00fcd7\" (UID: \"2d0fa506-504c-426e-b43e-d8f46b00fcd7\") " Oct 03 08:53:06 crc kubenswrapper[4790]: I1003 08:53:06.250944 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2d0fa506-504c-426e-b43e-d8f46b00fcd7-utilities" (OuterVolumeSpecName: "utilities") pod "2d0fa506-504c-426e-b43e-d8f46b00fcd7" (UID: "2d0fa506-504c-426e-b43e-d8f46b00fcd7"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 08:53:06 crc kubenswrapper[4790]: I1003 08:53:06.256915 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2d0fa506-504c-426e-b43e-d8f46b00fcd7-kube-api-access-lwcmk" (OuterVolumeSpecName: "kube-api-access-lwcmk") pod "2d0fa506-504c-426e-b43e-d8f46b00fcd7" (UID: "2d0fa506-504c-426e-b43e-d8f46b00fcd7"). InnerVolumeSpecName "kube-api-access-lwcmk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:53:06 crc kubenswrapper[4790]: I1003 08:53:06.332314 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2d0fa506-504c-426e-b43e-d8f46b00fcd7-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2d0fa506-504c-426e-b43e-d8f46b00fcd7" (UID: "2d0fa506-504c-426e-b43e-d8f46b00fcd7"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 08:53:06 crc kubenswrapper[4790]: I1003 08:53:06.350702 4790 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2d0fa506-504c-426e-b43e-d8f46b00fcd7-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 08:53:06 crc kubenswrapper[4790]: I1003 08:53:06.350738 4790 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2d0fa506-504c-426e-b43e-d8f46b00fcd7-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 08:53:06 crc kubenswrapper[4790]: I1003 08:53:06.350754 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lwcmk\" (UniqueName: \"kubernetes.io/projected/2d0fa506-504c-426e-b43e-d8f46b00fcd7-kube-api-access-lwcmk\") on node \"crc\" DevicePath \"\"" Oct 03 08:53:06 crc kubenswrapper[4790]: I1003 08:53:06.620609 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-kt62h" event={"ID":"1594d001-e795-49a2-bd6a-3a9ec48c2e4d","Type":"ContainerStarted","Data":"90ee98b17557cb95a7a08ddb8551d266b7f4d3fe77a638f1ead13da7b17bab8f"} Oct 03 08:53:06 crc kubenswrapper[4790]: I1003 08:53:06.623602 4790 generic.go:334] "Generic (PLEG): container finished" podID="2d0fa506-504c-426e-b43e-d8f46b00fcd7" containerID="de48adb1293998839eedc137fd002e845fba7053d13c6c92524f6279f39fdfbf" exitCode=0 Oct 03 08:53:06 crc kubenswrapper[4790]: I1003 08:53:06.623686 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qrfh8" Oct 03 08:53:06 crc kubenswrapper[4790]: I1003 08:53:06.623683 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qrfh8" event={"ID":"2d0fa506-504c-426e-b43e-d8f46b00fcd7","Type":"ContainerDied","Data":"de48adb1293998839eedc137fd002e845fba7053d13c6c92524f6279f39fdfbf"} Oct 03 08:53:06 crc kubenswrapper[4790]: I1003 08:53:06.623847 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qrfh8" event={"ID":"2d0fa506-504c-426e-b43e-d8f46b00fcd7","Type":"ContainerDied","Data":"a2b8926605ea07dc3aba4b6b8370249f71a5807bd4cd287887f23c03f9205463"} Oct 03 08:53:06 crc kubenswrapper[4790]: I1003 08:53:06.623880 4790 scope.go:117] "RemoveContainer" containerID="de48adb1293998839eedc137fd002e845fba7053d13c6c92524f6279f39fdfbf" Oct 03 08:53:06 crc kubenswrapper[4790]: I1003 08:53:06.627267 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-57b5448b76-n8nt6" event={"ID":"84ccc043-ea21-497d-84c6-ff6a9fef0604","Type":"ContainerStarted","Data":"df680adfc67837a076cd6050c02bfe4fafd89a600152715f885e74cdd67068cc"} Oct 03 08:53:06 crc kubenswrapper[4790]: I1003 08:53:06.627319 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-57b5448b76-n8nt6" event={"ID":"84ccc043-ea21-497d-84c6-ff6a9fef0604","Type":"ContainerStarted","Data":"89bd41a59b50019d1b1d3cade8dcdd3f446046c73dc0410cd8397e28d5513f58"} Oct 03 08:53:06 crc kubenswrapper[4790]: I1003 08:53:06.649532 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-57b5448b76-n8nt6" podStartSLOduration=2.649511402 podStartE2EDuration="2.649511402s" podCreationTimestamp="2025-10-03 08:53:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 08:53:06.64683611 +0000 UTC m=+772.999770368" watchObservedRunningTime="2025-10-03 08:53:06.649511402 +0000 UTC m=+773.002445650" Oct 03 08:53:06 crc kubenswrapper[4790]: I1003 08:53:06.651193 4790 scope.go:117] "RemoveContainer" containerID="f4d0e24c0b4008855fd52bbf65c3b9488f294b37a16e151700a199d11dd3737b" Oct 03 08:53:06 crc kubenswrapper[4790]: I1003 08:53:06.666760 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-qrfh8"] Oct 03 08:53:06 crc kubenswrapper[4790]: I1003 08:53:06.667771 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-qrfh8"] Oct 03 08:53:06 crc kubenswrapper[4790]: I1003 08:53:06.675059 4790 scope.go:117] "RemoveContainer" containerID="00d18e7d67f3cfd792fe9b657acdc8618cb0da6f40e859543758938462b4c7a4" Oct 03 08:53:06 crc kubenswrapper[4790]: I1003 08:53:06.690744 4790 scope.go:117] "RemoveContainer" containerID="de48adb1293998839eedc137fd002e845fba7053d13c6c92524f6279f39fdfbf" Oct 03 08:53:06 crc kubenswrapper[4790]: E1003 08:53:06.691127 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"de48adb1293998839eedc137fd002e845fba7053d13c6c92524f6279f39fdfbf\": container with ID starting with de48adb1293998839eedc137fd002e845fba7053d13c6c92524f6279f39fdfbf not found: ID does not exist" containerID="de48adb1293998839eedc137fd002e845fba7053d13c6c92524f6279f39fdfbf" Oct 03 08:53:06 crc kubenswrapper[4790]: I1003 08:53:06.691182 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"de48adb1293998839eedc137fd002e845fba7053d13c6c92524f6279f39fdfbf"} err="failed to get container status \"de48adb1293998839eedc137fd002e845fba7053d13c6c92524f6279f39fdfbf\": rpc error: code = NotFound desc = could not find container \"de48adb1293998839eedc137fd002e845fba7053d13c6c92524f6279f39fdfbf\": container with ID starting with de48adb1293998839eedc137fd002e845fba7053d13c6c92524f6279f39fdfbf not found: ID does not exist" Oct 03 08:53:06 crc kubenswrapper[4790]: I1003 08:53:06.691217 4790 scope.go:117] "RemoveContainer" containerID="f4d0e24c0b4008855fd52bbf65c3b9488f294b37a16e151700a199d11dd3737b" Oct 03 08:53:06 crc kubenswrapper[4790]: E1003 08:53:06.691524 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f4d0e24c0b4008855fd52bbf65c3b9488f294b37a16e151700a199d11dd3737b\": container with ID starting with f4d0e24c0b4008855fd52bbf65c3b9488f294b37a16e151700a199d11dd3737b not found: ID does not exist" containerID="f4d0e24c0b4008855fd52bbf65c3b9488f294b37a16e151700a199d11dd3737b" Oct 03 08:53:06 crc kubenswrapper[4790]: I1003 08:53:06.691556 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f4d0e24c0b4008855fd52bbf65c3b9488f294b37a16e151700a199d11dd3737b"} err="failed to get container status \"f4d0e24c0b4008855fd52bbf65c3b9488f294b37a16e151700a199d11dd3737b\": rpc error: code = NotFound desc = could not find container \"f4d0e24c0b4008855fd52bbf65c3b9488f294b37a16e151700a199d11dd3737b\": container with ID starting with f4d0e24c0b4008855fd52bbf65c3b9488f294b37a16e151700a199d11dd3737b not found: ID does not exist" Oct 03 08:53:06 crc kubenswrapper[4790]: I1003 08:53:06.691590 4790 scope.go:117] "RemoveContainer" containerID="00d18e7d67f3cfd792fe9b657acdc8618cb0da6f40e859543758938462b4c7a4" Oct 03 08:53:06 crc kubenswrapper[4790]: E1003 08:53:06.691969 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"00d18e7d67f3cfd792fe9b657acdc8618cb0da6f40e859543758938462b4c7a4\": container with ID starting with 00d18e7d67f3cfd792fe9b657acdc8618cb0da6f40e859543758938462b4c7a4 not found: ID does not exist" containerID="00d18e7d67f3cfd792fe9b657acdc8618cb0da6f40e859543758938462b4c7a4" Oct 03 08:53:06 crc kubenswrapper[4790]: I1003 08:53:06.692004 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"00d18e7d67f3cfd792fe9b657acdc8618cb0da6f40e859543758938462b4c7a4"} err="failed to get container status \"00d18e7d67f3cfd792fe9b657acdc8618cb0da6f40e859543758938462b4c7a4\": rpc error: code = NotFound desc = could not find container \"00d18e7d67f3cfd792fe9b657acdc8618cb0da6f40e859543758938462b4c7a4\": container with ID starting with 00d18e7d67f3cfd792fe9b657acdc8618cb0da6f40e859543758938462b4c7a4 not found: ID does not exist" Oct 03 08:53:08 crc kubenswrapper[4790]: I1003 08:53:08.338955 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2d0fa506-504c-426e-b43e-d8f46b00fcd7" path="/var/lib/kubelet/pods/2d0fa506-504c-426e-b43e-d8f46b00fcd7/volumes" Oct 03 08:53:08 crc kubenswrapper[4790]: I1003 08:53:08.643559 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-hrwtp" event={"ID":"d542c730-e799-4975-9b94-ae826e4706cf","Type":"ContainerStarted","Data":"5d261d660a957aa512a31257be99f3075ea001bcb637e87182b62a034b4cf193"} Oct 03 08:53:08 crc kubenswrapper[4790]: I1003 08:53:08.646381 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-l4kg8" event={"ID":"8ad5530d-7fb8-4c5b-b0ef-df6837ebc877","Type":"ContainerStarted","Data":"88a5ad45957d31cbc8c8ad956bf5dd2c79c8a5bf95824886c8e4e4aa5b25e34d"} Oct 03 08:53:08 crc kubenswrapper[4790]: I1003 08:53:08.646492 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-l4kg8" Oct 03 08:53:08 crc kubenswrapper[4790]: I1003 08:53:08.649600 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-rdglh" event={"ID":"525f7376-4012-46b8-b79b-0c7bfb288354","Type":"ContainerStarted","Data":"4849a168610577ca6fe2044aff6e6807329de1e5572c1f016ebf8127472e0dbc"} Oct 03 08:53:08 crc kubenswrapper[4790]: I1003 08:53:08.650511 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-rdglh" Oct 03 08:53:08 crc kubenswrapper[4790]: I1003 08:53:08.666370 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-l4kg8" podStartSLOduration=2.487671008 podStartE2EDuration="4.666345303s" podCreationTimestamp="2025-10-03 08:53:04 +0000 UTC" firstStartedPulling="2025-10-03 08:53:05.457096452 +0000 UTC m=+771.810030690" lastFinishedPulling="2025-10-03 08:53:07.635770757 +0000 UTC m=+773.988704985" observedRunningTime="2025-10-03 08:53:08.665975443 +0000 UTC m=+775.018909691" watchObservedRunningTime="2025-10-03 08:53:08.666345303 +0000 UTC m=+775.019279561" Oct 03 08:53:08 crc kubenswrapper[4790]: I1003 08:53:08.694117 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-handler-rdglh" podStartSLOduration=2.170989242 podStartE2EDuration="4.694094818s" podCreationTimestamp="2025-10-03 08:53:04 +0000 UTC" firstStartedPulling="2025-10-03 08:53:05.090224939 +0000 UTC m=+771.443159177" lastFinishedPulling="2025-10-03 08:53:07.613330515 +0000 UTC m=+773.966264753" observedRunningTime="2025-10-03 08:53:08.691018096 +0000 UTC m=+775.043952344" watchObservedRunningTime="2025-10-03 08:53:08.694094818 +0000 UTC m=+775.047029066" Oct 03 08:53:09 crc kubenswrapper[4790]: I1003 08:53:09.664052 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-kt62h" event={"ID":"1594d001-e795-49a2-bd6a-3a9ec48c2e4d","Type":"ContainerStarted","Data":"169056d9e1246a8f3cbb7b2726173bf2b8bd2131aae85e367dd0712ba8ed820b"} Oct 03 08:53:09 crc kubenswrapper[4790]: I1003 08:53:09.681900 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-kt62h" podStartSLOduration=3.272228336 podStartE2EDuration="5.681879795s" podCreationTimestamp="2025-10-03 08:53:04 +0000 UTC" firstStartedPulling="2025-10-03 08:53:06.110774825 +0000 UTC m=+772.463709063" lastFinishedPulling="2025-10-03 08:53:08.520426284 +0000 UTC m=+774.873360522" observedRunningTime="2025-10-03 08:53:09.679489481 +0000 UTC m=+776.032423719" watchObservedRunningTime="2025-10-03 08:53:09.681879795 +0000 UTC m=+776.034814033" Oct 03 08:53:10 crc kubenswrapper[4790]: I1003 08:53:10.186030 4790 patch_prober.go:28] interesting pod/machine-config-daemon-zqj4j container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 08:53:10 crc kubenswrapper[4790]: I1003 08:53:10.186105 4790 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" podUID="36eae06a-d8be-4128-8d68-acb32e1f011b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 08:53:10 crc kubenswrapper[4790]: I1003 08:53:10.186149 4790 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" Oct 03 08:53:10 crc kubenswrapper[4790]: I1003 08:53:10.186764 4790 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"4ae452d98acbb7343cd83b179511e310c36044718e3f86be8c5675a6c5f1473a"} pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 03 08:53:10 crc kubenswrapper[4790]: I1003 08:53:10.186822 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" podUID="36eae06a-d8be-4128-8d68-acb32e1f011b" containerName="machine-config-daemon" containerID="cri-o://4ae452d98acbb7343cd83b179511e310c36044718e3f86be8c5675a6c5f1473a" gracePeriod=600 Oct 03 08:53:10 crc kubenswrapper[4790]: I1003 08:53:10.674058 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-hrwtp" event={"ID":"d542c730-e799-4975-9b94-ae826e4706cf","Type":"ContainerStarted","Data":"3c2e00b359bb21fcd6f4e4d786e3251df22272b4afcc689717bd344867a58554"} Oct 03 08:53:10 crc kubenswrapper[4790]: I1003 08:53:10.678158 4790 generic.go:334] "Generic (PLEG): container finished" podID="36eae06a-d8be-4128-8d68-acb32e1f011b" containerID="4ae452d98acbb7343cd83b179511e310c36044718e3f86be8c5675a6c5f1473a" exitCode=0 Oct 03 08:53:10 crc kubenswrapper[4790]: I1003 08:53:10.678261 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" event={"ID":"36eae06a-d8be-4128-8d68-acb32e1f011b","Type":"ContainerDied","Data":"4ae452d98acbb7343cd83b179511e310c36044718e3f86be8c5675a6c5f1473a"} Oct 03 08:53:10 crc kubenswrapper[4790]: I1003 08:53:10.678334 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" event={"ID":"36eae06a-d8be-4128-8d68-acb32e1f011b","Type":"ContainerStarted","Data":"e2e53bba0341ebe043843fcef071e2d9f340680e77501673a1e1a360df997492"} Oct 03 08:53:10 crc kubenswrapper[4790]: I1003 08:53:10.678359 4790 scope.go:117] "RemoveContainer" containerID="63ef82ccb7061f9ee1f5e4db190568a8e3d5bb60550f8a976b4df71eb3e198ad" Oct 03 08:53:10 crc kubenswrapper[4790]: I1003 08:53:10.697467 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-hrwtp" podStartSLOduration=2.346959048 podStartE2EDuration="6.697442647s" podCreationTimestamp="2025-10-03 08:53:04 +0000 UTC" firstStartedPulling="2025-10-03 08:53:05.450890394 +0000 UTC m=+771.803824632" lastFinishedPulling="2025-10-03 08:53:09.801373973 +0000 UTC m=+776.154308231" observedRunningTime="2025-10-03 08:53:10.692526915 +0000 UTC m=+777.045461163" watchObservedRunningTime="2025-10-03 08:53:10.697442647 +0000 UTC m=+777.050376905" Oct 03 08:53:14 crc kubenswrapper[4790]: I1003 08:53:14.552208 4790 scope.go:117] "RemoveContainer" containerID="9db2214616770ef4914f92507e05e694fad50a555ad92b0ae796b74294560fdd" Oct 03 08:53:15 crc kubenswrapper[4790]: I1003 08:53:15.004783 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-rdglh" Oct 03 08:53:15 crc kubenswrapper[4790]: I1003 08:53:15.362223 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-57b5448b76-n8nt6" Oct 03 08:53:15 crc kubenswrapper[4790]: I1003 08:53:15.362312 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-57b5448b76-n8nt6" Oct 03 08:53:15 crc kubenswrapper[4790]: I1003 08:53:15.370723 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-57b5448b76-n8nt6" Oct 03 08:53:15 crc kubenswrapper[4790]: I1003 08:53:15.723011 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-57b5448b76-n8nt6" Oct 03 08:53:15 crc kubenswrapper[4790]: I1003 08:53:15.786371 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-ggr2t"] Oct 03 08:53:24 crc kubenswrapper[4790]: I1003 08:53:24.946831 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-l4kg8" Oct 03 08:53:38 crc kubenswrapper[4790]: I1003 08:53:38.302971 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2g6bf6"] Oct 03 08:53:38 crc kubenswrapper[4790]: E1003 08:53:38.303977 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2d0fa506-504c-426e-b43e-d8f46b00fcd7" containerName="registry-server" Oct 03 08:53:38 crc kubenswrapper[4790]: I1003 08:53:38.303992 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="2d0fa506-504c-426e-b43e-d8f46b00fcd7" containerName="registry-server" Oct 03 08:53:38 crc kubenswrapper[4790]: E1003 08:53:38.304014 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2d0fa506-504c-426e-b43e-d8f46b00fcd7" containerName="extract-utilities" Oct 03 08:53:38 crc kubenswrapper[4790]: I1003 08:53:38.304021 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="2d0fa506-504c-426e-b43e-d8f46b00fcd7" containerName="extract-utilities" Oct 03 08:53:38 crc kubenswrapper[4790]: E1003 08:53:38.304036 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2d0fa506-504c-426e-b43e-d8f46b00fcd7" containerName="extract-content" Oct 03 08:53:38 crc kubenswrapper[4790]: I1003 08:53:38.304045 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="2d0fa506-504c-426e-b43e-d8f46b00fcd7" containerName="extract-content" Oct 03 08:53:38 crc kubenswrapper[4790]: I1003 08:53:38.304353 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="2d0fa506-504c-426e-b43e-d8f46b00fcd7" containerName="registry-server" Oct 03 08:53:38 crc kubenswrapper[4790]: I1003 08:53:38.305336 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2g6bf6" Oct 03 08:53:38 crc kubenswrapper[4790]: I1003 08:53:38.307616 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Oct 03 08:53:38 crc kubenswrapper[4790]: I1003 08:53:38.314505 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2g6bf6"] Oct 03 08:53:38 crc kubenswrapper[4790]: I1003 08:53:38.393366 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/cdfb6807-f905-4cef-973b-10f556f2fc96-bundle\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2g6bf6\" (UID: \"cdfb6807-f905-4cef-973b-10f556f2fc96\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2g6bf6" Oct 03 08:53:38 crc kubenswrapper[4790]: I1003 08:53:38.393413 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/cdfb6807-f905-4cef-973b-10f556f2fc96-util\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2g6bf6\" (UID: \"cdfb6807-f905-4cef-973b-10f556f2fc96\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2g6bf6" Oct 03 08:53:38 crc kubenswrapper[4790]: I1003 08:53:38.393470 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z78n7\" (UniqueName: \"kubernetes.io/projected/cdfb6807-f905-4cef-973b-10f556f2fc96-kube-api-access-z78n7\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2g6bf6\" (UID: \"cdfb6807-f905-4cef-973b-10f556f2fc96\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2g6bf6" Oct 03 08:53:38 crc kubenswrapper[4790]: I1003 08:53:38.494621 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/cdfb6807-f905-4cef-973b-10f556f2fc96-bundle\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2g6bf6\" (UID: \"cdfb6807-f905-4cef-973b-10f556f2fc96\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2g6bf6" Oct 03 08:53:38 crc kubenswrapper[4790]: I1003 08:53:38.494675 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/cdfb6807-f905-4cef-973b-10f556f2fc96-util\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2g6bf6\" (UID: \"cdfb6807-f905-4cef-973b-10f556f2fc96\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2g6bf6" Oct 03 08:53:38 crc kubenswrapper[4790]: I1003 08:53:38.494752 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z78n7\" (UniqueName: \"kubernetes.io/projected/cdfb6807-f905-4cef-973b-10f556f2fc96-kube-api-access-z78n7\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2g6bf6\" (UID: \"cdfb6807-f905-4cef-973b-10f556f2fc96\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2g6bf6" Oct 03 08:53:38 crc kubenswrapper[4790]: I1003 08:53:38.495164 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/cdfb6807-f905-4cef-973b-10f556f2fc96-util\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2g6bf6\" (UID: \"cdfb6807-f905-4cef-973b-10f556f2fc96\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2g6bf6" Oct 03 08:53:38 crc kubenswrapper[4790]: I1003 08:53:38.495676 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/cdfb6807-f905-4cef-973b-10f556f2fc96-bundle\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2g6bf6\" (UID: \"cdfb6807-f905-4cef-973b-10f556f2fc96\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2g6bf6" Oct 03 08:53:38 crc kubenswrapper[4790]: I1003 08:53:38.525830 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z78n7\" (UniqueName: \"kubernetes.io/projected/cdfb6807-f905-4cef-973b-10f556f2fc96-kube-api-access-z78n7\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2g6bf6\" (UID: \"cdfb6807-f905-4cef-973b-10f556f2fc96\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2g6bf6" Oct 03 08:53:38 crc kubenswrapper[4790]: I1003 08:53:38.620026 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2g6bf6" Oct 03 08:53:39 crc kubenswrapper[4790]: I1003 08:53:39.116224 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2g6bf6"] Oct 03 08:53:39 crc kubenswrapper[4790]: I1003 08:53:39.876436 4790 generic.go:334] "Generic (PLEG): container finished" podID="cdfb6807-f905-4cef-973b-10f556f2fc96" containerID="36e231fff7d0c6df7860a7f197f68c68fe0936d6ac2ccaca31860c0ef85ab153" exitCode=0 Oct 03 08:53:39 crc kubenswrapper[4790]: I1003 08:53:39.876559 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2g6bf6" event={"ID":"cdfb6807-f905-4cef-973b-10f556f2fc96","Type":"ContainerDied","Data":"36e231fff7d0c6df7860a7f197f68c68fe0936d6ac2ccaca31860c0ef85ab153"} Oct 03 08:53:39 crc kubenswrapper[4790]: I1003 08:53:39.876922 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2g6bf6" event={"ID":"cdfb6807-f905-4cef-973b-10f556f2fc96","Type":"ContainerStarted","Data":"9256adc712ea2aaa5eec78e882b7d44e0de7cfa308c558d464abdada1314c3d0"} Oct 03 08:53:40 crc kubenswrapper[4790]: I1003 08:53:40.832185 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-f9d7485db-ggr2t" podUID="f53170bf-fd8d-41c2-b353-50c5d5cff3d8" containerName="console" containerID="cri-o://60247fffbebd95e7197adf13ca5ae7db6d4f3985d854e90f215fb37cff5269a1" gracePeriod=15 Oct 03 08:53:41 crc kubenswrapper[4790]: I1003 08:53:41.348771 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-ggr2t_f53170bf-fd8d-41c2-b353-50c5d5cff3d8/console/0.log" Oct 03 08:53:41 crc kubenswrapper[4790]: I1003 08:53:41.348861 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-ggr2t" Oct 03 08:53:41 crc kubenswrapper[4790]: I1003 08:53:41.430822 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/f53170bf-fd8d-41c2-b353-50c5d5cff3d8-console-config\") pod \"f53170bf-fd8d-41c2-b353-50c5d5cff3d8\" (UID: \"f53170bf-fd8d-41c2-b353-50c5d5cff3d8\") " Oct 03 08:53:41 crc kubenswrapper[4790]: I1003 08:53:41.431414 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wjchd\" (UniqueName: \"kubernetes.io/projected/f53170bf-fd8d-41c2-b353-50c5d5cff3d8-kube-api-access-wjchd\") pod \"f53170bf-fd8d-41c2-b353-50c5d5cff3d8\" (UID: \"f53170bf-fd8d-41c2-b353-50c5d5cff3d8\") " Oct 03 08:53:41 crc kubenswrapper[4790]: I1003 08:53:41.431548 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/f53170bf-fd8d-41c2-b353-50c5d5cff3d8-console-oauth-config\") pod \"f53170bf-fd8d-41c2-b353-50c5d5cff3d8\" (UID: \"f53170bf-fd8d-41c2-b353-50c5d5cff3d8\") " Oct 03 08:53:41 crc kubenswrapper[4790]: I1003 08:53:41.431706 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/f53170bf-fd8d-41c2-b353-50c5d5cff3d8-console-serving-cert\") pod \"f53170bf-fd8d-41c2-b353-50c5d5cff3d8\" (UID: \"f53170bf-fd8d-41c2-b353-50c5d5cff3d8\") " Oct 03 08:53:41 crc kubenswrapper[4790]: I1003 08:53:41.431852 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/f53170bf-fd8d-41c2-b353-50c5d5cff3d8-oauth-serving-cert\") pod \"f53170bf-fd8d-41c2-b353-50c5d5cff3d8\" (UID: \"f53170bf-fd8d-41c2-b353-50c5d5cff3d8\") " Oct 03 08:53:41 crc kubenswrapper[4790]: I1003 08:53:41.431965 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f53170bf-fd8d-41c2-b353-50c5d5cff3d8-trusted-ca-bundle\") pod \"f53170bf-fd8d-41c2-b353-50c5d5cff3d8\" (UID: \"f53170bf-fd8d-41c2-b353-50c5d5cff3d8\") " Oct 03 08:53:41 crc kubenswrapper[4790]: I1003 08:53:41.432048 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f53170bf-fd8d-41c2-b353-50c5d5cff3d8-service-ca\") pod \"f53170bf-fd8d-41c2-b353-50c5d5cff3d8\" (UID: \"f53170bf-fd8d-41c2-b353-50c5d5cff3d8\") " Oct 03 08:53:41 crc kubenswrapper[4790]: I1003 08:53:41.431998 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f53170bf-fd8d-41c2-b353-50c5d5cff3d8-console-config" (OuterVolumeSpecName: "console-config") pod "f53170bf-fd8d-41c2-b353-50c5d5cff3d8" (UID: "f53170bf-fd8d-41c2-b353-50c5d5cff3d8"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:53:41 crc kubenswrapper[4790]: I1003 08:53:41.432817 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f53170bf-fd8d-41c2-b353-50c5d5cff3d8-service-ca" (OuterVolumeSpecName: "service-ca") pod "f53170bf-fd8d-41c2-b353-50c5d5cff3d8" (UID: "f53170bf-fd8d-41c2-b353-50c5d5cff3d8"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:53:41 crc kubenswrapper[4790]: I1003 08:53:41.433916 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f53170bf-fd8d-41c2-b353-50c5d5cff3d8-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "f53170bf-fd8d-41c2-b353-50c5d5cff3d8" (UID: "f53170bf-fd8d-41c2-b353-50c5d5cff3d8"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:53:41 crc kubenswrapper[4790]: I1003 08:53:41.434010 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f53170bf-fd8d-41c2-b353-50c5d5cff3d8-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "f53170bf-fd8d-41c2-b353-50c5d5cff3d8" (UID: "f53170bf-fd8d-41c2-b353-50c5d5cff3d8"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:53:41 crc kubenswrapper[4790]: I1003 08:53:41.447901 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f53170bf-fd8d-41c2-b353-50c5d5cff3d8-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "f53170bf-fd8d-41c2-b353-50c5d5cff3d8" (UID: "f53170bf-fd8d-41c2-b353-50c5d5cff3d8"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:53:41 crc kubenswrapper[4790]: I1003 08:53:41.449208 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f53170bf-fd8d-41c2-b353-50c5d5cff3d8-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "f53170bf-fd8d-41c2-b353-50c5d5cff3d8" (UID: "f53170bf-fd8d-41c2-b353-50c5d5cff3d8"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:53:41 crc kubenswrapper[4790]: I1003 08:53:41.449299 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f53170bf-fd8d-41c2-b353-50c5d5cff3d8-kube-api-access-wjchd" (OuterVolumeSpecName: "kube-api-access-wjchd") pod "f53170bf-fd8d-41c2-b353-50c5d5cff3d8" (UID: "f53170bf-fd8d-41c2-b353-50c5d5cff3d8"). InnerVolumeSpecName "kube-api-access-wjchd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:53:41 crc kubenswrapper[4790]: I1003 08:53:41.533890 4790 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/f53170bf-fd8d-41c2-b353-50c5d5cff3d8-console-config\") on node \"crc\" DevicePath \"\"" Oct 03 08:53:41 crc kubenswrapper[4790]: I1003 08:53:41.533931 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wjchd\" (UniqueName: \"kubernetes.io/projected/f53170bf-fd8d-41c2-b353-50c5d5cff3d8-kube-api-access-wjchd\") on node \"crc\" DevicePath \"\"" Oct 03 08:53:41 crc kubenswrapper[4790]: I1003 08:53:41.533942 4790 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/f53170bf-fd8d-41c2-b353-50c5d5cff3d8-console-oauth-config\") on node \"crc\" DevicePath \"\"" Oct 03 08:53:41 crc kubenswrapper[4790]: I1003 08:53:41.533951 4790 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/f53170bf-fd8d-41c2-b353-50c5d5cff3d8-console-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 03 08:53:41 crc kubenswrapper[4790]: I1003 08:53:41.533959 4790 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/f53170bf-fd8d-41c2-b353-50c5d5cff3d8-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 03 08:53:41 crc kubenswrapper[4790]: I1003 08:53:41.533968 4790 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f53170bf-fd8d-41c2-b353-50c5d5cff3d8-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 08:53:41 crc kubenswrapper[4790]: I1003 08:53:41.533979 4790 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f53170bf-fd8d-41c2-b353-50c5d5cff3d8-service-ca\") on node \"crc\" DevicePath \"\"" Oct 03 08:53:41 crc kubenswrapper[4790]: I1003 08:53:41.890107 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-ggr2t_f53170bf-fd8d-41c2-b353-50c5d5cff3d8/console/0.log" Oct 03 08:53:41 crc kubenswrapper[4790]: I1003 08:53:41.890172 4790 generic.go:334] "Generic (PLEG): container finished" podID="f53170bf-fd8d-41c2-b353-50c5d5cff3d8" containerID="60247fffbebd95e7197adf13ca5ae7db6d4f3985d854e90f215fb37cff5269a1" exitCode=2 Oct 03 08:53:41 crc kubenswrapper[4790]: I1003 08:53:41.890253 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-ggr2t" Oct 03 08:53:41 crc kubenswrapper[4790]: I1003 08:53:41.890219 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-ggr2t" event={"ID":"f53170bf-fd8d-41c2-b353-50c5d5cff3d8","Type":"ContainerDied","Data":"60247fffbebd95e7197adf13ca5ae7db6d4f3985d854e90f215fb37cff5269a1"} Oct 03 08:53:41 crc kubenswrapper[4790]: I1003 08:53:41.890383 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-ggr2t" event={"ID":"f53170bf-fd8d-41c2-b353-50c5d5cff3d8","Type":"ContainerDied","Data":"db3e9e4f1ee8ad6d7c52a497378325ae5667f61f6e1c6d515e9f10dfbf54f0a5"} Oct 03 08:53:41 crc kubenswrapper[4790]: I1003 08:53:41.890412 4790 scope.go:117] "RemoveContainer" containerID="60247fffbebd95e7197adf13ca5ae7db6d4f3985d854e90f215fb37cff5269a1" Oct 03 08:53:41 crc kubenswrapper[4790]: I1003 08:53:41.892691 4790 generic.go:334] "Generic (PLEG): container finished" podID="cdfb6807-f905-4cef-973b-10f556f2fc96" containerID="ca784001bdb95d8440d083f42de0c7c2c79a9ceee35b6ed6065fe7a09e19e71a" exitCode=0 Oct 03 08:53:41 crc kubenswrapper[4790]: I1003 08:53:41.892721 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2g6bf6" event={"ID":"cdfb6807-f905-4cef-973b-10f556f2fc96","Type":"ContainerDied","Data":"ca784001bdb95d8440d083f42de0c7c2c79a9ceee35b6ed6065fe7a09e19e71a"} Oct 03 08:53:41 crc kubenswrapper[4790]: I1003 08:53:41.905692 4790 scope.go:117] "RemoveContainer" containerID="60247fffbebd95e7197adf13ca5ae7db6d4f3985d854e90f215fb37cff5269a1" Oct 03 08:53:41 crc kubenswrapper[4790]: E1003 08:53:41.906123 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"60247fffbebd95e7197adf13ca5ae7db6d4f3985d854e90f215fb37cff5269a1\": container with ID starting with 60247fffbebd95e7197adf13ca5ae7db6d4f3985d854e90f215fb37cff5269a1 not found: ID does not exist" containerID="60247fffbebd95e7197adf13ca5ae7db6d4f3985d854e90f215fb37cff5269a1" Oct 03 08:53:41 crc kubenswrapper[4790]: I1003 08:53:41.906166 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"60247fffbebd95e7197adf13ca5ae7db6d4f3985d854e90f215fb37cff5269a1"} err="failed to get container status \"60247fffbebd95e7197adf13ca5ae7db6d4f3985d854e90f215fb37cff5269a1\": rpc error: code = NotFound desc = could not find container \"60247fffbebd95e7197adf13ca5ae7db6d4f3985d854e90f215fb37cff5269a1\": container with ID starting with 60247fffbebd95e7197adf13ca5ae7db6d4f3985d854e90f215fb37cff5269a1 not found: ID does not exist" Oct 03 08:53:41 crc kubenswrapper[4790]: I1003 08:53:41.928871 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-ggr2t"] Oct 03 08:53:41 crc kubenswrapper[4790]: I1003 08:53:41.934915 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-f9d7485db-ggr2t"] Oct 03 08:53:42 crc kubenswrapper[4790]: I1003 08:53:42.335170 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f53170bf-fd8d-41c2-b353-50c5d5cff3d8" path="/var/lib/kubelet/pods/f53170bf-fd8d-41c2-b353-50c5d5cff3d8/volumes" Oct 03 08:53:42 crc kubenswrapper[4790]: I1003 08:53:42.900245 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2g6bf6" event={"ID":"cdfb6807-f905-4cef-973b-10f556f2fc96","Type":"ContainerStarted","Data":"1a78a1b036250416a628ed8677ec4bac773d2f8f758675f95332545a2a42bd86"} Oct 03 08:53:42 crc kubenswrapper[4790]: I1003 08:53:42.923082 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2g6bf6" podStartSLOduration=3.458554768 podStartE2EDuration="4.923054683s" podCreationTimestamp="2025-10-03 08:53:38 +0000 UTC" firstStartedPulling="2025-10-03 08:53:39.878613867 +0000 UTC m=+806.231548095" lastFinishedPulling="2025-10-03 08:53:41.343113752 +0000 UTC m=+807.696048010" observedRunningTime="2025-10-03 08:53:42.922764565 +0000 UTC m=+809.275698803" watchObservedRunningTime="2025-10-03 08:53:42.923054683 +0000 UTC m=+809.275988941" Oct 03 08:53:43 crc kubenswrapper[4790]: I1003 08:53:43.910795 4790 generic.go:334] "Generic (PLEG): container finished" podID="cdfb6807-f905-4cef-973b-10f556f2fc96" containerID="1a78a1b036250416a628ed8677ec4bac773d2f8f758675f95332545a2a42bd86" exitCode=0 Oct 03 08:53:43 crc kubenswrapper[4790]: I1003 08:53:43.910858 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2g6bf6" event={"ID":"cdfb6807-f905-4cef-973b-10f556f2fc96","Type":"ContainerDied","Data":"1a78a1b036250416a628ed8677ec4bac773d2f8f758675f95332545a2a42bd86"} Oct 03 08:53:44 crc kubenswrapper[4790]: I1003 08:53:44.889178 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-8mb6b"] Oct 03 08:53:44 crc kubenswrapper[4790]: E1003 08:53:44.890019 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f53170bf-fd8d-41c2-b353-50c5d5cff3d8" containerName="console" Oct 03 08:53:44 crc kubenswrapper[4790]: I1003 08:53:44.890062 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="f53170bf-fd8d-41c2-b353-50c5d5cff3d8" containerName="console" Oct 03 08:53:44 crc kubenswrapper[4790]: I1003 08:53:44.890500 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="f53170bf-fd8d-41c2-b353-50c5d5cff3d8" containerName="console" Oct 03 08:53:44 crc kubenswrapper[4790]: I1003 08:53:44.892694 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8mb6b" Oct 03 08:53:44 crc kubenswrapper[4790]: I1003 08:53:44.926771 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-8mb6b"] Oct 03 08:53:45 crc kubenswrapper[4790]: I1003 08:53:45.069458 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/76bdf878-3f2c-49ae-a722-cedc2c075f67-utilities\") pod \"certified-operators-8mb6b\" (UID: \"76bdf878-3f2c-49ae-a722-cedc2c075f67\") " pod="openshift-marketplace/certified-operators-8mb6b" Oct 03 08:53:45 crc kubenswrapper[4790]: I1003 08:53:45.069815 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/76bdf878-3f2c-49ae-a722-cedc2c075f67-catalog-content\") pod \"certified-operators-8mb6b\" (UID: \"76bdf878-3f2c-49ae-a722-cedc2c075f67\") " pod="openshift-marketplace/certified-operators-8mb6b" Oct 03 08:53:45 crc kubenswrapper[4790]: I1003 08:53:45.069857 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xzmq7\" (UniqueName: \"kubernetes.io/projected/76bdf878-3f2c-49ae-a722-cedc2c075f67-kube-api-access-xzmq7\") pod \"certified-operators-8mb6b\" (UID: \"76bdf878-3f2c-49ae-a722-cedc2c075f67\") " pod="openshift-marketplace/certified-operators-8mb6b" Oct 03 08:53:45 crc kubenswrapper[4790]: I1003 08:53:45.170960 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/76bdf878-3f2c-49ae-a722-cedc2c075f67-utilities\") pod \"certified-operators-8mb6b\" (UID: \"76bdf878-3f2c-49ae-a722-cedc2c075f67\") " pod="openshift-marketplace/certified-operators-8mb6b" Oct 03 08:53:45 crc kubenswrapper[4790]: I1003 08:53:45.171018 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/76bdf878-3f2c-49ae-a722-cedc2c075f67-catalog-content\") pod \"certified-operators-8mb6b\" (UID: \"76bdf878-3f2c-49ae-a722-cedc2c075f67\") " pod="openshift-marketplace/certified-operators-8mb6b" Oct 03 08:53:45 crc kubenswrapper[4790]: I1003 08:53:45.171057 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xzmq7\" (UniqueName: \"kubernetes.io/projected/76bdf878-3f2c-49ae-a722-cedc2c075f67-kube-api-access-xzmq7\") pod \"certified-operators-8mb6b\" (UID: \"76bdf878-3f2c-49ae-a722-cedc2c075f67\") " pod="openshift-marketplace/certified-operators-8mb6b" Oct 03 08:53:45 crc kubenswrapper[4790]: I1003 08:53:45.172057 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/76bdf878-3f2c-49ae-a722-cedc2c075f67-utilities\") pod \"certified-operators-8mb6b\" (UID: \"76bdf878-3f2c-49ae-a722-cedc2c075f67\") " pod="openshift-marketplace/certified-operators-8mb6b" Oct 03 08:53:45 crc kubenswrapper[4790]: I1003 08:53:45.172363 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/76bdf878-3f2c-49ae-a722-cedc2c075f67-catalog-content\") pod \"certified-operators-8mb6b\" (UID: \"76bdf878-3f2c-49ae-a722-cedc2c075f67\") " pod="openshift-marketplace/certified-operators-8mb6b" Oct 03 08:53:45 crc kubenswrapper[4790]: I1003 08:53:45.192149 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xzmq7\" (UniqueName: \"kubernetes.io/projected/76bdf878-3f2c-49ae-a722-cedc2c075f67-kube-api-access-xzmq7\") pod \"certified-operators-8mb6b\" (UID: \"76bdf878-3f2c-49ae-a722-cedc2c075f67\") " pod="openshift-marketplace/certified-operators-8mb6b" Oct 03 08:53:45 crc kubenswrapper[4790]: I1003 08:53:45.234512 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2g6bf6" Oct 03 08:53:45 crc kubenswrapper[4790]: I1003 08:53:45.262697 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8mb6b" Oct 03 08:53:45 crc kubenswrapper[4790]: I1003 08:53:45.278503 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z78n7\" (UniqueName: \"kubernetes.io/projected/cdfb6807-f905-4cef-973b-10f556f2fc96-kube-api-access-z78n7\") pod \"cdfb6807-f905-4cef-973b-10f556f2fc96\" (UID: \"cdfb6807-f905-4cef-973b-10f556f2fc96\") " Oct 03 08:53:45 crc kubenswrapper[4790]: I1003 08:53:45.278562 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/cdfb6807-f905-4cef-973b-10f556f2fc96-bundle\") pod \"cdfb6807-f905-4cef-973b-10f556f2fc96\" (UID: \"cdfb6807-f905-4cef-973b-10f556f2fc96\") " Oct 03 08:53:45 crc kubenswrapper[4790]: I1003 08:53:45.280287 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/cdfb6807-f905-4cef-973b-10f556f2fc96-util\") pod \"cdfb6807-f905-4cef-973b-10f556f2fc96\" (UID: \"cdfb6807-f905-4cef-973b-10f556f2fc96\") " Oct 03 08:53:45 crc kubenswrapper[4790]: I1003 08:53:45.286397 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cdfb6807-f905-4cef-973b-10f556f2fc96-bundle" (OuterVolumeSpecName: "bundle") pod "cdfb6807-f905-4cef-973b-10f556f2fc96" (UID: "cdfb6807-f905-4cef-973b-10f556f2fc96"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 08:53:45 crc kubenswrapper[4790]: I1003 08:53:45.297787 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cdfb6807-f905-4cef-973b-10f556f2fc96-kube-api-access-z78n7" (OuterVolumeSpecName: "kube-api-access-z78n7") pod "cdfb6807-f905-4cef-973b-10f556f2fc96" (UID: "cdfb6807-f905-4cef-973b-10f556f2fc96"). InnerVolumeSpecName "kube-api-access-z78n7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:53:45 crc kubenswrapper[4790]: I1003 08:53:45.308052 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cdfb6807-f905-4cef-973b-10f556f2fc96-util" (OuterVolumeSpecName: "util") pod "cdfb6807-f905-4cef-973b-10f556f2fc96" (UID: "cdfb6807-f905-4cef-973b-10f556f2fc96"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 08:53:45 crc kubenswrapper[4790]: I1003 08:53:45.382044 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z78n7\" (UniqueName: \"kubernetes.io/projected/cdfb6807-f905-4cef-973b-10f556f2fc96-kube-api-access-z78n7\") on node \"crc\" DevicePath \"\"" Oct 03 08:53:45 crc kubenswrapper[4790]: I1003 08:53:45.382277 4790 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/cdfb6807-f905-4cef-973b-10f556f2fc96-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 08:53:45 crc kubenswrapper[4790]: I1003 08:53:45.382285 4790 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/cdfb6807-f905-4cef-973b-10f556f2fc96-util\") on node \"crc\" DevicePath \"\"" Oct 03 08:53:45 crc kubenswrapper[4790]: I1003 08:53:45.732546 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-8mb6b"] Oct 03 08:53:45 crc kubenswrapper[4790]: W1003 08:53:45.738451 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod76bdf878_3f2c_49ae_a722_cedc2c075f67.slice/crio-5bdf117b1fe238f900cc36d732a06c9ed6b191573936ab72b04ae7dfc7d0d6af WatchSource:0}: Error finding container 5bdf117b1fe238f900cc36d732a06c9ed6b191573936ab72b04ae7dfc7d0d6af: Status 404 returned error can't find the container with id 5bdf117b1fe238f900cc36d732a06c9ed6b191573936ab72b04ae7dfc7d0d6af Oct 03 08:53:45 crc kubenswrapper[4790]: I1003 08:53:45.946103 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2g6bf6" Oct 03 08:53:45 crc kubenswrapper[4790]: I1003 08:53:45.946163 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2g6bf6" event={"ID":"cdfb6807-f905-4cef-973b-10f556f2fc96","Type":"ContainerDied","Data":"9256adc712ea2aaa5eec78e882b7d44e0de7cfa308c558d464abdada1314c3d0"} Oct 03 08:53:45 crc kubenswrapper[4790]: I1003 08:53:45.946948 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9256adc712ea2aaa5eec78e882b7d44e0de7cfa308c558d464abdada1314c3d0" Oct 03 08:53:45 crc kubenswrapper[4790]: I1003 08:53:45.949862 4790 generic.go:334] "Generic (PLEG): container finished" podID="76bdf878-3f2c-49ae-a722-cedc2c075f67" containerID="76e732ec77a7ebecb480a01c4d69b3f939c1e9bb4635740bed659d8c9ceaa10b" exitCode=0 Oct 03 08:53:45 crc kubenswrapper[4790]: I1003 08:53:45.949904 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8mb6b" event={"ID":"76bdf878-3f2c-49ae-a722-cedc2c075f67","Type":"ContainerDied","Data":"76e732ec77a7ebecb480a01c4d69b3f939c1e9bb4635740bed659d8c9ceaa10b"} Oct 03 08:53:45 crc kubenswrapper[4790]: I1003 08:53:45.949935 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8mb6b" event={"ID":"76bdf878-3f2c-49ae-a722-cedc2c075f67","Type":"ContainerStarted","Data":"5bdf117b1fe238f900cc36d732a06c9ed6b191573936ab72b04ae7dfc7d0d6af"} Oct 03 08:53:46 crc kubenswrapper[4790]: I1003 08:53:46.968499 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8mb6b" event={"ID":"76bdf878-3f2c-49ae-a722-cedc2c075f67","Type":"ContainerStarted","Data":"d1025ab17477825a030c78250ba51ed10ca2c835b21dce6ff146e92d6dc55ab8"} Oct 03 08:53:47 crc kubenswrapper[4790]: I1003 08:53:47.975868 4790 generic.go:334] "Generic (PLEG): container finished" podID="76bdf878-3f2c-49ae-a722-cedc2c075f67" containerID="d1025ab17477825a030c78250ba51ed10ca2c835b21dce6ff146e92d6dc55ab8" exitCode=0 Oct 03 08:53:47 crc kubenswrapper[4790]: I1003 08:53:47.975916 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8mb6b" event={"ID":"76bdf878-3f2c-49ae-a722-cedc2c075f67","Type":"ContainerDied","Data":"d1025ab17477825a030c78250ba51ed10ca2c835b21dce6ff146e92d6dc55ab8"} Oct 03 08:53:48 crc kubenswrapper[4790]: I1003 08:53:48.464505 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-4nmjt"] Oct 03 08:53:48 crc kubenswrapper[4790]: E1003 08:53:48.464767 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cdfb6807-f905-4cef-973b-10f556f2fc96" containerName="pull" Oct 03 08:53:48 crc kubenswrapper[4790]: I1003 08:53:48.464780 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="cdfb6807-f905-4cef-973b-10f556f2fc96" containerName="pull" Oct 03 08:53:48 crc kubenswrapper[4790]: E1003 08:53:48.464788 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cdfb6807-f905-4cef-973b-10f556f2fc96" containerName="extract" Oct 03 08:53:48 crc kubenswrapper[4790]: I1003 08:53:48.464794 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="cdfb6807-f905-4cef-973b-10f556f2fc96" containerName="extract" Oct 03 08:53:48 crc kubenswrapper[4790]: E1003 08:53:48.464801 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cdfb6807-f905-4cef-973b-10f556f2fc96" containerName="util" Oct 03 08:53:48 crc kubenswrapper[4790]: I1003 08:53:48.464808 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="cdfb6807-f905-4cef-973b-10f556f2fc96" containerName="util" Oct 03 08:53:48 crc kubenswrapper[4790]: I1003 08:53:48.464904 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="cdfb6807-f905-4cef-973b-10f556f2fc96" containerName="extract" Oct 03 08:53:48 crc kubenswrapper[4790]: I1003 08:53:48.465652 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4nmjt" Oct 03 08:53:48 crc kubenswrapper[4790]: I1003 08:53:48.482657 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-4nmjt"] Oct 03 08:53:48 crc kubenswrapper[4790]: I1003 08:53:48.624507 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fzcst\" (UniqueName: \"kubernetes.io/projected/623a7260-3d53-47b8-a1e0-7b9287f21295-kube-api-access-fzcst\") pod \"redhat-marketplace-4nmjt\" (UID: \"623a7260-3d53-47b8-a1e0-7b9287f21295\") " pod="openshift-marketplace/redhat-marketplace-4nmjt" Oct 03 08:53:48 crc kubenswrapper[4790]: I1003 08:53:48.624611 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/623a7260-3d53-47b8-a1e0-7b9287f21295-catalog-content\") pod \"redhat-marketplace-4nmjt\" (UID: \"623a7260-3d53-47b8-a1e0-7b9287f21295\") " pod="openshift-marketplace/redhat-marketplace-4nmjt" Oct 03 08:53:48 crc kubenswrapper[4790]: I1003 08:53:48.624647 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/623a7260-3d53-47b8-a1e0-7b9287f21295-utilities\") pod \"redhat-marketplace-4nmjt\" (UID: \"623a7260-3d53-47b8-a1e0-7b9287f21295\") " pod="openshift-marketplace/redhat-marketplace-4nmjt" Oct 03 08:53:48 crc kubenswrapper[4790]: I1003 08:53:48.726344 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/623a7260-3d53-47b8-a1e0-7b9287f21295-catalog-content\") pod \"redhat-marketplace-4nmjt\" (UID: \"623a7260-3d53-47b8-a1e0-7b9287f21295\") " pod="openshift-marketplace/redhat-marketplace-4nmjt" Oct 03 08:53:48 crc kubenswrapper[4790]: I1003 08:53:48.726415 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/623a7260-3d53-47b8-a1e0-7b9287f21295-utilities\") pod \"redhat-marketplace-4nmjt\" (UID: \"623a7260-3d53-47b8-a1e0-7b9287f21295\") " pod="openshift-marketplace/redhat-marketplace-4nmjt" Oct 03 08:53:48 crc kubenswrapper[4790]: I1003 08:53:48.726487 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fzcst\" (UniqueName: \"kubernetes.io/projected/623a7260-3d53-47b8-a1e0-7b9287f21295-kube-api-access-fzcst\") pod \"redhat-marketplace-4nmjt\" (UID: \"623a7260-3d53-47b8-a1e0-7b9287f21295\") " pod="openshift-marketplace/redhat-marketplace-4nmjt" Oct 03 08:53:48 crc kubenswrapper[4790]: I1003 08:53:48.727891 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/623a7260-3d53-47b8-a1e0-7b9287f21295-catalog-content\") pod \"redhat-marketplace-4nmjt\" (UID: \"623a7260-3d53-47b8-a1e0-7b9287f21295\") " pod="openshift-marketplace/redhat-marketplace-4nmjt" Oct 03 08:53:48 crc kubenswrapper[4790]: I1003 08:53:48.728553 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/623a7260-3d53-47b8-a1e0-7b9287f21295-utilities\") pod \"redhat-marketplace-4nmjt\" (UID: \"623a7260-3d53-47b8-a1e0-7b9287f21295\") " pod="openshift-marketplace/redhat-marketplace-4nmjt" Oct 03 08:53:48 crc kubenswrapper[4790]: I1003 08:53:48.757243 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fzcst\" (UniqueName: \"kubernetes.io/projected/623a7260-3d53-47b8-a1e0-7b9287f21295-kube-api-access-fzcst\") pod \"redhat-marketplace-4nmjt\" (UID: \"623a7260-3d53-47b8-a1e0-7b9287f21295\") " pod="openshift-marketplace/redhat-marketplace-4nmjt" Oct 03 08:53:48 crc kubenswrapper[4790]: I1003 08:53:48.787731 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4nmjt" Oct 03 08:53:48 crc kubenswrapper[4790]: I1003 08:53:48.984820 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8mb6b" event={"ID":"76bdf878-3f2c-49ae-a722-cedc2c075f67","Type":"ContainerStarted","Data":"e9d6dd0eb5c6e2040500a405930e80ca733afa95f09a4249697ed65b365cb266"} Oct 03 08:53:49 crc kubenswrapper[4790]: I1003 08:53:49.006024 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-8mb6b" podStartSLOduration=2.475390176 podStartE2EDuration="5.006001683s" podCreationTimestamp="2025-10-03 08:53:44 +0000 UTC" firstStartedPulling="2025-10-03 08:53:45.952243404 +0000 UTC m=+812.305177642" lastFinishedPulling="2025-10-03 08:53:48.482854911 +0000 UTC m=+814.835789149" observedRunningTime="2025-10-03 08:53:49.004630316 +0000 UTC m=+815.357564574" watchObservedRunningTime="2025-10-03 08:53:49.006001683 +0000 UTC m=+815.358935921" Oct 03 08:53:49 crc kubenswrapper[4790]: I1003 08:53:49.211479 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-4nmjt"] Oct 03 08:53:49 crc kubenswrapper[4790]: W1003 08:53:49.217825 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod623a7260_3d53_47b8_a1e0_7b9287f21295.slice/crio-11301549debef24165bb1c02c2769b59377d44915cf525c4e6841eb0d8f57285 WatchSource:0}: Error finding container 11301549debef24165bb1c02c2769b59377d44915cf525c4e6841eb0d8f57285: Status 404 returned error can't find the container with id 11301549debef24165bb1c02c2769b59377d44915cf525c4e6841eb0d8f57285 Oct 03 08:53:49 crc kubenswrapper[4790]: I1003 08:53:49.992162 4790 generic.go:334] "Generic (PLEG): container finished" podID="623a7260-3d53-47b8-a1e0-7b9287f21295" containerID="798b732bca5a7600f2939d51a8670683db106e3dc4cf224bc5eab0e0e77f290e" exitCode=0 Oct 03 08:53:49 crc kubenswrapper[4790]: I1003 08:53:49.993036 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4nmjt" event={"ID":"623a7260-3d53-47b8-a1e0-7b9287f21295","Type":"ContainerDied","Data":"798b732bca5a7600f2939d51a8670683db106e3dc4cf224bc5eab0e0e77f290e"} Oct 03 08:53:49 crc kubenswrapper[4790]: I1003 08:53:49.993067 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4nmjt" event={"ID":"623a7260-3d53-47b8-a1e0-7b9287f21295","Type":"ContainerStarted","Data":"11301549debef24165bb1c02c2769b59377d44915cf525c4e6841eb0d8f57285"} Oct 03 08:53:51 crc kubenswrapper[4790]: I1003 08:53:51.000059 4790 generic.go:334] "Generic (PLEG): container finished" podID="623a7260-3d53-47b8-a1e0-7b9287f21295" containerID="c87e55b19232b4f88fd90edcac00f4fe046f5aa994c9552bf903893ce29f7593" exitCode=0 Oct 03 08:53:51 crc kubenswrapper[4790]: I1003 08:53:51.000126 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4nmjt" event={"ID":"623a7260-3d53-47b8-a1e0-7b9287f21295","Type":"ContainerDied","Data":"c87e55b19232b4f88fd90edcac00f4fe046f5aa994c9552bf903893ce29f7593"} Oct 03 08:53:52 crc kubenswrapper[4790]: I1003 08:53:52.008668 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4nmjt" event={"ID":"623a7260-3d53-47b8-a1e0-7b9287f21295","Type":"ContainerStarted","Data":"25a1dc024169719310a65dcb89f0776c18201e3dba9bac7bf5d4c5239d74780b"} Oct 03 08:53:52 crc kubenswrapper[4790]: I1003 08:53:52.031216 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-4nmjt" podStartSLOduration=2.581517254 podStartE2EDuration="4.031195376s" podCreationTimestamp="2025-10-03 08:53:48 +0000 UTC" firstStartedPulling="2025-10-03 08:53:49.993403417 +0000 UTC m=+816.346337655" lastFinishedPulling="2025-10-03 08:53:51.443081529 +0000 UTC m=+817.796015777" observedRunningTime="2025-10-03 08:53:52.027515106 +0000 UTC m=+818.380449354" watchObservedRunningTime="2025-10-03 08:53:52.031195376 +0000 UTC m=+818.384129614" Oct 03 08:53:53 crc kubenswrapper[4790]: I1003 08:53:53.742992 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-c97c79c55-pqbls"] Oct 03 08:53:53 crc kubenswrapper[4790]: I1003 08:53:53.744320 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-c97c79c55-pqbls" Oct 03 08:53:53 crc kubenswrapper[4790]: I1003 08:53:53.747254 4790 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Oct 03 08:53:53 crc kubenswrapper[4790]: I1003 08:53:53.747875 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Oct 03 08:53:53 crc kubenswrapper[4790]: I1003 08:53:53.748328 4790 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Oct 03 08:53:53 crc kubenswrapper[4790]: I1003 08:53:53.749032 4790 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-ccn68" Oct 03 08:53:53 crc kubenswrapper[4790]: I1003 08:53:53.749281 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Oct 03 08:53:53 crc kubenswrapper[4790]: I1003 08:53:53.763994 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-c97c79c55-pqbls"] Oct 03 08:53:53 crc kubenswrapper[4790]: I1003 08:53:53.898319 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/1ae3495c-a801-4526-87ed-b8e81ea71d53-webhook-cert\") pod \"metallb-operator-controller-manager-c97c79c55-pqbls\" (UID: \"1ae3495c-a801-4526-87ed-b8e81ea71d53\") " pod="metallb-system/metallb-operator-controller-manager-c97c79c55-pqbls" Oct 03 08:53:53 crc kubenswrapper[4790]: I1003 08:53:53.898403 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/1ae3495c-a801-4526-87ed-b8e81ea71d53-apiservice-cert\") pod \"metallb-operator-controller-manager-c97c79c55-pqbls\" (UID: \"1ae3495c-a801-4526-87ed-b8e81ea71d53\") " pod="metallb-system/metallb-operator-controller-manager-c97c79c55-pqbls" Oct 03 08:53:53 crc kubenswrapper[4790]: I1003 08:53:53.898621 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fn5rv\" (UniqueName: \"kubernetes.io/projected/1ae3495c-a801-4526-87ed-b8e81ea71d53-kube-api-access-fn5rv\") pod \"metallb-operator-controller-manager-c97c79c55-pqbls\" (UID: \"1ae3495c-a801-4526-87ed-b8e81ea71d53\") " pod="metallb-system/metallb-operator-controller-manager-c97c79c55-pqbls" Oct 03 08:53:53 crc kubenswrapper[4790]: I1003 08:53:53.973840 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-687d9f4499-mr2f5"] Oct 03 08:53:53 crc kubenswrapper[4790]: I1003 08:53:53.974799 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-687d9f4499-mr2f5" Oct 03 08:53:53 crc kubenswrapper[4790]: I1003 08:53:53.978042 4790 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Oct 03 08:53:53 crc kubenswrapper[4790]: I1003 08:53:53.978232 4790 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Oct 03 08:53:53 crc kubenswrapper[4790]: I1003 08:53:53.978355 4790 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-kpcjs" Oct 03 08:53:54 crc kubenswrapper[4790]: I1003 08:53:54.004779 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fn5rv\" (UniqueName: \"kubernetes.io/projected/1ae3495c-a801-4526-87ed-b8e81ea71d53-kube-api-access-fn5rv\") pod \"metallb-operator-controller-manager-c97c79c55-pqbls\" (UID: \"1ae3495c-a801-4526-87ed-b8e81ea71d53\") " pod="metallb-system/metallb-operator-controller-manager-c97c79c55-pqbls" Oct 03 08:53:54 crc kubenswrapper[4790]: I1003 08:53:54.004853 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/f5d65866-e446-4512-b8aa-566a283ff5be-webhook-cert\") pod \"metallb-operator-webhook-server-687d9f4499-mr2f5\" (UID: \"f5d65866-e446-4512-b8aa-566a283ff5be\") " pod="metallb-system/metallb-operator-webhook-server-687d9f4499-mr2f5" Oct 03 08:53:54 crc kubenswrapper[4790]: I1003 08:53:54.004906 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/f5d65866-e446-4512-b8aa-566a283ff5be-apiservice-cert\") pod \"metallb-operator-webhook-server-687d9f4499-mr2f5\" (UID: \"f5d65866-e446-4512-b8aa-566a283ff5be\") " pod="metallb-system/metallb-operator-webhook-server-687d9f4499-mr2f5" Oct 03 08:53:54 crc kubenswrapper[4790]: I1003 08:53:54.004932 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xpnxw\" (UniqueName: \"kubernetes.io/projected/f5d65866-e446-4512-b8aa-566a283ff5be-kube-api-access-xpnxw\") pod \"metallb-operator-webhook-server-687d9f4499-mr2f5\" (UID: \"f5d65866-e446-4512-b8aa-566a283ff5be\") " pod="metallb-system/metallb-operator-webhook-server-687d9f4499-mr2f5" Oct 03 08:53:54 crc kubenswrapper[4790]: I1003 08:53:54.004978 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/1ae3495c-a801-4526-87ed-b8e81ea71d53-webhook-cert\") pod \"metallb-operator-controller-manager-c97c79c55-pqbls\" (UID: \"1ae3495c-a801-4526-87ed-b8e81ea71d53\") " pod="metallb-system/metallb-operator-controller-manager-c97c79c55-pqbls" Oct 03 08:53:54 crc kubenswrapper[4790]: I1003 08:53:54.005011 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/1ae3495c-a801-4526-87ed-b8e81ea71d53-apiservice-cert\") pod \"metallb-operator-controller-manager-c97c79c55-pqbls\" (UID: \"1ae3495c-a801-4526-87ed-b8e81ea71d53\") " pod="metallb-system/metallb-operator-controller-manager-c97c79c55-pqbls" Oct 03 08:53:54 crc kubenswrapper[4790]: I1003 08:53:54.006239 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-687d9f4499-mr2f5"] Oct 03 08:53:54 crc kubenswrapper[4790]: I1003 08:53:54.011826 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/1ae3495c-a801-4526-87ed-b8e81ea71d53-apiservice-cert\") pod \"metallb-operator-controller-manager-c97c79c55-pqbls\" (UID: \"1ae3495c-a801-4526-87ed-b8e81ea71d53\") " pod="metallb-system/metallb-operator-controller-manager-c97c79c55-pqbls" Oct 03 08:53:54 crc kubenswrapper[4790]: I1003 08:53:54.011859 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/1ae3495c-a801-4526-87ed-b8e81ea71d53-webhook-cert\") pod \"metallb-operator-controller-manager-c97c79c55-pqbls\" (UID: \"1ae3495c-a801-4526-87ed-b8e81ea71d53\") " pod="metallb-system/metallb-operator-controller-manager-c97c79c55-pqbls" Oct 03 08:53:54 crc kubenswrapper[4790]: I1003 08:53:54.043376 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fn5rv\" (UniqueName: \"kubernetes.io/projected/1ae3495c-a801-4526-87ed-b8e81ea71d53-kube-api-access-fn5rv\") pod \"metallb-operator-controller-manager-c97c79c55-pqbls\" (UID: \"1ae3495c-a801-4526-87ed-b8e81ea71d53\") " pod="metallb-system/metallb-operator-controller-manager-c97c79c55-pqbls" Oct 03 08:53:54 crc kubenswrapper[4790]: I1003 08:53:54.061140 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-c97c79c55-pqbls" Oct 03 08:53:54 crc kubenswrapper[4790]: I1003 08:53:54.105819 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/f5d65866-e446-4512-b8aa-566a283ff5be-webhook-cert\") pod \"metallb-operator-webhook-server-687d9f4499-mr2f5\" (UID: \"f5d65866-e446-4512-b8aa-566a283ff5be\") " pod="metallb-system/metallb-operator-webhook-server-687d9f4499-mr2f5" Oct 03 08:53:54 crc kubenswrapper[4790]: I1003 08:53:54.106301 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/f5d65866-e446-4512-b8aa-566a283ff5be-apiservice-cert\") pod \"metallb-operator-webhook-server-687d9f4499-mr2f5\" (UID: \"f5d65866-e446-4512-b8aa-566a283ff5be\") " pod="metallb-system/metallb-operator-webhook-server-687d9f4499-mr2f5" Oct 03 08:53:54 crc kubenswrapper[4790]: I1003 08:53:54.106338 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xpnxw\" (UniqueName: \"kubernetes.io/projected/f5d65866-e446-4512-b8aa-566a283ff5be-kube-api-access-xpnxw\") pod \"metallb-operator-webhook-server-687d9f4499-mr2f5\" (UID: \"f5d65866-e446-4512-b8aa-566a283ff5be\") " pod="metallb-system/metallb-operator-webhook-server-687d9f4499-mr2f5" Oct 03 08:53:54 crc kubenswrapper[4790]: I1003 08:53:54.111041 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/f5d65866-e446-4512-b8aa-566a283ff5be-webhook-cert\") pod \"metallb-operator-webhook-server-687d9f4499-mr2f5\" (UID: \"f5d65866-e446-4512-b8aa-566a283ff5be\") " pod="metallb-system/metallb-operator-webhook-server-687d9f4499-mr2f5" Oct 03 08:53:54 crc kubenswrapper[4790]: I1003 08:53:54.115745 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/f5d65866-e446-4512-b8aa-566a283ff5be-apiservice-cert\") pod \"metallb-operator-webhook-server-687d9f4499-mr2f5\" (UID: \"f5d65866-e446-4512-b8aa-566a283ff5be\") " pod="metallb-system/metallb-operator-webhook-server-687d9f4499-mr2f5" Oct 03 08:53:54 crc kubenswrapper[4790]: I1003 08:53:54.126046 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xpnxw\" (UniqueName: \"kubernetes.io/projected/f5d65866-e446-4512-b8aa-566a283ff5be-kube-api-access-xpnxw\") pod \"metallb-operator-webhook-server-687d9f4499-mr2f5\" (UID: \"f5d65866-e446-4512-b8aa-566a283ff5be\") " pod="metallb-system/metallb-operator-webhook-server-687d9f4499-mr2f5" Oct 03 08:53:54 crc kubenswrapper[4790]: I1003 08:53:54.288077 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-687d9f4499-mr2f5" Oct 03 08:53:54 crc kubenswrapper[4790]: I1003 08:53:54.519013 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-c97c79c55-pqbls"] Oct 03 08:53:54 crc kubenswrapper[4790]: W1003 08:53:54.529448 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1ae3495c_a801_4526_87ed_b8e81ea71d53.slice/crio-cd515a7de6e6ba837f30e481071672a4ce5beb80dd7c59c0e91e10fa767f3b35 WatchSource:0}: Error finding container cd515a7de6e6ba837f30e481071672a4ce5beb80dd7c59c0e91e10fa767f3b35: Status 404 returned error can't find the container with id cd515a7de6e6ba837f30e481071672a4ce5beb80dd7c59c0e91e10fa767f3b35 Oct 03 08:53:54 crc kubenswrapper[4790]: W1003 08:53:54.757730 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf5d65866_e446_4512_b8aa_566a283ff5be.slice/crio-668692130bd67e061f07043e9c4f0ef1455fe277488dd7df28223c005d603951 WatchSource:0}: Error finding container 668692130bd67e061f07043e9c4f0ef1455fe277488dd7df28223c005d603951: Status 404 returned error can't find the container with id 668692130bd67e061f07043e9c4f0ef1455fe277488dd7df28223c005d603951 Oct 03 08:53:54 crc kubenswrapper[4790]: I1003 08:53:54.759794 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-687d9f4499-mr2f5"] Oct 03 08:53:55 crc kubenswrapper[4790]: I1003 08:53:55.026441 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-c97c79c55-pqbls" event={"ID":"1ae3495c-a801-4526-87ed-b8e81ea71d53","Type":"ContainerStarted","Data":"cd515a7de6e6ba837f30e481071672a4ce5beb80dd7c59c0e91e10fa767f3b35"} Oct 03 08:53:55 crc kubenswrapper[4790]: I1003 08:53:55.027754 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-687d9f4499-mr2f5" event={"ID":"f5d65866-e446-4512-b8aa-566a283ff5be","Type":"ContainerStarted","Data":"668692130bd67e061f07043e9c4f0ef1455fe277488dd7df28223c005d603951"} Oct 03 08:53:55 crc kubenswrapper[4790]: I1003 08:53:55.263873 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-8mb6b" Oct 03 08:53:55 crc kubenswrapper[4790]: I1003 08:53:55.264133 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-8mb6b" Oct 03 08:53:55 crc kubenswrapper[4790]: I1003 08:53:55.327024 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-8mb6b" Oct 03 08:53:56 crc kubenswrapper[4790]: I1003 08:53:56.086772 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-8mb6b" Oct 03 08:53:58 crc kubenswrapper[4790]: I1003 08:53:58.264767 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-2jxt5"] Oct 03 08:53:58 crc kubenswrapper[4790]: I1003 08:53:58.271226 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2jxt5" Oct 03 08:53:58 crc kubenswrapper[4790]: I1003 08:53:58.272679 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-2jxt5"] Oct 03 08:53:58 crc kubenswrapper[4790]: I1003 08:53:58.467835 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-24dhs\" (UniqueName: \"kubernetes.io/projected/2a6d725a-e84a-4337-b71a-0af4ba825541-kube-api-access-24dhs\") pod \"community-operators-2jxt5\" (UID: \"2a6d725a-e84a-4337-b71a-0af4ba825541\") " pod="openshift-marketplace/community-operators-2jxt5" Oct 03 08:53:58 crc kubenswrapper[4790]: I1003 08:53:58.467935 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2a6d725a-e84a-4337-b71a-0af4ba825541-catalog-content\") pod \"community-operators-2jxt5\" (UID: \"2a6d725a-e84a-4337-b71a-0af4ba825541\") " pod="openshift-marketplace/community-operators-2jxt5" Oct 03 08:53:58 crc kubenswrapper[4790]: I1003 08:53:58.468021 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2a6d725a-e84a-4337-b71a-0af4ba825541-utilities\") pod \"community-operators-2jxt5\" (UID: \"2a6d725a-e84a-4337-b71a-0af4ba825541\") " pod="openshift-marketplace/community-operators-2jxt5" Oct 03 08:53:58 crc kubenswrapper[4790]: I1003 08:53:58.568856 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-24dhs\" (UniqueName: \"kubernetes.io/projected/2a6d725a-e84a-4337-b71a-0af4ba825541-kube-api-access-24dhs\") pod \"community-operators-2jxt5\" (UID: \"2a6d725a-e84a-4337-b71a-0af4ba825541\") " pod="openshift-marketplace/community-operators-2jxt5" Oct 03 08:53:58 crc kubenswrapper[4790]: I1003 08:53:58.568971 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2a6d725a-e84a-4337-b71a-0af4ba825541-catalog-content\") pod \"community-operators-2jxt5\" (UID: \"2a6d725a-e84a-4337-b71a-0af4ba825541\") " pod="openshift-marketplace/community-operators-2jxt5" Oct 03 08:53:58 crc kubenswrapper[4790]: I1003 08:53:58.569203 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2a6d725a-e84a-4337-b71a-0af4ba825541-utilities\") pod \"community-operators-2jxt5\" (UID: \"2a6d725a-e84a-4337-b71a-0af4ba825541\") " pod="openshift-marketplace/community-operators-2jxt5" Oct 03 08:53:58 crc kubenswrapper[4790]: I1003 08:53:58.569846 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2a6d725a-e84a-4337-b71a-0af4ba825541-catalog-content\") pod \"community-operators-2jxt5\" (UID: \"2a6d725a-e84a-4337-b71a-0af4ba825541\") " pod="openshift-marketplace/community-operators-2jxt5" Oct 03 08:53:58 crc kubenswrapper[4790]: I1003 08:53:58.569981 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2a6d725a-e84a-4337-b71a-0af4ba825541-utilities\") pod \"community-operators-2jxt5\" (UID: \"2a6d725a-e84a-4337-b71a-0af4ba825541\") " pod="openshift-marketplace/community-operators-2jxt5" Oct 03 08:53:58 crc kubenswrapper[4790]: I1003 08:53:58.592302 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-24dhs\" (UniqueName: \"kubernetes.io/projected/2a6d725a-e84a-4337-b71a-0af4ba825541-kube-api-access-24dhs\") pod \"community-operators-2jxt5\" (UID: \"2a6d725a-e84a-4337-b71a-0af4ba825541\") " pod="openshift-marketplace/community-operators-2jxt5" Oct 03 08:53:58 crc kubenswrapper[4790]: I1003 08:53:58.606355 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2jxt5" Oct 03 08:53:58 crc kubenswrapper[4790]: I1003 08:53:58.788884 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-4nmjt" Oct 03 08:53:58 crc kubenswrapper[4790]: I1003 08:53:58.789938 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-4nmjt" Oct 03 08:53:58 crc kubenswrapper[4790]: I1003 08:53:58.853273 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-4nmjt" Oct 03 08:53:59 crc kubenswrapper[4790]: I1003 08:53:59.108290 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-4nmjt" Oct 03 08:54:00 crc kubenswrapper[4790]: I1003 08:54:00.065401 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-687d9f4499-mr2f5" event={"ID":"f5d65866-e446-4512-b8aa-566a283ff5be","Type":"ContainerStarted","Data":"19a240897db0ed42c0bb946f79be1c34ecbf675e966976756666454a89ed5a46"} Oct 03 08:54:00 crc kubenswrapper[4790]: I1003 08:54:00.065848 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-687d9f4499-mr2f5" Oct 03 08:54:00 crc kubenswrapper[4790]: I1003 08:54:00.066971 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-c97c79c55-pqbls" event={"ID":"1ae3495c-a801-4526-87ed-b8e81ea71d53","Type":"ContainerStarted","Data":"4a10564a236dee4f3cc154de68ef1a07c3230fd76156d352921afe51d828aca3"} Oct 03 08:54:00 crc kubenswrapper[4790]: I1003 08:54:00.067214 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-c97c79c55-pqbls" Oct 03 08:54:00 crc kubenswrapper[4790]: I1003 08:54:00.086589 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-687d9f4499-mr2f5" podStartSLOduration=2.149087636 podStartE2EDuration="7.086546028s" podCreationTimestamp="2025-10-03 08:53:53 +0000 UTC" firstStartedPulling="2025-10-03 08:53:54.760859149 +0000 UTC m=+821.113793377" lastFinishedPulling="2025-10-03 08:53:59.698317531 +0000 UTC m=+826.051251769" observedRunningTime="2025-10-03 08:54:00.08401785 +0000 UTC m=+826.436952098" watchObservedRunningTime="2025-10-03 08:54:00.086546028 +0000 UTC m=+826.439480276" Oct 03 08:54:00 crc kubenswrapper[4790]: I1003 08:54:00.164832 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-c97c79c55-pqbls" podStartSLOduration=2.043518428 podStartE2EDuration="7.164807984s" podCreationTimestamp="2025-10-03 08:53:53 +0000 UTC" firstStartedPulling="2025-10-03 08:53:54.539530887 +0000 UTC m=+820.892465125" lastFinishedPulling="2025-10-03 08:53:59.660820433 +0000 UTC m=+826.013754681" observedRunningTime="2025-10-03 08:54:00.128136757 +0000 UTC m=+826.481070995" watchObservedRunningTime="2025-10-03 08:54:00.164807984 +0000 UTC m=+826.517742212" Oct 03 08:54:00 crc kubenswrapper[4790]: I1003 08:54:00.170514 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-2jxt5"] Oct 03 08:54:01 crc kubenswrapper[4790]: I1003 08:54:01.075415 4790 generic.go:334] "Generic (PLEG): container finished" podID="2a6d725a-e84a-4337-b71a-0af4ba825541" containerID="4b7d68c67b436ff2e00b49899caffad9a0318b6774000508e52faf19620af8bf" exitCode=0 Oct 03 08:54:01 crc kubenswrapper[4790]: I1003 08:54:01.075668 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2jxt5" event={"ID":"2a6d725a-e84a-4337-b71a-0af4ba825541","Type":"ContainerDied","Data":"4b7d68c67b436ff2e00b49899caffad9a0318b6774000508e52faf19620af8bf"} Oct 03 08:54:01 crc kubenswrapper[4790]: I1003 08:54:01.076204 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2jxt5" event={"ID":"2a6d725a-e84a-4337-b71a-0af4ba825541","Type":"ContainerStarted","Data":"befe9412a03a3d40ad2a658e0f3c6f9027231461300a8867081ae236d730d16b"} Oct 03 08:54:01 crc kubenswrapper[4790]: I1003 08:54:01.452246 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-8mb6b"] Oct 03 08:54:01 crc kubenswrapper[4790]: I1003 08:54:01.452520 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-8mb6b" podUID="76bdf878-3f2c-49ae-a722-cedc2c075f67" containerName="registry-server" containerID="cri-o://e9d6dd0eb5c6e2040500a405930e80ca733afa95f09a4249697ed65b365cb266" gracePeriod=2 Oct 03 08:54:01 crc kubenswrapper[4790]: I1003 08:54:01.901196 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8mb6b" Oct 03 08:54:02 crc kubenswrapper[4790]: I1003 08:54:02.031128 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/76bdf878-3f2c-49ae-a722-cedc2c075f67-utilities\") pod \"76bdf878-3f2c-49ae-a722-cedc2c075f67\" (UID: \"76bdf878-3f2c-49ae-a722-cedc2c075f67\") " Oct 03 08:54:02 crc kubenswrapper[4790]: I1003 08:54:02.031180 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xzmq7\" (UniqueName: \"kubernetes.io/projected/76bdf878-3f2c-49ae-a722-cedc2c075f67-kube-api-access-xzmq7\") pod \"76bdf878-3f2c-49ae-a722-cedc2c075f67\" (UID: \"76bdf878-3f2c-49ae-a722-cedc2c075f67\") " Oct 03 08:54:02 crc kubenswrapper[4790]: I1003 08:54:02.031300 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/76bdf878-3f2c-49ae-a722-cedc2c075f67-catalog-content\") pod \"76bdf878-3f2c-49ae-a722-cedc2c075f67\" (UID: \"76bdf878-3f2c-49ae-a722-cedc2c075f67\") " Oct 03 08:54:02 crc kubenswrapper[4790]: I1003 08:54:02.031966 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/76bdf878-3f2c-49ae-a722-cedc2c075f67-utilities" (OuterVolumeSpecName: "utilities") pod "76bdf878-3f2c-49ae-a722-cedc2c075f67" (UID: "76bdf878-3f2c-49ae-a722-cedc2c075f67"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 08:54:02 crc kubenswrapper[4790]: I1003 08:54:02.041511 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/76bdf878-3f2c-49ae-a722-cedc2c075f67-kube-api-access-xzmq7" (OuterVolumeSpecName: "kube-api-access-xzmq7") pod "76bdf878-3f2c-49ae-a722-cedc2c075f67" (UID: "76bdf878-3f2c-49ae-a722-cedc2c075f67"). InnerVolumeSpecName "kube-api-access-xzmq7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:54:02 crc kubenswrapper[4790]: I1003 08:54:02.081201 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/76bdf878-3f2c-49ae-a722-cedc2c075f67-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "76bdf878-3f2c-49ae-a722-cedc2c075f67" (UID: "76bdf878-3f2c-49ae-a722-cedc2c075f67"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 08:54:02 crc kubenswrapper[4790]: I1003 08:54:02.084705 4790 generic.go:334] "Generic (PLEG): container finished" podID="76bdf878-3f2c-49ae-a722-cedc2c075f67" containerID="e9d6dd0eb5c6e2040500a405930e80ca733afa95f09a4249697ed65b365cb266" exitCode=0 Oct 03 08:54:02 crc kubenswrapper[4790]: I1003 08:54:02.084775 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8mb6b" event={"ID":"76bdf878-3f2c-49ae-a722-cedc2c075f67","Type":"ContainerDied","Data":"e9d6dd0eb5c6e2040500a405930e80ca733afa95f09a4249697ed65b365cb266"} Oct 03 08:54:02 crc kubenswrapper[4790]: I1003 08:54:02.084804 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8mb6b" event={"ID":"76bdf878-3f2c-49ae-a722-cedc2c075f67","Type":"ContainerDied","Data":"5bdf117b1fe238f900cc36d732a06c9ed6b191573936ab72b04ae7dfc7d0d6af"} Oct 03 08:54:02 crc kubenswrapper[4790]: I1003 08:54:02.084821 4790 scope.go:117] "RemoveContainer" containerID="e9d6dd0eb5c6e2040500a405930e80ca733afa95f09a4249697ed65b365cb266" Oct 03 08:54:02 crc kubenswrapper[4790]: I1003 08:54:02.084925 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8mb6b" Oct 03 08:54:02 crc kubenswrapper[4790]: I1003 08:54:02.087773 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2jxt5" event={"ID":"2a6d725a-e84a-4337-b71a-0af4ba825541","Type":"ContainerStarted","Data":"dad261a8c291e0170ae309cccbe1bcc949765761643a3a2eead60df09093cbd4"} Oct 03 08:54:02 crc kubenswrapper[4790]: I1003 08:54:02.103089 4790 scope.go:117] "RemoveContainer" containerID="d1025ab17477825a030c78250ba51ed10ca2c835b21dce6ff146e92d6dc55ab8" Oct 03 08:54:02 crc kubenswrapper[4790]: I1003 08:54:02.124731 4790 scope.go:117] "RemoveContainer" containerID="76e732ec77a7ebecb480a01c4d69b3f939c1e9bb4635740bed659d8c9ceaa10b" Oct 03 08:54:02 crc kubenswrapper[4790]: I1003 08:54:02.132367 4790 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/76bdf878-3f2c-49ae-a722-cedc2c075f67-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 08:54:02 crc kubenswrapper[4790]: I1003 08:54:02.132405 4790 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/76bdf878-3f2c-49ae-a722-cedc2c075f67-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 08:54:02 crc kubenswrapper[4790]: I1003 08:54:02.132419 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xzmq7\" (UniqueName: \"kubernetes.io/projected/76bdf878-3f2c-49ae-a722-cedc2c075f67-kube-api-access-xzmq7\") on node \"crc\" DevicePath \"\"" Oct 03 08:54:02 crc kubenswrapper[4790]: I1003 08:54:02.137365 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-8mb6b"] Oct 03 08:54:02 crc kubenswrapper[4790]: I1003 08:54:02.144336 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-8mb6b"] Oct 03 08:54:02 crc kubenswrapper[4790]: I1003 08:54:02.161180 4790 scope.go:117] "RemoveContainer" containerID="e9d6dd0eb5c6e2040500a405930e80ca733afa95f09a4249697ed65b365cb266" Oct 03 08:54:02 crc kubenswrapper[4790]: E1003 08:54:02.164819 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e9d6dd0eb5c6e2040500a405930e80ca733afa95f09a4249697ed65b365cb266\": container with ID starting with e9d6dd0eb5c6e2040500a405930e80ca733afa95f09a4249697ed65b365cb266 not found: ID does not exist" containerID="e9d6dd0eb5c6e2040500a405930e80ca733afa95f09a4249697ed65b365cb266" Oct 03 08:54:02 crc kubenswrapper[4790]: I1003 08:54:02.164865 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e9d6dd0eb5c6e2040500a405930e80ca733afa95f09a4249697ed65b365cb266"} err="failed to get container status \"e9d6dd0eb5c6e2040500a405930e80ca733afa95f09a4249697ed65b365cb266\": rpc error: code = NotFound desc = could not find container \"e9d6dd0eb5c6e2040500a405930e80ca733afa95f09a4249697ed65b365cb266\": container with ID starting with e9d6dd0eb5c6e2040500a405930e80ca733afa95f09a4249697ed65b365cb266 not found: ID does not exist" Oct 03 08:54:02 crc kubenswrapper[4790]: I1003 08:54:02.164893 4790 scope.go:117] "RemoveContainer" containerID="d1025ab17477825a030c78250ba51ed10ca2c835b21dce6ff146e92d6dc55ab8" Oct 03 08:54:02 crc kubenswrapper[4790]: E1003 08:54:02.165418 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d1025ab17477825a030c78250ba51ed10ca2c835b21dce6ff146e92d6dc55ab8\": container with ID starting with d1025ab17477825a030c78250ba51ed10ca2c835b21dce6ff146e92d6dc55ab8 not found: ID does not exist" containerID="d1025ab17477825a030c78250ba51ed10ca2c835b21dce6ff146e92d6dc55ab8" Oct 03 08:54:02 crc kubenswrapper[4790]: I1003 08:54:02.165459 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d1025ab17477825a030c78250ba51ed10ca2c835b21dce6ff146e92d6dc55ab8"} err="failed to get container status \"d1025ab17477825a030c78250ba51ed10ca2c835b21dce6ff146e92d6dc55ab8\": rpc error: code = NotFound desc = could not find container \"d1025ab17477825a030c78250ba51ed10ca2c835b21dce6ff146e92d6dc55ab8\": container with ID starting with d1025ab17477825a030c78250ba51ed10ca2c835b21dce6ff146e92d6dc55ab8 not found: ID does not exist" Oct 03 08:54:02 crc kubenswrapper[4790]: I1003 08:54:02.165488 4790 scope.go:117] "RemoveContainer" containerID="76e732ec77a7ebecb480a01c4d69b3f939c1e9bb4635740bed659d8c9ceaa10b" Oct 03 08:54:02 crc kubenswrapper[4790]: E1003 08:54:02.165962 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"76e732ec77a7ebecb480a01c4d69b3f939c1e9bb4635740bed659d8c9ceaa10b\": container with ID starting with 76e732ec77a7ebecb480a01c4d69b3f939c1e9bb4635740bed659d8c9ceaa10b not found: ID does not exist" containerID="76e732ec77a7ebecb480a01c4d69b3f939c1e9bb4635740bed659d8c9ceaa10b" Oct 03 08:54:02 crc kubenswrapper[4790]: I1003 08:54:02.165991 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"76e732ec77a7ebecb480a01c4d69b3f939c1e9bb4635740bed659d8c9ceaa10b"} err="failed to get container status \"76e732ec77a7ebecb480a01c4d69b3f939c1e9bb4635740bed659d8c9ceaa10b\": rpc error: code = NotFound desc = could not find container \"76e732ec77a7ebecb480a01c4d69b3f939c1e9bb4635740bed659d8c9ceaa10b\": container with ID starting with 76e732ec77a7ebecb480a01c4d69b3f939c1e9bb4635740bed659d8c9ceaa10b not found: ID does not exist" Oct 03 08:54:02 crc kubenswrapper[4790]: I1003 08:54:02.335656 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="76bdf878-3f2c-49ae-a722-cedc2c075f67" path="/var/lib/kubelet/pods/76bdf878-3f2c-49ae-a722-cedc2c075f67/volumes" Oct 03 08:54:03 crc kubenswrapper[4790]: I1003 08:54:03.098005 4790 generic.go:334] "Generic (PLEG): container finished" podID="2a6d725a-e84a-4337-b71a-0af4ba825541" containerID="dad261a8c291e0170ae309cccbe1bcc949765761643a3a2eead60df09093cbd4" exitCode=0 Oct 03 08:54:03 crc kubenswrapper[4790]: I1003 08:54:03.098097 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2jxt5" event={"ID":"2a6d725a-e84a-4337-b71a-0af4ba825541","Type":"ContainerDied","Data":"dad261a8c291e0170ae309cccbe1bcc949765761643a3a2eead60df09093cbd4"} Oct 03 08:54:04 crc kubenswrapper[4790]: I1003 08:54:04.106343 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2jxt5" event={"ID":"2a6d725a-e84a-4337-b71a-0af4ba825541","Type":"ContainerStarted","Data":"42e1cbcf49dec5789875eb96cfb2f4be0a82497dab749f01f95838e8b8aea394"} Oct 03 08:54:04 crc kubenswrapper[4790]: I1003 08:54:04.127993 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-2jxt5" podStartSLOduration=3.736123521 podStartE2EDuration="6.127970158s" podCreationTimestamp="2025-10-03 08:53:58 +0000 UTC" firstStartedPulling="2025-10-03 08:54:01.076941783 +0000 UTC m=+827.429876021" lastFinishedPulling="2025-10-03 08:54:03.46878842 +0000 UTC m=+829.821722658" observedRunningTime="2025-10-03 08:54:04.125146881 +0000 UTC m=+830.478081149" watchObservedRunningTime="2025-10-03 08:54:04.127970158 +0000 UTC m=+830.480904416" Oct 03 08:54:04 crc kubenswrapper[4790]: I1003 08:54:04.851135 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-4nmjt"] Oct 03 08:54:04 crc kubenswrapper[4790]: I1003 08:54:04.851389 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-4nmjt" podUID="623a7260-3d53-47b8-a1e0-7b9287f21295" containerName="registry-server" containerID="cri-o://25a1dc024169719310a65dcb89f0776c18201e3dba9bac7bf5d4c5239d74780b" gracePeriod=2 Oct 03 08:54:05 crc kubenswrapper[4790]: I1003 08:54:05.114071 4790 generic.go:334] "Generic (PLEG): container finished" podID="623a7260-3d53-47b8-a1e0-7b9287f21295" containerID="25a1dc024169719310a65dcb89f0776c18201e3dba9bac7bf5d4c5239d74780b" exitCode=0 Oct 03 08:54:05 crc kubenswrapper[4790]: I1003 08:54:05.114107 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4nmjt" event={"ID":"623a7260-3d53-47b8-a1e0-7b9287f21295","Type":"ContainerDied","Data":"25a1dc024169719310a65dcb89f0776c18201e3dba9bac7bf5d4c5239d74780b"} Oct 03 08:54:05 crc kubenswrapper[4790]: I1003 08:54:05.288185 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4nmjt" Oct 03 08:54:05 crc kubenswrapper[4790]: I1003 08:54:05.372539 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/623a7260-3d53-47b8-a1e0-7b9287f21295-catalog-content\") pod \"623a7260-3d53-47b8-a1e0-7b9287f21295\" (UID: \"623a7260-3d53-47b8-a1e0-7b9287f21295\") " Oct 03 08:54:05 crc kubenswrapper[4790]: I1003 08:54:05.372602 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/623a7260-3d53-47b8-a1e0-7b9287f21295-utilities\") pod \"623a7260-3d53-47b8-a1e0-7b9287f21295\" (UID: \"623a7260-3d53-47b8-a1e0-7b9287f21295\") " Oct 03 08:54:05 crc kubenswrapper[4790]: I1003 08:54:05.372675 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fzcst\" (UniqueName: \"kubernetes.io/projected/623a7260-3d53-47b8-a1e0-7b9287f21295-kube-api-access-fzcst\") pod \"623a7260-3d53-47b8-a1e0-7b9287f21295\" (UID: \"623a7260-3d53-47b8-a1e0-7b9287f21295\") " Oct 03 08:54:05 crc kubenswrapper[4790]: I1003 08:54:05.373419 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/623a7260-3d53-47b8-a1e0-7b9287f21295-utilities" (OuterVolumeSpecName: "utilities") pod "623a7260-3d53-47b8-a1e0-7b9287f21295" (UID: "623a7260-3d53-47b8-a1e0-7b9287f21295"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 08:54:05 crc kubenswrapper[4790]: I1003 08:54:05.378550 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/623a7260-3d53-47b8-a1e0-7b9287f21295-kube-api-access-fzcst" (OuterVolumeSpecName: "kube-api-access-fzcst") pod "623a7260-3d53-47b8-a1e0-7b9287f21295" (UID: "623a7260-3d53-47b8-a1e0-7b9287f21295"). InnerVolumeSpecName "kube-api-access-fzcst". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:54:05 crc kubenswrapper[4790]: I1003 08:54:05.384265 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/623a7260-3d53-47b8-a1e0-7b9287f21295-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "623a7260-3d53-47b8-a1e0-7b9287f21295" (UID: "623a7260-3d53-47b8-a1e0-7b9287f21295"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 08:54:05 crc kubenswrapper[4790]: I1003 08:54:05.473790 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fzcst\" (UniqueName: \"kubernetes.io/projected/623a7260-3d53-47b8-a1e0-7b9287f21295-kube-api-access-fzcst\") on node \"crc\" DevicePath \"\"" Oct 03 08:54:05 crc kubenswrapper[4790]: I1003 08:54:05.473829 4790 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/623a7260-3d53-47b8-a1e0-7b9287f21295-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 08:54:05 crc kubenswrapper[4790]: I1003 08:54:05.473840 4790 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/623a7260-3d53-47b8-a1e0-7b9287f21295-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 08:54:06 crc kubenswrapper[4790]: I1003 08:54:06.122248 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4nmjt" event={"ID":"623a7260-3d53-47b8-a1e0-7b9287f21295","Type":"ContainerDied","Data":"11301549debef24165bb1c02c2769b59377d44915cf525c4e6841eb0d8f57285"} Oct 03 08:54:06 crc kubenswrapper[4790]: I1003 08:54:06.122304 4790 scope.go:117] "RemoveContainer" containerID="25a1dc024169719310a65dcb89f0776c18201e3dba9bac7bf5d4c5239d74780b" Oct 03 08:54:06 crc kubenswrapper[4790]: I1003 08:54:06.122306 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4nmjt" Oct 03 08:54:06 crc kubenswrapper[4790]: I1003 08:54:06.138817 4790 scope.go:117] "RemoveContainer" containerID="c87e55b19232b4f88fd90edcac00f4fe046f5aa994c9552bf903893ce29f7593" Oct 03 08:54:06 crc kubenswrapper[4790]: I1003 08:54:06.149720 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-4nmjt"] Oct 03 08:54:06 crc kubenswrapper[4790]: I1003 08:54:06.159201 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-4nmjt"] Oct 03 08:54:06 crc kubenswrapper[4790]: I1003 08:54:06.164019 4790 scope.go:117] "RemoveContainer" containerID="798b732bca5a7600f2939d51a8670683db106e3dc4cf224bc5eab0e0e77f290e" Oct 03 08:54:06 crc kubenswrapper[4790]: I1003 08:54:06.335429 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="623a7260-3d53-47b8-a1e0-7b9287f21295" path="/var/lib/kubelet/pods/623a7260-3d53-47b8-a1e0-7b9287f21295/volumes" Oct 03 08:54:08 crc kubenswrapper[4790]: I1003 08:54:08.607468 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-2jxt5" Oct 03 08:54:08 crc kubenswrapper[4790]: I1003 08:54:08.607869 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-2jxt5" Oct 03 08:54:08 crc kubenswrapper[4790]: I1003 08:54:08.653228 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-2jxt5" Oct 03 08:54:09 crc kubenswrapper[4790]: I1003 08:54:09.186705 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-2jxt5" Oct 03 08:54:11 crc kubenswrapper[4790]: I1003 08:54:11.250633 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-2jxt5"] Oct 03 08:54:11 crc kubenswrapper[4790]: I1003 08:54:11.251247 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-2jxt5" podUID="2a6d725a-e84a-4337-b71a-0af4ba825541" containerName="registry-server" containerID="cri-o://42e1cbcf49dec5789875eb96cfb2f4be0a82497dab749f01f95838e8b8aea394" gracePeriod=2 Oct 03 08:54:11 crc kubenswrapper[4790]: I1003 08:54:11.775464 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2jxt5" Oct 03 08:54:11 crc kubenswrapper[4790]: I1003 08:54:11.968640 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-24dhs\" (UniqueName: \"kubernetes.io/projected/2a6d725a-e84a-4337-b71a-0af4ba825541-kube-api-access-24dhs\") pod \"2a6d725a-e84a-4337-b71a-0af4ba825541\" (UID: \"2a6d725a-e84a-4337-b71a-0af4ba825541\") " Oct 03 08:54:11 crc kubenswrapper[4790]: I1003 08:54:11.968698 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2a6d725a-e84a-4337-b71a-0af4ba825541-utilities\") pod \"2a6d725a-e84a-4337-b71a-0af4ba825541\" (UID: \"2a6d725a-e84a-4337-b71a-0af4ba825541\") " Oct 03 08:54:11 crc kubenswrapper[4790]: I1003 08:54:11.968789 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2a6d725a-e84a-4337-b71a-0af4ba825541-catalog-content\") pod \"2a6d725a-e84a-4337-b71a-0af4ba825541\" (UID: \"2a6d725a-e84a-4337-b71a-0af4ba825541\") " Oct 03 08:54:11 crc kubenswrapper[4790]: I1003 08:54:11.970279 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2a6d725a-e84a-4337-b71a-0af4ba825541-utilities" (OuterVolumeSpecName: "utilities") pod "2a6d725a-e84a-4337-b71a-0af4ba825541" (UID: "2a6d725a-e84a-4337-b71a-0af4ba825541"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 08:54:11 crc kubenswrapper[4790]: I1003 08:54:11.977881 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2a6d725a-e84a-4337-b71a-0af4ba825541-kube-api-access-24dhs" (OuterVolumeSpecName: "kube-api-access-24dhs") pod "2a6d725a-e84a-4337-b71a-0af4ba825541" (UID: "2a6d725a-e84a-4337-b71a-0af4ba825541"). InnerVolumeSpecName "kube-api-access-24dhs". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:54:12 crc kubenswrapper[4790]: I1003 08:54:12.018676 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2a6d725a-e84a-4337-b71a-0af4ba825541-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2a6d725a-e84a-4337-b71a-0af4ba825541" (UID: "2a6d725a-e84a-4337-b71a-0af4ba825541"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 08:54:12 crc kubenswrapper[4790]: I1003 08:54:12.070541 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-24dhs\" (UniqueName: \"kubernetes.io/projected/2a6d725a-e84a-4337-b71a-0af4ba825541-kube-api-access-24dhs\") on node \"crc\" DevicePath \"\"" Oct 03 08:54:12 crc kubenswrapper[4790]: I1003 08:54:12.070613 4790 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2a6d725a-e84a-4337-b71a-0af4ba825541-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 08:54:12 crc kubenswrapper[4790]: I1003 08:54:12.070627 4790 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2a6d725a-e84a-4337-b71a-0af4ba825541-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 08:54:12 crc kubenswrapper[4790]: I1003 08:54:12.162635 4790 generic.go:334] "Generic (PLEG): container finished" podID="2a6d725a-e84a-4337-b71a-0af4ba825541" containerID="42e1cbcf49dec5789875eb96cfb2f4be0a82497dab749f01f95838e8b8aea394" exitCode=0 Oct 03 08:54:12 crc kubenswrapper[4790]: I1003 08:54:12.162690 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2jxt5" event={"ID":"2a6d725a-e84a-4337-b71a-0af4ba825541","Type":"ContainerDied","Data":"42e1cbcf49dec5789875eb96cfb2f4be0a82497dab749f01f95838e8b8aea394"} Oct 03 08:54:12 crc kubenswrapper[4790]: I1003 08:54:12.162714 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2jxt5" Oct 03 08:54:12 crc kubenswrapper[4790]: I1003 08:54:12.162746 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2jxt5" event={"ID":"2a6d725a-e84a-4337-b71a-0af4ba825541","Type":"ContainerDied","Data":"befe9412a03a3d40ad2a658e0f3c6f9027231461300a8867081ae236d730d16b"} Oct 03 08:54:12 crc kubenswrapper[4790]: I1003 08:54:12.162766 4790 scope.go:117] "RemoveContainer" containerID="42e1cbcf49dec5789875eb96cfb2f4be0a82497dab749f01f95838e8b8aea394" Oct 03 08:54:12 crc kubenswrapper[4790]: I1003 08:54:12.186621 4790 scope.go:117] "RemoveContainer" containerID="dad261a8c291e0170ae309cccbe1bcc949765761643a3a2eead60df09093cbd4" Oct 03 08:54:12 crc kubenswrapper[4790]: I1003 08:54:12.205043 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-2jxt5"] Oct 03 08:54:12 crc kubenswrapper[4790]: I1003 08:54:12.205258 4790 scope.go:117] "RemoveContainer" containerID="4b7d68c67b436ff2e00b49899caffad9a0318b6774000508e52faf19620af8bf" Oct 03 08:54:12 crc kubenswrapper[4790]: I1003 08:54:12.207501 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-2jxt5"] Oct 03 08:54:12 crc kubenswrapper[4790]: I1003 08:54:12.234407 4790 scope.go:117] "RemoveContainer" containerID="42e1cbcf49dec5789875eb96cfb2f4be0a82497dab749f01f95838e8b8aea394" Oct 03 08:54:12 crc kubenswrapper[4790]: E1003 08:54:12.234881 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"42e1cbcf49dec5789875eb96cfb2f4be0a82497dab749f01f95838e8b8aea394\": container with ID starting with 42e1cbcf49dec5789875eb96cfb2f4be0a82497dab749f01f95838e8b8aea394 not found: ID does not exist" containerID="42e1cbcf49dec5789875eb96cfb2f4be0a82497dab749f01f95838e8b8aea394" Oct 03 08:54:12 crc kubenswrapper[4790]: I1003 08:54:12.234918 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"42e1cbcf49dec5789875eb96cfb2f4be0a82497dab749f01f95838e8b8aea394"} err="failed to get container status \"42e1cbcf49dec5789875eb96cfb2f4be0a82497dab749f01f95838e8b8aea394\": rpc error: code = NotFound desc = could not find container \"42e1cbcf49dec5789875eb96cfb2f4be0a82497dab749f01f95838e8b8aea394\": container with ID starting with 42e1cbcf49dec5789875eb96cfb2f4be0a82497dab749f01f95838e8b8aea394 not found: ID does not exist" Oct 03 08:54:12 crc kubenswrapper[4790]: I1003 08:54:12.234940 4790 scope.go:117] "RemoveContainer" containerID="dad261a8c291e0170ae309cccbe1bcc949765761643a3a2eead60df09093cbd4" Oct 03 08:54:12 crc kubenswrapper[4790]: E1003 08:54:12.235327 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dad261a8c291e0170ae309cccbe1bcc949765761643a3a2eead60df09093cbd4\": container with ID starting with dad261a8c291e0170ae309cccbe1bcc949765761643a3a2eead60df09093cbd4 not found: ID does not exist" containerID="dad261a8c291e0170ae309cccbe1bcc949765761643a3a2eead60df09093cbd4" Oct 03 08:54:12 crc kubenswrapper[4790]: I1003 08:54:12.235380 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dad261a8c291e0170ae309cccbe1bcc949765761643a3a2eead60df09093cbd4"} err="failed to get container status \"dad261a8c291e0170ae309cccbe1bcc949765761643a3a2eead60df09093cbd4\": rpc error: code = NotFound desc = could not find container \"dad261a8c291e0170ae309cccbe1bcc949765761643a3a2eead60df09093cbd4\": container with ID starting with dad261a8c291e0170ae309cccbe1bcc949765761643a3a2eead60df09093cbd4 not found: ID does not exist" Oct 03 08:54:12 crc kubenswrapper[4790]: I1003 08:54:12.235408 4790 scope.go:117] "RemoveContainer" containerID="4b7d68c67b436ff2e00b49899caffad9a0318b6774000508e52faf19620af8bf" Oct 03 08:54:12 crc kubenswrapper[4790]: E1003 08:54:12.235786 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4b7d68c67b436ff2e00b49899caffad9a0318b6774000508e52faf19620af8bf\": container with ID starting with 4b7d68c67b436ff2e00b49899caffad9a0318b6774000508e52faf19620af8bf not found: ID does not exist" containerID="4b7d68c67b436ff2e00b49899caffad9a0318b6774000508e52faf19620af8bf" Oct 03 08:54:12 crc kubenswrapper[4790]: I1003 08:54:12.235832 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4b7d68c67b436ff2e00b49899caffad9a0318b6774000508e52faf19620af8bf"} err="failed to get container status \"4b7d68c67b436ff2e00b49899caffad9a0318b6774000508e52faf19620af8bf\": rpc error: code = NotFound desc = could not find container \"4b7d68c67b436ff2e00b49899caffad9a0318b6774000508e52faf19620af8bf\": container with ID starting with 4b7d68c67b436ff2e00b49899caffad9a0318b6774000508e52faf19620af8bf not found: ID does not exist" Oct 03 08:54:12 crc kubenswrapper[4790]: I1003 08:54:12.336453 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2a6d725a-e84a-4337-b71a-0af4ba825541" path="/var/lib/kubelet/pods/2a6d725a-e84a-4337-b71a-0af4ba825541/volumes" Oct 03 08:54:14 crc kubenswrapper[4790]: I1003 08:54:14.294331 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-687d9f4499-mr2f5" Oct 03 08:54:34 crc kubenswrapper[4790]: I1003 08:54:34.065060 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-c97c79c55-pqbls" Oct 03 08:54:34 crc kubenswrapper[4790]: I1003 08:54:34.835732 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-rzzll"] Oct 03 08:54:34 crc kubenswrapper[4790]: E1003 08:54:34.836076 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="623a7260-3d53-47b8-a1e0-7b9287f21295" containerName="extract-utilities" Oct 03 08:54:34 crc kubenswrapper[4790]: I1003 08:54:34.836099 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="623a7260-3d53-47b8-a1e0-7b9287f21295" containerName="extract-utilities" Oct 03 08:54:34 crc kubenswrapper[4790]: E1003 08:54:34.836114 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="623a7260-3d53-47b8-a1e0-7b9287f21295" containerName="extract-content" Oct 03 08:54:34 crc kubenswrapper[4790]: I1003 08:54:34.836122 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="623a7260-3d53-47b8-a1e0-7b9287f21295" containerName="extract-content" Oct 03 08:54:34 crc kubenswrapper[4790]: E1003 08:54:34.836137 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="623a7260-3d53-47b8-a1e0-7b9287f21295" containerName="registry-server" Oct 03 08:54:34 crc kubenswrapper[4790]: I1003 08:54:34.836145 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="623a7260-3d53-47b8-a1e0-7b9287f21295" containerName="registry-server" Oct 03 08:54:34 crc kubenswrapper[4790]: E1003 08:54:34.836163 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="76bdf878-3f2c-49ae-a722-cedc2c075f67" containerName="registry-server" Oct 03 08:54:34 crc kubenswrapper[4790]: I1003 08:54:34.836171 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="76bdf878-3f2c-49ae-a722-cedc2c075f67" containerName="registry-server" Oct 03 08:54:34 crc kubenswrapper[4790]: E1003 08:54:34.836185 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="76bdf878-3f2c-49ae-a722-cedc2c075f67" containerName="extract-content" Oct 03 08:54:34 crc kubenswrapper[4790]: I1003 08:54:34.836192 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="76bdf878-3f2c-49ae-a722-cedc2c075f67" containerName="extract-content" Oct 03 08:54:34 crc kubenswrapper[4790]: E1003 08:54:34.836200 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2a6d725a-e84a-4337-b71a-0af4ba825541" containerName="registry-server" Oct 03 08:54:34 crc kubenswrapper[4790]: I1003 08:54:34.836207 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a6d725a-e84a-4337-b71a-0af4ba825541" containerName="registry-server" Oct 03 08:54:34 crc kubenswrapper[4790]: E1003 08:54:34.836214 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2a6d725a-e84a-4337-b71a-0af4ba825541" containerName="extract-utilities" Oct 03 08:54:34 crc kubenswrapper[4790]: I1003 08:54:34.836220 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a6d725a-e84a-4337-b71a-0af4ba825541" containerName="extract-utilities" Oct 03 08:54:34 crc kubenswrapper[4790]: E1003 08:54:34.836231 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="76bdf878-3f2c-49ae-a722-cedc2c075f67" containerName="extract-utilities" Oct 03 08:54:34 crc kubenswrapper[4790]: I1003 08:54:34.836237 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="76bdf878-3f2c-49ae-a722-cedc2c075f67" containerName="extract-utilities" Oct 03 08:54:34 crc kubenswrapper[4790]: E1003 08:54:34.836246 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2a6d725a-e84a-4337-b71a-0af4ba825541" containerName="extract-content" Oct 03 08:54:34 crc kubenswrapper[4790]: I1003 08:54:34.836252 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a6d725a-e84a-4337-b71a-0af4ba825541" containerName="extract-content" Oct 03 08:54:34 crc kubenswrapper[4790]: I1003 08:54:34.836383 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="76bdf878-3f2c-49ae-a722-cedc2c075f67" containerName="registry-server" Oct 03 08:54:34 crc kubenswrapper[4790]: I1003 08:54:34.836400 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="623a7260-3d53-47b8-a1e0-7b9287f21295" containerName="registry-server" Oct 03 08:54:34 crc kubenswrapper[4790]: I1003 08:54:34.836412 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="2a6d725a-e84a-4337-b71a-0af4ba825541" containerName="registry-server" Oct 03 08:54:34 crc kubenswrapper[4790]: I1003 08:54:34.838527 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-rzzll" Oct 03 08:54:34 crc kubenswrapper[4790]: I1003 08:54:34.839724 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-64bf5d555-xtg2d"] Oct 03 08:54:34 crc kubenswrapper[4790]: I1003 08:54:34.840434 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-xtg2d" Oct 03 08:54:34 crc kubenswrapper[4790]: I1003 08:54:34.840627 4790 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-clh4c" Oct 03 08:54:34 crc kubenswrapper[4790]: I1003 08:54:34.840867 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Oct 03 08:54:34 crc kubenswrapper[4790]: I1003 08:54:34.844199 4790 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Oct 03 08:54:34 crc kubenswrapper[4790]: I1003 08:54:34.844387 4790 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Oct 03 08:54:34 crc kubenswrapper[4790]: I1003 08:54:34.876751 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-64bf5d555-xtg2d"] Oct 03 08:54:34 crc kubenswrapper[4790]: I1003 08:54:34.936262 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-lxwhp"] Oct 03 08:54:34 crc kubenswrapper[4790]: I1003 08:54:34.937774 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-lxwhp" Oct 03 08:54:34 crc kubenswrapper[4790]: I1003 08:54:34.940030 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Oct 03 08:54:34 crc kubenswrapper[4790]: I1003 08:54:34.940218 4790 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-n8mlm" Oct 03 08:54:34 crc kubenswrapper[4790]: I1003 08:54:34.940349 4790 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Oct 03 08:54:34 crc kubenswrapper[4790]: I1003 08:54:34.943512 4790 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Oct 03 08:54:34 crc kubenswrapper[4790]: I1003 08:54:34.960686 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-68d546b9d8-lbn9j"] Oct 03 08:54:34 crc kubenswrapper[4790]: I1003 08:54:34.962129 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-68d546b9d8-lbn9j" Oct 03 08:54:34 crc kubenswrapper[4790]: I1003 08:54:34.966732 4790 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Oct 03 08:54:34 crc kubenswrapper[4790]: I1003 08:54:34.977493 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/904c6a5c-2937-4b23-bdbb-4dd0e461f9fd-metrics-certs\") pod \"frr-k8s-rzzll\" (UID: \"904c6a5c-2937-4b23-bdbb-4dd0e461f9fd\") " pod="metallb-system/frr-k8s-rzzll" Oct 03 08:54:34 crc kubenswrapper[4790]: I1003 08:54:34.977541 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/904c6a5c-2937-4b23-bdbb-4dd0e461f9fd-frr-conf\") pod \"frr-k8s-rzzll\" (UID: \"904c6a5c-2937-4b23-bdbb-4dd0e461f9fd\") " pod="metallb-system/frr-k8s-rzzll" Oct 03 08:54:34 crc kubenswrapper[4790]: I1003 08:54:34.977598 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/904c6a5c-2937-4b23-bdbb-4dd0e461f9fd-metrics\") pod \"frr-k8s-rzzll\" (UID: \"904c6a5c-2937-4b23-bdbb-4dd0e461f9fd\") " pod="metallb-system/frr-k8s-rzzll" Oct 03 08:54:34 crc kubenswrapper[4790]: I1003 08:54:34.977639 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/904c6a5c-2937-4b23-bdbb-4dd0e461f9fd-reloader\") pod \"frr-k8s-rzzll\" (UID: \"904c6a5c-2937-4b23-bdbb-4dd0e461f9fd\") " pod="metallb-system/frr-k8s-rzzll" Oct 03 08:54:34 crc kubenswrapper[4790]: I1003 08:54:34.977699 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/966af9a3-d44e-4bf2-bff8-9b8573083daa-cert\") pod \"frr-k8s-webhook-server-64bf5d555-xtg2d\" (UID: \"966af9a3-d44e-4bf2-bff8-9b8573083daa\") " pod="metallb-system/frr-k8s-webhook-server-64bf5d555-xtg2d" Oct 03 08:54:34 crc kubenswrapper[4790]: I1003 08:54:34.977736 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-plnmh\" (UniqueName: \"kubernetes.io/projected/966af9a3-d44e-4bf2-bff8-9b8573083daa-kube-api-access-plnmh\") pod \"frr-k8s-webhook-server-64bf5d555-xtg2d\" (UID: \"966af9a3-d44e-4bf2-bff8-9b8573083daa\") " pod="metallb-system/frr-k8s-webhook-server-64bf5d555-xtg2d" Oct 03 08:54:34 crc kubenswrapper[4790]: I1003 08:54:34.977818 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/904c6a5c-2937-4b23-bdbb-4dd0e461f9fd-frr-startup\") pod \"frr-k8s-rzzll\" (UID: \"904c6a5c-2937-4b23-bdbb-4dd0e461f9fd\") " pod="metallb-system/frr-k8s-rzzll" Oct 03 08:54:34 crc kubenswrapper[4790]: I1003 08:54:34.977909 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/904c6a5c-2937-4b23-bdbb-4dd0e461f9fd-frr-sockets\") pod \"frr-k8s-rzzll\" (UID: \"904c6a5c-2937-4b23-bdbb-4dd0e461f9fd\") " pod="metallb-system/frr-k8s-rzzll" Oct 03 08:54:34 crc kubenswrapper[4790]: I1003 08:54:34.977931 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t8tnw\" (UniqueName: \"kubernetes.io/projected/904c6a5c-2937-4b23-bdbb-4dd0e461f9fd-kube-api-access-t8tnw\") pod \"frr-k8s-rzzll\" (UID: \"904c6a5c-2937-4b23-bdbb-4dd0e461f9fd\") " pod="metallb-system/frr-k8s-rzzll" Oct 03 08:54:34 crc kubenswrapper[4790]: I1003 08:54:34.978875 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-68d546b9d8-lbn9j"] Oct 03 08:54:35 crc kubenswrapper[4790]: I1003 08:54:35.079690 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/904c6a5c-2937-4b23-bdbb-4dd0e461f9fd-metrics-certs\") pod \"frr-k8s-rzzll\" (UID: \"904c6a5c-2937-4b23-bdbb-4dd0e461f9fd\") " pod="metallb-system/frr-k8s-rzzll" Oct 03 08:54:35 crc kubenswrapper[4790]: I1003 08:54:35.079778 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/904c6a5c-2937-4b23-bdbb-4dd0e461f9fd-frr-conf\") pod \"frr-k8s-rzzll\" (UID: \"904c6a5c-2937-4b23-bdbb-4dd0e461f9fd\") " pod="metallb-system/frr-k8s-rzzll" Oct 03 08:54:35 crc kubenswrapper[4790]: I1003 08:54:35.079811 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8f597a65-e358-498d-b635-6eafe7618337-metrics-certs\") pod \"speaker-lxwhp\" (UID: \"8f597a65-e358-498d-b635-6eafe7618337\") " pod="metallb-system/speaker-lxwhp" Oct 03 08:54:35 crc kubenswrapper[4790]: I1003 08:54:35.079836 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/904c6a5c-2937-4b23-bdbb-4dd0e461f9fd-metrics\") pod \"frr-k8s-rzzll\" (UID: \"904c6a5c-2937-4b23-bdbb-4dd0e461f9fd\") " pod="metallb-system/frr-k8s-rzzll" Oct 03 08:54:35 crc kubenswrapper[4790]: I1003 08:54:35.079865 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/8f597a65-e358-498d-b635-6eafe7618337-memberlist\") pod \"speaker-lxwhp\" (UID: \"8f597a65-e358-498d-b635-6eafe7618337\") " pod="metallb-system/speaker-lxwhp" Oct 03 08:54:35 crc kubenswrapper[4790]: I1003 08:54:35.079894 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/904c6a5c-2937-4b23-bdbb-4dd0e461f9fd-reloader\") pod \"frr-k8s-rzzll\" (UID: \"904c6a5c-2937-4b23-bdbb-4dd0e461f9fd\") " pod="metallb-system/frr-k8s-rzzll" Oct 03 08:54:35 crc kubenswrapper[4790]: I1003 08:54:35.079925 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r9x49\" (UniqueName: \"kubernetes.io/projected/32b53a38-4a5c-44b1-9518-fe99746bf6a4-kube-api-access-r9x49\") pod \"controller-68d546b9d8-lbn9j\" (UID: \"32b53a38-4a5c-44b1-9518-fe99746bf6a4\") " pod="metallb-system/controller-68d546b9d8-lbn9j" Oct 03 08:54:35 crc kubenswrapper[4790]: I1003 08:54:35.079955 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5kwdd\" (UniqueName: \"kubernetes.io/projected/8f597a65-e358-498d-b635-6eafe7618337-kube-api-access-5kwdd\") pod \"speaker-lxwhp\" (UID: \"8f597a65-e358-498d-b635-6eafe7618337\") " pod="metallb-system/speaker-lxwhp" Oct 03 08:54:35 crc kubenswrapper[4790]: I1003 08:54:35.079984 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/966af9a3-d44e-4bf2-bff8-9b8573083daa-cert\") pod \"frr-k8s-webhook-server-64bf5d555-xtg2d\" (UID: \"966af9a3-d44e-4bf2-bff8-9b8573083daa\") " pod="metallb-system/frr-k8s-webhook-server-64bf5d555-xtg2d" Oct 03 08:54:35 crc kubenswrapper[4790]: I1003 08:54:35.080035 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-plnmh\" (UniqueName: \"kubernetes.io/projected/966af9a3-d44e-4bf2-bff8-9b8573083daa-kube-api-access-plnmh\") pod \"frr-k8s-webhook-server-64bf5d555-xtg2d\" (UID: \"966af9a3-d44e-4bf2-bff8-9b8573083daa\") " pod="metallb-system/frr-k8s-webhook-server-64bf5d555-xtg2d" Oct 03 08:54:35 crc kubenswrapper[4790]: I1003 08:54:35.080066 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/904c6a5c-2937-4b23-bdbb-4dd0e461f9fd-frr-startup\") pod \"frr-k8s-rzzll\" (UID: \"904c6a5c-2937-4b23-bdbb-4dd0e461f9fd\") " pod="metallb-system/frr-k8s-rzzll" Oct 03 08:54:35 crc kubenswrapper[4790]: I1003 08:54:35.080099 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/32b53a38-4a5c-44b1-9518-fe99746bf6a4-metrics-certs\") pod \"controller-68d546b9d8-lbn9j\" (UID: \"32b53a38-4a5c-44b1-9518-fe99746bf6a4\") " pod="metallb-system/controller-68d546b9d8-lbn9j" Oct 03 08:54:35 crc kubenswrapper[4790]: I1003 08:54:35.080141 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/904c6a5c-2937-4b23-bdbb-4dd0e461f9fd-frr-sockets\") pod \"frr-k8s-rzzll\" (UID: \"904c6a5c-2937-4b23-bdbb-4dd0e461f9fd\") " pod="metallb-system/frr-k8s-rzzll" Oct 03 08:54:35 crc kubenswrapper[4790]: I1003 08:54:35.080170 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t8tnw\" (UniqueName: \"kubernetes.io/projected/904c6a5c-2937-4b23-bdbb-4dd0e461f9fd-kube-api-access-t8tnw\") pod \"frr-k8s-rzzll\" (UID: \"904c6a5c-2937-4b23-bdbb-4dd0e461f9fd\") " pod="metallb-system/frr-k8s-rzzll" Oct 03 08:54:35 crc kubenswrapper[4790]: I1003 08:54:35.080207 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/32b53a38-4a5c-44b1-9518-fe99746bf6a4-cert\") pod \"controller-68d546b9d8-lbn9j\" (UID: \"32b53a38-4a5c-44b1-9518-fe99746bf6a4\") " pod="metallb-system/controller-68d546b9d8-lbn9j" Oct 03 08:54:35 crc kubenswrapper[4790]: I1003 08:54:35.080229 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/8f597a65-e358-498d-b635-6eafe7618337-metallb-excludel2\") pod \"speaker-lxwhp\" (UID: \"8f597a65-e358-498d-b635-6eafe7618337\") " pod="metallb-system/speaker-lxwhp" Oct 03 08:54:35 crc kubenswrapper[4790]: I1003 08:54:35.082200 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/904c6a5c-2937-4b23-bdbb-4dd0e461f9fd-reloader\") pod \"frr-k8s-rzzll\" (UID: \"904c6a5c-2937-4b23-bdbb-4dd0e461f9fd\") " pod="metallb-system/frr-k8s-rzzll" Oct 03 08:54:35 crc kubenswrapper[4790]: I1003 08:54:35.082497 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/904c6a5c-2937-4b23-bdbb-4dd0e461f9fd-frr-sockets\") pod \"frr-k8s-rzzll\" (UID: \"904c6a5c-2937-4b23-bdbb-4dd0e461f9fd\") " pod="metallb-system/frr-k8s-rzzll" Oct 03 08:54:35 crc kubenswrapper[4790]: I1003 08:54:35.082710 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/904c6a5c-2937-4b23-bdbb-4dd0e461f9fd-metrics\") pod \"frr-k8s-rzzll\" (UID: \"904c6a5c-2937-4b23-bdbb-4dd0e461f9fd\") " pod="metallb-system/frr-k8s-rzzll" Oct 03 08:54:35 crc kubenswrapper[4790]: I1003 08:54:35.082836 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/904c6a5c-2937-4b23-bdbb-4dd0e461f9fd-frr-conf\") pod \"frr-k8s-rzzll\" (UID: \"904c6a5c-2937-4b23-bdbb-4dd0e461f9fd\") " pod="metallb-system/frr-k8s-rzzll" Oct 03 08:54:35 crc kubenswrapper[4790]: I1003 08:54:35.082841 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/904c6a5c-2937-4b23-bdbb-4dd0e461f9fd-frr-startup\") pod \"frr-k8s-rzzll\" (UID: \"904c6a5c-2937-4b23-bdbb-4dd0e461f9fd\") " pod="metallb-system/frr-k8s-rzzll" Oct 03 08:54:35 crc kubenswrapper[4790]: I1003 08:54:35.087617 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/904c6a5c-2937-4b23-bdbb-4dd0e461f9fd-metrics-certs\") pod \"frr-k8s-rzzll\" (UID: \"904c6a5c-2937-4b23-bdbb-4dd0e461f9fd\") " pod="metallb-system/frr-k8s-rzzll" Oct 03 08:54:35 crc kubenswrapper[4790]: I1003 08:54:35.089339 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/966af9a3-d44e-4bf2-bff8-9b8573083daa-cert\") pod \"frr-k8s-webhook-server-64bf5d555-xtg2d\" (UID: \"966af9a3-d44e-4bf2-bff8-9b8573083daa\") " pod="metallb-system/frr-k8s-webhook-server-64bf5d555-xtg2d" Oct 03 08:54:35 crc kubenswrapper[4790]: I1003 08:54:35.099742 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-plnmh\" (UniqueName: \"kubernetes.io/projected/966af9a3-d44e-4bf2-bff8-9b8573083daa-kube-api-access-plnmh\") pod \"frr-k8s-webhook-server-64bf5d555-xtg2d\" (UID: \"966af9a3-d44e-4bf2-bff8-9b8573083daa\") " pod="metallb-system/frr-k8s-webhook-server-64bf5d555-xtg2d" Oct 03 08:54:35 crc kubenswrapper[4790]: I1003 08:54:35.105939 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t8tnw\" (UniqueName: \"kubernetes.io/projected/904c6a5c-2937-4b23-bdbb-4dd0e461f9fd-kube-api-access-t8tnw\") pod \"frr-k8s-rzzll\" (UID: \"904c6a5c-2937-4b23-bdbb-4dd0e461f9fd\") " pod="metallb-system/frr-k8s-rzzll" Oct 03 08:54:35 crc kubenswrapper[4790]: I1003 08:54:35.181252 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-rzzll" Oct 03 08:54:35 crc kubenswrapper[4790]: I1003 08:54:35.181508 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8f597a65-e358-498d-b635-6eafe7618337-metrics-certs\") pod \"speaker-lxwhp\" (UID: \"8f597a65-e358-498d-b635-6eafe7618337\") " pod="metallb-system/speaker-lxwhp" Oct 03 08:54:35 crc kubenswrapper[4790]: I1003 08:54:35.181542 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/8f597a65-e358-498d-b635-6eafe7618337-memberlist\") pod \"speaker-lxwhp\" (UID: \"8f597a65-e358-498d-b635-6eafe7618337\") " pod="metallb-system/speaker-lxwhp" Oct 03 08:54:35 crc kubenswrapper[4790]: I1003 08:54:35.181586 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r9x49\" (UniqueName: \"kubernetes.io/projected/32b53a38-4a5c-44b1-9518-fe99746bf6a4-kube-api-access-r9x49\") pod \"controller-68d546b9d8-lbn9j\" (UID: \"32b53a38-4a5c-44b1-9518-fe99746bf6a4\") " pod="metallb-system/controller-68d546b9d8-lbn9j" Oct 03 08:54:35 crc kubenswrapper[4790]: I1003 08:54:35.181604 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5kwdd\" (UniqueName: \"kubernetes.io/projected/8f597a65-e358-498d-b635-6eafe7618337-kube-api-access-5kwdd\") pod \"speaker-lxwhp\" (UID: \"8f597a65-e358-498d-b635-6eafe7618337\") " pod="metallb-system/speaker-lxwhp" Oct 03 08:54:35 crc kubenswrapper[4790]: I1003 08:54:35.181649 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/32b53a38-4a5c-44b1-9518-fe99746bf6a4-metrics-certs\") pod \"controller-68d546b9d8-lbn9j\" (UID: \"32b53a38-4a5c-44b1-9518-fe99746bf6a4\") " pod="metallb-system/controller-68d546b9d8-lbn9j" Oct 03 08:54:35 crc kubenswrapper[4790]: I1003 08:54:35.181702 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/32b53a38-4a5c-44b1-9518-fe99746bf6a4-cert\") pod \"controller-68d546b9d8-lbn9j\" (UID: \"32b53a38-4a5c-44b1-9518-fe99746bf6a4\") " pod="metallb-system/controller-68d546b9d8-lbn9j" Oct 03 08:54:35 crc kubenswrapper[4790]: E1003 08:54:35.181703 4790 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Oct 03 08:54:35 crc kubenswrapper[4790]: E1003 08:54:35.181780 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8f597a65-e358-498d-b635-6eafe7618337-memberlist podName:8f597a65-e358-498d-b635-6eafe7618337 nodeName:}" failed. No retries permitted until 2025-10-03 08:54:35.681760594 +0000 UTC m=+862.034694922 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/8f597a65-e358-498d-b635-6eafe7618337-memberlist") pod "speaker-lxwhp" (UID: "8f597a65-e358-498d-b635-6eafe7618337") : secret "metallb-memberlist" not found Oct 03 08:54:35 crc kubenswrapper[4790]: I1003 08:54:35.181719 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/8f597a65-e358-498d-b635-6eafe7618337-metallb-excludel2\") pod \"speaker-lxwhp\" (UID: \"8f597a65-e358-498d-b635-6eafe7618337\") " pod="metallb-system/speaker-lxwhp" Oct 03 08:54:35 crc kubenswrapper[4790]: I1003 08:54:35.182355 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/8f597a65-e358-498d-b635-6eafe7618337-metallb-excludel2\") pod \"speaker-lxwhp\" (UID: \"8f597a65-e358-498d-b635-6eafe7618337\") " pod="metallb-system/speaker-lxwhp" Oct 03 08:54:35 crc kubenswrapper[4790]: I1003 08:54:35.184424 4790 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Oct 03 08:54:35 crc kubenswrapper[4790]: I1003 08:54:35.184882 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/32b53a38-4a5c-44b1-9518-fe99746bf6a4-metrics-certs\") pod \"controller-68d546b9d8-lbn9j\" (UID: \"32b53a38-4a5c-44b1-9518-fe99746bf6a4\") " pod="metallb-system/controller-68d546b9d8-lbn9j" Oct 03 08:54:35 crc kubenswrapper[4790]: I1003 08:54:35.186921 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8f597a65-e358-498d-b635-6eafe7618337-metrics-certs\") pod \"speaker-lxwhp\" (UID: \"8f597a65-e358-498d-b635-6eafe7618337\") " pod="metallb-system/speaker-lxwhp" Oct 03 08:54:35 crc kubenswrapper[4790]: I1003 08:54:35.188873 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-xtg2d" Oct 03 08:54:35 crc kubenswrapper[4790]: I1003 08:54:35.194877 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/32b53a38-4a5c-44b1-9518-fe99746bf6a4-cert\") pod \"controller-68d546b9d8-lbn9j\" (UID: \"32b53a38-4a5c-44b1-9518-fe99746bf6a4\") " pod="metallb-system/controller-68d546b9d8-lbn9j" Oct 03 08:54:35 crc kubenswrapper[4790]: I1003 08:54:35.199824 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r9x49\" (UniqueName: \"kubernetes.io/projected/32b53a38-4a5c-44b1-9518-fe99746bf6a4-kube-api-access-r9x49\") pod \"controller-68d546b9d8-lbn9j\" (UID: \"32b53a38-4a5c-44b1-9518-fe99746bf6a4\") " pod="metallb-system/controller-68d546b9d8-lbn9j" Oct 03 08:54:35 crc kubenswrapper[4790]: I1003 08:54:35.201232 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5kwdd\" (UniqueName: \"kubernetes.io/projected/8f597a65-e358-498d-b635-6eafe7618337-kube-api-access-5kwdd\") pod \"speaker-lxwhp\" (UID: \"8f597a65-e358-498d-b635-6eafe7618337\") " pod="metallb-system/speaker-lxwhp" Oct 03 08:54:35 crc kubenswrapper[4790]: I1003 08:54:35.293959 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-68d546b9d8-lbn9j" Oct 03 08:54:35 crc kubenswrapper[4790]: I1003 08:54:35.523223 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-68d546b9d8-lbn9j"] Oct 03 08:54:35 crc kubenswrapper[4790]: W1003 08:54:35.529114 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod32b53a38_4a5c_44b1_9518_fe99746bf6a4.slice/crio-2093d43b70761b15e1e10711e8c3851177dcf60c9cd19ef036d2c12c6810f554 WatchSource:0}: Error finding container 2093d43b70761b15e1e10711e8c3851177dcf60c9cd19ef036d2c12c6810f554: Status 404 returned error can't find the container with id 2093d43b70761b15e1e10711e8c3851177dcf60c9cd19ef036d2c12c6810f554 Oct 03 08:54:35 crc kubenswrapper[4790]: I1003 08:54:35.656882 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-64bf5d555-xtg2d"] Oct 03 08:54:35 crc kubenswrapper[4790]: W1003 08:54:35.668055 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod966af9a3_d44e_4bf2_bff8_9b8573083daa.slice/crio-06b26a1f49bc195694f20fbb8f804d66c8a7f46c5ddecccdc6d7b0646de4420a WatchSource:0}: Error finding container 06b26a1f49bc195694f20fbb8f804d66c8a7f46c5ddecccdc6d7b0646de4420a: Status 404 returned error can't find the container with id 06b26a1f49bc195694f20fbb8f804d66c8a7f46c5ddecccdc6d7b0646de4420a Oct 03 08:54:35 crc kubenswrapper[4790]: I1003 08:54:35.724993 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/8f597a65-e358-498d-b635-6eafe7618337-memberlist\") pod \"speaker-lxwhp\" (UID: \"8f597a65-e358-498d-b635-6eafe7618337\") " pod="metallb-system/speaker-lxwhp" Oct 03 08:54:35 crc kubenswrapper[4790]: E1003 08:54:35.725199 4790 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Oct 03 08:54:35 crc kubenswrapper[4790]: E1003 08:54:35.725244 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8f597a65-e358-498d-b635-6eafe7618337-memberlist podName:8f597a65-e358-498d-b635-6eafe7618337 nodeName:}" failed. No retries permitted until 2025-10-03 08:54:36.725230335 +0000 UTC m=+863.078164573 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/8f597a65-e358-498d-b635-6eafe7618337-memberlist") pod "speaker-lxwhp" (UID: "8f597a65-e358-498d-b635-6eafe7618337") : secret "metallb-memberlist" not found Oct 03 08:54:36 crc kubenswrapper[4790]: I1003 08:54:36.349887 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-xtg2d" event={"ID":"966af9a3-d44e-4bf2-bff8-9b8573083daa","Type":"ContainerStarted","Data":"06b26a1f49bc195694f20fbb8f804d66c8a7f46c5ddecccdc6d7b0646de4420a"} Oct 03 08:54:36 crc kubenswrapper[4790]: I1003 08:54:36.350449 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-rzzll" event={"ID":"904c6a5c-2937-4b23-bdbb-4dd0e461f9fd","Type":"ContainerStarted","Data":"9f69d51cdcbd6a4576d6c201720e02861dd37eb9e10a54686aabf2a1a3e50e9d"} Oct 03 08:54:36 crc kubenswrapper[4790]: I1003 08:54:36.350475 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-68d546b9d8-lbn9j" event={"ID":"32b53a38-4a5c-44b1-9518-fe99746bf6a4","Type":"ContainerStarted","Data":"2fb7f7789a30bb94a920317704f0a977214ef104a7683386f93a5a473ced4196"} Oct 03 08:54:36 crc kubenswrapper[4790]: I1003 08:54:36.350502 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-68d546b9d8-lbn9j" Oct 03 08:54:36 crc kubenswrapper[4790]: I1003 08:54:36.350518 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-68d546b9d8-lbn9j" event={"ID":"32b53a38-4a5c-44b1-9518-fe99746bf6a4","Type":"ContainerStarted","Data":"ec454f2f386e2fdb039f97658189c4dda2ba32b206120c04795d12eabde74748"} Oct 03 08:54:36 crc kubenswrapper[4790]: I1003 08:54:36.350536 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-68d546b9d8-lbn9j" event={"ID":"32b53a38-4a5c-44b1-9518-fe99746bf6a4","Type":"ContainerStarted","Data":"2093d43b70761b15e1e10711e8c3851177dcf60c9cd19ef036d2c12c6810f554"} Oct 03 08:54:36 crc kubenswrapper[4790]: I1003 08:54:36.356217 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-68d546b9d8-lbn9j" podStartSLOduration=2.35618845 podStartE2EDuration="2.35618845s" podCreationTimestamp="2025-10-03 08:54:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 08:54:36.35252796 +0000 UTC m=+862.705462218" watchObservedRunningTime="2025-10-03 08:54:36.35618845 +0000 UTC m=+862.709122708" Oct 03 08:54:36 crc kubenswrapper[4790]: I1003 08:54:36.743051 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/8f597a65-e358-498d-b635-6eafe7618337-memberlist\") pod \"speaker-lxwhp\" (UID: \"8f597a65-e358-498d-b635-6eafe7618337\") " pod="metallb-system/speaker-lxwhp" Oct 03 08:54:36 crc kubenswrapper[4790]: I1003 08:54:36.769202 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/8f597a65-e358-498d-b635-6eafe7618337-memberlist\") pod \"speaker-lxwhp\" (UID: \"8f597a65-e358-498d-b635-6eafe7618337\") " pod="metallb-system/speaker-lxwhp" Oct 03 08:54:37 crc kubenswrapper[4790]: I1003 08:54:37.057697 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-lxwhp" Oct 03 08:54:37 crc kubenswrapper[4790]: I1003 08:54:37.348325 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-lxwhp" event={"ID":"8f597a65-e358-498d-b635-6eafe7618337","Type":"ContainerStarted","Data":"2c915d163b2acdebbb78ffc7f5e719d5c61f88cde2ed200403920e27c4a209d1"} Oct 03 08:54:38 crc kubenswrapper[4790]: I1003 08:54:38.364668 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-lxwhp" event={"ID":"8f597a65-e358-498d-b635-6eafe7618337","Type":"ContainerStarted","Data":"2c3d0e76ba769fc9621c353ddaa7663cb7ccc7e7836cdd569868e26be31f5594"} Oct 03 08:54:38 crc kubenswrapper[4790]: I1003 08:54:38.365013 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-lxwhp" Oct 03 08:54:38 crc kubenswrapper[4790]: I1003 08:54:38.365031 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-lxwhp" event={"ID":"8f597a65-e358-498d-b635-6eafe7618337","Type":"ContainerStarted","Data":"e68b5237a4cb2380ee1e4a23f10990914d3597f57aab9c0b21905b08534cc079"} Oct 03 08:54:38 crc kubenswrapper[4790]: I1003 08:54:38.386090 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-lxwhp" podStartSLOduration=4.386069238 podStartE2EDuration="4.386069238s" podCreationTimestamp="2025-10-03 08:54:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 08:54:38.38394457 +0000 UTC m=+864.736878808" watchObservedRunningTime="2025-10-03 08:54:38.386069238 +0000 UTC m=+864.739003486" Oct 03 08:54:43 crc kubenswrapper[4790]: I1003 08:54:43.398265 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-xtg2d" event={"ID":"966af9a3-d44e-4bf2-bff8-9b8573083daa","Type":"ContainerStarted","Data":"bd5061fbc107be4c8c8445b25ce70bb94d756029899e6e9ea78e95f3fc6a33df"} Oct 03 08:54:43 crc kubenswrapper[4790]: I1003 08:54:43.398861 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-xtg2d" Oct 03 08:54:43 crc kubenswrapper[4790]: I1003 08:54:43.401796 4790 generic.go:334] "Generic (PLEG): container finished" podID="904c6a5c-2937-4b23-bdbb-4dd0e461f9fd" containerID="da24c7e13c5e015c6ae8f232f5fc39c504b32ea889fd5b29e1a5e180889687c5" exitCode=0 Oct 03 08:54:43 crc kubenswrapper[4790]: I1003 08:54:43.401869 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-rzzll" event={"ID":"904c6a5c-2937-4b23-bdbb-4dd0e461f9fd","Type":"ContainerDied","Data":"da24c7e13c5e015c6ae8f232f5fc39c504b32ea889fd5b29e1a5e180889687c5"} Oct 03 08:54:43 crc kubenswrapper[4790]: I1003 08:54:43.421721 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-xtg2d" podStartSLOduration=2.137460727 podStartE2EDuration="9.421698686s" podCreationTimestamp="2025-10-03 08:54:34 +0000 UTC" firstStartedPulling="2025-10-03 08:54:35.672433713 +0000 UTC m=+862.025367951" lastFinishedPulling="2025-10-03 08:54:42.956671672 +0000 UTC m=+869.309605910" observedRunningTime="2025-10-03 08:54:43.415146927 +0000 UTC m=+869.768081185" watchObservedRunningTime="2025-10-03 08:54:43.421698686 +0000 UTC m=+869.774632934" Oct 03 08:54:44 crc kubenswrapper[4790]: I1003 08:54:44.409876 4790 generic.go:334] "Generic (PLEG): container finished" podID="904c6a5c-2937-4b23-bdbb-4dd0e461f9fd" containerID="9bb2d2b631d8903e908234977cc7d21ae1d89af5e60e93b6d577084a0c0db962" exitCode=0 Oct 03 08:54:44 crc kubenswrapper[4790]: I1003 08:54:44.409982 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-rzzll" event={"ID":"904c6a5c-2937-4b23-bdbb-4dd0e461f9fd","Type":"ContainerDied","Data":"9bb2d2b631d8903e908234977cc7d21ae1d89af5e60e93b6d577084a0c0db962"} Oct 03 08:54:45 crc kubenswrapper[4790]: I1003 08:54:45.298400 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-68d546b9d8-lbn9j" Oct 03 08:54:45 crc kubenswrapper[4790]: I1003 08:54:45.419628 4790 generic.go:334] "Generic (PLEG): container finished" podID="904c6a5c-2937-4b23-bdbb-4dd0e461f9fd" containerID="36c8a4915a188f7a3bf23ffd21a29c878f0df8461d6cec0adb1b4cc6d80f019e" exitCode=0 Oct 03 08:54:45 crc kubenswrapper[4790]: I1003 08:54:45.419674 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-rzzll" event={"ID":"904c6a5c-2937-4b23-bdbb-4dd0e461f9fd","Type":"ContainerDied","Data":"36c8a4915a188f7a3bf23ffd21a29c878f0df8461d6cec0adb1b4cc6d80f019e"} Oct 03 08:54:46 crc kubenswrapper[4790]: I1003 08:54:46.434371 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-rzzll" event={"ID":"904c6a5c-2937-4b23-bdbb-4dd0e461f9fd","Type":"ContainerStarted","Data":"23e7fd878077623f630838b2078a6424e7ababd31127ee1cdf3a392ce9e83237"} Oct 03 08:54:46 crc kubenswrapper[4790]: I1003 08:54:46.434740 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-rzzll" event={"ID":"904c6a5c-2937-4b23-bdbb-4dd0e461f9fd","Type":"ContainerStarted","Data":"058669530d3b2492ef69a8c4710ef5442e8004b63ea6ce599e36a64549baa58d"} Oct 03 08:54:46 crc kubenswrapper[4790]: I1003 08:54:46.434754 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-rzzll" event={"ID":"904c6a5c-2937-4b23-bdbb-4dd0e461f9fd","Type":"ContainerStarted","Data":"1cb045f440734c38c78a46c2291698bbb121da63badc2eec18227346639e0096"} Oct 03 08:54:46 crc kubenswrapper[4790]: I1003 08:54:46.434773 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-rzzll" event={"ID":"904c6a5c-2937-4b23-bdbb-4dd0e461f9fd","Type":"ContainerStarted","Data":"9d9d38a62d9f46031971747a17b035c84f382b73fd83b7e45f9bf98028d995d0"} Oct 03 08:54:46 crc kubenswrapper[4790]: I1003 08:54:46.434784 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-rzzll" event={"ID":"904c6a5c-2937-4b23-bdbb-4dd0e461f9fd","Type":"ContainerStarted","Data":"945a69173d749241544198b050b911b92b3e3e85ecdbe7fbe11051916614e32e"} Oct 03 08:54:47 crc kubenswrapper[4790]: I1003 08:54:47.067218 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-lxwhp" Oct 03 08:54:47 crc kubenswrapper[4790]: I1003 08:54:47.451523 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-rzzll" event={"ID":"904c6a5c-2937-4b23-bdbb-4dd0e461f9fd","Type":"ContainerStarted","Data":"ebe7ed0f14bcbc55ca1e76bcda0ca1e7b7f2aca808c58734c012ef3b67e9abbe"} Oct 03 08:54:47 crc kubenswrapper[4790]: I1003 08:54:47.451802 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-rzzll" Oct 03 08:54:47 crc kubenswrapper[4790]: I1003 08:54:47.479548 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-rzzll" podStartSLOduration=5.895998474 podStartE2EDuration="13.479530381s" podCreationTimestamp="2025-10-03 08:54:34 +0000 UTC" firstStartedPulling="2025-10-03 08:54:35.354111019 +0000 UTC m=+861.707045247" lastFinishedPulling="2025-10-03 08:54:42.937642916 +0000 UTC m=+869.290577154" observedRunningTime="2025-10-03 08:54:47.479279534 +0000 UTC m=+873.832213802" watchObservedRunningTime="2025-10-03 08:54:47.479530381 +0000 UTC m=+873.832464619" Oct 03 08:54:49 crc kubenswrapper[4790]: I1003 08:54:49.856520 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-sg48w"] Oct 03 08:54:49 crc kubenswrapper[4790]: I1003 08:54:49.857626 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-sg48w" Oct 03 08:54:49 crc kubenswrapper[4790]: I1003 08:54:49.860470 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Oct 03 08:54:49 crc kubenswrapper[4790]: I1003 08:54:49.860681 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Oct 03 08:54:49 crc kubenswrapper[4790]: I1003 08:54:49.869231 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-index-dockercfg-p6s4z" Oct 03 08:54:49 crc kubenswrapper[4790]: I1003 08:54:49.914078 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-sg48w"] Oct 03 08:54:50 crc kubenswrapper[4790]: I1003 08:54:50.038407 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cwqqb\" (UniqueName: \"kubernetes.io/projected/b44b03bb-79b2-4aec-9775-5ae18ab0d4d0-kube-api-access-cwqqb\") pod \"openstack-operator-index-sg48w\" (UID: \"b44b03bb-79b2-4aec-9775-5ae18ab0d4d0\") " pod="openstack-operators/openstack-operator-index-sg48w" Oct 03 08:54:50 crc kubenswrapper[4790]: I1003 08:54:50.140364 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cwqqb\" (UniqueName: \"kubernetes.io/projected/b44b03bb-79b2-4aec-9775-5ae18ab0d4d0-kube-api-access-cwqqb\") pod \"openstack-operator-index-sg48w\" (UID: \"b44b03bb-79b2-4aec-9775-5ae18ab0d4d0\") " pod="openstack-operators/openstack-operator-index-sg48w" Oct 03 08:54:50 crc kubenswrapper[4790]: I1003 08:54:50.159009 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cwqqb\" (UniqueName: \"kubernetes.io/projected/b44b03bb-79b2-4aec-9775-5ae18ab0d4d0-kube-api-access-cwqqb\") pod \"openstack-operator-index-sg48w\" (UID: \"b44b03bb-79b2-4aec-9775-5ae18ab0d4d0\") " pod="openstack-operators/openstack-operator-index-sg48w" Oct 03 08:54:50 crc kubenswrapper[4790]: I1003 08:54:50.177709 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-sg48w" Oct 03 08:54:50 crc kubenswrapper[4790]: I1003 08:54:50.181807 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-rzzll" Oct 03 08:54:50 crc kubenswrapper[4790]: I1003 08:54:50.227095 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-rzzll" Oct 03 08:54:50 crc kubenswrapper[4790]: I1003 08:54:50.612788 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-sg48w"] Oct 03 08:54:51 crc kubenswrapper[4790]: I1003 08:54:51.477599 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-sg48w" event={"ID":"b44b03bb-79b2-4aec-9775-5ae18ab0d4d0","Type":"ContainerStarted","Data":"d58309efa315921b496d297311949d693d5067f1edb944e8e791edf4f6c79a84"} Oct 03 08:54:53 crc kubenswrapper[4790]: I1003 08:54:53.228256 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-sg48w"] Oct 03 08:54:53 crc kubenswrapper[4790]: I1003 08:54:53.494112 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-sg48w" event={"ID":"b44b03bb-79b2-4aec-9775-5ae18ab0d4d0","Type":"ContainerStarted","Data":"36674ddc987b7c806f5aceccb335493588d07f234ace69b85cfca0f723839cc6"} Oct 03 08:54:53 crc kubenswrapper[4790]: I1003 08:54:53.533000 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-sg48w" podStartSLOduration=2.512926271 podStartE2EDuration="4.532970734s" podCreationTimestamp="2025-10-03 08:54:49 +0000 UTC" firstStartedPulling="2025-10-03 08:54:50.62912549 +0000 UTC m=+876.982059728" lastFinishedPulling="2025-10-03 08:54:52.649169953 +0000 UTC m=+879.002104191" observedRunningTime="2025-10-03 08:54:53.513232419 +0000 UTC m=+879.866166697" watchObservedRunningTime="2025-10-03 08:54:53.532970734 +0000 UTC m=+879.885905002" Oct 03 08:54:53 crc kubenswrapper[4790]: I1003 08:54:53.842331 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-w2g5j"] Oct 03 08:54:53 crc kubenswrapper[4790]: I1003 08:54:53.843723 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-w2g5j" Oct 03 08:54:53 crc kubenswrapper[4790]: I1003 08:54:53.856143 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-w2g5j"] Oct 03 08:54:53 crc kubenswrapper[4790]: I1003 08:54:53.946369 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tmxq5\" (UniqueName: \"kubernetes.io/projected/9a285450-ca0c-4d27-b51b-81474aaa7d8f-kube-api-access-tmxq5\") pod \"openstack-operator-index-w2g5j\" (UID: \"9a285450-ca0c-4d27-b51b-81474aaa7d8f\") " pod="openstack-operators/openstack-operator-index-w2g5j" Oct 03 08:54:54 crc kubenswrapper[4790]: I1003 08:54:54.047783 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tmxq5\" (UniqueName: \"kubernetes.io/projected/9a285450-ca0c-4d27-b51b-81474aaa7d8f-kube-api-access-tmxq5\") pod \"openstack-operator-index-w2g5j\" (UID: \"9a285450-ca0c-4d27-b51b-81474aaa7d8f\") " pod="openstack-operators/openstack-operator-index-w2g5j" Oct 03 08:54:55 crc kubenswrapper[4790]: I1003 08:54:54.072007 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tmxq5\" (UniqueName: \"kubernetes.io/projected/9a285450-ca0c-4d27-b51b-81474aaa7d8f-kube-api-access-tmxq5\") pod \"openstack-operator-index-w2g5j\" (UID: \"9a285450-ca0c-4d27-b51b-81474aaa7d8f\") " pod="openstack-operators/openstack-operator-index-w2g5j" Oct 03 08:54:55 crc kubenswrapper[4790]: I1003 08:54:54.177379 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-w2g5j" Oct 03 08:54:55 crc kubenswrapper[4790]: I1003 08:54:54.499632 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/openstack-operator-index-sg48w" podUID="b44b03bb-79b2-4aec-9775-5ae18ab0d4d0" containerName="registry-server" containerID="cri-o://36674ddc987b7c806f5aceccb335493588d07f234ace69b85cfca0f723839cc6" gracePeriod=2 Oct 03 08:54:55 crc kubenswrapper[4790]: I1003 08:54:55.187347 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-rzzll" Oct 03 08:54:55 crc kubenswrapper[4790]: I1003 08:54:55.193348 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-xtg2d" Oct 03 08:54:55 crc kubenswrapper[4790]: I1003 08:54:55.349388 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-sg48w" Oct 03 08:54:55 crc kubenswrapper[4790]: I1003 08:54:55.367353 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cwqqb\" (UniqueName: \"kubernetes.io/projected/b44b03bb-79b2-4aec-9775-5ae18ab0d4d0-kube-api-access-cwqqb\") pod \"b44b03bb-79b2-4aec-9775-5ae18ab0d4d0\" (UID: \"b44b03bb-79b2-4aec-9775-5ae18ab0d4d0\") " Oct 03 08:54:55 crc kubenswrapper[4790]: I1003 08:54:55.373702 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b44b03bb-79b2-4aec-9775-5ae18ab0d4d0-kube-api-access-cwqqb" (OuterVolumeSpecName: "kube-api-access-cwqqb") pod "b44b03bb-79b2-4aec-9775-5ae18ab0d4d0" (UID: "b44b03bb-79b2-4aec-9775-5ae18ab0d4d0"). InnerVolumeSpecName "kube-api-access-cwqqb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:54:55 crc kubenswrapper[4790]: I1003 08:54:55.412440 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-w2g5j"] Oct 03 08:54:55 crc kubenswrapper[4790]: W1003 08:54:55.418292 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9a285450_ca0c_4d27_b51b_81474aaa7d8f.slice/crio-3e6fd06a2dec6b1f9db6d433078e007568a9f4a89a5e717e38cea293dd3a2194 WatchSource:0}: Error finding container 3e6fd06a2dec6b1f9db6d433078e007568a9f4a89a5e717e38cea293dd3a2194: Status 404 returned error can't find the container with id 3e6fd06a2dec6b1f9db6d433078e007568a9f4a89a5e717e38cea293dd3a2194 Oct 03 08:54:55 crc kubenswrapper[4790]: I1003 08:54:55.469721 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cwqqb\" (UniqueName: \"kubernetes.io/projected/b44b03bb-79b2-4aec-9775-5ae18ab0d4d0-kube-api-access-cwqqb\") on node \"crc\" DevicePath \"\"" Oct 03 08:54:55 crc kubenswrapper[4790]: I1003 08:54:55.506372 4790 generic.go:334] "Generic (PLEG): container finished" podID="b44b03bb-79b2-4aec-9775-5ae18ab0d4d0" containerID="36674ddc987b7c806f5aceccb335493588d07f234ace69b85cfca0f723839cc6" exitCode=0 Oct 03 08:54:55 crc kubenswrapper[4790]: I1003 08:54:55.506411 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-sg48w" event={"ID":"b44b03bb-79b2-4aec-9775-5ae18ab0d4d0","Type":"ContainerDied","Data":"36674ddc987b7c806f5aceccb335493588d07f234ace69b85cfca0f723839cc6"} Oct 03 08:54:55 crc kubenswrapper[4790]: I1003 08:54:55.506451 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-sg48w" event={"ID":"b44b03bb-79b2-4aec-9775-5ae18ab0d4d0","Type":"ContainerDied","Data":"d58309efa315921b496d297311949d693d5067f1edb944e8e791edf4f6c79a84"} Oct 03 08:54:55 crc kubenswrapper[4790]: I1003 08:54:55.506467 4790 scope.go:117] "RemoveContainer" containerID="36674ddc987b7c806f5aceccb335493588d07f234ace69b85cfca0f723839cc6" Oct 03 08:54:55 crc kubenswrapper[4790]: I1003 08:54:55.506424 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-sg48w" Oct 03 08:54:55 crc kubenswrapper[4790]: I1003 08:54:55.509263 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-w2g5j" event={"ID":"9a285450-ca0c-4d27-b51b-81474aaa7d8f","Type":"ContainerStarted","Data":"3e6fd06a2dec6b1f9db6d433078e007568a9f4a89a5e717e38cea293dd3a2194"} Oct 03 08:54:55 crc kubenswrapper[4790]: I1003 08:54:55.525090 4790 scope.go:117] "RemoveContainer" containerID="36674ddc987b7c806f5aceccb335493588d07f234ace69b85cfca0f723839cc6" Oct 03 08:54:55 crc kubenswrapper[4790]: E1003 08:54:55.525462 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"36674ddc987b7c806f5aceccb335493588d07f234ace69b85cfca0f723839cc6\": container with ID starting with 36674ddc987b7c806f5aceccb335493588d07f234ace69b85cfca0f723839cc6 not found: ID does not exist" containerID="36674ddc987b7c806f5aceccb335493588d07f234ace69b85cfca0f723839cc6" Oct 03 08:54:55 crc kubenswrapper[4790]: I1003 08:54:55.525506 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"36674ddc987b7c806f5aceccb335493588d07f234ace69b85cfca0f723839cc6"} err="failed to get container status \"36674ddc987b7c806f5aceccb335493588d07f234ace69b85cfca0f723839cc6\": rpc error: code = NotFound desc = could not find container \"36674ddc987b7c806f5aceccb335493588d07f234ace69b85cfca0f723839cc6\": container with ID starting with 36674ddc987b7c806f5aceccb335493588d07f234ace69b85cfca0f723839cc6 not found: ID does not exist" Oct 03 08:54:55 crc kubenswrapper[4790]: I1003 08:54:55.554778 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-sg48w"] Oct 03 08:54:55 crc kubenswrapper[4790]: I1003 08:54:55.559283 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/openstack-operator-index-sg48w"] Oct 03 08:54:56 crc kubenswrapper[4790]: I1003 08:54:56.344881 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b44b03bb-79b2-4aec-9775-5ae18ab0d4d0" path="/var/lib/kubelet/pods/b44b03bb-79b2-4aec-9775-5ae18ab0d4d0/volumes" Oct 03 08:54:56 crc kubenswrapper[4790]: I1003 08:54:56.520149 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-w2g5j" event={"ID":"9a285450-ca0c-4d27-b51b-81474aaa7d8f","Type":"ContainerStarted","Data":"9b736034d2f609ee946451712b11c60826097c03428b6cbab99a9a3818426970"} Oct 03 08:54:56 crc kubenswrapper[4790]: I1003 08:54:56.544614 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-w2g5j" podStartSLOduration=3.500946948 podStartE2EDuration="3.544560541s" podCreationTimestamp="2025-10-03 08:54:53 +0000 UTC" firstStartedPulling="2025-10-03 08:54:55.423233696 +0000 UTC m=+881.776167934" lastFinishedPulling="2025-10-03 08:54:55.466847299 +0000 UTC m=+881.819781527" observedRunningTime="2025-10-03 08:54:56.538188698 +0000 UTC m=+882.891122956" watchObservedRunningTime="2025-10-03 08:54:56.544560541 +0000 UTC m=+882.897494819" Oct 03 08:55:04 crc kubenswrapper[4790]: I1003 08:55:04.177823 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-w2g5j" Oct 03 08:55:04 crc kubenswrapper[4790]: I1003 08:55:04.178203 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-w2g5j" Oct 03 08:55:04 crc kubenswrapper[4790]: I1003 08:55:04.205740 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-w2g5j" Oct 03 08:55:04 crc kubenswrapper[4790]: I1003 08:55:04.626561 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-w2g5j" Oct 03 08:55:05 crc kubenswrapper[4790]: I1003 08:55:05.688265 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/05e449ad10d2925ed0c6277e570b403daaa897f83ce6b356969c83b302dmhsz"] Oct 03 08:55:05 crc kubenswrapper[4790]: E1003 08:55:05.701969 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b44b03bb-79b2-4aec-9775-5ae18ab0d4d0" containerName="registry-server" Oct 03 08:55:05 crc kubenswrapper[4790]: I1003 08:55:05.702018 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="b44b03bb-79b2-4aec-9775-5ae18ab0d4d0" containerName="registry-server" Oct 03 08:55:05 crc kubenswrapper[4790]: I1003 08:55:05.702416 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="b44b03bb-79b2-4aec-9775-5ae18ab0d4d0" containerName="registry-server" Oct 03 08:55:05 crc kubenswrapper[4790]: I1003 08:55:05.705126 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/05e449ad10d2925ed0c6277e570b403daaa897f83ce6b356969c83b302dmhsz" Oct 03 08:55:05 crc kubenswrapper[4790]: I1003 08:55:05.708167 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-nvl2g" Oct 03 08:55:05 crc kubenswrapper[4790]: I1003 08:55:05.710693 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/05e449ad10d2925ed0c6277e570b403daaa897f83ce6b356969c83b302dmhsz"] Oct 03 08:55:05 crc kubenswrapper[4790]: I1003 08:55:05.714508 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a3112067-ef2a-471a-94af-601f199e667d-util\") pod \"05e449ad10d2925ed0c6277e570b403daaa897f83ce6b356969c83b302dmhsz\" (UID: \"a3112067-ef2a-471a-94af-601f199e667d\") " pod="openstack-operators/05e449ad10d2925ed0c6277e570b403daaa897f83ce6b356969c83b302dmhsz" Oct 03 08:55:05 crc kubenswrapper[4790]: I1003 08:55:05.714550 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a3112067-ef2a-471a-94af-601f199e667d-bundle\") pod \"05e449ad10d2925ed0c6277e570b403daaa897f83ce6b356969c83b302dmhsz\" (UID: \"a3112067-ef2a-471a-94af-601f199e667d\") " pod="openstack-operators/05e449ad10d2925ed0c6277e570b403daaa897f83ce6b356969c83b302dmhsz" Oct 03 08:55:05 crc kubenswrapper[4790]: I1003 08:55:05.714825 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6n84k\" (UniqueName: \"kubernetes.io/projected/a3112067-ef2a-471a-94af-601f199e667d-kube-api-access-6n84k\") pod \"05e449ad10d2925ed0c6277e570b403daaa897f83ce6b356969c83b302dmhsz\" (UID: \"a3112067-ef2a-471a-94af-601f199e667d\") " pod="openstack-operators/05e449ad10d2925ed0c6277e570b403daaa897f83ce6b356969c83b302dmhsz" Oct 03 08:55:05 crc kubenswrapper[4790]: I1003 08:55:05.815644 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a3112067-ef2a-471a-94af-601f199e667d-util\") pod \"05e449ad10d2925ed0c6277e570b403daaa897f83ce6b356969c83b302dmhsz\" (UID: \"a3112067-ef2a-471a-94af-601f199e667d\") " pod="openstack-operators/05e449ad10d2925ed0c6277e570b403daaa897f83ce6b356969c83b302dmhsz" Oct 03 08:55:05 crc kubenswrapper[4790]: I1003 08:55:05.815693 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a3112067-ef2a-471a-94af-601f199e667d-bundle\") pod \"05e449ad10d2925ed0c6277e570b403daaa897f83ce6b356969c83b302dmhsz\" (UID: \"a3112067-ef2a-471a-94af-601f199e667d\") " pod="openstack-operators/05e449ad10d2925ed0c6277e570b403daaa897f83ce6b356969c83b302dmhsz" Oct 03 08:55:05 crc kubenswrapper[4790]: I1003 08:55:05.815774 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6n84k\" (UniqueName: \"kubernetes.io/projected/a3112067-ef2a-471a-94af-601f199e667d-kube-api-access-6n84k\") pod \"05e449ad10d2925ed0c6277e570b403daaa897f83ce6b356969c83b302dmhsz\" (UID: \"a3112067-ef2a-471a-94af-601f199e667d\") " pod="openstack-operators/05e449ad10d2925ed0c6277e570b403daaa897f83ce6b356969c83b302dmhsz" Oct 03 08:55:05 crc kubenswrapper[4790]: I1003 08:55:05.816363 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a3112067-ef2a-471a-94af-601f199e667d-util\") pod \"05e449ad10d2925ed0c6277e570b403daaa897f83ce6b356969c83b302dmhsz\" (UID: \"a3112067-ef2a-471a-94af-601f199e667d\") " pod="openstack-operators/05e449ad10d2925ed0c6277e570b403daaa897f83ce6b356969c83b302dmhsz" Oct 03 08:55:05 crc kubenswrapper[4790]: I1003 08:55:05.816633 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a3112067-ef2a-471a-94af-601f199e667d-bundle\") pod \"05e449ad10d2925ed0c6277e570b403daaa897f83ce6b356969c83b302dmhsz\" (UID: \"a3112067-ef2a-471a-94af-601f199e667d\") " pod="openstack-operators/05e449ad10d2925ed0c6277e570b403daaa897f83ce6b356969c83b302dmhsz" Oct 03 08:55:05 crc kubenswrapper[4790]: I1003 08:55:05.849394 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6n84k\" (UniqueName: \"kubernetes.io/projected/a3112067-ef2a-471a-94af-601f199e667d-kube-api-access-6n84k\") pod \"05e449ad10d2925ed0c6277e570b403daaa897f83ce6b356969c83b302dmhsz\" (UID: \"a3112067-ef2a-471a-94af-601f199e667d\") " pod="openstack-operators/05e449ad10d2925ed0c6277e570b403daaa897f83ce6b356969c83b302dmhsz" Oct 03 08:55:06 crc kubenswrapper[4790]: I1003 08:55:06.028455 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/05e449ad10d2925ed0c6277e570b403daaa897f83ce6b356969c83b302dmhsz" Oct 03 08:55:06 crc kubenswrapper[4790]: I1003 08:55:06.494790 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/05e449ad10d2925ed0c6277e570b403daaa897f83ce6b356969c83b302dmhsz"] Oct 03 08:55:06 crc kubenswrapper[4790]: W1003 08:55:06.503875 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda3112067_ef2a_471a_94af_601f199e667d.slice/crio-8aac40ebd2287e204c1328d4ba6e5679c37b36e20c1cb8bd59f729e856bc9680 WatchSource:0}: Error finding container 8aac40ebd2287e204c1328d4ba6e5679c37b36e20c1cb8bd59f729e856bc9680: Status 404 returned error can't find the container with id 8aac40ebd2287e204c1328d4ba6e5679c37b36e20c1cb8bd59f729e856bc9680 Oct 03 08:55:06 crc kubenswrapper[4790]: I1003 08:55:06.594413 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/05e449ad10d2925ed0c6277e570b403daaa897f83ce6b356969c83b302dmhsz" event={"ID":"a3112067-ef2a-471a-94af-601f199e667d","Type":"ContainerStarted","Data":"8aac40ebd2287e204c1328d4ba6e5679c37b36e20c1cb8bd59f729e856bc9680"} Oct 03 08:55:07 crc kubenswrapper[4790]: I1003 08:55:07.606348 4790 generic.go:334] "Generic (PLEG): container finished" podID="a3112067-ef2a-471a-94af-601f199e667d" containerID="7d0a7240489d09ab9fff1f421c1aab6028808a27de41c2704b74e677697ced2e" exitCode=0 Oct 03 08:55:07 crc kubenswrapper[4790]: I1003 08:55:07.606415 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/05e449ad10d2925ed0c6277e570b403daaa897f83ce6b356969c83b302dmhsz" event={"ID":"a3112067-ef2a-471a-94af-601f199e667d","Type":"ContainerDied","Data":"7d0a7240489d09ab9fff1f421c1aab6028808a27de41c2704b74e677697ced2e"} Oct 03 08:55:08 crc kubenswrapper[4790]: I1003 08:55:08.614945 4790 generic.go:334] "Generic (PLEG): container finished" podID="a3112067-ef2a-471a-94af-601f199e667d" containerID="80122643f546cbf6a00d6b13d1828c6c1adbcdfd8d46d5ab5e6ac5ed4f6a993a" exitCode=0 Oct 03 08:55:08 crc kubenswrapper[4790]: I1003 08:55:08.615000 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/05e449ad10d2925ed0c6277e570b403daaa897f83ce6b356969c83b302dmhsz" event={"ID":"a3112067-ef2a-471a-94af-601f199e667d","Type":"ContainerDied","Data":"80122643f546cbf6a00d6b13d1828c6c1adbcdfd8d46d5ab5e6ac5ed4f6a993a"} Oct 03 08:55:09 crc kubenswrapper[4790]: I1003 08:55:09.626392 4790 generic.go:334] "Generic (PLEG): container finished" podID="a3112067-ef2a-471a-94af-601f199e667d" containerID="11cbd877f79008438f9c1a9fcbe1576fb9290a3bdbaf5a18e15ec4d12aa2e6be" exitCode=0 Oct 03 08:55:09 crc kubenswrapper[4790]: I1003 08:55:09.626468 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/05e449ad10d2925ed0c6277e570b403daaa897f83ce6b356969c83b302dmhsz" event={"ID":"a3112067-ef2a-471a-94af-601f199e667d","Type":"ContainerDied","Data":"11cbd877f79008438f9c1a9fcbe1576fb9290a3bdbaf5a18e15ec4d12aa2e6be"} Oct 03 08:55:10 crc kubenswrapper[4790]: I1003 08:55:10.186600 4790 patch_prober.go:28] interesting pod/machine-config-daemon-zqj4j container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 08:55:10 crc kubenswrapper[4790]: I1003 08:55:10.186657 4790 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" podUID="36eae06a-d8be-4128-8d68-acb32e1f011b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 08:55:10 crc kubenswrapper[4790]: I1003 08:55:10.922440 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/05e449ad10d2925ed0c6277e570b403daaa897f83ce6b356969c83b302dmhsz" Oct 03 08:55:11 crc kubenswrapper[4790]: I1003 08:55:11.006676 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a3112067-ef2a-471a-94af-601f199e667d-util\") pod \"a3112067-ef2a-471a-94af-601f199e667d\" (UID: \"a3112067-ef2a-471a-94af-601f199e667d\") " Oct 03 08:55:11 crc kubenswrapper[4790]: I1003 08:55:11.006782 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a3112067-ef2a-471a-94af-601f199e667d-bundle\") pod \"a3112067-ef2a-471a-94af-601f199e667d\" (UID: \"a3112067-ef2a-471a-94af-601f199e667d\") " Oct 03 08:55:11 crc kubenswrapper[4790]: I1003 08:55:11.006886 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6n84k\" (UniqueName: \"kubernetes.io/projected/a3112067-ef2a-471a-94af-601f199e667d-kube-api-access-6n84k\") pod \"a3112067-ef2a-471a-94af-601f199e667d\" (UID: \"a3112067-ef2a-471a-94af-601f199e667d\") " Oct 03 08:55:11 crc kubenswrapper[4790]: I1003 08:55:11.008784 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a3112067-ef2a-471a-94af-601f199e667d-bundle" (OuterVolumeSpecName: "bundle") pod "a3112067-ef2a-471a-94af-601f199e667d" (UID: "a3112067-ef2a-471a-94af-601f199e667d"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 08:55:11 crc kubenswrapper[4790]: I1003 08:55:11.012759 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a3112067-ef2a-471a-94af-601f199e667d-kube-api-access-6n84k" (OuterVolumeSpecName: "kube-api-access-6n84k") pod "a3112067-ef2a-471a-94af-601f199e667d" (UID: "a3112067-ef2a-471a-94af-601f199e667d"). InnerVolumeSpecName "kube-api-access-6n84k". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:55:11 crc kubenswrapper[4790]: I1003 08:55:11.021802 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a3112067-ef2a-471a-94af-601f199e667d-util" (OuterVolumeSpecName: "util") pod "a3112067-ef2a-471a-94af-601f199e667d" (UID: "a3112067-ef2a-471a-94af-601f199e667d"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 08:55:11 crc kubenswrapper[4790]: I1003 08:55:11.108659 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6n84k\" (UniqueName: \"kubernetes.io/projected/a3112067-ef2a-471a-94af-601f199e667d-kube-api-access-6n84k\") on node \"crc\" DevicePath \"\"" Oct 03 08:55:11 crc kubenswrapper[4790]: I1003 08:55:11.108718 4790 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a3112067-ef2a-471a-94af-601f199e667d-util\") on node \"crc\" DevicePath \"\"" Oct 03 08:55:11 crc kubenswrapper[4790]: I1003 08:55:11.108736 4790 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a3112067-ef2a-471a-94af-601f199e667d-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 08:55:11 crc kubenswrapper[4790]: I1003 08:55:11.643279 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/05e449ad10d2925ed0c6277e570b403daaa897f83ce6b356969c83b302dmhsz" event={"ID":"a3112067-ef2a-471a-94af-601f199e667d","Type":"ContainerDied","Data":"8aac40ebd2287e204c1328d4ba6e5679c37b36e20c1cb8bd59f729e856bc9680"} Oct 03 08:55:11 crc kubenswrapper[4790]: I1003 08:55:11.643323 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/05e449ad10d2925ed0c6277e570b403daaa897f83ce6b356969c83b302dmhsz" Oct 03 08:55:11 crc kubenswrapper[4790]: I1003 08:55:11.643334 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8aac40ebd2287e204c1328d4ba6e5679c37b36e20c1cb8bd59f729e856bc9680" Oct 03 08:55:18 crc kubenswrapper[4790]: I1003 08:55:18.113737 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-operator-7f8d586cd6-ptxlm"] Oct 03 08:55:18 crc kubenswrapper[4790]: E1003 08:55:18.114651 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a3112067-ef2a-471a-94af-601f199e667d" containerName="util" Oct 03 08:55:18 crc kubenswrapper[4790]: I1003 08:55:18.114668 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3112067-ef2a-471a-94af-601f199e667d" containerName="util" Oct 03 08:55:18 crc kubenswrapper[4790]: E1003 08:55:18.114683 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a3112067-ef2a-471a-94af-601f199e667d" containerName="extract" Oct 03 08:55:18 crc kubenswrapper[4790]: I1003 08:55:18.114690 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3112067-ef2a-471a-94af-601f199e667d" containerName="extract" Oct 03 08:55:18 crc kubenswrapper[4790]: E1003 08:55:18.114699 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a3112067-ef2a-471a-94af-601f199e667d" containerName="pull" Oct 03 08:55:18 crc kubenswrapper[4790]: I1003 08:55:18.114707 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3112067-ef2a-471a-94af-601f199e667d" containerName="pull" Oct 03 08:55:18 crc kubenswrapper[4790]: I1003 08:55:18.114830 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="a3112067-ef2a-471a-94af-601f199e667d" containerName="extract" Oct 03 08:55:18 crc kubenswrapper[4790]: I1003 08:55:18.115671 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-7f8d586cd6-ptxlm" Oct 03 08:55:18 crc kubenswrapper[4790]: I1003 08:55:18.121765 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-operator-dockercfg-hwrzf" Oct 03 08:55:18 crc kubenswrapper[4790]: I1003 08:55:18.167202 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-7f8d586cd6-ptxlm"] Oct 03 08:55:18 crc kubenswrapper[4790]: I1003 08:55:18.212107 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8grp9\" (UniqueName: \"kubernetes.io/projected/f1035867-3856-4436-8175-e21df1fbb18c-kube-api-access-8grp9\") pod \"openstack-operator-controller-operator-7f8d586cd6-ptxlm\" (UID: \"f1035867-3856-4436-8175-e21df1fbb18c\") " pod="openstack-operators/openstack-operator-controller-operator-7f8d586cd6-ptxlm" Oct 03 08:55:18 crc kubenswrapper[4790]: I1003 08:55:18.313205 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8grp9\" (UniqueName: \"kubernetes.io/projected/f1035867-3856-4436-8175-e21df1fbb18c-kube-api-access-8grp9\") pod \"openstack-operator-controller-operator-7f8d586cd6-ptxlm\" (UID: \"f1035867-3856-4436-8175-e21df1fbb18c\") " pod="openstack-operators/openstack-operator-controller-operator-7f8d586cd6-ptxlm" Oct 03 08:55:18 crc kubenswrapper[4790]: I1003 08:55:18.335070 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8grp9\" (UniqueName: \"kubernetes.io/projected/f1035867-3856-4436-8175-e21df1fbb18c-kube-api-access-8grp9\") pod \"openstack-operator-controller-operator-7f8d586cd6-ptxlm\" (UID: \"f1035867-3856-4436-8175-e21df1fbb18c\") " pod="openstack-operators/openstack-operator-controller-operator-7f8d586cd6-ptxlm" Oct 03 08:55:18 crc kubenswrapper[4790]: I1003 08:55:18.441770 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-7f8d586cd6-ptxlm" Oct 03 08:55:18 crc kubenswrapper[4790]: I1003 08:55:18.861743 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-7f8d586cd6-ptxlm"] Oct 03 08:55:19 crc kubenswrapper[4790]: I1003 08:55:19.705880 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-7f8d586cd6-ptxlm" event={"ID":"f1035867-3856-4436-8175-e21df1fbb18c","Type":"ContainerStarted","Data":"b0ab181aace41e074dc5c7e97db7290ec32e8aeacd7e1bac87c8ef8596d6be4a"} Oct 03 08:55:22 crc kubenswrapper[4790]: I1003 08:55:22.758014 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-7f8d586cd6-ptxlm" event={"ID":"f1035867-3856-4436-8175-e21df1fbb18c","Type":"ContainerStarted","Data":"e75ded84bab601c088790b9c5e3ea9f50cf92ba492db188ece53f6f73b250056"} Oct 03 08:55:24 crc kubenswrapper[4790]: I1003 08:55:24.771693 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-7f8d586cd6-ptxlm" event={"ID":"f1035867-3856-4436-8175-e21df1fbb18c","Type":"ContainerStarted","Data":"47830dda1c9fb1a76f643dbe5cf14c5702fe31fe677a5c9f3d0d8c7a171892f3"} Oct 03 08:55:24 crc kubenswrapper[4790]: I1003 08:55:24.772048 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-operator-7f8d586cd6-ptxlm" Oct 03 08:55:24 crc kubenswrapper[4790]: I1003 08:55:24.801794 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-operator-7f8d586cd6-ptxlm" podStartSLOduration=1.14650064 podStartE2EDuration="6.801770305s" podCreationTimestamp="2025-10-03 08:55:18 +0000 UTC" firstStartedPulling="2025-10-03 08:55:18.871020808 +0000 UTC m=+905.223955046" lastFinishedPulling="2025-10-03 08:55:24.526290473 +0000 UTC m=+910.879224711" observedRunningTime="2025-10-03 08:55:24.798527757 +0000 UTC m=+911.151461995" watchObservedRunningTime="2025-10-03 08:55:24.801770305 +0000 UTC m=+911.154704553" Oct 03 08:55:28 crc kubenswrapper[4790]: I1003 08:55:28.445232 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-operator-7f8d586cd6-ptxlm" Oct 03 08:55:40 crc kubenswrapper[4790]: I1003 08:55:40.188243 4790 patch_prober.go:28] interesting pod/machine-config-daemon-zqj4j container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 08:55:40 crc kubenswrapper[4790]: I1003 08:55:40.188684 4790 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" podUID="36eae06a-d8be-4128-8d68-acb32e1f011b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 08:55:44 crc kubenswrapper[4790]: I1003 08:55:44.642994 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-6c675fb79f-rtftw"] Oct 03 08:55:44 crc kubenswrapper[4790]: I1003 08:55:44.644705 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-6c675fb79f-rtftw" Oct 03 08:55:44 crc kubenswrapper[4790]: I1003 08:55:44.649679 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/cinder-operator-controller-manager-79d68d6c85-q7g8m"] Oct 03 08:55:44 crc kubenswrapper[4790]: I1003 08:55:44.652172 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-79d68d6c85-q7g8m" Oct 03 08:55:44 crc kubenswrapper[4790]: I1003 08:55:44.653831 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"cinder-operator-controller-manager-dockercfg-bvjbs" Oct 03 08:55:44 crc kubenswrapper[4790]: I1003 08:55:44.655275 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-6c675fb79f-rtftw"] Oct 03 08:55:44 crc kubenswrapper[4790]: I1003 08:55:44.655556 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-controller-manager-dockercfg-gmlxc" Oct 03 08:55:44 crc kubenswrapper[4790]: I1003 08:55:44.670913 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/designate-operator-controller-manager-75dfd9b554-s25k5"] Oct 03 08:55:44 crc kubenswrapper[4790]: I1003 08:55:44.672548 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-75dfd9b554-s25k5" Oct 03 08:55:44 crc kubenswrapper[4790]: I1003 08:55:44.676816 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-79d68d6c85-q7g8m"] Oct 03 08:55:44 crc kubenswrapper[4790]: I1003 08:55:44.678230 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"designate-operator-controller-manager-dockercfg-ln67t" Oct 03 08:55:44 crc kubenswrapper[4790]: I1003 08:55:44.701527 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-75dfd9b554-s25k5"] Oct 03 08:55:44 crc kubenswrapper[4790]: I1003 08:55:44.710085 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7nvvz\" (UniqueName: \"kubernetes.io/projected/9b9ab2ca-8c9e-46d1-af78-b811e32b05f0-kube-api-access-7nvvz\") pod \"cinder-operator-controller-manager-79d68d6c85-q7g8m\" (UID: \"9b9ab2ca-8c9e-46d1-af78-b811e32b05f0\") " pod="openstack-operators/cinder-operator-controller-manager-79d68d6c85-q7g8m" Oct 03 08:55:44 crc kubenswrapper[4790]: I1003 08:55:44.710347 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pz6tz\" (UniqueName: \"kubernetes.io/projected/f598237f-5311-44e2-8ed2-3de81bbebc91-kube-api-access-pz6tz\") pod \"barbican-operator-controller-manager-6c675fb79f-rtftw\" (UID: \"f598237f-5311-44e2-8ed2-3de81bbebc91\") " pod="openstack-operators/barbican-operator-controller-manager-6c675fb79f-rtftw" Oct 03 08:55:44 crc kubenswrapper[4790]: I1003 08:55:44.726445 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-846dff85b5-pqh6z"] Oct 03 08:55:44 crc kubenswrapper[4790]: I1003 08:55:44.727878 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-846dff85b5-pqh6z" Oct 03 08:55:44 crc kubenswrapper[4790]: I1003 08:55:44.731128 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-846dff85b5-pqh6z"] Oct 03 08:55:44 crc kubenswrapper[4790]: I1003 08:55:44.733868 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-dockercfg-chwh8" Oct 03 08:55:44 crc kubenswrapper[4790]: I1003 08:55:44.734768 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/heat-operator-controller-manager-599898f689-h2km6"] Oct 03 08:55:44 crc kubenswrapper[4790]: I1003 08:55:44.735694 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-599898f689-h2km6" Oct 03 08:55:44 crc kubenswrapper[4790]: I1003 08:55:44.741946 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"heat-operator-controller-manager-dockercfg-c6qm6" Oct 03 08:55:44 crc kubenswrapper[4790]: I1003 08:55:44.751134 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-599898f689-h2km6"] Oct 03 08:55:44 crc kubenswrapper[4790]: I1003 08:55:44.791571 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-6769b867d9-d6t4r"] Oct 03 08:55:44 crc kubenswrapper[4790]: I1003 08:55:44.792822 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-6769b867d9-d6t4r" Oct 03 08:55:44 crc kubenswrapper[4790]: I1003 08:55:44.800455 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-dockercfg-r6xxk" Oct 03 08:55:44 crc kubenswrapper[4790]: I1003 08:55:44.811254 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nsmmw\" (UniqueName: \"kubernetes.io/projected/d6653934-c835-4642-a1da-9159b7f85dcb-kube-api-access-nsmmw\") pod \"heat-operator-controller-manager-599898f689-h2km6\" (UID: \"d6653934-c835-4642-a1da-9159b7f85dcb\") " pod="openstack-operators/heat-operator-controller-manager-599898f689-h2km6" Oct 03 08:55:44 crc kubenswrapper[4790]: I1003 08:55:44.811320 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pz6tz\" (UniqueName: \"kubernetes.io/projected/f598237f-5311-44e2-8ed2-3de81bbebc91-kube-api-access-pz6tz\") pod \"barbican-operator-controller-manager-6c675fb79f-rtftw\" (UID: \"f598237f-5311-44e2-8ed2-3de81bbebc91\") " pod="openstack-operators/barbican-operator-controller-manager-6c675fb79f-rtftw" Oct 03 08:55:44 crc kubenswrapper[4790]: I1003 08:55:44.811401 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8fsgz\" (UniqueName: \"kubernetes.io/projected/a168c4e9-522f-4f9b-97d3-5c50008ec59c-kube-api-access-8fsgz\") pod \"glance-operator-controller-manager-846dff85b5-pqh6z\" (UID: \"a168c4e9-522f-4f9b-97d3-5c50008ec59c\") " pod="openstack-operators/glance-operator-controller-manager-846dff85b5-pqh6z" Oct 03 08:55:44 crc kubenswrapper[4790]: I1003 08:55:44.811430 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7nvvz\" (UniqueName: \"kubernetes.io/projected/9b9ab2ca-8c9e-46d1-af78-b811e32b05f0-kube-api-access-7nvvz\") pod \"cinder-operator-controller-manager-79d68d6c85-q7g8m\" (UID: \"9b9ab2ca-8c9e-46d1-af78-b811e32b05f0\") " pod="openstack-operators/cinder-operator-controller-manager-79d68d6c85-q7g8m" Oct 03 08:55:44 crc kubenswrapper[4790]: I1003 08:55:44.811454 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wljt7\" (UniqueName: \"kubernetes.io/projected/b1f2aef2-7fd0-4d10-ab5d-f0a8b2e7ffaf-kube-api-access-wljt7\") pod \"designate-operator-controller-manager-75dfd9b554-s25k5\" (UID: \"b1f2aef2-7fd0-4d10-ab5d-f0a8b2e7ffaf\") " pod="openstack-operators/designate-operator-controller-manager-75dfd9b554-s25k5" Oct 03 08:55:44 crc kubenswrapper[4790]: I1003 08:55:44.820410 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-6769b867d9-d6t4r"] Oct 03 08:55:44 crc kubenswrapper[4790]: I1003 08:55:44.832838 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ironic-operator-controller-manager-84bc9db6cc-2vq5g"] Oct 03 08:55:44 crc kubenswrapper[4790]: I1003 08:55:44.834729 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-84bc9db6cc-2vq5g" Oct 03 08:55:44 crc kubenswrapper[4790]: I1003 08:55:44.842010 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-7f55849f88-twvmq"] Oct 03 08:55:44 crc kubenswrapper[4790]: I1003 08:55:44.842247 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ironic-operator-controller-manager-dockercfg-fpfqt" Oct 03 08:55:44 crc kubenswrapper[4790]: I1003 08:55:44.843638 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-7f55849f88-twvmq" Oct 03 08:55:44 crc kubenswrapper[4790]: I1003 08:55:44.846632 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-fd6wp" Oct 03 08:55:44 crc kubenswrapper[4790]: I1003 08:55:44.896030 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7nvvz\" (UniqueName: \"kubernetes.io/projected/9b9ab2ca-8c9e-46d1-af78-b811e32b05f0-kube-api-access-7nvvz\") pod \"cinder-operator-controller-manager-79d68d6c85-q7g8m\" (UID: \"9b9ab2ca-8c9e-46d1-af78-b811e32b05f0\") " pod="openstack-operators/cinder-operator-controller-manager-79d68d6c85-q7g8m" Oct 03 08:55:44 crc kubenswrapper[4790]: I1003 08:55:44.896801 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pz6tz\" (UniqueName: \"kubernetes.io/projected/f598237f-5311-44e2-8ed2-3de81bbebc91-kube-api-access-pz6tz\") pod \"barbican-operator-controller-manager-6c675fb79f-rtftw\" (UID: \"f598237f-5311-44e2-8ed2-3de81bbebc91\") " pod="openstack-operators/barbican-operator-controller-manager-6c675fb79f-rtftw" Oct 03 08:55:44 crc kubenswrapper[4790]: I1003 08:55:44.912510 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-5fbf469cd7-894gw"] Oct 03 08:55:44 crc kubenswrapper[4790]: I1003 08:55:44.914033 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8fsgz\" (UniqueName: \"kubernetes.io/projected/a168c4e9-522f-4f9b-97d3-5c50008ec59c-kube-api-access-8fsgz\") pod \"glance-operator-controller-manager-846dff85b5-pqh6z\" (UID: \"a168c4e9-522f-4f9b-97d3-5c50008ec59c\") " pod="openstack-operators/glance-operator-controller-manager-846dff85b5-pqh6z" Oct 03 08:55:44 crc kubenswrapper[4790]: I1003 08:55:44.914080 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wljt7\" (UniqueName: \"kubernetes.io/projected/b1f2aef2-7fd0-4d10-ab5d-f0a8b2e7ffaf-kube-api-access-wljt7\") pod \"designate-operator-controller-manager-75dfd9b554-s25k5\" (UID: \"b1f2aef2-7fd0-4d10-ab5d-f0a8b2e7ffaf\") " pod="openstack-operators/designate-operator-controller-manager-75dfd9b554-s25k5" Oct 03 08:55:44 crc kubenswrapper[4790]: I1003 08:55:44.914115 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hj2sr\" (UniqueName: \"kubernetes.io/projected/1eec0ecd-6bcb-4df5-b6bc-66780ca3e4d4-kube-api-access-hj2sr\") pod \"ironic-operator-controller-manager-84bc9db6cc-2vq5g\" (UID: \"1eec0ecd-6bcb-4df5-b6bc-66780ca3e4d4\") " pod="openstack-operators/ironic-operator-controller-manager-84bc9db6cc-2vq5g" Oct 03 08:55:44 crc kubenswrapper[4790]: I1003 08:55:44.914177 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nsmmw\" (UniqueName: \"kubernetes.io/projected/d6653934-c835-4642-a1da-9159b7f85dcb-kube-api-access-nsmmw\") pod \"heat-operator-controller-manager-599898f689-h2km6\" (UID: \"d6653934-c835-4642-a1da-9159b7f85dcb\") " pod="openstack-operators/heat-operator-controller-manager-599898f689-h2km6" Oct 03 08:55:44 crc kubenswrapper[4790]: I1003 08:55:44.914243 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-55hls\" (UniqueName: \"kubernetes.io/projected/5b3d2a0d-d36d-4a67-9c5f-6a85735127a1-kube-api-access-55hls\") pod \"horizon-operator-controller-manager-6769b867d9-d6t4r\" (UID: \"5b3d2a0d-d36d-4a67-9c5f-6a85735127a1\") " pod="openstack-operators/horizon-operator-controller-manager-6769b867d9-d6t4r" Oct 03 08:55:44 crc kubenswrapper[4790]: I1003 08:55:44.914264 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9d48p\" (UniqueName: \"kubernetes.io/projected/12237b3a-fae2-4cac-b0d7-ee396be3da08-kube-api-access-9d48p\") pod \"keystone-operator-controller-manager-7f55849f88-twvmq\" (UID: \"12237b3a-fae2-4cac-b0d7-ee396be3da08\") " pod="openstack-operators/keystone-operator-controller-manager-7f55849f88-twvmq" Oct 03 08:55:44 crc kubenswrapper[4790]: I1003 08:55:44.930825 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-5fbf469cd7-894gw" Oct 03 08:55:44 crc kubenswrapper[4790]: I1003 08:55:44.955766 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-webhook-server-cert" Oct 03 08:55:44 crc kubenswrapper[4790]: I1003 08:55:44.976789 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wljt7\" (UniqueName: \"kubernetes.io/projected/b1f2aef2-7fd0-4d10-ab5d-f0a8b2e7ffaf-kube-api-access-wljt7\") pod \"designate-operator-controller-manager-75dfd9b554-s25k5\" (UID: \"b1f2aef2-7fd0-4d10-ab5d-f0a8b2e7ffaf\") " pod="openstack-operators/designate-operator-controller-manager-75dfd9b554-s25k5" Oct 03 08:55:44 crc kubenswrapper[4790]: I1003 08:55:44.976938 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nsmmw\" (UniqueName: \"kubernetes.io/projected/d6653934-c835-4642-a1da-9159b7f85dcb-kube-api-access-nsmmw\") pod \"heat-operator-controller-manager-599898f689-h2km6\" (UID: \"d6653934-c835-4642-a1da-9159b7f85dcb\") " pod="openstack-operators/heat-operator-controller-manager-599898f689-h2km6" Oct 03 08:55:44 crc kubenswrapper[4790]: I1003 08:55:44.977274 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-v5mbh" Oct 03 08:55:44 crc kubenswrapper[4790]: I1003 08:55:44.977794 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-6c675fb79f-rtftw" Oct 03 08:55:44 crc kubenswrapper[4790]: I1003 08:55:44.979020 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-84bc9db6cc-2vq5g"] Oct 03 08:55:44 crc kubenswrapper[4790]: I1003 08:55:44.982838 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8fsgz\" (UniqueName: \"kubernetes.io/projected/a168c4e9-522f-4f9b-97d3-5c50008ec59c-kube-api-access-8fsgz\") pod \"glance-operator-controller-manager-846dff85b5-pqh6z\" (UID: \"a168c4e9-522f-4f9b-97d3-5c50008ec59c\") " pod="openstack-operators/glance-operator-controller-manager-846dff85b5-pqh6z" Oct 03 08:55:44 crc kubenswrapper[4790]: I1003 08:55:44.993718 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-7f55849f88-twvmq"] Oct 03 08:55:45 crc kubenswrapper[4790]: I1003 08:55:45.005163 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-79d68d6c85-q7g8m" Oct 03 08:55:45 crc kubenswrapper[4790]: I1003 08:55:45.011338 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-75dfd9b554-s25k5" Oct 03 08:55:45 crc kubenswrapper[4790]: I1003 08:55:45.011681 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-5fbf469cd7-894gw"] Oct 03 08:55:45 crc kubenswrapper[4790]: I1003 08:55:45.018636 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-55hls\" (UniqueName: \"kubernetes.io/projected/5b3d2a0d-d36d-4a67-9c5f-6a85735127a1-kube-api-access-55hls\") pod \"horizon-operator-controller-manager-6769b867d9-d6t4r\" (UID: \"5b3d2a0d-d36d-4a67-9c5f-6a85735127a1\") " pod="openstack-operators/horizon-operator-controller-manager-6769b867d9-d6t4r" Oct 03 08:55:45 crc kubenswrapper[4790]: I1003 08:55:45.018688 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9d48p\" (UniqueName: \"kubernetes.io/projected/12237b3a-fae2-4cac-b0d7-ee396be3da08-kube-api-access-9d48p\") pod \"keystone-operator-controller-manager-7f55849f88-twvmq\" (UID: \"12237b3a-fae2-4cac-b0d7-ee396be3da08\") " pod="openstack-operators/keystone-operator-controller-manager-7f55849f88-twvmq" Oct 03 08:55:45 crc kubenswrapper[4790]: I1003 08:55:45.018748 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hj2sr\" (UniqueName: \"kubernetes.io/projected/1eec0ecd-6bcb-4df5-b6bc-66780ca3e4d4-kube-api-access-hj2sr\") pod \"ironic-operator-controller-manager-84bc9db6cc-2vq5g\" (UID: \"1eec0ecd-6bcb-4df5-b6bc-66780ca3e4d4\") " pod="openstack-operators/ironic-operator-controller-manager-84bc9db6cc-2vq5g" Oct 03 08:55:45 crc kubenswrapper[4790]: I1003 08:55:45.064015 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9d48p\" (UniqueName: \"kubernetes.io/projected/12237b3a-fae2-4cac-b0d7-ee396be3da08-kube-api-access-9d48p\") pod \"keystone-operator-controller-manager-7f55849f88-twvmq\" (UID: \"12237b3a-fae2-4cac-b0d7-ee396be3da08\") " pod="openstack-operators/keystone-operator-controller-manager-7f55849f88-twvmq" Oct 03 08:55:45 crc kubenswrapper[4790]: I1003 08:55:45.064270 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hj2sr\" (UniqueName: \"kubernetes.io/projected/1eec0ecd-6bcb-4df5-b6bc-66780ca3e4d4-kube-api-access-hj2sr\") pod \"ironic-operator-controller-manager-84bc9db6cc-2vq5g\" (UID: \"1eec0ecd-6bcb-4df5-b6bc-66780ca3e4d4\") " pod="openstack-operators/ironic-operator-controller-manager-84bc9db6cc-2vq5g" Oct 03 08:55:45 crc kubenswrapper[4790]: I1003 08:55:45.073923 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-846dff85b5-pqh6z" Oct 03 08:55:45 crc kubenswrapper[4790]: I1003 08:55:45.078389 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-55hls\" (UniqueName: \"kubernetes.io/projected/5b3d2a0d-d36d-4a67-9c5f-6a85735127a1-kube-api-access-55hls\") pod \"horizon-operator-controller-manager-6769b867d9-d6t4r\" (UID: \"5b3d2a0d-d36d-4a67-9c5f-6a85735127a1\") " pod="openstack-operators/horizon-operator-controller-manager-6769b867d9-d6t4r" Oct 03 08:55:45 crc kubenswrapper[4790]: I1003 08:55:45.085372 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-599898f689-h2km6" Oct 03 08:55:45 crc kubenswrapper[4790]: I1003 08:55:45.096989 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/manila-operator-controller-manager-6fd6854b49-k8l6q"] Oct 03 08:55:45 crc kubenswrapper[4790]: I1003 08:55:45.098101 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-6fd6854b49-k8l6q" Oct 03 08:55:45 crc kubenswrapper[4790]: I1003 08:55:45.102922 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-5c468bf4d4-sdqf6"] Oct 03 08:55:45 crc kubenswrapper[4790]: I1003 08:55:45.103330 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"manila-operator-controller-manager-dockercfg-mt9fj" Oct 03 08:55:45 crc kubenswrapper[4790]: I1003 08:55:45.104319 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-5c468bf4d4-sdqf6" Oct 03 08:55:45 crc kubenswrapper[4790]: I1003 08:55:45.110497 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-6fd6854b49-k8l6q"] Oct 03 08:55:45 crc kubenswrapper[4790]: I1003 08:55:45.121012 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-xmntr" Oct 03 08:55:45 crc kubenswrapper[4790]: I1003 08:55:45.125246 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zn2js\" (UniqueName: \"kubernetes.io/projected/2bc6d6a2-91a5-464c-bba7-3a00ee0b1938-kube-api-access-zn2js\") pod \"infra-operator-controller-manager-5fbf469cd7-894gw\" (UID: \"2bc6d6a2-91a5-464c-bba7-3a00ee0b1938\") " pod="openstack-operators/infra-operator-controller-manager-5fbf469cd7-894gw" Oct 03 08:55:45 crc kubenswrapper[4790]: I1003 08:55:45.125427 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2bc6d6a2-91a5-464c-bba7-3a00ee0b1938-cert\") pod \"infra-operator-controller-manager-5fbf469cd7-894gw\" (UID: \"2bc6d6a2-91a5-464c-bba7-3a00ee0b1938\") " pod="openstack-operators/infra-operator-controller-manager-5fbf469cd7-894gw" Oct 03 08:55:45 crc kubenswrapper[4790]: I1003 08:55:45.125869 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-6769b867d9-d6t4r" Oct 03 08:55:45 crc kubenswrapper[4790]: I1003 08:55:45.139357 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-5c468bf4d4-sdqf6"] Oct 03 08:55:45 crc kubenswrapper[4790]: I1003 08:55:45.148985 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/neutron-operator-controller-manager-6574bf987d-668k4"] Oct 03 08:55:45 crc kubenswrapper[4790]: I1003 08:55:45.150351 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-6574bf987d-668k4" Oct 03 08:55:45 crc kubenswrapper[4790]: I1003 08:55:45.153129 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"neutron-operator-controller-manager-dockercfg-qkphd" Oct 03 08:55:45 crc kubenswrapper[4790]: I1003 08:55:45.161838 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/nova-operator-controller-manager-555c7456bd-tlmwh"] Oct 03 08:55:45 crc kubenswrapper[4790]: I1003 08:55:45.163060 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-555c7456bd-tlmwh" Oct 03 08:55:45 crc kubenswrapper[4790]: I1003 08:55:45.165194 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"nova-operator-controller-manager-dockercfg-bctl7" Oct 03 08:55:45 crc kubenswrapper[4790]: I1003 08:55:45.170457 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-6574bf987d-668k4"] Oct 03 08:55:45 crc kubenswrapper[4790]: I1003 08:55:45.174080 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/octavia-operator-controller-manager-59d6cfdf45-62zvh"] Oct 03 08:55:45 crc kubenswrapper[4790]: I1003 08:55:45.176158 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-59d6cfdf45-62zvh" Oct 03 08:55:45 crc kubenswrapper[4790]: I1003 08:55:45.181976 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-555c7456bd-tlmwh"] Oct 03 08:55:45 crc kubenswrapper[4790]: I1003 08:55:45.186794 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"octavia-operator-controller-manager-dockercfg-mltk8" Oct 03 08:55:45 crc kubenswrapper[4790]: I1003 08:55:45.188605 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-59d6cfdf45-62zvh"] Oct 03 08:55:45 crc kubenswrapper[4790]: I1003 08:55:45.200011 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/placement-operator-controller-manager-7d8bb7f44c-8gls6"] Oct 03 08:55:45 crc kubenswrapper[4790]: I1003 08:55:45.201060 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ovn-operator-controller-manager-688db7b6c7-kv6xz"] Oct 03 08:55:45 crc kubenswrapper[4790]: I1003 08:55:45.201793 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-6f64c4d678lpbm5"] Oct 03 08:55:45 crc kubenswrapper[4790]: I1003 08:55:45.202541 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6f64c4d678lpbm5" Oct 03 08:55:45 crc kubenswrapper[4790]: I1003 08:55:45.232441 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-7d8bb7f44c-8gls6" Oct 03 08:55:45 crc kubenswrapper[4790]: I1003 08:55:45.232979 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-688db7b6c7-kv6xz" Oct 03 08:55:45 crc kubenswrapper[4790]: I1003 08:55:45.234281 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rhhgr\" (UniqueName: \"kubernetes.io/projected/9cfa9536-48e0-4639-a0ed-acab9ab92ed0-kube-api-access-rhhgr\") pod \"manila-operator-controller-manager-6fd6854b49-k8l6q\" (UID: \"9cfa9536-48e0-4639-a0ed-acab9ab92ed0\") " pod="openstack-operators/manila-operator-controller-manager-6fd6854b49-k8l6q" Oct 03 08:55:45 crc kubenswrapper[4790]: I1003 08:55:45.234354 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8m945\" (UniqueName: \"kubernetes.io/projected/c97f26b2-5313-49db-8ee8-bcde2e793bfa-kube-api-access-8m945\") pod \"mariadb-operator-controller-manager-5c468bf4d4-sdqf6\" (UID: \"c97f26b2-5313-49db-8ee8-bcde2e793bfa\") " pod="openstack-operators/mariadb-operator-controller-manager-5c468bf4d4-sdqf6" Oct 03 08:55:45 crc kubenswrapper[4790]: I1003 08:55:45.234390 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zn2js\" (UniqueName: \"kubernetes.io/projected/2bc6d6a2-91a5-464c-bba7-3a00ee0b1938-kube-api-access-zn2js\") pod \"infra-operator-controller-manager-5fbf469cd7-894gw\" (UID: \"2bc6d6a2-91a5-464c-bba7-3a00ee0b1938\") " pod="openstack-operators/infra-operator-controller-manager-5fbf469cd7-894gw" Oct 03 08:55:45 crc kubenswrapper[4790]: I1003 08:55:45.234438 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2bc6d6a2-91a5-464c-bba7-3a00ee0b1938-cert\") pod \"infra-operator-controller-manager-5fbf469cd7-894gw\" (UID: \"2bc6d6a2-91a5-464c-bba7-3a00ee0b1938\") " pod="openstack-operators/infra-operator-controller-manager-5fbf469cd7-894gw" Oct 03 08:55:45 crc kubenswrapper[4790]: E1003 08:55:45.234579 4790 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Oct 03 08:55:45 crc kubenswrapper[4790]: E1003 08:55:45.234644 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2bc6d6a2-91a5-464c-bba7-3a00ee0b1938-cert podName:2bc6d6a2-91a5-464c-bba7-3a00ee0b1938 nodeName:}" failed. No retries permitted until 2025-10-03 08:55:45.734626687 +0000 UTC m=+932.087560925 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/2bc6d6a2-91a5-464c-bba7-3a00ee0b1938-cert") pod "infra-operator-controller-manager-5fbf469cd7-894gw" (UID: "2bc6d6a2-91a5-464c-bba7-3a00ee0b1938") : secret "infra-operator-webhook-server-cert" not found Oct 03 08:55:45 crc kubenswrapper[4790]: I1003 08:55:45.236361 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-hwpqv" Oct 03 08:55:45 crc kubenswrapper[4790]: I1003 08:55:45.236545 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"placement-operator-controller-manager-dockercfg-bs7mq" Oct 03 08:55:45 crc kubenswrapper[4790]: I1003 08:55:45.236692 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-webhook-server-cert" Oct 03 08:55:45 crc kubenswrapper[4790]: I1003 08:55:45.238870 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-84bc9db6cc-2vq5g" Oct 03 08:55:45 crc kubenswrapper[4790]: I1003 08:55:45.239542 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ovn-operator-controller-manager-dockercfg-md95f" Oct 03 08:55:45 crc kubenswrapper[4790]: I1003 08:55:45.255535 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-688db7b6c7-kv6xz"] Oct 03 08:55:45 crc kubenswrapper[4790]: I1003 08:55:45.262702 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-7f55849f88-twvmq" Oct 03 08:55:45 crc kubenswrapper[4790]: I1003 08:55:45.265885 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-7d8bb7f44c-8gls6"] Oct 03 08:55:45 crc kubenswrapper[4790]: I1003 08:55:45.285173 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-6f64c4d678lpbm5"] Oct 03 08:55:45 crc kubenswrapper[4790]: I1003 08:55:45.290659 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-6859f9b676-ntlnz"] Oct 03 08:55:45 crc kubenswrapper[4790]: I1003 08:55:45.291928 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-6859f9b676-ntlnz" Oct 03 08:55:45 crc kubenswrapper[4790]: I1003 08:55:45.301483 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-5db5cf686f-wrcp9"] Oct 03 08:55:45 crc kubenswrapper[4790]: I1003 08:55:45.301602 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-dockercfg-xhb4t" Oct 03 08:55:45 crc kubenswrapper[4790]: I1003 08:55:45.307370 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-5db5cf686f-wrcp9" Oct 03 08:55:45 crc kubenswrapper[4790]: I1003 08:55:45.312294 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"telemetry-operator-controller-manager-dockercfg-gbpjb" Oct 03 08:55:45 crc kubenswrapper[4790]: I1003 08:55:45.312620 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-6859f9b676-ntlnz"] Oct 03 08:55:45 crc kubenswrapper[4790]: I1003 08:55:45.323251 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-5db5cf686f-wrcp9"] Oct 03 08:55:45 crc kubenswrapper[4790]: I1003 08:55:45.328653 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zn2js\" (UniqueName: \"kubernetes.io/projected/2bc6d6a2-91a5-464c-bba7-3a00ee0b1938-kube-api-access-zn2js\") pod \"infra-operator-controller-manager-5fbf469cd7-894gw\" (UID: \"2bc6d6a2-91a5-464c-bba7-3a00ee0b1938\") " pod="openstack-operators/infra-operator-controller-manager-5fbf469cd7-894gw" Oct 03 08:55:45 crc kubenswrapper[4790]: I1003 08:55:45.337613 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mgh6w\" (UniqueName: \"kubernetes.io/projected/c7787291-34a2-4682-88f4-7b05f28f36f2-kube-api-access-mgh6w\") pod \"placement-operator-controller-manager-7d8bb7f44c-8gls6\" (UID: \"c7787291-34a2-4682-88f4-7b05f28f36f2\") " pod="openstack-operators/placement-operator-controller-manager-7d8bb7f44c-8gls6" Oct 03 08:55:45 crc kubenswrapper[4790]: I1003 08:55:45.338988 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gj75x\" (UniqueName: \"kubernetes.io/projected/e6f6790b-7aca-49c0-881f-8a7d15ac4071-kube-api-access-gj75x\") pod \"neutron-operator-controller-manager-6574bf987d-668k4\" (UID: \"e6f6790b-7aca-49c0-881f-8a7d15ac4071\") " pod="openstack-operators/neutron-operator-controller-manager-6574bf987d-668k4" Oct 03 08:55:45 crc kubenswrapper[4790]: I1003 08:55:45.339058 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rhhgr\" (UniqueName: \"kubernetes.io/projected/9cfa9536-48e0-4639-a0ed-acab9ab92ed0-kube-api-access-rhhgr\") pod \"manila-operator-controller-manager-6fd6854b49-k8l6q\" (UID: \"9cfa9536-48e0-4639-a0ed-acab9ab92ed0\") " pod="openstack-operators/manila-operator-controller-manager-6fd6854b49-k8l6q" Oct 03 08:55:45 crc kubenswrapper[4790]: I1003 08:55:45.339129 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8kxvb\" (UniqueName: \"kubernetes.io/projected/759c9730-8d00-4516-9ed2-79f56be527bf-kube-api-access-8kxvb\") pod \"openstack-baremetal-operator-controller-manager-6f64c4d678lpbm5\" (UID: \"759c9730-8d00-4516-9ed2-79f56be527bf\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6f64c4d678lpbm5" Oct 03 08:55:45 crc kubenswrapper[4790]: I1003 08:55:45.339162 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zdkwk\" (UniqueName: \"kubernetes.io/projected/98c1e024-12c4-47f7-ac46-ae9f6bd45ab9-kube-api-access-zdkwk\") pod \"octavia-operator-controller-manager-59d6cfdf45-62zvh\" (UID: \"98c1e024-12c4-47f7-ac46-ae9f6bd45ab9\") " pod="openstack-operators/octavia-operator-controller-manager-59d6cfdf45-62zvh" Oct 03 08:55:45 crc kubenswrapper[4790]: I1003 08:55:45.339193 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8m945\" (UniqueName: \"kubernetes.io/projected/c97f26b2-5313-49db-8ee8-bcde2e793bfa-kube-api-access-8m945\") pod \"mariadb-operator-controller-manager-5c468bf4d4-sdqf6\" (UID: \"c97f26b2-5313-49db-8ee8-bcde2e793bfa\") " pod="openstack-operators/mariadb-operator-controller-manager-5c468bf4d4-sdqf6" Oct 03 08:55:45 crc kubenswrapper[4790]: I1003 08:55:45.339223 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-skpq9\" (UniqueName: \"kubernetes.io/projected/94dc0b91-323f-4390-8af1-71019409bdf2-kube-api-access-skpq9\") pod \"ovn-operator-controller-manager-688db7b6c7-kv6xz\" (UID: \"94dc0b91-323f-4390-8af1-71019409bdf2\") " pod="openstack-operators/ovn-operator-controller-manager-688db7b6c7-kv6xz" Oct 03 08:55:45 crc kubenswrapper[4790]: I1003 08:55:45.339279 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/759c9730-8d00-4516-9ed2-79f56be527bf-cert\") pod \"openstack-baremetal-operator-controller-manager-6f64c4d678lpbm5\" (UID: \"759c9730-8d00-4516-9ed2-79f56be527bf\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6f64c4d678lpbm5" Oct 03 08:55:45 crc kubenswrapper[4790]: I1003 08:55:45.339307 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lh9g7\" (UniqueName: \"kubernetes.io/projected/b52c73ba-c737-4eaf-a913-155d05c96f6e-kube-api-access-lh9g7\") pod \"nova-operator-controller-manager-555c7456bd-tlmwh\" (UID: \"b52c73ba-c737-4eaf-a913-155d05c96f6e\") " pod="openstack-operators/nova-operator-controller-manager-555c7456bd-tlmwh" Oct 03 08:55:45 crc kubenswrapper[4790]: I1003 08:55:45.344300 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/test-operator-controller-manager-5cd5cb47d7-mshnf"] Oct 03 08:55:45 crc kubenswrapper[4790]: I1003 08:55:45.345605 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5cd5cb47d7-mshnf" Oct 03 08:55:45 crc kubenswrapper[4790]: I1003 08:55:45.346940 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5cd5cb47d7-mshnf"] Oct 03 08:55:45 crc kubenswrapper[4790]: I1003 08:55:45.353217 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"test-operator-controller-manager-dockercfg-slzfj" Oct 03 08:55:45 crc kubenswrapper[4790]: I1003 08:55:45.382073 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-5c899c45b6-gb6mg"] Oct 03 08:55:45 crc kubenswrapper[4790]: I1003 08:55:45.387211 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rhhgr\" (UniqueName: \"kubernetes.io/projected/9cfa9536-48e0-4639-a0ed-acab9ab92ed0-kube-api-access-rhhgr\") pod \"manila-operator-controller-manager-6fd6854b49-k8l6q\" (UID: \"9cfa9536-48e0-4639-a0ed-acab9ab92ed0\") " pod="openstack-operators/manila-operator-controller-manager-6fd6854b49-k8l6q" Oct 03 08:55:45 crc kubenswrapper[4790]: I1003 08:55:45.411749 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-5c899c45b6-gb6mg"] Oct 03 08:55:45 crc kubenswrapper[4790]: I1003 08:55:45.411873 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-5c899c45b6-gb6mg" Oct 03 08:55:45 crc kubenswrapper[4790]: I1003 08:55:45.415695 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8m945\" (UniqueName: \"kubernetes.io/projected/c97f26b2-5313-49db-8ee8-bcde2e793bfa-kube-api-access-8m945\") pod \"mariadb-operator-controller-manager-5c468bf4d4-sdqf6\" (UID: \"c97f26b2-5313-49db-8ee8-bcde2e793bfa\") " pod="openstack-operators/mariadb-operator-controller-manager-5c468bf4d4-sdqf6" Oct 03 08:55:45 crc kubenswrapper[4790]: I1003 08:55:45.418528 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"watcher-operator-controller-manager-dockercfg-mqzn7" Oct 03 08:55:45 crc kubenswrapper[4790]: I1003 08:55:45.428988 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-6fd6854b49-k8l6q" Oct 03 08:55:45 crc kubenswrapper[4790]: I1003 08:55:45.444390 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-5c468bf4d4-sdqf6" Oct 03 08:55:45 crc kubenswrapper[4790]: I1003 08:55:45.445298 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gj75x\" (UniqueName: \"kubernetes.io/projected/e6f6790b-7aca-49c0-881f-8a7d15ac4071-kube-api-access-gj75x\") pod \"neutron-operator-controller-manager-6574bf987d-668k4\" (UID: \"e6f6790b-7aca-49c0-881f-8a7d15ac4071\") " pod="openstack-operators/neutron-operator-controller-manager-6574bf987d-668k4" Oct 03 08:55:45 crc kubenswrapper[4790]: I1003 08:55:45.445399 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n9564\" (UniqueName: \"kubernetes.io/projected/15dfe278-fcef-4c6f-8c06-cce51c7d5f7c-kube-api-access-n9564\") pod \"test-operator-controller-manager-5cd5cb47d7-mshnf\" (UID: \"15dfe278-fcef-4c6f-8c06-cce51c7d5f7c\") " pod="openstack-operators/test-operator-controller-manager-5cd5cb47d7-mshnf" Oct 03 08:55:45 crc kubenswrapper[4790]: I1003 08:55:45.445436 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8kxvb\" (UniqueName: \"kubernetes.io/projected/759c9730-8d00-4516-9ed2-79f56be527bf-kube-api-access-8kxvb\") pod \"openstack-baremetal-operator-controller-manager-6f64c4d678lpbm5\" (UID: \"759c9730-8d00-4516-9ed2-79f56be527bf\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6f64c4d678lpbm5" Oct 03 08:55:45 crc kubenswrapper[4790]: I1003 08:55:45.445467 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zdkwk\" (UniqueName: \"kubernetes.io/projected/98c1e024-12c4-47f7-ac46-ae9f6bd45ab9-kube-api-access-zdkwk\") pod \"octavia-operator-controller-manager-59d6cfdf45-62zvh\" (UID: \"98c1e024-12c4-47f7-ac46-ae9f6bd45ab9\") " pod="openstack-operators/octavia-operator-controller-manager-59d6cfdf45-62zvh" Oct 03 08:55:45 crc kubenswrapper[4790]: I1003 08:55:45.445506 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-skpq9\" (UniqueName: \"kubernetes.io/projected/94dc0b91-323f-4390-8af1-71019409bdf2-kube-api-access-skpq9\") pod \"ovn-operator-controller-manager-688db7b6c7-kv6xz\" (UID: \"94dc0b91-323f-4390-8af1-71019409bdf2\") " pod="openstack-operators/ovn-operator-controller-manager-688db7b6c7-kv6xz" Oct 03 08:55:45 crc kubenswrapper[4790]: I1003 08:55:45.445575 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/759c9730-8d00-4516-9ed2-79f56be527bf-cert\") pod \"openstack-baremetal-operator-controller-manager-6f64c4d678lpbm5\" (UID: \"759c9730-8d00-4516-9ed2-79f56be527bf\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6f64c4d678lpbm5" Oct 03 08:55:45 crc kubenswrapper[4790]: I1003 08:55:45.445624 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lh9g7\" (UniqueName: \"kubernetes.io/projected/b52c73ba-c737-4eaf-a913-155d05c96f6e-kube-api-access-lh9g7\") pod \"nova-operator-controller-manager-555c7456bd-tlmwh\" (UID: \"b52c73ba-c737-4eaf-a913-155d05c96f6e\") " pod="openstack-operators/nova-operator-controller-manager-555c7456bd-tlmwh" Oct 03 08:55:45 crc kubenswrapper[4790]: I1003 08:55:45.445654 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-78t6l\" (UniqueName: \"kubernetes.io/projected/7c58f90d-6708-4bb2-ba68-fb4e08b99c61-kube-api-access-78t6l\") pod \"swift-operator-controller-manager-6859f9b676-ntlnz\" (UID: \"7c58f90d-6708-4bb2-ba68-fb4e08b99c61\") " pod="openstack-operators/swift-operator-controller-manager-6859f9b676-ntlnz" Oct 03 08:55:45 crc kubenswrapper[4790]: I1003 08:55:45.445769 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-87rrl\" (UniqueName: \"kubernetes.io/projected/b90a3b76-58bf-4c8a-a361-c59a183f6828-kube-api-access-87rrl\") pod \"telemetry-operator-controller-manager-5db5cf686f-wrcp9\" (UID: \"b90a3b76-58bf-4c8a-a361-c59a183f6828\") " pod="openstack-operators/telemetry-operator-controller-manager-5db5cf686f-wrcp9" Oct 03 08:55:45 crc kubenswrapper[4790]: I1003 08:55:45.445812 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mgh6w\" (UniqueName: \"kubernetes.io/projected/c7787291-34a2-4682-88f4-7b05f28f36f2-kube-api-access-mgh6w\") pod \"placement-operator-controller-manager-7d8bb7f44c-8gls6\" (UID: \"c7787291-34a2-4682-88f4-7b05f28f36f2\") " pod="openstack-operators/placement-operator-controller-manager-7d8bb7f44c-8gls6" Oct 03 08:55:45 crc kubenswrapper[4790]: E1003 08:55:45.469217 4790 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Oct 03 08:55:45 crc kubenswrapper[4790]: E1003 08:55:45.469491 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/759c9730-8d00-4516-9ed2-79f56be527bf-cert podName:759c9730-8d00-4516-9ed2-79f56be527bf nodeName:}" failed. No retries permitted until 2025-10-03 08:55:45.969470658 +0000 UTC m=+932.322404896 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/759c9730-8d00-4516-9ed2-79f56be527bf-cert") pod "openstack-baremetal-operator-controller-manager-6f64c4d678lpbm5" (UID: "759c9730-8d00-4516-9ed2-79f56be527bf") : secret "openstack-baremetal-operator-webhook-server-cert" not found Oct 03 08:55:45 crc kubenswrapper[4790]: I1003 08:55:45.492082 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gj75x\" (UniqueName: \"kubernetes.io/projected/e6f6790b-7aca-49c0-881f-8a7d15ac4071-kube-api-access-gj75x\") pod \"neutron-operator-controller-manager-6574bf987d-668k4\" (UID: \"e6f6790b-7aca-49c0-881f-8a7d15ac4071\") " pod="openstack-operators/neutron-operator-controller-manager-6574bf987d-668k4" Oct 03 08:55:45 crc kubenswrapper[4790]: I1003 08:55:45.492463 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mgh6w\" (UniqueName: \"kubernetes.io/projected/c7787291-34a2-4682-88f4-7b05f28f36f2-kube-api-access-mgh6w\") pod \"placement-operator-controller-manager-7d8bb7f44c-8gls6\" (UID: \"c7787291-34a2-4682-88f4-7b05f28f36f2\") " pod="openstack-operators/placement-operator-controller-manager-7d8bb7f44c-8gls6" Oct 03 08:55:45 crc kubenswrapper[4790]: I1003 08:55:45.515175 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lh9g7\" (UniqueName: \"kubernetes.io/projected/b52c73ba-c737-4eaf-a913-155d05c96f6e-kube-api-access-lh9g7\") pod \"nova-operator-controller-manager-555c7456bd-tlmwh\" (UID: \"b52c73ba-c737-4eaf-a913-155d05c96f6e\") " pod="openstack-operators/nova-operator-controller-manager-555c7456bd-tlmwh" Oct 03 08:55:45 crc kubenswrapper[4790]: I1003 08:55:45.518639 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8kxvb\" (UniqueName: \"kubernetes.io/projected/759c9730-8d00-4516-9ed2-79f56be527bf-kube-api-access-8kxvb\") pod \"openstack-baremetal-operator-controller-manager-6f64c4d678lpbm5\" (UID: \"759c9730-8d00-4516-9ed2-79f56be527bf\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6f64c4d678lpbm5" Oct 03 08:55:45 crc kubenswrapper[4790]: I1003 08:55:45.519966 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zdkwk\" (UniqueName: \"kubernetes.io/projected/98c1e024-12c4-47f7-ac46-ae9f6bd45ab9-kube-api-access-zdkwk\") pod \"octavia-operator-controller-manager-59d6cfdf45-62zvh\" (UID: \"98c1e024-12c4-47f7-ac46-ae9f6bd45ab9\") " pod="openstack-operators/octavia-operator-controller-manager-59d6cfdf45-62zvh" Oct 03 08:55:45 crc kubenswrapper[4790]: I1003 08:55:45.526635 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-skpq9\" (UniqueName: \"kubernetes.io/projected/94dc0b91-323f-4390-8af1-71019409bdf2-kube-api-access-skpq9\") pod \"ovn-operator-controller-manager-688db7b6c7-kv6xz\" (UID: \"94dc0b91-323f-4390-8af1-71019409bdf2\") " pod="openstack-operators/ovn-operator-controller-manager-688db7b6c7-kv6xz" Oct 03 08:55:45 crc kubenswrapper[4790]: I1003 08:55:45.601678 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-manager-6fb94b767d-nhfws"] Oct 03 08:55:45 crc kubenswrapper[4790]: I1003 08:55:45.602557 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-87rrl\" (UniqueName: \"kubernetes.io/projected/b90a3b76-58bf-4c8a-a361-c59a183f6828-kube-api-access-87rrl\") pod \"telemetry-operator-controller-manager-5db5cf686f-wrcp9\" (UID: \"b90a3b76-58bf-4c8a-a361-c59a183f6828\") " pod="openstack-operators/telemetry-operator-controller-manager-5db5cf686f-wrcp9" Oct 03 08:55:45 crc kubenswrapper[4790]: I1003 08:55:45.602758 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7rkdd\" (UniqueName: \"kubernetes.io/projected/a583ed8e-8a45-4dc9-80e0-bf559d110818-kube-api-access-7rkdd\") pod \"watcher-operator-controller-manager-5c899c45b6-gb6mg\" (UID: \"a583ed8e-8a45-4dc9-80e0-bf559d110818\") " pod="openstack-operators/watcher-operator-controller-manager-5c899c45b6-gb6mg" Oct 03 08:55:45 crc kubenswrapper[4790]: I1003 08:55:45.602799 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n9564\" (UniqueName: \"kubernetes.io/projected/15dfe278-fcef-4c6f-8c06-cce51c7d5f7c-kube-api-access-n9564\") pod \"test-operator-controller-manager-5cd5cb47d7-mshnf\" (UID: \"15dfe278-fcef-4c6f-8c06-cce51c7d5f7c\") " pod="openstack-operators/test-operator-controller-manager-5cd5cb47d7-mshnf" Oct 03 08:55:45 crc kubenswrapper[4790]: I1003 08:55:45.602936 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-78t6l\" (UniqueName: \"kubernetes.io/projected/7c58f90d-6708-4bb2-ba68-fb4e08b99c61-kube-api-access-78t6l\") pod \"swift-operator-controller-manager-6859f9b676-ntlnz\" (UID: \"7c58f90d-6708-4bb2-ba68-fb4e08b99c61\") " pod="openstack-operators/swift-operator-controller-manager-6859f9b676-ntlnz" Oct 03 08:55:45 crc kubenswrapper[4790]: I1003 08:55:45.604060 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-6574bf987d-668k4" Oct 03 08:55:45 crc kubenswrapper[4790]: I1003 08:55:45.623114 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-6fb94b767d-nhfws" Oct 03 08:55:45 crc kubenswrapper[4790]: I1003 08:55:45.625440 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-555c7456bd-tlmwh" Oct 03 08:55:45 crc kubenswrapper[4790]: I1003 08:55:45.642221 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Oct 03 08:55:45 crc kubenswrapper[4790]: I1003 08:55:45.642672 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-fhhnk" Oct 03 08:55:45 crc kubenswrapper[4790]: I1003 08:55:45.646829 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n9564\" (UniqueName: \"kubernetes.io/projected/15dfe278-fcef-4c6f-8c06-cce51c7d5f7c-kube-api-access-n9564\") pod \"test-operator-controller-manager-5cd5cb47d7-mshnf\" (UID: \"15dfe278-fcef-4c6f-8c06-cce51c7d5f7c\") " pod="openstack-operators/test-operator-controller-manager-5cd5cb47d7-mshnf" Oct 03 08:55:45 crc kubenswrapper[4790]: I1003 08:55:45.666080 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-6fb94b767d-nhfws"] Oct 03 08:55:45 crc kubenswrapper[4790]: I1003 08:55:45.675537 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-87rrl\" (UniqueName: \"kubernetes.io/projected/b90a3b76-58bf-4c8a-a361-c59a183f6828-kube-api-access-87rrl\") pod \"telemetry-operator-controller-manager-5db5cf686f-wrcp9\" (UID: \"b90a3b76-58bf-4c8a-a361-c59a183f6828\") " pod="openstack-operators/telemetry-operator-controller-manager-5db5cf686f-wrcp9" Oct 03 08:55:45 crc kubenswrapper[4790]: I1003 08:55:45.675893 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-78t6l\" (UniqueName: \"kubernetes.io/projected/7c58f90d-6708-4bb2-ba68-fb4e08b99c61-kube-api-access-78t6l\") pod \"swift-operator-controller-manager-6859f9b676-ntlnz\" (UID: \"7c58f90d-6708-4bb2-ba68-fb4e08b99c61\") " pod="openstack-operators/swift-operator-controller-manager-6859f9b676-ntlnz" Oct 03 08:55:45 crc kubenswrapper[4790]: I1003 08:55:45.707152 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7rkdd\" (UniqueName: \"kubernetes.io/projected/a583ed8e-8a45-4dc9-80e0-bf559d110818-kube-api-access-7rkdd\") pod \"watcher-operator-controller-manager-5c899c45b6-gb6mg\" (UID: \"a583ed8e-8a45-4dc9-80e0-bf559d110818\") " pod="openstack-operators/watcher-operator-controller-manager-5c899c45b6-gb6mg" Oct 03 08:55:45 crc kubenswrapper[4790]: I1003 08:55:45.727371 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-q8lx4"] Oct 03 08:55:45 crc kubenswrapper[4790]: I1003 08:55:45.728977 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-q8lx4" Oct 03 08:55:45 crc kubenswrapper[4790]: I1003 08:55:45.731179 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-5db5cf686f-wrcp9" Oct 03 08:55:45 crc kubenswrapper[4790]: I1003 08:55:45.731310 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-59d6cfdf45-62zvh" Oct 03 08:55:45 crc kubenswrapper[4790]: I1003 08:55:45.732119 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-controller-manager-dockercfg-ng67g" Oct 03 08:55:45 crc kubenswrapper[4790]: I1003 08:55:45.732431 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7rkdd\" (UniqueName: \"kubernetes.io/projected/a583ed8e-8a45-4dc9-80e0-bf559d110818-kube-api-access-7rkdd\") pod \"watcher-operator-controller-manager-5c899c45b6-gb6mg\" (UID: \"a583ed8e-8a45-4dc9-80e0-bf559d110818\") " pod="openstack-operators/watcher-operator-controller-manager-5c899c45b6-gb6mg" Oct 03 08:55:45 crc kubenswrapper[4790]: I1003 08:55:45.739698 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-q8lx4"] Oct 03 08:55:45 crc kubenswrapper[4790]: I1003 08:55:45.760739 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-7d8bb7f44c-8gls6" Oct 03 08:55:45 crc kubenswrapper[4790]: I1003 08:55:45.776327 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-599898f689-h2km6"] Oct 03 08:55:45 crc kubenswrapper[4790]: I1003 08:55:45.777945 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5cd5cb47d7-mshnf" Oct 03 08:55:45 crc kubenswrapper[4790]: I1003 08:55:45.793405 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-5c899c45b6-gb6mg" Oct 03 08:55:45 crc kubenswrapper[4790]: I1003 08:55:45.806199 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-688db7b6c7-kv6xz" Oct 03 08:55:45 crc kubenswrapper[4790]: I1003 08:55:45.808876 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c64c5a59-ac27-4428-8cdc-a1f8c96368bb-cert\") pod \"openstack-operator-controller-manager-6fb94b767d-nhfws\" (UID: \"c64c5a59-ac27-4428-8cdc-a1f8c96368bb\") " pod="openstack-operators/openstack-operator-controller-manager-6fb94b767d-nhfws" Oct 03 08:55:45 crc kubenswrapper[4790]: I1003 08:55:45.808987 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vtp5m\" (UniqueName: \"kubernetes.io/projected/c64c5a59-ac27-4428-8cdc-a1f8c96368bb-kube-api-access-vtp5m\") pod \"openstack-operator-controller-manager-6fb94b767d-nhfws\" (UID: \"c64c5a59-ac27-4428-8cdc-a1f8c96368bb\") " pod="openstack-operators/openstack-operator-controller-manager-6fb94b767d-nhfws" Oct 03 08:55:45 crc kubenswrapper[4790]: I1003 08:55:45.809091 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2bc6d6a2-91a5-464c-bba7-3a00ee0b1938-cert\") pod \"infra-operator-controller-manager-5fbf469cd7-894gw\" (UID: \"2bc6d6a2-91a5-464c-bba7-3a00ee0b1938\") " pod="openstack-operators/infra-operator-controller-manager-5fbf469cd7-894gw" Oct 03 08:55:45 crc kubenswrapper[4790]: I1003 08:55:45.812444 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2bc6d6a2-91a5-464c-bba7-3a00ee0b1938-cert\") pod \"infra-operator-controller-manager-5fbf469cd7-894gw\" (UID: \"2bc6d6a2-91a5-464c-bba7-3a00ee0b1938\") " pod="openstack-operators/infra-operator-controller-manager-5fbf469cd7-894gw" Oct 03 08:55:45 crc kubenswrapper[4790]: I1003 08:55:45.910686 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c64c5a59-ac27-4428-8cdc-a1f8c96368bb-cert\") pod \"openstack-operator-controller-manager-6fb94b767d-nhfws\" (UID: \"c64c5a59-ac27-4428-8cdc-a1f8c96368bb\") " pod="openstack-operators/openstack-operator-controller-manager-6fb94b767d-nhfws" Oct 03 08:55:45 crc kubenswrapper[4790]: I1003 08:55:45.910763 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vtp5m\" (UniqueName: \"kubernetes.io/projected/c64c5a59-ac27-4428-8cdc-a1f8c96368bb-kube-api-access-vtp5m\") pod \"openstack-operator-controller-manager-6fb94b767d-nhfws\" (UID: \"c64c5a59-ac27-4428-8cdc-a1f8c96368bb\") " pod="openstack-operators/openstack-operator-controller-manager-6fb94b767d-nhfws" Oct 03 08:55:45 crc kubenswrapper[4790]: I1003 08:55:45.910809 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f9g89\" (UniqueName: \"kubernetes.io/projected/681ac688-7419-4d78-bf96-3a1d54c30d8b-kube-api-access-f9g89\") pod \"rabbitmq-cluster-operator-manager-5f97d8c699-q8lx4\" (UID: \"681ac688-7419-4d78-bf96-3a1d54c30d8b\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-q8lx4" Oct 03 08:55:45 crc kubenswrapper[4790]: E1003 08:55:45.911270 4790 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Oct 03 08:55:45 crc kubenswrapper[4790]: E1003 08:55:45.911315 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c64c5a59-ac27-4428-8cdc-a1f8c96368bb-cert podName:c64c5a59-ac27-4428-8cdc-a1f8c96368bb nodeName:}" failed. No retries permitted until 2025-10-03 08:55:46.411299121 +0000 UTC m=+932.764233359 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/c64c5a59-ac27-4428-8cdc-a1f8c96368bb-cert") pod "openstack-operator-controller-manager-6fb94b767d-nhfws" (UID: "c64c5a59-ac27-4428-8cdc-a1f8c96368bb") : secret "webhook-server-cert" not found Oct 03 08:55:45 crc kubenswrapper[4790]: I1003 08:55:45.917510 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-6859f9b676-ntlnz" Oct 03 08:55:45 crc kubenswrapper[4790]: I1003 08:55:45.947120 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vtp5m\" (UniqueName: \"kubernetes.io/projected/c64c5a59-ac27-4428-8cdc-a1f8c96368bb-kube-api-access-vtp5m\") pod \"openstack-operator-controller-manager-6fb94b767d-nhfws\" (UID: \"c64c5a59-ac27-4428-8cdc-a1f8c96368bb\") " pod="openstack-operators/openstack-operator-controller-manager-6fb94b767d-nhfws" Oct 03 08:55:45 crc kubenswrapper[4790]: I1003 08:55:45.959933 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-5fbf469cd7-894gw" Oct 03 08:55:45 crc kubenswrapper[4790]: I1003 08:55:45.973563 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-599898f689-h2km6" event={"ID":"d6653934-c835-4642-a1da-9159b7f85dcb","Type":"ContainerStarted","Data":"a1edd83910117fdcbcd8fcd0031a9f631794d3a443d2a60963403de6bb849de5"} Oct 03 08:55:46 crc kubenswrapper[4790]: I1003 08:55:46.013065 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/759c9730-8d00-4516-9ed2-79f56be527bf-cert\") pod \"openstack-baremetal-operator-controller-manager-6f64c4d678lpbm5\" (UID: \"759c9730-8d00-4516-9ed2-79f56be527bf\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6f64c4d678lpbm5" Oct 03 08:55:46 crc kubenswrapper[4790]: I1003 08:55:46.013266 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f9g89\" (UniqueName: \"kubernetes.io/projected/681ac688-7419-4d78-bf96-3a1d54c30d8b-kube-api-access-f9g89\") pod \"rabbitmq-cluster-operator-manager-5f97d8c699-q8lx4\" (UID: \"681ac688-7419-4d78-bf96-3a1d54c30d8b\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-q8lx4" Oct 03 08:55:46 crc kubenswrapper[4790]: I1003 08:55:46.017731 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/759c9730-8d00-4516-9ed2-79f56be527bf-cert\") pod \"openstack-baremetal-operator-controller-manager-6f64c4d678lpbm5\" (UID: \"759c9730-8d00-4516-9ed2-79f56be527bf\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6f64c4d678lpbm5" Oct 03 08:55:46 crc kubenswrapper[4790]: I1003 08:55:46.036503 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f9g89\" (UniqueName: \"kubernetes.io/projected/681ac688-7419-4d78-bf96-3a1d54c30d8b-kube-api-access-f9g89\") pod \"rabbitmq-cluster-operator-manager-5f97d8c699-q8lx4\" (UID: \"681ac688-7419-4d78-bf96-3a1d54c30d8b\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-q8lx4" Oct 03 08:55:46 crc kubenswrapper[4790]: I1003 08:55:46.128514 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-q8lx4" Oct 03 08:55:46 crc kubenswrapper[4790]: I1003 08:55:46.232012 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6f64c4d678lpbm5" Oct 03 08:55:46 crc kubenswrapper[4790]: I1003 08:55:46.426557 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c64c5a59-ac27-4428-8cdc-a1f8c96368bb-cert\") pod \"openstack-operator-controller-manager-6fb94b767d-nhfws\" (UID: \"c64c5a59-ac27-4428-8cdc-a1f8c96368bb\") " pod="openstack-operators/openstack-operator-controller-manager-6fb94b767d-nhfws" Oct 03 08:55:46 crc kubenswrapper[4790]: E1003 08:55:46.427187 4790 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Oct 03 08:55:46 crc kubenswrapper[4790]: E1003 08:55:46.427238 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c64c5a59-ac27-4428-8cdc-a1f8c96368bb-cert podName:c64c5a59-ac27-4428-8cdc-a1f8c96368bb nodeName:}" failed. No retries permitted until 2025-10-03 08:55:47.427222156 +0000 UTC m=+933.780156384 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/c64c5a59-ac27-4428-8cdc-a1f8c96368bb-cert") pod "openstack-operator-controller-manager-6fb94b767d-nhfws" (UID: "c64c5a59-ac27-4428-8cdc-a1f8c96368bb") : secret "webhook-server-cert" not found Oct 03 08:55:46 crc kubenswrapper[4790]: I1003 08:55:46.650695 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-6769b867d9-d6t4r"] Oct 03 08:55:46 crc kubenswrapper[4790]: I1003 08:55:46.673121 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-79d68d6c85-q7g8m"] Oct 03 08:55:46 crc kubenswrapper[4790]: I1003 08:55:46.682411 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-6c675fb79f-rtftw"] Oct 03 08:55:46 crc kubenswrapper[4790]: I1003 08:55:46.687198 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-846dff85b5-pqh6z"] Oct 03 08:55:46 crc kubenswrapper[4790]: W1003 08:55:46.694167 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9b9ab2ca_8c9e_46d1_af78_b811e32b05f0.slice/crio-0906ac59f36ca2e70532658b25a89ad9f43126fa9b89e4ab89d96989ae34d85c WatchSource:0}: Error finding container 0906ac59f36ca2e70532658b25a89ad9f43126fa9b89e4ab89d96989ae34d85c: Status 404 returned error can't find the container with id 0906ac59f36ca2e70532658b25a89ad9f43126fa9b89e4ab89d96989ae34d85c Oct 03 08:55:46 crc kubenswrapper[4790]: I1003 08:55:46.706334 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-75dfd9b554-s25k5"] Oct 03 08:55:46 crc kubenswrapper[4790]: I1003 08:55:46.880288 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-5db5cf686f-wrcp9"] Oct 03 08:55:46 crc kubenswrapper[4790]: I1003 08:55:46.917544 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-84bc9db6cc-2vq5g"] Oct 03 08:55:46 crc kubenswrapper[4790]: I1003 08:55:46.942630 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-6fd6854b49-k8l6q"] Oct 03 08:55:46 crc kubenswrapper[4790]: I1003 08:55:46.969839 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-7f55849f88-twvmq"] Oct 03 08:55:46 crc kubenswrapper[4790]: W1003 08:55:46.974033 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1eec0ecd_6bcb_4df5_b6bc_66780ca3e4d4.slice/crio-4e1a81f6e9a730d6ca70fd6a1bf8668d833bd177370714490844c99a8a12f85f WatchSource:0}: Error finding container 4e1a81f6e9a730d6ca70fd6a1bf8668d833bd177370714490844c99a8a12f85f: Status 404 returned error can't find the container with id 4e1a81f6e9a730d6ca70fd6a1bf8668d833bd177370714490844c99a8a12f85f Oct 03 08:55:46 crc kubenswrapper[4790]: W1003 08:55:46.998029 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb52c73ba_c737_4eaf_a913_155d05c96f6e.slice/crio-b0b98d2f01399bbb2038006e1818c05a642e89699d040dfc418f9ea25a43e4ce WatchSource:0}: Error finding container b0b98d2f01399bbb2038006e1818c05a642e89699d040dfc418f9ea25a43e4ce: Status 404 returned error can't find the container with id b0b98d2f01399bbb2038006e1818c05a642e89699d040dfc418f9ea25a43e4ce Oct 03 08:55:47 crc kubenswrapper[4790]: E1003 08:55:47.007315 4790 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/nova-operator@sha256:a82409e6d6a5554aad95acfe6fa4784e33de19a963eb8b1da1a80a3e6cf1ab55,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-lh9g7,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod nova-operator-controller-manager-555c7456bd-tlmwh_openstack-operators(b52c73ba-c737-4eaf-a913-155d05c96f6e): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Oct 03 08:55:47 crc kubenswrapper[4790]: E1003 08:55:47.008730 4790 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/ironic-operator@sha256:38abe6135ccaa369bc831f7878a6dfdf9a5a993a882e1c42073ca43582766f12,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-hj2sr,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ironic-operator-controller-manager-84bc9db6cc-2vq5g_openstack-operators(1eec0ecd-6bcb-4df5-b6bc-66780ca3e4d4): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Oct 03 08:55:47 crc kubenswrapper[4790]: I1003 08:55:47.022731 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-6c675fb79f-rtftw" event={"ID":"f598237f-5311-44e2-8ed2-3de81bbebc91","Type":"ContainerStarted","Data":"ac07d9f2ecfce578a60c37c409f7cee3eca14a0e103c5b4396c7ac19c5c33194"} Oct 03 08:55:47 crc kubenswrapper[4790]: I1003 08:55:47.026893 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-5db5cf686f-wrcp9" event={"ID":"b90a3b76-58bf-4c8a-a361-c59a183f6828","Type":"ContainerStarted","Data":"ef0e877ed53f890debff4b485b9284a5cc2a2388480bf5912db266f3075258fb"} Oct 03 08:55:47 crc kubenswrapper[4790]: I1003 08:55:47.029910 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-6574bf987d-668k4"] Oct 03 08:55:47 crc kubenswrapper[4790]: I1003 08:55:47.032251 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-7f55849f88-twvmq" event={"ID":"12237b3a-fae2-4cac-b0d7-ee396be3da08","Type":"ContainerStarted","Data":"4b83762d30485cb9b908f8dbc9ba12c5ed6a38715fcce1a06cb1605aeaadd2f9"} Oct 03 08:55:47 crc kubenswrapper[4790]: I1003 08:55:47.036361 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-6574bf987d-668k4" event={"ID":"e6f6790b-7aca-49c0-881f-8a7d15ac4071","Type":"ContainerStarted","Data":"2261ce6f5e04241a4526b830c248a19f8ff2b4a3e9868b37f393f2aaba9c1487"} Oct 03 08:55:47 crc kubenswrapper[4790]: I1003 08:55:47.039265 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-79d68d6c85-q7g8m" event={"ID":"9b9ab2ca-8c9e-46d1-af78-b811e32b05f0","Type":"ContainerStarted","Data":"0906ac59f36ca2e70532658b25a89ad9f43126fa9b89e4ab89d96989ae34d85c"} Oct 03 08:55:47 crc kubenswrapper[4790]: I1003 08:55:47.040654 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-59d6cfdf45-62zvh"] Oct 03 08:55:47 crc kubenswrapper[4790]: I1003 08:55:47.043837 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-846dff85b5-pqh6z" event={"ID":"a168c4e9-522f-4f9b-97d3-5c50008ec59c","Type":"ContainerStarted","Data":"85d76a6f9ffc317ee78b629e1a83ff49161364af2ccc314d6075c17e7e1b0678"} Oct 03 08:55:47 crc kubenswrapper[4790]: I1003 08:55:47.045478 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-6769b867d9-d6t4r" event={"ID":"5b3d2a0d-d36d-4a67-9c5f-6a85735127a1","Type":"ContainerStarted","Data":"b783dfb5e7edc499b6c526be8bf855504796611e95c112f9e51f2d43da425478"} Oct 03 08:55:47 crc kubenswrapper[4790]: I1003 08:55:47.049084 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-75dfd9b554-s25k5" event={"ID":"b1f2aef2-7fd0-4d10-ab5d-f0a8b2e7ffaf","Type":"ContainerStarted","Data":"64b2e13589c7484a5633314d129f470a2faa827faf6448bf83c91cf9c224edca"} Oct 03 08:55:47 crc kubenswrapper[4790]: I1003 08:55:47.050892 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-6fd6854b49-k8l6q" event={"ID":"9cfa9536-48e0-4639-a0ed-acab9ab92ed0","Type":"ContainerStarted","Data":"17d7654bc52b1b4ea9bcc80a72b6f633066c3f6aa65e0fba9be031f56b72845a"} Oct 03 08:55:47 crc kubenswrapper[4790]: I1003 08:55:47.055382 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-5c468bf4d4-sdqf6"] Oct 03 08:55:47 crc kubenswrapper[4790]: I1003 08:55:47.062640 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-555c7456bd-tlmwh"] Oct 03 08:55:47 crc kubenswrapper[4790]: I1003 08:55:47.176973 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-7d8bb7f44c-8gls6"] Oct 03 08:55:47 crc kubenswrapper[4790]: I1003 08:55:47.196672 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-5c899c45b6-gb6mg"] Oct 03 08:55:47 crc kubenswrapper[4790]: I1003 08:55:47.202704 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-688db7b6c7-kv6xz"] Oct 03 08:55:47 crc kubenswrapper[4790]: E1003 08:55:47.204989 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/ironic-operator-controller-manager-84bc9db6cc-2vq5g" podUID="1eec0ecd-6bcb-4df5-b6bc-66780ca3e4d4" Oct 03 08:55:47 crc kubenswrapper[4790]: I1003 08:55:47.206702 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5cd5cb47d7-mshnf"] Oct 03 08:55:47 crc kubenswrapper[4790]: E1003 08:55:47.208062 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/nova-operator-controller-manager-555c7456bd-tlmwh" podUID="b52c73ba-c737-4eaf-a913-155d05c96f6e" Oct 03 08:55:47 crc kubenswrapper[4790]: E1003 08:55:47.208217 4790 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/ovn-operator@sha256:5c6ab93b78bd20eb7f1736751a59c1eb33fb06351339563dbefe49ccaaff6e94,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-skpq9,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovn-operator-controller-manager-688db7b6c7-kv6xz_openstack-operators(94dc0b91-323f-4390-8af1-71019409bdf2): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Oct 03 08:55:47 crc kubenswrapper[4790]: E1003 08:55:47.208277 4790 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/placement-operator@sha256:725da67b3f9cf2758564e0111928cdd570c0f6f1ca34775f159bbe94deb82548,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-mgh6w,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod placement-operator-controller-manager-7d8bb7f44c-8gls6_openstack-operators(c7787291-34a2-4682-88f4-7b05f28f36f2): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Oct 03 08:55:47 crc kubenswrapper[4790]: I1003 08:55:47.210720 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-6859f9b676-ntlnz"] Oct 03 08:55:47 crc kubenswrapper[4790]: I1003 08:55:47.215436 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-q8lx4"] Oct 03 08:55:47 crc kubenswrapper[4790]: I1003 08:55:47.219450 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-5fbf469cd7-894gw"] Oct 03 08:55:47 crc kubenswrapper[4790]: E1003 08:55:47.225521 4790 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/swift-operator@sha256:637bb7b9ac308bc1e323391a3593b824f688090a856c83385814c17a571b1eed,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-78t6l,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod swift-operator-controller-manager-6859f9b676-ntlnz_openstack-operators(7c58f90d-6708-4bb2-ba68-fb4e08b99c61): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Oct 03 08:55:47 crc kubenswrapper[4790]: W1003 08:55:47.230164 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2bc6d6a2_91a5_464c_bba7_3a00ee0b1938.slice/crio-e6246bfa94fcd4320bab42009691ca75ead39834dad8e07c80463c0e062863d1 WatchSource:0}: Error finding container e6246bfa94fcd4320bab42009691ca75ead39834dad8e07c80463c0e062863d1: Status 404 returned error can't find the container with id e6246bfa94fcd4320bab42009691ca75ead39834dad8e07c80463c0e062863d1 Oct 03 08:55:47 crc kubenswrapper[4790]: W1003 08:55:47.231790 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod681ac688_7419_4d78_bf96_3a1d54c30d8b.slice/crio-28efa074fc607b779b9176d1fcaea43e1606fba86c80e5599c27fb87d99ea6a8 WatchSource:0}: Error finding container 28efa074fc607b779b9176d1fcaea43e1606fba86c80e5599c27fb87d99ea6a8: Status 404 returned error can't find the container with id 28efa074fc607b779b9176d1fcaea43e1606fba86c80e5599c27fb87d99ea6a8 Oct 03 08:55:47 crc kubenswrapper[4790]: E1003 08:55:47.244204 4790 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2,Command:[/manager],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:9782,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-f9g89,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cluster-operator-manager-5f97d8c699-q8lx4_openstack-operators(681ac688-7419-4d78-bf96-3a1d54c30d8b): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Oct 03 08:55:47 crc kubenswrapper[4790]: E1003 08:55:47.244332 4790 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/test-operator@sha256:0daf76cc40ab619ae266b11defcc1b65beb22d859369e7b1b04de9169089a4cb,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-n9564,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod test-operator-controller-manager-5cd5cb47d7-mshnf_openstack-operators(15dfe278-fcef-4c6f-8c06-cce51c7d5f7c): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Oct 03 08:55:47 crc kubenswrapper[4790]: E1003 08:55:47.244329 4790 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/infra-operator@sha256:40fb1819b6639807b77ef79448d35f1e4bfc1838a09d4f380e9fa0f755352475,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{600 -3} {} 600m DecimalSI},memory: {{2147483648 0} {} 2Gi BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{536870912 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:cert,ReadOnly:true,MountPath:/tmp/k8s-webhook-server/serving-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-zn2js,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod infra-operator-controller-manager-5fbf469cd7-894gw_openstack-operators(2bc6d6a2-91a5-464c-bba7-3a00ee0b1938): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Oct 03 08:55:47 crc kubenswrapper[4790]: E1003 08:55:47.245737 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-q8lx4" podUID="681ac688-7419-4d78-bf96-3a1d54c30d8b" Oct 03 08:55:47 crc kubenswrapper[4790]: I1003 08:55:47.260013 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-6f64c4d678lpbm5"] Oct 03 08:55:47 crc kubenswrapper[4790]: E1003 08:55:47.425821 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/ovn-operator-controller-manager-688db7b6c7-kv6xz" podUID="94dc0b91-323f-4390-8af1-71019409bdf2" Oct 03 08:55:47 crc kubenswrapper[4790]: I1003 08:55:47.449092 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c64c5a59-ac27-4428-8cdc-a1f8c96368bb-cert\") pod \"openstack-operator-controller-manager-6fb94b767d-nhfws\" (UID: \"c64c5a59-ac27-4428-8cdc-a1f8c96368bb\") " pod="openstack-operators/openstack-operator-controller-manager-6fb94b767d-nhfws" Oct 03 08:55:47 crc kubenswrapper[4790]: I1003 08:55:47.458896 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c64c5a59-ac27-4428-8cdc-a1f8c96368bb-cert\") pod \"openstack-operator-controller-manager-6fb94b767d-nhfws\" (UID: \"c64c5a59-ac27-4428-8cdc-a1f8c96368bb\") " pod="openstack-operators/openstack-operator-controller-manager-6fb94b767d-nhfws" Oct 03 08:55:47 crc kubenswrapper[4790]: E1003 08:55:47.490684 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/swift-operator-controller-manager-6859f9b676-ntlnz" podUID="7c58f90d-6708-4bb2-ba68-fb4e08b99c61" Oct 03 08:55:47 crc kubenswrapper[4790]: E1003 08:55:47.525978 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/placement-operator-controller-manager-7d8bb7f44c-8gls6" podUID="c7787291-34a2-4682-88f4-7b05f28f36f2" Oct 03 08:55:47 crc kubenswrapper[4790]: E1003 08:55:47.529892 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/test-operator-controller-manager-5cd5cb47d7-mshnf" podUID="15dfe278-fcef-4c6f-8c06-cce51c7d5f7c" Oct 03 08:55:47 crc kubenswrapper[4790]: E1003 08:55:47.561902 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/infra-operator-controller-manager-5fbf469cd7-894gw" podUID="2bc6d6a2-91a5-464c-bba7-3a00ee0b1938" Oct 03 08:55:47 crc kubenswrapper[4790]: I1003 08:55:47.624430 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-6fb94b767d-nhfws" Oct 03 08:55:48 crc kubenswrapper[4790]: I1003 08:55:48.069987 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-59d6cfdf45-62zvh" event={"ID":"98c1e024-12c4-47f7-ac46-ae9f6bd45ab9","Type":"ContainerStarted","Data":"18f37d38baeec99e6d420e32aa149ba1aa7b2c3396ded1b53252cc89f4427332"} Oct 03 08:55:48 crc kubenswrapper[4790]: I1003 08:55:48.071729 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-5c899c45b6-gb6mg" event={"ID":"a583ed8e-8a45-4dc9-80e0-bf559d110818","Type":"ContainerStarted","Data":"882827950cbedc9a4e688e67d8a13af76fbd2f38ba8fb976d0f640d16980e79b"} Oct 03 08:55:48 crc kubenswrapper[4790]: I1003 08:55:48.077992 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-555c7456bd-tlmwh" event={"ID":"b52c73ba-c737-4eaf-a913-155d05c96f6e","Type":"ContainerStarted","Data":"d0a2ffeec7c30b2c4e1766f52141b6ffbaf2bdf0efe25fc9ec6ec003ffe9f00a"} Oct 03 08:55:48 crc kubenswrapper[4790]: I1003 08:55:48.078017 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-555c7456bd-tlmwh" event={"ID":"b52c73ba-c737-4eaf-a913-155d05c96f6e","Type":"ContainerStarted","Data":"b0b98d2f01399bbb2038006e1818c05a642e89699d040dfc418f9ea25a43e4ce"} Oct 03 08:55:48 crc kubenswrapper[4790]: E1003 08:55:48.083785 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/nova-operator@sha256:a82409e6d6a5554aad95acfe6fa4784e33de19a963eb8b1da1a80a3e6cf1ab55\\\"\"" pod="openstack-operators/nova-operator-controller-manager-555c7456bd-tlmwh" podUID="b52c73ba-c737-4eaf-a913-155d05c96f6e" Oct 03 08:55:48 crc kubenswrapper[4790]: I1003 08:55:48.087774 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5cd5cb47d7-mshnf" event={"ID":"15dfe278-fcef-4c6f-8c06-cce51c7d5f7c","Type":"ContainerStarted","Data":"5b4e21eb319f9ed6765e53ba0c12db438e190a2f90058e7b32aeee0cf877400e"} Oct 03 08:55:48 crc kubenswrapper[4790]: I1003 08:55:48.087834 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5cd5cb47d7-mshnf" event={"ID":"15dfe278-fcef-4c6f-8c06-cce51c7d5f7c","Type":"ContainerStarted","Data":"a5dc0b26ccc67eb5cf58800c45b1070b04de6c18771a6c294d1936f51443d2ab"} Oct 03 08:55:48 crc kubenswrapper[4790]: E1003 08:55:48.089639 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:0daf76cc40ab619ae266b11defcc1b65beb22d859369e7b1b04de9169089a4cb\\\"\"" pod="openstack-operators/test-operator-controller-manager-5cd5cb47d7-mshnf" podUID="15dfe278-fcef-4c6f-8c06-cce51c7d5f7c" Oct 03 08:55:48 crc kubenswrapper[4790]: I1003 08:55:48.092252 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-5fbf469cd7-894gw" event={"ID":"2bc6d6a2-91a5-464c-bba7-3a00ee0b1938","Type":"ContainerStarted","Data":"56f45ffe5d058e7a03f6372b703bc96c1c5dbbc4ea9f784c28a8ca69511c05b3"} Oct 03 08:55:48 crc kubenswrapper[4790]: I1003 08:55:48.092291 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-5fbf469cd7-894gw" event={"ID":"2bc6d6a2-91a5-464c-bba7-3a00ee0b1938","Type":"ContainerStarted","Data":"e6246bfa94fcd4320bab42009691ca75ead39834dad8e07c80463c0e062863d1"} Oct 03 08:55:48 crc kubenswrapper[4790]: E1003 08:55:48.093646 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/infra-operator@sha256:40fb1819b6639807b77ef79448d35f1e4bfc1838a09d4f380e9fa0f755352475\\\"\"" pod="openstack-operators/infra-operator-controller-manager-5fbf469cd7-894gw" podUID="2bc6d6a2-91a5-464c-bba7-3a00ee0b1938" Oct 03 08:55:48 crc kubenswrapper[4790]: I1003 08:55:48.098712 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6f64c4d678lpbm5" event={"ID":"759c9730-8d00-4516-9ed2-79f56be527bf","Type":"ContainerStarted","Data":"7636a45d3cb580977a11f60ec7f30c640500efa33532ea2850a04013de67296a"} Oct 03 08:55:48 crc kubenswrapper[4790]: I1003 08:55:48.103746 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-6859f9b676-ntlnz" event={"ID":"7c58f90d-6708-4bb2-ba68-fb4e08b99c61","Type":"ContainerStarted","Data":"3b3409334d663821f283d2e59512ec4de5a234770c3e673f232ad5d79399f515"} Oct 03 08:55:48 crc kubenswrapper[4790]: I1003 08:55:48.103791 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-6859f9b676-ntlnz" event={"ID":"7c58f90d-6708-4bb2-ba68-fb4e08b99c61","Type":"ContainerStarted","Data":"346bd4d692cb728f5d28bcaeb7398bb526aa272c74916245fec1576fc67290ce"} Oct 03 08:55:48 crc kubenswrapper[4790]: E1003 08:55:48.105600 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:637bb7b9ac308bc1e323391a3593b824f688090a856c83385814c17a571b1eed\\\"\"" pod="openstack-operators/swift-operator-controller-manager-6859f9b676-ntlnz" podUID="7c58f90d-6708-4bb2-ba68-fb4e08b99c61" Oct 03 08:55:48 crc kubenswrapper[4790]: I1003 08:55:48.107542 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-5c468bf4d4-sdqf6" event={"ID":"c97f26b2-5313-49db-8ee8-bcde2e793bfa","Type":"ContainerStarted","Data":"94add0e743ca225da3f211c58154f1081f50ccf17484eecf987bcc5a8c17bf64"} Oct 03 08:55:48 crc kubenswrapper[4790]: I1003 08:55:48.120021 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-84bc9db6cc-2vq5g" event={"ID":"1eec0ecd-6bcb-4df5-b6bc-66780ca3e4d4","Type":"ContainerStarted","Data":"959a10c09593feba2a386ed757a521fd1b4d950fa5ce65bd4b3b9a48e0366fb4"} Oct 03 08:55:48 crc kubenswrapper[4790]: I1003 08:55:48.120062 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-84bc9db6cc-2vq5g" event={"ID":"1eec0ecd-6bcb-4df5-b6bc-66780ca3e4d4","Type":"ContainerStarted","Data":"4e1a81f6e9a730d6ca70fd6a1bf8668d833bd177370714490844c99a8a12f85f"} Oct 03 08:55:48 crc kubenswrapper[4790]: E1003 08:55:48.122812 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ironic-operator@sha256:38abe6135ccaa369bc831f7878a6dfdf9a5a993a882e1c42073ca43582766f12\\\"\"" pod="openstack-operators/ironic-operator-controller-manager-84bc9db6cc-2vq5g" podUID="1eec0ecd-6bcb-4df5-b6bc-66780ca3e4d4" Oct 03 08:55:48 crc kubenswrapper[4790]: I1003 08:55:48.127840 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-7d8bb7f44c-8gls6" event={"ID":"c7787291-34a2-4682-88f4-7b05f28f36f2","Type":"ContainerStarted","Data":"b4c9b53b4fbaacf3b389741e2adffa8f35c66c16f1c4c6fd0b48921a9aacaf0b"} Oct 03 08:55:48 crc kubenswrapper[4790]: I1003 08:55:48.127874 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-7d8bb7f44c-8gls6" event={"ID":"c7787291-34a2-4682-88f4-7b05f28f36f2","Type":"ContainerStarted","Data":"07c1aa6cf142beed37e5f58872711a7aa0ac997ad914c2d6253d790355b5c3ac"} Oct 03 08:55:48 crc kubenswrapper[4790]: E1003 08:55:48.129961 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:725da67b3f9cf2758564e0111928cdd570c0f6f1ca34775f159bbe94deb82548\\\"\"" pod="openstack-operators/placement-operator-controller-manager-7d8bb7f44c-8gls6" podUID="c7787291-34a2-4682-88f4-7b05f28f36f2" Oct 03 08:55:48 crc kubenswrapper[4790]: I1003 08:55:48.130536 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-688db7b6c7-kv6xz" event={"ID":"94dc0b91-323f-4390-8af1-71019409bdf2","Type":"ContainerStarted","Data":"335a0da885190db70675fc22653ffc6401fe8bbcdd21ea9d65c43b1039775fa5"} Oct 03 08:55:48 crc kubenswrapper[4790]: I1003 08:55:48.130589 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-688db7b6c7-kv6xz" event={"ID":"94dc0b91-323f-4390-8af1-71019409bdf2","Type":"ContainerStarted","Data":"1897a81b23b8d108bc44ddfac586149d49607ce66a509fd8d84227d23265b0eb"} Oct 03 08:55:48 crc kubenswrapper[4790]: E1003 08:55:48.133951 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ovn-operator@sha256:5c6ab93b78bd20eb7f1736751a59c1eb33fb06351339563dbefe49ccaaff6e94\\\"\"" pod="openstack-operators/ovn-operator-controller-manager-688db7b6c7-kv6xz" podUID="94dc0b91-323f-4390-8af1-71019409bdf2" Oct 03 08:55:48 crc kubenswrapper[4790]: I1003 08:55:48.134758 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-q8lx4" event={"ID":"681ac688-7419-4d78-bf96-3a1d54c30d8b","Type":"ContainerStarted","Data":"28efa074fc607b779b9176d1fcaea43e1606fba86c80e5599c27fb87d99ea6a8"} Oct 03 08:55:48 crc kubenswrapper[4790]: E1003 08:55:48.163837 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-q8lx4" podUID="681ac688-7419-4d78-bf96-3a1d54c30d8b" Oct 03 08:55:48 crc kubenswrapper[4790]: I1003 08:55:48.297773 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-6fb94b767d-nhfws"] Oct 03 08:55:49 crc kubenswrapper[4790]: I1003 08:55:49.207455 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-6fb94b767d-nhfws" event={"ID":"c64c5a59-ac27-4428-8cdc-a1f8c96368bb","Type":"ContainerStarted","Data":"2e82ce541a5c1010e9d80945e46424b0d084bfc4585643fea6bfc0b74000edda"} Oct 03 08:55:49 crc kubenswrapper[4790]: I1003 08:55:49.207531 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-6fb94b767d-nhfws" Oct 03 08:55:49 crc kubenswrapper[4790]: I1003 08:55:49.207564 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-6fb94b767d-nhfws" event={"ID":"c64c5a59-ac27-4428-8cdc-a1f8c96368bb","Type":"ContainerStarted","Data":"e8f427154c0ee7170fde1089d5da1c37eadf569466c8a05d19e42afe42134f43"} Oct 03 08:55:49 crc kubenswrapper[4790]: I1003 08:55:49.207607 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-6fb94b767d-nhfws" event={"ID":"c64c5a59-ac27-4428-8cdc-a1f8c96368bb","Type":"ContainerStarted","Data":"e78453c2121454e170d8f61faf5b80cd7a5e13da0f28015d86f438e7893b9593"} Oct 03 08:55:49 crc kubenswrapper[4790]: E1003 08:55:49.221622 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/infra-operator@sha256:40fb1819b6639807b77ef79448d35f1e4bfc1838a09d4f380e9fa0f755352475\\\"\"" pod="openstack-operators/infra-operator-controller-manager-5fbf469cd7-894gw" podUID="2bc6d6a2-91a5-464c-bba7-3a00ee0b1938" Oct 03 08:55:49 crc kubenswrapper[4790]: E1003 08:55:49.222062 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:0daf76cc40ab619ae266b11defcc1b65beb22d859369e7b1b04de9169089a4cb\\\"\"" pod="openstack-operators/test-operator-controller-manager-5cd5cb47d7-mshnf" podUID="15dfe278-fcef-4c6f-8c06-cce51c7d5f7c" Oct 03 08:55:49 crc kubenswrapper[4790]: E1003 08:55:49.222102 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:725da67b3f9cf2758564e0111928cdd570c0f6f1ca34775f159bbe94deb82548\\\"\"" pod="openstack-operators/placement-operator-controller-manager-7d8bb7f44c-8gls6" podUID="c7787291-34a2-4682-88f4-7b05f28f36f2" Oct 03 08:55:49 crc kubenswrapper[4790]: E1003 08:55:49.222140 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ovn-operator@sha256:5c6ab93b78bd20eb7f1736751a59c1eb33fb06351339563dbefe49ccaaff6e94\\\"\"" pod="openstack-operators/ovn-operator-controller-manager-688db7b6c7-kv6xz" podUID="94dc0b91-323f-4390-8af1-71019409bdf2" Oct 03 08:55:49 crc kubenswrapper[4790]: E1003 08:55:49.222191 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:637bb7b9ac308bc1e323391a3593b824f688090a856c83385814c17a571b1eed\\\"\"" pod="openstack-operators/swift-operator-controller-manager-6859f9b676-ntlnz" podUID="7c58f90d-6708-4bb2-ba68-fb4e08b99c61" Oct 03 08:55:49 crc kubenswrapper[4790]: E1003 08:55:49.222252 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/nova-operator@sha256:a82409e6d6a5554aad95acfe6fa4784e33de19a963eb8b1da1a80a3e6cf1ab55\\\"\"" pod="openstack-operators/nova-operator-controller-manager-555c7456bd-tlmwh" podUID="b52c73ba-c737-4eaf-a913-155d05c96f6e" Oct 03 08:55:49 crc kubenswrapper[4790]: E1003 08:55:49.222718 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-q8lx4" podUID="681ac688-7419-4d78-bf96-3a1d54c30d8b" Oct 03 08:55:49 crc kubenswrapper[4790]: E1003 08:55:49.222764 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ironic-operator@sha256:38abe6135ccaa369bc831f7878a6dfdf9a5a993a882e1c42073ca43582766f12\\\"\"" pod="openstack-operators/ironic-operator-controller-manager-84bc9db6cc-2vq5g" podUID="1eec0ecd-6bcb-4df5-b6bc-66780ca3e4d4" Oct 03 08:55:49 crc kubenswrapper[4790]: I1003 08:55:49.260968 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-manager-6fb94b767d-nhfws" podStartSLOduration=4.260902957 podStartE2EDuration="4.260902957s" podCreationTimestamp="2025-10-03 08:55:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 08:55:49.246306901 +0000 UTC m=+935.599241139" watchObservedRunningTime="2025-10-03 08:55:49.260902957 +0000 UTC m=+935.613837205" Oct 03 08:55:57 crc kubenswrapper[4790]: I1003 08:55:57.634075 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-6fb94b767d-nhfws" Oct 03 08:55:58 crc kubenswrapper[4790]: I1003 08:55:58.282511 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-75dfd9b554-s25k5" event={"ID":"b1f2aef2-7fd0-4d10-ab5d-f0a8b2e7ffaf","Type":"ContainerStarted","Data":"3175412ddabde3091bc24f2e88e88448e74335a81e51c48ed68bb635712aeea9"} Oct 03 08:55:58 crc kubenswrapper[4790]: I1003 08:55:58.287811 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-6fd6854b49-k8l6q" event={"ID":"9cfa9536-48e0-4639-a0ed-acab9ab92ed0","Type":"ContainerStarted","Data":"3739b6800300d1743932df9a686a33b40fd890583556bd42a4565ad66513cb96"} Oct 03 08:55:58 crc kubenswrapper[4790]: I1003 08:55:58.291526 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6f64c4d678lpbm5" event={"ID":"759c9730-8d00-4516-9ed2-79f56be527bf","Type":"ContainerStarted","Data":"b50a58b58a9366b0888966956493f7c561c9fcb1da742ab59592e48813f52174"} Oct 03 08:55:58 crc kubenswrapper[4790]: I1003 08:55:58.296032 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-5db5cf686f-wrcp9" event={"ID":"b90a3b76-58bf-4c8a-a361-c59a183f6828","Type":"ContainerStarted","Data":"3804c119baad0c01d57d7ecdc3f6c0d71ebeb2fec76260fd4fb5f6bdeb4c089b"} Oct 03 08:55:58 crc kubenswrapper[4790]: I1003 08:55:58.298635 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-5c468bf4d4-sdqf6" event={"ID":"c97f26b2-5313-49db-8ee8-bcde2e793bfa","Type":"ContainerStarted","Data":"d2171f28e364f381d6314a132cff4cc68618c6e24352b7cc7e2a7f2a30aa4fcd"} Oct 03 08:55:58 crc kubenswrapper[4790]: I1003 08:55:58.300811 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-6574bf987d-668k4" event={"ID":"e6f6790b-7aca-49c0-881f-8a7d15ac4071","Type":"ContainerStarted","Data":"e3da318903b0e4d4de0ee8ff3948eed3057048173454746825c895bc623c1362"} Oct 03 08:55:58 crc kubenswrapper[4790]: I1003 08:55:58.302944 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-79d68d6c85-q7g8m" event={"ID":"9b9ab2ca-8c9e-46d1-af78-b811e32b05f0","Type":"ContainerStarted","Data":"d4d3c5e8f125473db1c5c7b58656d6cbd754a3f0cc048bc937a0fb559375372e"} Oct 03 08:55:58 crc kubenswrapper[4790]: I1003 08:55:58.304810 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-846dff85b5-pqh6z" event={"ID":"a168c4e9-522f-4f9b-97d3-5c50008ec59c","Type":"ContainerStarted","Data":"a0a7cde67ac9e48ec3cb2e7af9f0c766045984a90d0c9605b8f7f5fce0311ebb"} Oct 03 08:55:58 crc kubenswrapper[4790]: I1003 08:55:58.312884 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-5c899c45b6-gb6mg" event={"ID":"a583ed8e-8a45-4dc9-80e0-bf559d110818","Type":"ContainerStarted","Data":"b750e902b66e2a6bff065991ddb85f89e9bd7501481bfa873697975e444df770"} Oct 03 08:55:58 crc kubenswrapper[4790]: I1003 08:55:58.325098 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-599898f689-h2km6" event={"ID":"d6653934-c835-4642-a1da-9159b7f85dcb","Type":"ContainerStarted","Data":"0159db989e3434e03d765eae419a5f7f1964b495406745ec5fe4221adb3537fe"} Oct 03 08:55:58 crc kubenswrapper[4790]: I1003 08:55:58.349739 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-6c675fb79f-rtftw" event={"ID":"f598237f-5311-44e2-8ed2-3de81bbebc91","Type":"ContainerStarted","Data":"000ec5e79b7c91002f08a6f7298f009e4c7681c0b04d2d797b16345f2168406f"} Oct 03 08:55:59 crc kubenswrapper[4790]: I1003 08:55:59.357292 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6f64c4d678lpbm5" event={"ID":"759c9730-8d00-4516-9ed2-79f56be527bf","Type":"ContainerStarted","Data":"0c52729331dfe90fd126885dd38f927c5aaaf2c1f749e2647db7d81aa728a791"} Oct 03 08:55:59 crc kubenswrapper[4790]: I1003 08:55:59.358410 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6f64c4d678lpbm5" Oct 03 08:55:59 crc kubenswrapper[4790]: I1003 08:55:59.360986 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-5db5cf686f-wrcp9" event={"ID":"b90a3b76-58bf-4c8a-a361-c59a183f6828","Type":"ContainerStarted","Data":"d2f2094973acda05c3315ecb435b85d4e8856bacfcf3a5329e90a5d1f892debd"} Oct 03 08:55:59 crc kubenswrapper[4790]: I1003 08:55:59.361134 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-5db5cf686f-wrcp9" Oct 03 08:55:59 crc kubenswrapper[4790]: I1003 08:55:59.363439 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-6c675fb79f-rtftw" event={"ID":"f598237f-5311-44e2-8ed2-3de81bbebc91","Type":"ContainerStarted","Data":"508515255b24b3102c0a2ff55e89c896ef8a8b0b5d502036b8a1ea9d7b97d425"} Oct 03 08:55:59 crc kubenswrapper[4790]: I1003 08:55:59.363535 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-6c675fb79f-rtftw" Oct 03 08:55:59 crc kubenswrapper[4790]: I1003 08:55:59.369621 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-5c468bf4d4-sdqf6" event={"ID":"c97f26b2-5313-49db-8ee8-bcde2e793bfa","Type":"ContainerStarted","Data":"4d15d1f979fcad3d21d9933dbaa0ed66ebbda175442c2a3f857bf043c331f593"} Oct 03 08:55:59 crc kubenswrapper[4790]: I1003 08:55:59.370435 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-5c468bf4d4-sdqf6" Oct 03 08:55:59 crc kubenswrapper[4790]: I1003 08:55:59.372192 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-75dfd9b554-s25k5" event={"ID":"b1f2aef2-7fd0-4d10-ab5d-f0a8b2e7ffaf","Type":"ContainerStarted","Data":"f3ec617303d8bd50cd0e9439b7536813bfb8f1667218b3ee8602f49c899f7cf8"} Oct 03 08:55:59 crc kubenswrapper[4790]: I1003 08:55:59.372674 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-75dfd9b554-s25k5" Oct 03 08:55:59 crc kubenswrapper[4790]: I1003 08:55:59.373950 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-846dff85b5-pqh6z" event={"ID":"a168c4e9-522f-4f9b-97d3-5c50008ec59c","Type":"ContainerStarted","Data":"681decde4a63162b4cec202e17e82c1958d92fd84ced7cf6f9f16fc88284969d"} Oct 03 08:55:59 crc kubenswrapper[4790]: I1003 08:55:59.374319 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-846dff85b5-pqh6z" Oct 03 08:55:59 crc kubenswrapper[4790]: I1003 08:55:59.375612 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-5c899c45b6-gb6mg" event={"ID":"a583ed8e-8a45-4dc9-80e0-bf559d110818","Type":"ContainerStarted","Data":"1e9e04b7418ab13916b0ae8e5b5727e1aab48c3e08fd3b4edc4c6072fdedcf4c"} Oct 03 08:55:59 crc kubenswrapper[4790]: I1003 08:55:59.375977 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-5c899c45b6-gb6mg" Oct 03 08:55:59 crc kubenswrapper[4790]: I1003 08:55:59.377449 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-599898f689-h2km6" event={"ID":"d6653934-c835-4642-a1da-9159b7f85dcb","Type":"ContainerStarted","Data":"6f85aa03e1d8071ce56e0565651bbe3ef491eeb41d688c2aa26ba5d14483ada9"} Oct 03 08:55:59 crc kubenswrapper[4790]: I1003 08:55:59.377856 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-599898f689-h2km6" Oct 03 08:55:59 crc kubenswrapper[4790]: I1003 08:55:59.379408 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-79d68d6c85-q7g8m" event={"ID":"9b9ab2ca-8c9e-46d1-af78-b811e32b05f0","Type":"ContainerStarted","Data":"52ea289fed7ef1a62b764b2bcae79ab45f6870a240fa7cc76c5db0be3be59c4f"} Oct 03 08:55:59 crc kubenswrapper[4790]: I1003 08:55:59.379873 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-79d68d6c85-q7g8m" Oct 03 08:55:59 crc kubenswrapper[4790]: I1003 08:55:59.381270 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-6fd6854b49-k8l6q" event={"ID":"9cfa9536-48e0-4639-a0ed-acab9ab92ed0","Type":"ContainerStarted","Data":"5076cac2e9fe8310a4cf816dddc075d20efb209720f0ba662ed93843c99631e0"} Oct 03 08:55:59 crc kubenswrapper[4790]: I1003 08:55:59.381623 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-6fd6854b49-k8l6q" Oct 03 08:55:59 crc kubenswrapper[4790]: I1003 08:55:59.383190 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-7f55849f88-twvmq" event={"ID":"12237b3a-fae2-4cac-b0d7-ee396be3da08","Type":"ContainerStarted","Data":"e22f935c7bef72491114bb0c88de5fe37e19ec74240d6dc19407bb7464651697"} Oct 03 08:55:59 crc kubenswrapper[4790]: I1003 08:55:59.383214 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-7f55849f88-twvmq" event={"ID":"12237b3a-fae2-4cac-b0d7-ee396be3da08","Type":"ContainerStarted","Data":"274a94758771268f13c3eaf0c886989038b6b96f24efdc24f9146810c4d897de"} Oct 03 08:55:59 crc kubenswrapper[4790]: I1003 08:55:59.383385 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-7f55849f88-twvmq" Oct 03 08:55:59 crc kubenswrapper[4790]: I1003 08:55:59.385335 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-6769b867d9-d6t4r" event={"ID":"5b3d2a0d-d36d-4a67-9c5f-6a85735127a1","Type":"ContainerStarted","Data":"38f201b66968c26ab1647a0b42d1cb64da638c0b24cc4e1ef7feeab59dc80e35"} Oct 03 08:55:59 crc kubenswrapper[4790]: I1003 08:55:59.385360 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-6769b867d9-d6t4r" event={"ID":"5b3d2a0d-d36d-4a67-9c5f-6a85735127a1","Type":"ContainerStarted","Data":"fda02f17e9467897b1b41c914a9feba2ca8ab8444b34f95507d82c2665189949"} Oct 03 08:55:59 crc kubenswrapper[4790]: I1003 08:55:59.385392 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-6769b867d9-d6t4r" Oct 03 08:55:59 crc kubenswrapper[4790]: I1003 08:55:59.387215 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-59d6cfdf45-62zvh" event={"ID":"98c1e024-12c4-47f7-ac46-ae9f6bd45ab9","Type":"ContainerStarted","Data":"285556fd6a678dfca8f2f43d6915790b852e2c01181e1b65a2ac1fb241ca765c"} Oct 03 08:55:59 crc kubenswrapper[4790]: I1003 08:55:59.387238 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-59d6cfdf45-62zvh" event={"ID":"98c1e024-12c4-47f7-ac46-ae9f6bd45ab9","Type":"ContainerStarted","Data":"867df5fa900b119bcbfd764f0b48d4b6be8134b499f4592c16720c95b4d783e1"} Oct 03 08:55:59 crc kubenswrapper[4790]: I1003 08:55:59.387596 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-59d6cfdf45-62zvh" Oct 03 08:55:59 crc kubenswrapper[4790]: I1003 08:55:59.389045 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-6574bf987d-668k4" event={"ID":"e6f6790b-7aca-49c0-881f-8a7d15ac4071","Type":"ContainerStarted","Data":"53a9f1922e7484fd0a5ac48ad464ad41952ea8286530b5d4c7df9f734d02d837"} Oct 03 08:55:59 crc kubenswrapper[4790]: I1003 08:55:59.389376 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-6574bf987d-668k4" Oct 03 08:55:59 crc kubenswrapper[4790]: I1003 08:55:59.395010 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6f64c4d678lpbm5" podStartSLOduration=5.118654309 podStartE2EDuration="15.394997016s" podCreationTimestamp="2025-10-03 08:55:44 +0000 UTC" firstStartedPulling="2025-10-03 08:55:47.312854348 +0000 UTC m=+933.665788586" lastFinishedPulling="2025-10-03 08:55:57.589197045 +0000 UTC m=+943.942131293" observedRunningTime="2025-10-03 08:55:59.389873997 +0000 UTC m=+945.742808235" watchObservedRunningTime="2025-10-03 08:55:59.394997016 +0000 UTC m=+945.747931254" Oct 03 08:55:59 crc kubenswrapper[4790]: I1003 08:55:59.413045 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/manila-operator-controller-manager-6fd6854b49-k8l6q" podStartSLOduration=4.815570718 podStartE2EDuration="15.413031335s" podCreationTimestamp="2025-10-03 08:55:44 +0000 UTC" firstStartedPulling="2025-10-03 08:55:46.976264018 +0000 UTC m=+933.329198256" lastFinishedPulling="2025-10-03 08:55:57.573724625 +0000 UTC m=+943.926658873" observedRunningTime="2025-10-03 08:55:59.411368659 +0000 UTC m=+945.764302897" watchObservedRunningTime="2025-10-03 08:55:59.413031335 +0000 UTC m=+945.765965573" Oct 03 08:55:59 crc kubenswrapper[4790]: I1003 08:55:59.429636 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/heat-operator-controller-manager-599898f689-h2km6" podStartSLOduration=3.745237506 podStartE2EDuration="15.429621885s" podCreationTimestamp="2025-10-03 08:55:44 +0000 UTC" firstStartedPulling="2025-10-03 08:55:45.900150879 +0000 UTC m=+932.253085117" lastFinishedPulling="2025-10-03 08:55:57.584535238 +0000 UTC m=+943.937469496" observedRunningTime="2025-10-03 08:55:59.427873597 +0000 UTC m=+945.780807835" watchObservedRunningTime="2025-10-03 08:55:59.429621885 +0000 UTC m=+945.782556123" Oct 03 08:55:59 crc kubenswrapper[4790]: I1003 08:55:59.469342 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-7f55849f88-twvmq" podStartSLOduration=4.834873731 podStartE2EDuration="15.469325701s" podCreationTimestamp="2025-10-03 08:55:44 +0000 UTC" firstStartedPulling="2025-10-03 08:55:46.976550136 +0000 UTC m=+933.329484384" lastFinishedPulling="2025-10-03 08:55:57.611002106 +0000 UTC m=+943.963936354" observedRunningTime="2025-10-03 08:55:59.466809463 +0000 UTC m=+945.819743701" watchObservedRunningTime="2025-10-03 08:55:59.469325701 +0000 UTC m=+945.822259939" Oct 03 08:55:59 crc kubenswrapper[4790]: I1003 08:55:59.471151 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/designate-operator-controller-manager-75dfd9b554-s25k5" podStartSLOduration=4.592901768 podStartE2EDuration="15.471141951s" podCreationTimestamp="2025-10-03 08:55:44 +0000 UTC" firstStartedPulling="2025-10-03 08:55:46.722291709 +0000 UTC m=+933.075225947" lastFinishedPulling="2025-10-03 08:55:57.600531882 +0000 UTC m=+943.953466130" observedRunningTime="2025-10-03 08:55:59.449609507 +0000 UTC m=+945.802543745" watchObservedRunningTime="2025-10-03 08:55:59.471141951 +0000 UTC m=+945.824076179" Oct 03 08:55:59 crc kubenswrapper[4790]: I1003 08:55:59.488106 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-5c899c45b6-gb6mg" podStartSLOduration=4.099317263 podStartE2EDuration="14.488091221s" podCreationTimestamp="2025-10-03 08:55:45 +0000 UTC" firstStartedPulling="2025-10-03 08:55:47.185368089 +0000 UTC m=+933.538302327" lastFinishedPulling="2025-10-03 08:55:57.574142027 +0000 UTC m=+943.927076285" observedRunningTime="2025-10-03 08:55:59.48510183 +0000 UTC m=+945.838036078" watchObservedRunningTime="2025-10-03 08:55:59.488091221 +0000 UTC m=+945.841025459" Oct 03 08:55:59 crc kubenswrapper[4790]: I1003 08:55:59.525137 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-6c675fb79f-rtftw" podStartSLOduration=4.6291121 podStartE2EDuration="15.525093274s" podCreationTimestamp="2025-10-03 08:55:44 +0000 UTC" firstStartedPulling="2025-10-03 08:55:46.693758645 +0000 UTC m=+933.046692883" lastFinishedPulling="2025-10-03 08:55:57.589739809 +0000 UTC m=+943.942674057" observedRunningTime="2025-10-03 08:55:59.511321611 +0000 UTC m=+945.864255859" watchObservedRunningTime="2025-10-03 08:55:59.525093274 +0000 UTC m=+945.878027522" Oct 03 08:55:59 crc kubenswrapper[4790]: I1003 08:55:59.526147 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/octavia-operator-controller-manager-59d6cfdf45-62zvh" podStartSLOduration=4.909893236 podStartE2EDuration="15.526139952s" podCreationTimestamp="2025-10-03 08:55:44 +0000 UTC" firstStartedPulling="2025-10-03 08:55:46.990363271 +0000 UTC m=+933.343297529" lastFinishedPulling="2025-10-03 08:55:57.606609987 +0000 UTC m=+943.959544245" observedRunningTime="2025-10-03 08:55:59.523389138 +0000 UTC m=+945.876323416" watchObservedRunningTime="2025-10-03 08:55:59.526139952 +0000 UTC m=+945.879074190" Oct 03 08:55:59 crc kubenswrapper[4790]: I1003 08:55:59.547098 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-846dff85b5-pqh6z" podStartSLOduration=4.685526701 podStartE2EDuration="15.547072111s" podCreationTimestamp="2025-10-03 08:55:44 +0000 UTC" firstStartedPulling="2025-10-03 08:55:46.722289819 +0000 UTC m=+933.075224057" lastFinishedPulling="2025-10-03 08:55:57.583835219 +0000 UTC m=+943.936769467" observedRunningTime="2025-10-03 08:55:59.544056889 +0000 UTC m=+945.896991137" watchObservedRunningTime="2025-10-03 08:55:59.547072111 +0000 UTC m=+945.900006349" Oct 03 08:55:59 crc kubenswrapper[4790]: I1003 08:55:59.569498 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-6769b867d9-d6t4r" podStartSLOduration=4.674012207 podStartE2EDuration="15.569483248s" podCreationTimestamp="2025-10-03 08:55:44 +0000 UTC" firstStartedPulling="2025-10-03 08:55:46.693727624 +0000 UTC m=+933.046661862" lastFinishedPulling="2025-10-03 08:55:57.589198665 +0000 UTC m=+943.942132903" observedRunningTime="2025-10-03 08:55:59.565683606 +0000 UTC m=+945.918617844" watchObservedRunningTime="2025-10-03 08:55:59.569483248 +0000 UTC m=+945.922417486" Oct 03 08:55:59 crc kubenswrapper[4790]: I1003 08:55:59.587408 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-5c468bf4d4-sdqf6" podStartSLOduration=4.989612108 podStartE2EDuration="15.587394114s" podCreationTimestamp="2025-10-03 08:55:44 +0000 UTC" firstStartedPulling="2025-10-03 08:55:46.994160553 +0000 UTC m=+933.347094791" lastFinishedPulling="2025-10-03 08:55:57.591942549 +0000 UTC m=+943.944876797" observedRunningTime="2025-10-03 08:55:59.58392354 +0000 UTC m=+945.936857788" watchObservedRunningTime="2025-10-03 08:55:59.587394114 +0000 UTC m=+945.940328352" Oct 03 08:55:59 crc kubenswrapper[4790]: I1003 08:55:59.600481 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/neutron-operator-controller-manager-6574bf987d-668k4" podStartSLOduration=4.992814185 podStartE2EDuration="15.600468379s" podCreationTimestamp="2025-10-03 08:55:44 +0000 UTC" firstStartedPulling="2025-10-03 08:55:46.983426912 +0000 UTC m=+933.336361150" lastFinishedPulling="2025-10-03 08:55:57.591081096 +0000 UTC m=+943.944015344" observedRunningTime="2025-10-03 08:55:59.599787441 +0000 UTC m=+945.952721679" watchObservedRunningTime="2025-10-03 08:55:59.600468379 +0000 UTC m=+945.953402617" Oct 03 08:55:59 crc kubenswrapper[4790]: I1003 08:55:59.621827 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/telemetry-operator-controller-manager-5db5cf686f-wrcp9" podStartSLOduration=3.970740526 podStartE2EDuration="14.621813048s" podCreationTimestamp="2025-10-03 08:55:45 +0000 UTC" firstStartedPulling="2025-10-03 08:55:46.92807501 +0000 UTC m=+933.281009248" lastFinishedPulling="2025-10-03 08:55:57.579147522 +0000 UTC m=+943.932081770" observedRunningTime="2025-10-03 08:55:59.619517346 +0000 UTC m=+945.972451584" watchObservedRunningTime="2025-10-03 08:55:59.621813048 +0000 UTC m=+945.974747286" Oct 03 08:56:00 crc kubenswrapper[4790]: I1003 08:56:00.330649 4790 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 03 08:56:03 crc kubenswrapper[4790]: I1003 08:56:03.419598 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-555c7456bd-tlmwh" event={"ID":"b52c73ba-c737-4eaf-a913-155d05c96f6e","Type":"ContainerStarted","Data":"935ceeb85d0e9644973ba86648af2dda2d5bc8cd75f56f46aeb464776ced32ea"} Oct 03 08:56:03 crc kubenswrapper[4790]: I1003 08:56:03.420600 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-555c7456bd-tlmwh" Oct 03 08:56:03 crc kubenswrapper[4790]: I1003 08:56:03.424911 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5cd5cb47d7-mshnf" event={"ID":"15dfe278-fcef-4c6f-8c06-cce51c7d5f7c","Type":"ContainerStarted","Data":"d486b78726a5020d50fedef1e41ac1f7c45ef9cddde4975328c8cb97b8288a5f"} Oct 03 08:56:03 crc kubenswrapper[4790]: I1003 08:56:03.425178 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-5cd5cb47d7-mshnf" Oct 03 08:56:03 crc kubenswrapper[4790]: I1003 08:56:03.441049 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/cinder-operator-controller-manager-79d68d6c85-q7g8m" podStartSLOduration=8.542638451 podStartE2EDuration="19.4410283s" podCreationTimestamp="2025-10-03 08:55:44 +0000 UTC" firstStartedPulling="2025-10-03 08:55:46.702919544 +0000 UTC m=+933.055853782" lastFinishedPulling="2025-10-03 08:55:57.601309383 +0000 UTC m=+943.954243631" observedRunningTime="2025-10-03 08:55:59.636761593 +0000 UTC m=+945.989695831" watchObservedRunningTime="2025-10-03 08:56:03.4410283 +0000 UTC m=+949.793962538" Oct 03 08:56:03 crc kubenswrapper[4790]: I1003 08:56:03.444662 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/nova-operator-controller-manager-555c7456bd-tlmwh" podStartSLOduration=4.037639955 podStartE2EDuration="19.444614717s" podCreationTimestamp="2025-10-03 08:55:44 +0000 UTC" firstStartedPulling="2025-10-03 08:55:47.006632371 +0000 UTC m=+933.359566609" lastFinishedPulling="2025-10-03 08:56:02.413607133 +0000 UTC m=+948.766541371" observedRunningTime="2025-10-03 08:56:03.437074844 +0000 UTC m=+949.790009082" watchObservedRunningTime="2025-10-03 08:56:03.444614717 +0000 UTC m=+949.797548985" Oct 03 08:56:03 crc kubenswrapper[4790]: I1003 08:56:03.461046 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/test-operator-controller-manager-5cd5cb47d7-mshnf" podStartSLOduration=3.277915174 podStartE2EDuration="18.461021443s" podCreationTimestamp="2025-10-03 08:55:45 +0000 UTC" firstStartedPulling="2025-10-03 08:55:47.244264177 +0000 UTC m=+933.597198415" lastFinishedPulling="2025-10-03 08:56:02.427370446 +0000 UTC m=+948.780304684" observedRunningTime="2025-10-03 08:56:03.459383418 +0000 UTC m=+949.812317666" watchObservedRunningTime="2025-10-03 08:56:03.461021443 +0000 UTC m=+949.813955681" Oct 03 08:56:04 crc kubenswrapper[4790]: I1003 08:56:04.980427 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-6c675fb79f-rtftw" Oct 03 08:56:05 crc kubenswrapper[4790]: I1003 08:56:05.010456 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-79d68d6c85-q7g8m" Oct 03 08:56:05 crc kubenswrapper[4790]: I1003 08:56:05.015937 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/designate-operator-controller-manager-75dfd9b554-s25k5" Oct 03 08:56:05 crc kubenswrapper[4790]: I1003 08:56:05.080204 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-846dff85b5-pqh6z" Oct 03 08:56:05 crc kubenswrapper[4790]: I1003 08:56:05.093032 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-599898f689-h2km6" Oct 03 08:56:05 crc kubenswrapper[4790]: I1003 08:56:05.140535 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-6769b867d9-d6t4r" Oct 03 08:56:05 crc kubenswrapper[4790]: I1003 08:56:05.276459 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-7f55849f88-twvmq" Oct 03 08:56:05 crc kubenswrapper[4790]: I1003 08:56:05.436733 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-6fd6854b49-k8l6q" Oct 03 08:56:05 crc kubenswrapper[4790]: I1003 08:56:05.450603 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-5c468bf4d4-sdqf6" Oct 03 08:56:05 crc kubenswrapper[4790]: I1003 08:56:05.451037 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-6859f9b676-ntlnz" event={"ID":"7c58f90d-6708-4bb2-ba68-fb4e08b99c61","Type":"ContainerStarted","Data":"0b36490a2d8e403baacf5a78de5124c4d4f228008e854b4112639a9ac68b66f8"} Oct 03 08:56:05 crc kubenswrapper[4790]: I1003 08:56:05.451259 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-6859f9b676-ntlnz" Oct 03 08:56:05 crc kubenswrapper[4790]: I1003 08:56:05.457074 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-7d8bb7f44c-8gls6" event={"ID":"c7787291-34a2-4682-88f4-7b05f28f36f2","Type":"ContainerStarted","Data":"c2221ce9250bdcb7a72cce4d0cb7750bff1ad843c0d55d891106a22723fa842a"} Oct 03 08:56:05 crc kubenswrapper[4790]: I1003 08:56:05.457269 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-7d8bb7f44c-8gls6" Oct 03 08:56:05 crc kubenswrapper[4790]: I1003 08:56:05.459740 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-688db7b6c7-kv6xz" event={"ID":"94dc0b91-323f-4390-8af1-71019409bdf2","Type":"ContainerStarted","Data":"efe536ae84276e5be048d522b74217bf41b0074298ff18774876ab6b350b8e96"} Oct 03 08:56:05 crc kubenswrapper[4790]: I1003 08:56:05.459901 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-688db7b6c7-kv6xz" Oct 03 08:56:05 crc kubenswrapper[4790]: I1003 08:56:05.467375 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-84bc9db6cc-2vq5g" event={"ID":"1eec0ecd-6bcb-4df5-b6bc-66780ca3e4d4","Type":"ContainerStarted","Data":"6483008c1d410814b1eaec025035fd5161322f5e0e2dc4af55ed97c6af84fb1c"} Oct 03 08:56:05 crc kubenswrapper[4790]: I1003 08:56:05.468029 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-84bc9db6cc-2vq5g" Oct 03 08:56:05 crc kubenswrapper[4790]: I1003 08:56:05.513008 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-6859f9b676-ntlnz" podStartSLOduration=3.090687035 podStartE2EDuration="20.512987651s" podCreationTimestamp="2025-10-03 08:55:45 +0000 UTC" firstStartedPulling="2025-10-03 08:55:47.225372494 +0000 UTC m=+933.578306732" lastFinishedPulling="2025-10-03 08:56:04.64767311 +0000 UTC m=+951.000607348" observedRunningTime="2025-10-03 08:56:05.510222756 +0000 UTC m=+951.863156994" watchObservedRunningTime="2025-10-03 08:56:05.512987651 +0000 UTC m=+951.865921899" Oct 03 08:56:05 crc kubenswrapper[4790]: I1003 08:56:05.532166 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ovn-operator-controller-manager-688db7b6c7-kv6xz" podStartSLOduration=4.026591166 podStartE2EDuration="21.5321463s" podCreationTimestamp="2025-10-03 08:55:44 +0000 UTC" firstStartedPulling="2025-10-03 08:55:47.208082535 +0000 UTC m=+933.561016773" lastFinishedPulling="2025-10-03 08:56:04.713637669 +0000 UTC m=+951.066571907" observedRunningTime="2025-10-03 08:56:05.527651819 +0000 UTC m=+951.880586057" watchObservedRunningTime="2025-10-03 08:56:05.5321463 +0000 UTC m=+951.885080538" Oct 03 08:56:05 crc kubenswrapper[4790]: I1003 08:56:05.543141 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/placement-operator-controller-manager-7d8bb7f44c-8gls6" podStartSLOduration=4.103577434 podStartE2EDuration="21.543124838s" podCreationTimestamp="2025-10-03 08:55:44 +0000 UTC" firstStartedPulling="2025-10-03 08:55:47.208110196 +0000 UTC m=+933.561044434" lastFinishedPulling="2025-10-03 08:56:04.6476576 +0000 UTC m=+951.000591838" observedRunningTime="2025-10-03 08:56:05.540406934 +0000 UTC m=+951.893341172" watchObservedRunningTime="2025-10-03 08:56:05.543124838 +0000 UTC m=+951.896059076" Oct 03 08:56:05 crc kubenswrapper[4790]: I1003 08:56:05.570176 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ironic-operator-controller-manager-84bc9db6cc-2vq5g" podStartSLOduration=3.8966569030000002 podStartE2EDuration="21.570159992s" podCreationTimestamp="2025-10-03 08:55:44 +0000 UTC" firstStartedPulling="2025-10-03 08:55:47.008455491 +0000 UTC m=+933.361389729" lastFinishedPulling="2025-10-03 08:56:04.68195858 +0000 UTC m=+951.034892818" observedRunningTime="2025-10-03 08:56:05.555233127 +0000 UTC m=+951.908167385" watchObservedRunningTime="2025-10-03 08:56:05.570159992 +0000 UTC m=+951.923094230" Oct 03 08:56:05 crc kubenswrapper[4790]: I1003 08:56:05.609915 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/neutron-operator-controller-manager-6574bf987d-668k4" Oct 03 08:56:05 crc kubenswrapper[4790]: I1003 08:56:05.734119 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-5db5cf686f-wrcp9" Oct 03 08:56:05 crc kubenswrapper[4790]: I1003 08:56:05.735341 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-59d6cfdf45-62zvh" Oct 03 08:56:05 crc kubenswrapper[4790]: I1003 08:56:05.798125 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-5c899c45b6-gb6mg" Oct 03 08:56:06 crc kubenswrapper[4790]: I1003 08:56:06.239975 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6f64c4d678lpbm5" Oct 03 08:56:07 crc kubenswrapper[4790]: I1003 08:56:07.484282 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-q8lx4" event={"ID":"681ac688-7419-4d78-bf96-3a1d54c30d8b","Type":"ContainerStarted","Data":"27c061f83297a94d1951f3cf073280f3bb5021326dc9b80270b7c05f6ccb58d2"} Oct 03 08:56:07 crc kubenswrapper[4790]: I1003 08:56:07.486612 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-5fbf469cd7-894gw" event={"ID":"2bc6d6a2-91a5-464c-bba7-3a00ee0b1938","Type":"ContainerStarted","Data":"aa5b67136bf9bd0375bf7760ba9c901f32dcf2e0ae057234cb40374d475a4b3b"} Oct 03 08:56:07 crc kubenswrapper[4790]: I1003 08:56:07.487106 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-5fbf469cd7-894gw" Oct 03 08:56:07 crc kubenswrapper[4790]: I1003 08:56:07.503360 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-q8lx4" podStartSLOduration=3.161958749 podStartE2EDuration="22.503331687s" podCreationTimestamp="2025-10-03 08:55:45 +0000 UTC" firstStartedPulling="2025-10-03 08:55:47.244077692 +0000 UTC m=+933.597011930" lastFinishedPulling="2025-10-03 08:56:06.58545063 +0000 UTC m=+952.938384868" observedRunningTime="2025-10-03 08:56:07.501769114 +0000 UTC m=+953.854703362" watchObservedRunningTime="2025-10-03 08:56:07.503331687 +0000 UTC m=+953.856265965" Oct 03 08:56:07 crc kubenswrapper[4790]: I1003 08:56:07.527268 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-5fbf469cd7-894gw" podStartSLOduration=4.183055232 podStartE2EDuration="23.527243636s" podCreationTimestamp="2025-10-03 08:55:44 +0000 UTC" firstStartedPulling="2025-10-03 08:55:47.244144554 +0000 UTC m=+933.597078792" lastFinishedPulling="2025-10-03 08:56:06.588332958 +0000 UTC m=+952.941267196" observedRunningTime="2025-10-03 08:56:07.522926079 +0000 UTC m=+953.875860337" watchObservedRunningTime="2025-10-03 08:56:07.527243636 +0000 UTC m=+953.880177884" Oct 03 08:56:10 crc kubenswrapper[4790]: I1003 08:56:10.185950 4790 patch_prober.go:28] interesting pod/machine-config-daemon-zqj4j container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 08:56:10 crc kubenswrapper[4790]: I1003 08:56:10.186028 4790 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" podUID="36eae06a-d8be-4128-8d68-acb32e1f011b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 08:56:10 crc kubenswrapper[4790]: I1003 08:56:10.186085 4790 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" Oct 03 08:56:10 crc kubenswrapper[4790]: I1003 08:56:10.186992 4790 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"e2e53bba0341ebe043843fcef071e2d9f340680e77501673a1e1a360df997492"} pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 03 08:56:10 crc kubenswrapper[4790]: I1003 08:56:10.187078 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" podUID="36eae06a-d8be-4128-8d68-acb32e1f011b" containerName="machine-config-daemon" containerID="cri-o://e2e53bba0341ebe043843fcef071e2d9f340680e77501673a1e1a360df997492" gracePeriod=600 Oct 03 08:56:10 crc kubenswrapper[4790]: I1003 08:56:10.515424 4790 generic.go:334] "Generic (PLEG): container finished" podID="36eae06a-d8be-4128-8d68-acb32e1f011b" containerID="e2e53bba0341ebe043843fcef071e2d9f340680e77501673a1e1a360df997492" exitCode=0 Oct 03 08:56:10 crc kubenswrapper[4790]: I1003 08:56:10.515927 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" event={"ID":"36eae06a-d8be-4128-8d68-acb32e1f011b","Type":"ContainerDied","Data":"e2e53bba0341ebe043843fcef071e2d9f340680e77501673a1e1a360df997492"} Oct 03 08:56:10 crc kubenswrapper[4790]: I1003 08:56:10.515975 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" event={"ID":"36eae06a-d8be-4128-8d68-acb32e1f011b","Type":"ContainerStarted","Data":"ab4e236d11b4b58cf11bf821ee4fe0466ca737afca708932f0cbf0c7c147ea38"} Oct 03 08:56:10 crc kubenswrapper[4790]: I1003 08:56:10.516005 4790 scope.go:117] "RemoveContainer" containerID="4ae452d98acbb7343cd83b179511e310c36044718e3f86be8c5675a6c5f1473a" Oct 03 08:56:15 crc kubenswrapper[4790]: I1003 08:56:15.243366 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-84bc9db6cc-2vq5g" Oct 03 08:56:15 crc kubenswrapper[4790]: I1003 08:56:15.629141 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-555c7456bd-tlmwh" Oct 03 08:56:15 crc kubenswrapper[4790]: I1003 08:56:15.764641 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-7d8bb7f44c-8gls6" Oct 03 08:56:15 crc kubenswrapper[4790]: I1003 08:56:15.782673 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-5cd5cb47d7-mshnf" Oct 03 08:56:15 crc kubenswrapper[4790]: I1003 08:56:15.812957 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ovn-operator-controller-manager-688db7b6c7-kv6xz" Oct 03 08:56:15 crc kubenswrapper[4790]: I1003 08:56:15.920999 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-6859f9b676-ntlnz" Oct 03 08:56:15 crc kubenswrapper[4790]: I1003 08:56:15.967337 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-5fbf469cd7-894gw" Oct 03 08:56:34 crc kubenswrapper[4790]: I1003 08:56:34.275563 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5d46db5bb7-87gxz"] Oct 03 08:56:34 crc kubenswrapper[4790]: I1003 08:56:34.277354 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5d46db5bb7-87gxz" Oct 03 08:56:34 crc kubenswrapper[4790]: I1003 08:56:34.279188 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Oct 03 08:56:34 crc kubenswrapper[4790]: I1003 08:56:34.282351 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Oct 03 08:56:34 crc kubenswrapper[4790]: I1003 08:56:34.282613 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Oct 03 08:56:34 crc kubenswrapper[4790]: I1003 08:56:34.282748 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-lhl8z" Oct 03 08:56:34 crc kubenswrapper[4790]: I1003 08:56:34.292648 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5d46db5bb7-87gxz"] Oct 03 08:56:34 crc kubenswrapper[4790]: I1003 08:56:34.347525 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-59c78cff8f-gxv7p"] Oct 03 08:56:34 crc kubenswrapper[4790]: I1003 08:56:34.351401 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-59c78cff8f-gxv7p" Oct 03 08:56:34 crc kubenswrapper[4790]: I1003 08:56:34.353884 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Oct 03 08:56:34 crc kubenswrapper[4790]: I1003 08:56:34.367747 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-59c78cff8f-gxv7p"] Oct 03 08:56:34 crc kubenswrapper[4790]: I1003 08:56:34.430417 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hmwdq\" (UniqueName: \"kubernetes.io/projected/c4776223-9bc5-480f-acb7-d71c662bd3a8-kube-api-access-hmwdq\") pod \"dnsmasq-dns-5d46db5bb7-87gxz\" (UID: \"c4776223-9bc5-480f-acb7-d71c662bd3a8\") " pod="openstack/dnsmasq-dns-5d46db5bb7-87gxz" Oct 03 08:56:34 crc kubenswrapper[4790]: I1003 08:56:34.430566 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c4776223-9bc5-480f-acb7-d71c662bd3a8-config\") pod \"dnsmasq-dns-5d46db5bb7-87gxz\" (UID: \"c4776223-9bc5-480f-acb7-d71c662bd3a8\") " pod="openstack/dnsmasq-dns-5d46db5bb7-87gxz" Oct 03 08:56:34 crc kubenswrapper[4790]: I1003 08:56:34.532246 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hmwdq\" (UniqueName: \"kubernetes.io/projected/c4776223-9bc5-480f-acb7-d71c662bd3a8-kube-api-access-hmwdq\") pod \"dnsmasq-dns-5d46db5bb7-87gxz\" (UID: \"c4776223-9bc5-480f-acb7-d71c662bd3a8\") " pod="openstack/dnsmasq-dns-5d46db5bb7-87gxz" Oct 03 08:56:34 crc kubenswrapper[4790]: I1003 08:56:34.532608 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c4776223-9bc5-480f-acb7-d71c662bd3a8-config\") pod \"dnsmasq-dns-5d46db5bb7-87gxz\" (UID: \"c4776223-9bc5-480f-acb7-d71c662bd3a8\") " pod="openstack/dnsmasq-dns-5d46db5bb7-87gxz" Oct 03 08:56:34 crc kubenswrapper[4790]: I1003 08:56:34.532645 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7c60d6d1-6e32-4c60-a7c8-7bfb48d7267b-dns-svc\") pod \"dnsmasq-dns-59c78cff8f-gxv7p\" (UID: \"7c60d6d1-6e32-4c60-a7c8-7bfb48d7267b\") " pod="openstack/dnsmasq-dns-59c78cff8f-gxv7p" Oct 03 08:56:34 crc kubenswrapper[4790]: I1003 08:56:34.532663 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gxxrw\" (UniqueName: \"kubernetes.io/projected/7c60d6d1-6e32-4c60-a7c8-7bfb48d7267b-kube-api-access-gxxrw\") pod \"dnsmasq-dns-59c78cff8f-gxv7p\" (UID: \"7c60d6d1-6e32-4c60-a7c8-7bfb48d7267b\") " pod="openstack/dnsmasq-dns-59c78cff8f-gxv7p" Oct 03 08:56:34 crc kubenswrapper[4790]: I1003 08:56:34.532684 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7c60d6d1-6e32-4c60-a7c8-7bfb48d7267b-config\") pod \"dnsmasq-dns-59c78cff8f-gxv7p\" (UID: \"7c60d6d1-6e32-4c60-a7c8-7bfb48d7267b\") " pod="openstack/dnsmasq-dns-59c78cff8f-gxv7p" Oct 03 08:56:34 crc kubenswrapper[4790]: I1003 08:56:34.533520 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c4776223-9bc5-480f-acb7-d71c662bd3a8-config\") pod \"dnsmasq-dns-5d46db5bb7-87gxz\" (UID: \"c4776223-9bc5-480f-acb7-d71c662bd3a8\") " pod="openstack/dnsmasq-dns-5d46db5bb7-87gxz" Oct 03 08:56:34 crc kubenswrapper[4790]: I1003 08:56:34.554440 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hmwdq\" (UniqueName: \"kubernetes.io/projected/c4776223-9bc5-480f-acb7-d71c662bd3a8-kube-api-access-hmwdq\") pod \"dnsmasq-dns-5d46db5bb7-87gxz\" (UID: \"c4776223-9bc5-480f-acb7-d71c662bd3a8\") " pod="openstack/dnsmasq-dns-5d46db5bb7-87gxz" Oct 03 08:56:34 crc kubenswrapper[4790]: I1003 08:56:34.594422 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5d46db5bb7-87gxz" Oct 03 08:56:34 crc kubenswrapper[4790]: I1003 08:56:34.634313 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7c60d6d1-6e32-4c60-a7c8-7bfb48d7267b-dns-svc\") pod \"dnsmasq-dns-59c78cff8f-gxv7p\" (UID: \"7c60d6d1-6e32-4c60-a7c8-7bfb48d7267b\") " pod="openstack/dnsmasq-dns-59c78cff8f-gxv7p" Oct 03 08:56:34 crc kubenswrapper[4790]: I1003 08:56:34.634364 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gxxrw\" (UniqueName: \"kubernetes.io/projected/7c60d6d1-6e32-4c60-a7c8-7bfb48d7267b-kube-api-access-gxxrw\") pod \"dnsmasq-dns-59c78cff8f-gxv7p\" (UID: \"7c60d6d1-6e32-4c60-a7c8-7bfb48d7267b\") " pod="openstack/dnsmasq-dns-59c78cff8f-gxv7p" Oct 03 08:56:34 crc kubenswrapper[4790]: I1003 08:56:34.634391 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7c60d6d1-6e32-4c60-a7c8-7bfb48d7267b-config\") pod \"dnsmasq-dns-59c78cff8f-gxv7p\" (UID: \"7c60d6d1-6e32-4c60-a7c8-7bfb48d7267b\") " pod="openstack/dnsmasq-dns-59c78cff8f-gxv7p" Oct 03 08:56:34 crc kubenswrapper[4790]: I1003 08:56:34.635211 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7c60d6d1-6e32-4c60-a7c8-7bfb48d7267b-dns-svc\") pod \"dnsmasq-dns-59c78cff8f-gxv7p\" (UID: \"7c60d6d1-6e32-4c60-a7c8-7bfb48d7267b\") " pod="openstack/dnsmasq-dns-59c78cff8f-gxv7p" Oct 03 08:56:34 crc kubenswrapper[4790]: I1003 08:56:34.635350 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7c60d6d1-6e32-4c60-a7c8-7bfb48d7267b-config\") pod \"dnsmasq-dns-59c78cff8f-gxv7p\" (UID: \"7c60d6d1-6e32-4c60-a7c8-7bfb48d7267b\") " pod="openstack/dnsmasq-dns-59c78cff8f-gxv7p" Oct 03 08:56:34 crc kubenswrapper[4790]: I1003 08:56:34.650287 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gxxrw\" (UniqueName: \"kubernetes.io/projected/7c60d6d1-6e32-4c60-a7c8-7bfb48d7267b-kube-api-access-gxxrw\") pod \"dnsmasq-dns-59c78cff8f-gxv7p\" (UID: \"7c60d6d1-6e32-4c60-a7c8-7bfb48d7267b\") " pod="openstack/dnsmasq-dns-59c78cff8f-gxv7p" Oct 03 08:56:34 crc kubenswrapper[4790]: I1003 08:56:34.680042 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-59c78cff8f-gxv7p" Oct 03 08:56:35 crc kubenswrapper[4790]: I1003 08:56:35.027261 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5d46db5bb7-87gxz"] Oct 03 08:56:35 crc kubenswrapper[4790]: W1003 08:56:35.033972 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc4776223_9bc5_480f_acb7_d71c662bd3a8.slice/crio-4d2170a2549f7289e12696df3e75af242a8b6e6877c54a105783ee846094999c WatchSource:0}: Error finding container 4d2170a2549f7289e12696df3e75af242a8b6e6877c54a105783ee846094999c: Status 404 returned error can't find the container with id 4d2170a2549f7289e12696df3e75af242a8b6e6877c54a105783ee846094999c Oct 03 08:56:35 crc kubenswrapper[4790]: I1003 08:56:35.167254 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-59c78cff8f-gxv7p"] Oct 03 08:56:35 crc kubenswrapper[4790]: W1003 08:56:35.178380 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7c60d6d1_6e32_4c60_a7c8_7bfb48d7267b.slice/crio-24d354d8f83d55e63cab2b1e36adaa11a290f9cfe4cf5dd8ff734068e84e4dc8 WatchSource:0}: Error finding container 24d354d8f83d55e63cab2b1e36adaa11a290f9cfe4cf5dd8ff734068e84e4dc8: Status 404 returned error can't find the container with id 24d354d8f83d55e63cab2b1e36adaa11a290f9cfe4cf5dd8ff734068e84e4dc8 Oct 03 08:56:35 crc kubenswrapper[4790]: I1003 08:56:35.725733 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-59c78cff8f-gxv7p" event={"ID":"7c60d6d1-6e32-4c60-a7c8-7bfb48d7267b","Type":"ContainerStarted","Data":"24d354d8f83d55e63cab2b1e36adaa11a290f9cfe4cf5dd8ff734068e84e4dc8"} Oct 03 08:56:35 crc kubenswrapper[4790]: I1003 08:56:35.728680 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d46db5bb7-87gxz" event={"ID":"c4776223-9bc5-480f-acb7-d71c662bd3a8","Type":"ContainerStarted","Data":"4d2170a2549f7289e12696df3e75af242a8b6e6877c54a105783ee846094999c"} Oct 03 08:56:38 crc kubenswrapper[4790]: I1003 08:56:38.103781 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-59c78cff8f-gxv7p"] Oct 03 08:56:38 crc kubenswrapper[4790]: I1003 08:56:38.136664 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-57dc99974f-9r62r"] Oct 03 08:56:38 crc kubenswrapper[4790]: I1003 08:56:38.138091 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57dc99974f-9r62r" Oct 03 08:56:38 crc kubenswrapper[4790]: I1003 08:56:38.165518 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57dc99974f-9r62r"] Oct 03 08:56:38 crc kubenswrapper[4790]: I1003 08:56:38.286037 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-65b6n\" (UniqueName: \"kubernetes.io/projected/5d262119-1fe3-42b9-bdc1-fe03ce687c2e-kube-api-access-65b6n\") pod \"dnsmasq-dns-57dc99974f-9r62r\" (UID: \"5d262119-1fe3-42b9-bdc1-fe03ce687c2e\") " pod="openstack/dnsmasq-dns-57dc99974f-9r62r" Oct 03 08:56:38 crc kubenswrapper[4790]: I1003 08:56:38.286113 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5d262119-1fe3-42b9-bdc1-fe03ce687c2e-config\") pod \"dnsmasq-dns-57dc99974f-9r62r\" (UID: \"5d262119-1fe3-42b9-bdc1-fe03ce687c2e\") " pod="openstack/dnsmasq-dns-57dc99974f-9r62r" Oct 03 08:56:38 crc kubenswrapper[4790]: I1003 08:56:38.286135 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5d262119-1fe3-42b9-bdc1-fe03ce687c2e-dns-svc\") pod \"dnsmasq-dns-57dc99974f-9r62r\" (UID: \"5d262119-1fe3-42b9-bdc1-fe03ce687c2e\") " pod="openstack/dnsmasq-dns-57dc99974f-9r62r" Oct 03 08:56:38 crc kubenswrapper[4790]: I1003 08:56:38.387421 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-65b6n\" (UniqueName: \"kubernetes.io/projected/5d262119-1fe3-42b9-bdc1-fe03ce687c2e-kube-api-access-65b6n\") pod \"dnsmasq-dns-57dc99974f-9r62r\" (UID: \"5d262119-1fe3-42b9-bdc1-fe03ce687c2e\") " pod="openstack/dnsmasq-dns-57dc99974f-9r62r" Oct 03 08:56:38 crc kubenswrapper[4790]: I1003 08:56:38.387499 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5d262119-1fe3-42b9-bdc1-fe03ce687c2e-dns-svc\") pod \"dnsmasq-dns-57dc99974f-9r62r\" (UID: \"5d262119-1fe3-42b9-bdc1-fe03ce687c2e\") " pod="openstack/dnsmasq-dns-57dc99974f-9r62r" Oct 03 08:56:38 crc kubenswrapper[4790]: I1003 08:56:38.387528 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5d262119-1fe3-42b9-bdc1-fe03ce687c2e-config\") pod \"dnsmasq-dns-57dc99974f-9r62r\" (UID: \"5d262119-1fe3-42b9-bdc1-fe03ce687c2e\") " pod="openstack/dnsmasq-dns-57dc99974f-9r62r" Oct 03 08:56:38 crc kubenswrapper[4790]: I1003 08:56:38.388461 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5d262119-1fe3-42b9-bdc1-fe03ce687c2e-dns-svc\") pod \"dnsmasq-dns-57dc99974f-9r62r\" (UID: \"5d262119-1fe3-42b9-bdc1-fe03ce687c2e\") " pod="openstack/dnsmasq-dns-57dc99974f-9r62r" Oct 03 08:56:38 crc kubenswrapper[4790]: I1003 08:56:38.388892 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5d262119-1fe3-42b9-bdc1-fe03ce687c2e-config\") pod \"dnsmasq-dns-57dc99974f-9r62r\" (UID: \"5d262119-1fe3-42b9-bdc1-fe03ce687c2e\") " pod="openstack/dnsmasq-dns-57dc99974f-9r62r" Oct 03 08:56:38 crc kubenswrapper[4790]: I1003 08:56:38.430803 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-65b6n\" (UniqueName: \"kubernetes.io/projected/5d262119-1fe3-42b9-bdc1-fe03ce687c2e-kube-api-access-65b6n\") pod \"dnsmasq-dns-57dc99974f-9r62r\" (UID: \"5d262119-1fe3-42b9-bdc1-fe03ce687c2e\") " pod="openstack/dnsmasq-dns-57dc99974f-9r62r" Oct 03 08:56:38 crc kubenswrapper[4790]: I1003 08:56:38.461664 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57dc99974f-9r62r" Oct 03 08:56:38 crc kubenswrapper[4790]: I1003 08:56:38.507927 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5d46db5bb7-87gxz"] Oct 03 08:56:38 crc kubenswrapper[4790]: I1003 08:56:38.531191 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7b9746b6c-rmnft"] Oct 03 08:56:38 crc kubenswrapper[4790]: I1003 08:56:38.532343 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7b9746b6c-rmnft" Oct 03 08:56:38 crc kubenswrapper[4790]: I1003 08:56:38.546421 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7b9746b6c-rmnft"] Oct 03 08:56:38 crc kubenswrapper[4790]: I1003 08:56:38.691640 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9269d241-5eae-4eb1-b5a9-c4ac47ce177e-dns-svc\") pod \"dnsmasq-dns-7b9746b6c-rmnft\" (UID: \"9269d241-5eae-4eb1-b5a9-c4ac47ce177e\") " pod="openstack/dnsmasq-dns-7b9746b6c-rmnft" Oct 03 08:56:38 crc kubenswrapper[4790]: I1003 08:56:38.691733 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r7zzg\" (UniqueName: \"kubernetes.io/projected/9269d241-5eae-4eb1-b5a9-c4ac47ce177e-kube-api-access-r7zzg\") pod \"dnsmasq-dns-7b9746b6c-rmnft\" (UID: \"9269d241-5eae-4eb1-b5a9-c4ac47ce177e\") " pod="openstack/dnsmasq-dns-7b9746b6c-rmnft" Oct 03 08:56:38 crc kubenswrapper[4790]: I1003 08:56:38.691803 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9269d241-5eae-4eb1-b5a9-c4ac47ce177e-config\") pod \"dnsmasq-dns-7b9746b6c-rmnft\" (UID: \"9269d241-5eae-4eb1-b5a9-c4ac47ce177e\") " pod="openstack/dnsmasq-dns-7b9746b6c-rmnft" Oct 03 08:56:38 crc kubenswrapper[4790]: I1003 08:56:38.791761 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57dc99974f-9r62r"] Oct 03 08:56:38 crc kubenswrapper[4790]: I1003 08:56:38.793228 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9269d241-5eae-4eb1-b5a9-c4ac47ce177e-dns-svc\") pod \"dnsmasq-dns-7b9746b6c-rmnft\" (UID: \"9269d241-5eae-4eb1-b5a9-c4ac47ce177e\") " pod="openstack/dnsmasq-dns-7b9746b6c-rmnft" Oct 03 08:56:38 crc kubenswrapper[4790]: I1003 08:56:38.793304 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r7zzg\" (UniqueName: \"kubernetes.io/projected/9269d241-5eae-4eb1-b5a9-c4ac47ce177e-kube-api-access-r7zzg\") pod \"dnsmasq-dns-7b9746b6c-rmnft\" (UID: \"9269d241-5eae-4eb1-b5a9-c4ac47ce177e\") " pod="openstack/dnsmasq-dns-7b9746b6c-rmnft" Oct 03 08:56:38 crc kubenswrapper[4790]: I1003 08:56:38.793351 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9269d241-5eae-4eb1-b5a9-c4ac47ce177e-config\") pod \"dnsmasq-dns-7b9746b6c-rmnft\" (UID: \"9269d241-5eae-4eb1-b5a9-c4ac47ce177e\") " pod="openstack/dnsmasq-dns-7b9746b6c-rmnft" Oct 03 08:56:38 crc kubenswrapper[4790]: I1003 08:56:38.794096 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9269d241-5eae-4eb1-b5a9-c4ac47ce177e-dns-svc\") pod \"dnsmasq-dns-7b9746b6c-rmnft\" (UID: \"9269d241-5eae-4eb1-b5a9-c4ac47ce177e\") " pod="openstack/dnsmasq-dns-7b9746b6c-rmnft" Oct 03 08:56:38 crc kubenswrapper[4790]: I1003 08:56:38.796408 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9269d241-5eae-4eb1-b5a9-c4ac47ce177e-config\") pod \"dnsmasq-dns-7b9746b6c-rmnft\" (UID: \"9269d241-5eae-4eb1-b5a9-c4ac47ce177e\") " pod="openstack/dnsmasq-dns-7b9746b6c-rmnft" Oct 03 08:56:38 crc kubenswrapper[4790]: I1003 08:56:38.814715 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-86b4778477-cpjz6"] Oct 03 08:56:38 crc kubenswrapper[4790]: I1003 08:56:38.816190 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86b4778477-cpjz6" Oct 03 08:56:38 crc kubenswrapper[4790]: I1003 08:56:38.837344 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r7zzg\" (UniqueName: \"kubernetes.io/projected/9269d241-5eae-4eb1-b5a9-c4ac47ce177e-kube-api-access-r7zzg\") pod \"dnsmasq-dns-7b9746b6c-rmnft\" (UID: \"9269d241-5eae-4eb1-b5a9-c4ac47ce177e\") " pod="openstack/dnsmasq-dns-7b9746b6c-rmnft" Oct 03 08:56:38 crc kubenswrapper[4790]: I1003 08:56:38.852787 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7b9746b6c-rmnft" Oct 03 08:56:38 crc kubenswrapper[4790]: I1003 08:56:38.854526 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-86b4778477-cpjz6"] Oct 03 08:56:38 crc kubenswrapper[4790]: I1003 08:56:38.896609 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gqckf\" (UniqueName: \"kubernetes.io/projected/007c7c03-1417-496d-b35d-242dbaacc078-kube-api-access-gqckf\") pod \"dnsmasq-dns-86b4778477-cpjz6\" (UID: \"007c7c03-1417-496d-b35d-242dbaacc078\") " pod="openstack/dnsmasq-dns-86b4778477-cpjz6" Oct 03 08:56:38 crc kubenswrapper[4790]: I1003 08:56:38.896719 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/007c7c03-1417-496d-b35d-242dbaacc078-config\") pod \"dnsmasq-dns-86b4778477-cpjz6\" (UID: \"007c7c03-1417-496d-b35d-242dbaacc078\") " pod="openstack/dnsmasq-dns-86b4778477-cpjz6" Oct 03 08:56:38 crc kubenswrapper[4790]: I1003 08:56:38.896914 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/007c7c03-1417-496d-b35d-242dbaacc078-dns-svc\") pod \"dnsmasq-dns-86b4778477-cpjz6\" (UID: \"007c7c03-1417-496d-b35d-242dbaacc078\") " pod="openstack/dnsmasq-dns-86b4778477-cpjz6" Oct 03 08:56:38 crc kubenswrapper[4790]: I1003 08:56:38.999116 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/007c7c03-1417-496d-b35d-242dbaacc078-dns-svc\") pod \"dnsmasq-dns-86b4778477-cpjz6\" (UID: \"007c7c03-1417-496d-b35d-242dbaacc078\") " pod="openstack/dnsmasq-dns-86b4778477-cpjz6" Oct 03 08:56:38 crc kubenswrapper[4790]: I1003 08:56:38.999201 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gqckf\" (UniqueName: \"kubernetes.io/projected/007c7c03-1417-496d-b35d-242dbaacc078-kube-api-access-gqckf\") pod \"dnsmasq-dns-86b4778477-cpjz6\" (UID: \"007c7c03-1417-496d-b35d-242dbaacc078\") " pod="openstack/dnsmasq-dns-86b4778477-cpjz6" Oct 03 08:56:38 crc kubenswrapper[4790]: I1003 08:56:38.999261 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/007c7c03-1417-496d-b35d-242dbaacc078-config\") pod \"dnsmasq-dns-86b4778477-cpjz6\" (UID: \"007c7c03-1417-496d-b35d-242dbaacc078\") " pod="openstack/dnsmasq-dns-86b4778477-cpjz6" Oct 03 08:56:39 crc kubenswrapper[4790]: I1003 08:56:38.999947 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/007c7c03-1417-496d-b35d-242dbaacc078-dns-svc\") pod \"dnsmasq-dns-86b4778477-cpjz6\" (UID: \"007c7c03-1417-496d-b35d-242dbaacc078\") " pod="openstack/dnsmasq-dns-86b4778477-cpjz6" Oct 03 08:56:39 crc kubenswrapper[4790]: I1003 08:56:39.000458 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/007c7c03-1417-496d-b35d-242dbaacc078-config\") pod \"dnsmasq-dns-86b4778477-cpjz6\" (UID: \"007c7c03-1417-496d-b35d-242dbaacc078\") " pod="openstack/dnsmasq-dns-86b4778477-cpjz6" Oct 03 08:56:39 crc kubenswrapper[4790]: I1003 08:56:39.015774 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gqckf\" (UniqueName: \"kubernetes.io/projected/007c7c03-1417-496d-b35d-242dbaacc078-kube-api-access-gqckf\") pod \"dnsmasq-dns-86b4778477-cpjz6\" (UID: \"007c7c03-1417-496d-b35d-242dbaacc078\") " pod="openstack/dnsmasq-dns-86b4778477-cpjz6" Oct 03 08:56:39 crc kubenswrapper[4790]: I1003 08:56:39.170064 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86b4778477-cpjz6" Oct 03 08:56:39 crc kubenswrapper[4790]: I1003 08:56:39.359184 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Oct 03 08:56:39 crc kubenswrapper[4790]: I1003 08:56:39.360751 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 03 08:56:39 crc kubenswrapper[4790]: I1003 08:56:39.363136 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-2jckr" Oct 03 08:56:39 crc kubenswrapper[4790]: I1003 08:56:39.363457 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Oct 03 08:56:39 crc kubenswrapper[4790]: I1003 08:56:39.363711 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Oct 03 08:56:39 crc kubenswrapper[4790]: I1003 08:56:39.363856 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Oct 03 08:56:39 crc kubenswrapper[4790]: I1003 08:56:39.364038 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Oct 03 08:56:39 crc kubenswrapper[4790]: I1003 08:56:39.364292 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Oct 03 08:56:39 crc kubenswrapper[4790]: I1003 08:56:39.369226 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Oct 03 08:56:39 crc kubenswrapper[4790]: I1003 08:56:39.373959 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 03 08:56:39 crc kubenswrapper[4790]: I1003 08:56:39.511463 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"rabbitmq-server-0\" (UID: \"f79d57e8-680a-41b6-bca9-c233a695be4e\") " pod="openstack/rabbitmq-server-0" Oct 03 08:56:39 crc kubenswrapper[4790]: I1003 08:56:39.511510 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/f79d57e8-680a-41b6-bca9-c233a695be4e-server-conf\") pod \"rabbitmq-server-0\" (UID: \"f79d57e8-680a-41b6-bca9-c233a695be4e\") " pod="openstack/rabbitmq-server-0" Oct 03 08:56:39 crc kubenswrapper[4790]: I1003 08:56:39.511534 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/f79d57e8-680a-41b6-bca9-c233a695be4e-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"f79d57e8-680a-41b6-bca9-c233a695be4e\") " pod="openstack/rabbitmq-server-0" Oct 03 08:56:39 crc kubenswrapper[4790]: I1003 08:56:39.511563 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/f79d57e8-680a-41b6-bca9-c233a695be4e-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"f79d57e8-680a-41b6-bca9-c233a695be4e\") " pod="openstack/rabbitmq-server-0" Oct 03 08:56:39 crc kubenswrapper[4790]: I1003 08:56:39.511615 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/f79d57e8-680a-41b6-bca9-c233a695be4e-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"f79d57e8-680a-41b6-bca9-c233a695be4e\") " pod="openstack/rabbitmq-server-0" Oct 03 08:56:39 crc kubenswrapper[4790]: I1003 08:56:39.511636 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/f79d57e8-680a-41b6-bca9-c233a695be4e-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"f79d57e8-680a-41b6-bca9-c233a695be4e\") " pod="openstack/rabbitmq-server-0" Oct 03 08:56:39 crc kubenswrapper[4790]: I1003 08:56:39.511653 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/f79d57e8-680a-41b6-bca9-c233a695be4e-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"f79d57e8-680a-41b6-bca9-c233a695be4e\") " pod="openstack/rabbitmq-server-0" Oct 03 08:56:39 crc kubenswrapper[4790]: I1003 08:56:39.511742 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/f79d57e8-680a-41b6-bca9-c233a695be4e-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"f79d57e8-680a-41b6-bca9-c233a695be4e\") " pod="openstack/rabbitmq-server-0" Oct 03 08:56:39 crc kubenswrapper[4790]: I1003 08:56:39.511814 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f79d57e8-680a-41b6-bca9-c233a695be4e-config-data\") pod \"rabbitmq-server-0\" (UID: \"f79d57e8-680a-41b6-bca9-c233a695be4e\") " pod="openstack/rabbitmq-server-0" Oct 03 08:56:39 crc kubenswrapper[4790]: I1003 08:56:39.511853 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-89kch\" (UniqueName: \"kubernetes.io/projected/f79d57e8-680a-41b6-bca9-c233a695be4e-kube-api-access-89kch\") pod \"rabbitmq-server-0\" (UID: \"f79d57e8-680a-41b6-bca9-c233a695be4e\") " pod="openstack/rabbitmq-server-0" Oct 03 08:56:39 crc kubenswrapper[4790]: I1003 08:56:39.511923 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/f79d57e8-680a-41b6-bca9-c233a695be4e-pod-info\") pod \"rabbitmq-server-0\" (UID: \"f79d57e8-680a-41b6-bca9-c233a695be4e\") " pod="openstack/rabbitmq-server-0" Oct 03 08:56:39 crc kubenswrapper[4790]: I1003 08:56:39.613743 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/f79d57e8-680a-41b6-bca9-c233a695be4e-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"f79d57e8-680a-41b6-bca9-c233a695be4e\") " pod="openstack/rabbitmq-server-0" Oct 03 08:56:39 crc kubenswrapper[4790]: I1003 08:56:39.613821 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/f79d57e8-680a-41b6-bca9-c233a695be4e-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"f79d57e8-680a-41b6-bca9-c233a695be4e\") " pod="openstack/rabbitmq-server-0" Oct 03 08:56:39 crc kubenswrapper[4790]: I1003 08:56:39.613852 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/f79d57e8-680a-41b6-bca9-c233a695be4e-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"f79d57e8-680a-41b6-bca9-c233a695be4e\") " pod="openstack/rabbitmq-server-0" Oct 03 08:56:39 crc kubenswrapper[4790]: I1003 08:56:39.613877 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/f79d57e8-680a-41b6-bca9-c233a695be4e-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"f79d57e8-680a-41b6-bca9-c233a695be4e\") " pod="openstack/rabbitmq-server-0" Oct 03 08:56:39 crc kubenswrapper[4790]: I1003 08:56:39.613900 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/f79d57e8-680a-41b6-bca9-c233a695be4e-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"f79d57e8-680a-41b6-bca9-c233a695be4e\") " pod="openstack/rabbitmq-server-0" Oct 03 08:56:39 crc kubenswrapper[4790]: I1003 08:56:39.613924 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f79d57e8-680a-41b6-bca9-c233a695be4e-config-data\") pod \"rabbitmq-server-0\" (UID: \"f79d57e8-680a-41b6-bca9-c233a695be4e\") " pod="openstack/rabbitmq-server-0" Oct 03 08:56:39 crc kubenswrapper[4790]: I1003 08:56:39.613951 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-89kch\" (UniqueName: \"kubernetes.io/projected/f79d57e8-680a-41b6-bca9-c233a695be4e-kube-api-access-89kch\") pod \"rabbitmq-server-0\" (UID: \"f79d57e8-680a-41b6-bca9-c233a695be4e\") " pod="openstack/rabbitmq-server-0" Oct 03 08:56:39 crc kubenswrapper[4790]: I1003 08:56:39.613989 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/f79d57e8-680a-41b6-bca9-c233a695be4e-pod-info\") pod \"rabbitmq-server-0\" (UID: \"f79d57e8-680a-41b6-bca9-c233a695be4e\") " pod="openstack/rabbitmq-server-0" Oct 03 08:56:39 crc kubenswrapper[4790]: I1003 08:56:39.614021 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"rabbitmq-server-0\" (UID: \"f79d57e8-680a-41b6-bca9-c233a695be4e\") " pod="openstack/rabbitmq-server-0" Oct 03 08:56:39 crc kubenswrapper[4790]: I1003 08:56:39.614043 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/f79d57e8-680a-41b6-bca9-c233a695be4e-server-conf\") pod \"rabbitmq-server-0\" (UID: \"f79d57e8-680a-41b6-bca9-c233a695be4e\") " pod="openstack/rabbitmq-server-0" Oct 03 08:56:39 crc kubenswrapper[4790]: I1003 08:56:39.614064 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/f79d57e8-680a-41b6-bca9-c233a695be4e-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"f79d57e8-680a-41b6-bca9-c233a695be4e\") " pod="openstack/rabbitmq-server-0" Oct 03 08:56:39 crc kubenswrapper[4790]: I1003 08:56:39.614560 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/f79d57e8-680a-41b6-bca9-c233a695be4e-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"f79d57e8-680a-41b6-bca9-c233a695be4e\") " pod="openstack/rabbitmq-server-0" Oct 03 08:56:39 crc kubenswrapper[4790]: I1003 08:56:39.615337 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/f79d57e8-680a-41b6-bca9-c233a695be4e-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"f79d57e8-680a-41b6-bca9-c233a695be4e\") " pod="openstack/rabbitmq-server-0" Oct 03 08:56:39 crc kubenswrapper[4790]: I1003 08:56:39.615604 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/f79d57e8-680a-41b6-bca9-c233a695be4e-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"f79d57e8-680a-41b6-bca9-c233a695be4e\") " pod="openstack/rabbitmq-server-0" Oct 03 08:56:39 crc kubenswrapper[4790]: I1003 08:56:39.618179 4790 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"rabbitmq-server-0\" (UID: \"f79d57e8-680a-41b6-bca9-c233a695be4e\") device mount path \"/mnt/openstack/pv11\"" pod="openstack/rabbitmq-server-0" Oct 03 08:56:39 crc kubenswrapper[4790]: I1003 08:56:39.618613 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f79d57e8-680a-41b6-bca9-c233a695be4e-config-data\") pod \"rabbitmq-server-0\" (UID: \"f79d57e8-680a-41b6-bca9-c233a695be4e\") " pod="openstack/rabbitmq-server-0" Oct 03 08:56:39 crc kubenswrapper[4790]: I1003 08:56:39.618813 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/f79d57e8-680a-41b6-bca9-c233a695be4e-server-conf\") pod \"rabbitmq-server-0\" (UID: \"f79d57e8-680a-41b6-bca9-c233a695be4e\") " pod="openstack/rabbitmq-server-0" Oct 03 08:56:39 crc kubenswrapper[4790]: I1003 08:56:39.620774 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/f79d57e8-680a-41b6-bca9-c233a695be4e-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"f79d57e8-680a-41b6-bca9-c233a695be4e\") " pod="openstack/rabbitmq-server-0" Oct 03 08:56:39 crc kubenswrapper[4790]: I1003 08:56:39.622562 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/f79d57e8-680a-41b6-bca9-c233a695be4e-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"f79d57e8-680a-41b6-bca9-c233a695be4e\") " pod="openstack/rabbitmq-server-0" Oct 03 08:56:39 crc kubenswrapper[4790]: I1003 08:56:39.634594 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/f79d57e8-680a-41b6-bca9-c233a695be4e-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"f79d57e8-680a-41b6-bca9-c233a695be4e\") " pod="openstack/rabbitmq-server-0" Oct 03 08:56:39 crc kubenswrapper[4790]: I1003 08:56:39.635184 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/f79d57e8-680a-41b6-bca9-c233a695be4e-pod-info\") pod \"rabbitmq-server-0\" (UID: \"f79d57e8-680a-41b6-bca9-c233a695be4e\") " pod="openstack/rabbitmq-server-0" Oct 03 08:56:39 crc kubenswrapper[4790]: I1003 08:56:39.651040 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-89kch\" (UniqueName: \"kubernetes.io/projected/f79d57e8-680a-41b6-bca9-c233a695be4e-kube-api-access-89kch\") pod \"rabbitmq-server-0\" (UID: \"f79d57e8-680a-41b6-bca9-c233a695be4e\") " pod="openstack/rabbitmq-server-0" Oct 03 08:56:39 crc kubenswrapper[4790]: I1003 08:56:39.666712 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"rabbitmq-server-0\" (UID: \"f79d57e8-680a-41b6-bca9-c233a695be4e\") " pod="openstack/rabbitmq-server-0" Oct 03 08:56:39 crc kubenswrapper[4790]: I1003 08:56:39.671246 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 03 08:56:39 crc kubenswrapper[4790]: I1003 08:56:39.672492 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 03 08:56:39 crc kubenswrapper[4790]: I1003 08:56:39.687112 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 03 08:56:39 crc kubenswrapper[4790]: I1003 08:56:39.687556 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-q8nqg" Oct 03 08:56:39 crc kubenswrapper[4790]: I1003 08:56:39.689262 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Oct 03 08:56:39 crc kubenswrapper[4790]: I1003 08:56:39.689423 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Oct 03 08:56:39 crc kubenswrapper[4790]: I1003 08:56:39.689549 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Oct 03 08:56:39 crc kubenswrapper[4790]: I1003 08:56:39.689681 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Oct 03 08:56:39 crc kubenswrapper[4790]: I1003 08:56:39.689777 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Oct 03 08:56:39 crc kubenswrapper[4790]: I1003 08:56:39.689875 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Oct 03 08:56:39 crc kubenswrapper[4790]: I1003 08:56:39.692054 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 03 08:56:39 crc kubenswrapper[4790]: I1003 08:56:39.820673 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/f5ab6b2b-5bba-4043-b116-b8d996df1ace-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"f5ab6b2b-5bba-4043-b116-b8d996df1ace\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 08:56:39 crc kubenswrapper[4790]: I1003 08:56:39.820948 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/f5ab6b2b-5bba-4043-b116-b8d996df1ace-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"f5ab6b2b-5bba-4043-b116-b8d996df1ace\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 08:56:39 crc kubenswrapper[4790]: I1003 08:56:39.820970 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"f5ab6b2b-5bba-4043-b116-b8d996df1ace\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 08:56:39 crc kubenswrapper[4790]: I1003 08:56:39.821014 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/f5ab6b2b-5bba-4043-b116-b8d996df1ace-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"f5ab6b2b-5bba-4043-b116-b8d996df1ace\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 08:56:39 crc kubenswrapper[4790]: I1003 08:56:39.821034 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/f5ab6b2b-5bba-4043-b116-b8d996df1ace-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"f5ab6b2b-5bba-4043-b116-b8d996df1ace\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 08:56:39 crc kubenswrapper[4790]: I1003 08:56:39.821172 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/f5ab6b2b-5bba-4043-b116-b8d996df1ace-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"f5ab6b2b-5bba-4043-b116-b8d996df1ace\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 08:56:39 crc kubenswrapper[4790]: I1003 08:56:39.821238 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/f5ab6b2b-5bba-4043-b116-b8d996df1ace-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"f5ab6b2b-5bba-4043-b116-b8d996df1ace\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 08:56:39 crc kubenswrapper[4790]: I1003 08:56:39.821346 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dq7zf\" (UniqueName: \"kubernetes.io/projected/f5ab6b2b-5bba-4043-b116-b8d996df1ace-kube-api-access-dq7zf\") pod \"rabbitmq-cell1-server-0\" (UID: \"f5ab6b2b-5bba-4043-b116-b8d996df1ace\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 08:56:39 crc kubenswrapper[4790]: I1003 08:56:39.821436 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/f5ab6b2b-5bba-4043-b116-b8d996df1ace-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"f5ab6b2b-5bba-4043-b116-b8d996df1ace\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 08:56:39 crc kubenswrapper[4790]: I1003 08:56:39.821485 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/f5ab6b2b-5bba-4043-b116-b8d996df1ace-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"f5ab6b2b-5bba-4043-b116-b8d996df1ace\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 08:56:39 crc kubenswrapper[4790]: I1003 08:56:39.821512 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f5ab6b2b-5bba-4043-b116-b8d996df1ace-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"f5ab6b2b-5bba-4043-b116-b8d996df1ace\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 08:56:39 crc kubenswrapper[4790]: I1003 08:56:39.923246 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/f5ab6b2b-5bba-4043-b116-b8d996df1ace-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"f5ab6b2b-5bba-4043-b116-b8d996df1ace\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 08:56:39 crc kubenswrapper[4790]: I1003 08:56:39.923296 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/f5ab6b2b-5bba-4043-b116-b8d996df1ace-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"f5ab6b2b-5bba-4043-b116-b8d996df1ace\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 08:56:39 crc kubenswrapper[4790]: I1003 08:56:39.923328 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f5ab6b2b-5bba-4043-b116-b8d996df1ace-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"f5ab6b2b-5bba-4043-b116-b8d996df1ace\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 08:56:39 crc kubenswrapper[4790]: I1003 08:56:39.923346 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/f5ab6b2b-5bba-4043-b116-b8d996df1ace-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"f5ab6b2b-5bba-4043-b116-b8d996df1ace\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 08:56:39 crc kubenswrapper[4790]: I1003 08:56:39.923362 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/f5ab6b2b-5bba-4043-b116-b8d996df1ace-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"f5ab6b2b-5bba-4043-b116-b8d996df1ace\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 08:56:39 crc kubenswrapper[4790]: I1003 08:56:39.923380 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"f5ab6b2b-5bba-4043-b116-b8d996df1ace\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 08:56:39 crc kubenswrapper[4790]: I1003 08:56:39.923422 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/f5ab6b2b-5bba-4043-b116-b8d996df1ace-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"f5ab6b2b-5bba-4043-b116-b8d996df1ace\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 08:56:39 crc kubenswrapper[4790]: I1003 08:56:39.923444 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/f5ab6b2b-5bba-4043-b116-b8d996df1ace-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"f5ab6b2b-5bba-4043-b116-b8d996df1ace\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 08:56:39 crc kubenswrapper[4790]: I1003 08:56:39.923467 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/f5ab6b2b-5bba-4043-b116-b8d996df1ace-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"f5ab6b2b-5bba-4043-b116-b8d996df1ace\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 08:56:39 crc kubenswrapper[4790]: I1003 08:56:39.923483 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/f5ab6b2b-5bba-4043-b116-b8d996df1ace-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"f5ab6b2b-5bba-4043-b116-b8d996df1ace\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 08:56:39 crc kubenswrapper[4790]: I1003 08:56:39.923518 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dq7zf\" (UniqueName: \"kubernetes.io/projected/f5ab6b2b-5bba-4043-b116-b8d996df1ace-kube-api-access-dq7zf\") pod \"rabbitmq-cell1-server-0\" (UID: \"f5ab6b2b-5bba-4043-b116-b8d996df1ace\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 08:56:39 crc kubenswrapper[4790]: I1003 08:56:39.924224 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/f5ab6b2b-5bba-4043-b116-b8d996df1ace-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"f5ab6b2b-5bba-4043-b116-b8d996df1ace\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 08:56:39 crc kubenswrapper[4790]: I1003 08:56:39.924795 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f5ab6b2b-5bba-4043-b116-b8d996df1ace-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"f5ab6b2b-5bba-4043-b116-b8d996df1ace\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 08:56:39 crc kubenswrapper[4790]: I1003 08:56:39.925524 4790 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"f5ab6b2b-5bba-4043-b116-b8d996df1ace\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/rabbitmq-cell1-server-0" Oct 03 08:56:39 crc kubenswrapper[4790]: I1003 08:56:39.926106 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/f5ab6b2b-5bba-4043-b116-b8d996df1ace-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"f5ab6b2b-5bba-4043-b116-b8d996df1ace\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 08:56:39 crc kubenswrapper[4790]: I1003 08:56:39.926978 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/f5ab6b2b-5bba-4043-b116-b8d996df1ace-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"f5ab6b2b-5bba-4043-b116-b8d996df1ace\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 08:56:39 crc kubenswrapper[4790]: I1003 08:56:39.928688 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/f5ab6b2b-5bba-4043-b116-b8d996df1ace-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"f5ab6b2b-5bba-4043-b116-b8d996df1ace\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 08:56:39 crc kubenswrapper[4790]: I1003 08:56:39.929419 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/f5ab6b2b-5bba-4043-b116-b8d996df1ace-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"f5ab6b2b-5bba-4043-b116-b8d996df1ace\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 08:56:39 crc kubenswrapper[4790]: I1003 08:56:39.944701 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/f5ab6b2b-5bba-4043-b116-b8d996df1ace-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"f5ab6b2b-5bba-4043-b116-b8d996df1ace\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 08:56:39 crc kubenswrapper[4790]: I1003 08:56:39.945356 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/f5ab6b2b-5bba-4043-b116-b8d996df1ace-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"f5ab6b2b-5bba-4043-b116-b8d996df1ace\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 08:56:39 crc kubenswrapper[4790]: I1003 08:56:39.949785 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/f5ab6b2b-5bba-4043-b116-b8d996df1ace-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"f5ab6b2b-5bba-4043-b116-b8d996df1ace\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 08:56:39 crc kubenswrapper[4790]: I1003 08:56:39.951317 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dq7zf\" (UniqueName: \"kubernetes.io/projected/f5ab6b2b-5bba-4043-b116-b8d996df1ace-kube-api-access-dq7zf\") pod \"rabbitmq-cell1-server-0\" (UID: \"f5ab6b2b-5bba-4043-b116-b8d996df1ace\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 08:56:39 crc kubenswrapper[4790]: I1003 08:56:39.956150 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-notifications-server-0"] Oct 03 08:56:39 crc kubenswrapper[4790]: I1003 08:56:39.957799 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-notifications-server-0" Oct 03 08:56:39 crc kubenswrapper[4790]: I1003 08:56:39.963116 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-notifications-server-dockercfg-jfkjl" Oct 03 08:56:39 crc kubenswrapper[4790]: I1003 08:56:39.963387 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-notifications-plugins-conf" Oct 03 08:56:39 crc kubenswrapper[4790]: I1003 08:56:39.963540 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-notifications-svc" Oct 03 08:56:39 crc kubenswrapper[4790]: I1003 08:56:39.963757 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-notifications-erlang-cookie" Oct 03 08:56:39 crc kubenswrapper[4790]: I1003 08:56:39.963929 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-notifications-server-conf" Oct 03 08:56:39 crc kubenswrapper[4790]: I1003 08:56:39.964050 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-notifications-config-data" Oct 03 08:56:39 crc kubenswrapper[4790]: I1003 08:56:39.964842 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-notifications-default-user" Oct 03 08:56:39 crc kubenswrapper[4790]: I1003 08:56:39.965627 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"f5ab6b2b-5bba-4043-b116-b8d996df1ace\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 08:56:39 crc kubenswrapper[4790]: I1003 08:56:39.970258 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-notifications-server-0"] Oct 03 08:56:40 crc kubenswrapper[4790]: I1003 08:56:40.025180 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/04926fe2-c5e2-4663-8945-448a17ec2a8b-config-data\") pod \"rabbitmq-notifications-server-0\" (UID: \"04926fe2-c5e2-4663-8945-448a17ec2a8b\") " pod="openstack/rabbitmq-notifications-server-0" Oct 03 08:56:40 crc kubenswrapper[4790]: I1003 08:56:40.025410 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/04926fe2-c5e2-4663-8945-448a17ec2a8b-erlang-cookie-secret\") pod \"rabbitmq-notifications-server-0\" (UID: \"04926fe2-c5e2-4663-8945-448a17ec2a8b\") " pod="openstack/rabbitmq-notifications-server-0" Oct 03 08:56:40 crc kubenswrapper[4790]: I1003 08:56:40.025564 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/04926fe2-c5e2-4663-8945-448a17ec2a8b-rabbitmq-plugins\") pod \"rabbitmq-notifications-server-0\" (UID: \"04926fe2-c5e2-4663-8945-448a17ec2a8b\") " pod="openstack/rabbitmq-notifications-server-0" Oct 03 08:56:40 crc kubenswrapper[4790]: I1003 08:56:40.025680 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/04926fe2-c5e2-4663-8945-448a17ec2a8b-server-conf\") pod \"rabbitmq-notifications-server-0\" (UID: \"04926fe2-c5e2-4663-8945-448a17ec2a8b\") " pod="openstack/rabbitmq-notifications-server-0" Oct 03 08:56:40 crc kubenswrapper[4790]: I1003 08:56:40.025766 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/04926fe2-c5e2-4663-8945-448a17ec2a8b-plugins-conf\") pod \"rabbitmq-notifications-server-0\" (UID: \"04926fe2-c5e2-4663-8945-448a17ec2a8b\") " pod="openstack/rabbitmq-notifications-server-0" Oct 03 08:56:40 crc kubenswrapper[4790]: I1003 08:56:40.025872 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/04926fe2-c5e2-4663-8945-448a17ec2a8b-rabbitmq-erlang-cookie\") pod \"rabbitmq-notifications-server-0\" (UID: \"04926fe2-c5e2-4663-8945-448a17ec2a8b\") " pod="openstack/rabbitmq-notifications-server-0" Oct 03 08:56:40 crc kubenswrapper[4790]: I1003 08:56:40.025958 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/04926fe2-c5e2-4663-8945-448a17ec2a8b-pod-info\") pod \"rabbitmq-notifications-server-0\" (UID: \"04926fe2-c5e2-4663-8945-448a17ec2a8b\") " pod="openstack/rabbitmq-notifications-server-0" Oct 03 08:56:40 crc kubenswrapper[4790]: I1003 08:56:40.026041 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-notifications-server-0\" (UID: \"04926fe2-c5e2-4663-8945-448a17ec2a8b\") " pod="openstack/rabbitmq-notifications-server-0" Oct 03 08:56:40 crc kubenswrapper[4790]: I1003 08:56:40.026121 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-24kmd\" (UniqueName: \"kubernetes.io/projected/04926fe2-c5e2-4663-8945-448a17ec2a8b-kube-api-access-24kmd\") pod \"rabbitmq-notifications-server-0\" (UID: \"04926fe2-c5e2-4663-8945-448a17ec2a8b\") " pod="openstack/rabbitmq-notifications-server-0" Oct 03 08:56:40 crc kubenswrapper[4790]: I1003 08:56:40.026200 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/04926fe2-c5e2-4663-8945-448a17ec2a8b-rabbitmq-confd\") pod \"rabbitmq-notifications-server-0\" (UID: \"04926fe2-c5e2-4663-8945-448a17ec2a8b\") " pod="openstack/rabbitmq-notifications-server-0" Oct 03 08:56:40 crc kubenswrapper[4790]: I1003 08:56:40.026312 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/04926fe2-c5e2-4663-8945-448a17ec2a8b-rabbitmq-tls\") pod \"rabbitmq-notifications-server-0\" (UID: \"04926fe2-c5e2-4663-8945-448a17ec2a8b\") " pod="openstack/rabbitmq-notifications-server-0" Oct 03 08:56:40 crc kubenswrapper[4790]: I1003 08:56:40.046261 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 03 08:56:40 crc kubenswrapper[4790]: I1003 08:56:40.127300 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/04926fe2-c5e2-4663-8945-448a17ec2a8b-plugins-conf\") pod \"rabbitmq-notifications-server-0\" (UID: \"04926fe2-c5e2-4663-8945-448a17ec2a8b\") " pod="openstack/rabbitmq-notifications-server-0" Oct 03 08:56:40 crc kubenswrapper[4790]: I1003 08:56:40.127361 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/04926fe2-c5e2-4663-8945-448a17ec2a8b-rabbitmq-erlang-cookie\") pod \"rabbitmq-notifications-server-0\" (UID: \"04926fe2-c5e2-4663-8945-448a17ec2a8b\") " pod="openstack/rabbitmq-notifications-server-0" Oct 03 08:56:40 crc kubenswrapper[4790]: I1003 08:56:40.127388 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/04926fe2-c5e2-4663-8945-448a17ec2a8b-pod-info\") pod \"rabbitmq-notifications-server-0\" (UID: \"04926fe2-c5e2-4663-8945-448a17ec2a8b\") " pod="openstack/rabbitmq-notifications-server-0" Oct 03 08:56:40 crc kubenswrapper[4790]: I1003 08:56:40.127405 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-notifications-server-0\" (UID: \"04926fe2-c5e2-4663-8945-448a17ec2a8b\") " pod="openstack/rabbitmq-notifications-server-0" Oct 03 08:56:40 crc kubenswrapper[4790]: I1003 08:56:40.127429 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-24kmd\" (UniqueName: \"kubernetes.io/projected/04926fe2-c5e2-4663-8945-448a17ec2a8b-kube-api-access-24kmd\") pod \"rabbitmq-notifications-server-0\" (UID: \"04926fe2-c5e2-4663-8945-448a17ec2a8b\") " pod="openstack/rabbitmq-notifications-server-0" Oct 03 08:56:40 crc kubenswrapper[4790]: I1003 08:56:40.127450 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/04926fe2-c5e2-4663-8945-448a17ec2a8b-rabbitmq-confd\") pod \"rabbitmq-notifications-server-0\" (UID: \"04926fe2-c5e2-4663-8945-448a17ec2a8b\") " pod="openstack/rabbitmq-notifications-server-0" Oct 03 08:56:40 crc kubenswrapper[4790]: I1003 08:56:40.127506 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/04926fe2-c5e2-4663-8945-448a17ec2a8b-rabbitmq-tls\") pod \"rabbitmq-notifications-server-0\" (UID: \"04926fe2-c5e2-4663-8945-448a17ec2a8b\") " pod="openstack/rabbitmq-notifications-server-0" Oct 03 08:56:40 crc kubenswrapper[4790]: I1003 08:56:40.127903 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/04926fe2-c5e2-4663-8945-448a17ec2a8b-config-data\") pod \"rabbitmq-notifications-server-0\" (UID: \"04926fe2-c5e2-4663-8945-448a17ec2a8b\") " pod="openstack/rabbitmq-notifications-server-0" Oct 03 08:56:40 crc kubenswrapper[4790]: I1003 08:56:40.127921 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/04926fe2-c5e2-4663-8945-448a17ec2a8b-erlang-cookie-secret\") pod \"rabbitmq-notifications-server-0\" (UID: \"04926fe2-c5e2-4663-8945-448a17ec2a8b\") " pod="openstack/rabbitmq-notifications-server-0" Oct 03 08:56:40 crc kubenswrapper[4790]: I1003 08:56:40.127935 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/04926fe2-c5e2-4663-8945-448a17ec2a8b-rabbitmq-plugins\") pod \"rabbitmq-notifications-server-0\" (UID: \"04926fe2-c5e2-4663-8945-448a17ec2a8b\") " pod="openstack/rabbitmq-notifications-server-0" Oct 03 08:56:40 crc kubenswrapper[4790]: I1003 08:56:40.127953 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/04926fe2-c5e2-4663-8945-448a17ec2a8b-server-conf\") pod \"rabbitmq-notifications-server-0\" (UID: \"04926fe2-c5e2-4663-8945-448a17ec2a8b\") " pod="openstack/rabbitmq-notifications-server-0" Oct 03 08:56:40 crc kubenswrapper[4790]: I1003 08:56:40.128473 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/04926fe2-c5e2-4663-8945-448a17ec2a8b-rabbitmq-erlang-cookie\") pod \"rabbitmq-notifications-server-0\" (UID: \"04926fe2-c5e2-4663-8945-448a17ec2a8b\") " pod="openstack/rabbitmq-notifications-server-0" Oct 03 08:56:40 crc kubenswrapper[4790]: I1003 08:56:40.128564 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/04926fe2-c5e2-4663-8945-448a17ec2a8b-rabbitmq-plugins\") pod \"rabbitmq-notifications-server-0\" (UID: \"04926fe2-c5e2-4663-8945-448a17ec2a8b\") " pod="openstack/rabbitmq-notifications-server-0" Oct 03 08:56:40 crc kubenswrapper[4790]: I1003 08:56:40.128741 4790 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-notifications-server-0\" (UID: \"04926fe2-c5e2-4663-8945-448a17ec2a8b\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/rabbitmq-notifications-server-0" Oct 03 08:56:40 crc kubenswrapper[4790]: I1003 08:56:40.128822 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/04926fe2-c5e2-4663-8945-448a17ec2a8b-plugins-conf\") pod \"rabbitmq-notifications-server-0\" (UID: \"04926fe2-c5e2-4663-8945-448a17ec2a8b\") " pod="openstack/rabbitmq-notifications-server-0" Oct 03 08:56:40 crc kubenswrapper[4790]: I1003 08:56:40.129410 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/04926fe2-c5e2-4663-8945-448a17ec2a8b-server-conf\") pod \"rabbitmq-notifications-server-0\" (UID: \"04926fe2-c5e2-4663-8945-448a17ec2a8b\") " pod="openstack/rabbitmq-notifications-server-0" Oct 03 08:56:40 crc kubenswrapper[4790]: I1003 08:56:40.129829 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/04926fe2-c5e2-4663-8945-448a17ec2a8b-config-data\") pod \"rabbitmq-notifications-server-0\" (UID: \"04926fe2-c5e2-4663-8945-448a17ec2a8b\") " pod="openstack/rabbitmq-notifications-server-0" Oct 03 08:56:40 crc kubenswrapper[4790]: I1003 08:56:40.131891 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/04926fe2-c5e2-4663-8945-448a17ec2a8b-erlang-cookie-secret\") pod \"rabbitmq-notifications-server-0\" (UID: \"04926fe2-c5e2-4663-8945-448a17ec2a8b\") " pod="openstack/rabbitmq-notifications-server-0" Oct 03 08:56:40 crc kubenswrapper[4790]: I1003 08:56:40.134214 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/04926fe2-c5e2-4663-8945-448a17ec2a8b-rabbitmq-confd\") pod \"rabbitmq-notifications-server-0\" (UID: \"04926fe2-c5e2-4663-8945-448a17ec2a8b\") " pod="openstack/rabbitmq-notifications-server-0" Oct 03 08:56:40 crc kubenswrapper[4790]: I1003 08:56:40.136029 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/04926fe2-c5e2-4663-8945-448a17ec2a8b-pod-info\") pod \"rabbitmq-notifications-server-0\" (UID: \"04926fe2-c5e2-4663-8945-448a17ec2a8b\") " pod="openstack/rabbitmq-notifications-server-0" Oct 03 08:56:40 crc kubenswrapper[4790]: I1003 08:56:40.137616 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/04926fe2-c5e2-4663-8945-448a17ec2a8b-rabbitmq-tls\") pod \"rabbitmq-notifications-server-0\" (UID: \"04926fe2-c5e2-4663-8945-448a17ec2a8b\") " pod="openstack/rabbitmq-notifications-server-0" Oct 03 08:56:40 crc kubenswrapper[4790]: I1003 08:56:40.144647 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-24kmd\" (UniqueName: \"kubernetes.io/projected/04926fe2-c5e2-4663-8945-448a17ec2a8b-kube-api-access-24kmd\") pod \"rabbitmq-notifications-server-0\" (UID: \"04926fe2-c5e2-4663-8945-448a17ec2a8b\") " pod="openstack/rabbitmq-notifications-server-0" Oct 03 08:56:40 crc kubenswrapper[4790]: I1003 08:56:40.152302 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-notifications-server-0\" (UID: \"04926fe2-c5e2-4663-8945-448a17ec2a8b\") " pod="openstack/rabbitmq-notifications-server-0" Oct 03 08:56:40 crc kubenswrapper[4790]: I1003 08:56:40.316054 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-notifications-server-0" Oct 03 08:56:43 crc kubenswrapper[4790]: I1003 08:56:43.113222 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Oct 03 08:56:43 crc kubenswrapper[4790]: I1003 08:56:43.115676 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Oct 03 08:56:43 crc kubenswrapper[4790]: I1003 08:56:43.120283 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Oct 03 08:56:43 crc kubenswrapper[4790]: I1003 08:56:43.120365 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Oct 03 08:56:43 crc kubenswrapper[4790]: I1003 08:56:43.120535 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Oct 03 08:56:43 crc kubenswrapper[4790]: I1003 08:56:43.120866 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Oct 03 08:56:43 crc kubenswrapper[4790]: I1003 08:56:43.121013 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-njg6l" Oct 03 08:56:43 crc kubenswrapper[4790]: I1003 08:56:43.121186 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Oct 03 08:56:43 crc kubenswrapper[4790]: I1003 08:56:43.131657 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Oct 03 08:56:43 crc kubenswrapper[4790]: I1003 08:56:43.183216 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/89aeb08e-2699-4c1b-99ef-4ae2cfce9de3-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"89aeb08e-2699-4c1b-99ef-4ae2cfce9de3\") " pod="openstack/openstack-cell1-galera-0" Oct 03 08:56:43 crc kubenswrapper[4790]: I1003 08:56:43.183598 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/89aeb08e-2699-4c1b-99ef-4ae2cfce9de3-secrets\") pod \"openstack-cell1-galera-0\" (UID: \"89aeb08e-2699-4c1b-99ef-4ae2cfce9de3\") " pod="openstack/openstack-cell1-galera-0" Oct 03 08:56:43 crc kubenswrapper[4790]: I1003 08:56:43.183621 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/89aeb08e-2699-4c1b-99ef-4ae2cfce9de3-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"89aeb08e-2699-4c1b-99ef-4ae2cfce9de3\") " pod="openstack/openstack-cell1-galera-0" Oct 03 08:56:43 crc kubenswrapper[4790]: I1003 08:56:43.183656 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/89aeb08e-2699-4c1b-99ef-4ae2cfce9de3-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"89aeb08e-2699-4c1b-99ef-4ae2cfce9de3\") " pod="openstack/openstack-cell1-galera-0" Oct 03 08:56:43 crc kubenswrapper[4790]: I1003 08:56:43.183679 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kmqlr\" (UniqueName: \"kubernetes.io/projected/89aeb08e-2699-4c1b-99ef-4ae2cfce9de3-kube-api-access-kmqlr\") pod \"openstack-cell1-galera-0\" (UID: \"89aeb08e-2699-4c1b-99ef-4ae2cfce9de3\") " pod="openstack/openstack-cell1-galera-0" Oct 03 08:56:43 crc kubenswrapper[4790]: I1003 08:56:43.183714 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"openstack-cell1-galera-0\" (UID: \"89aeb08e-2699-4c1b-99ef-4ae2cfce9de3\") " pod="openstack/openstack-cell1-galera-0" Oct 03 08:56:43 crc kubenswrapper[4790]: I1003 08:56:43.183848 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/89aeb08e-2699-4c1b-99ef-4ae2cfce9de3-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"89aeb08e-2699-4c1b-99ef-4ae2cfce9de3\") " pod="openstack/openstack-cell1-galera-0" Oct 03 08:56:43 crc kubenswrapper[4790]: I1003 08:56:43.183883 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/89aeb08e-2699-4c1b-99ef-4ae2cfce9de3-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"89aeb08e-2699-4c1b-99ef-4ae2cfce9de3\") " pod="openstack/openstack-cell1-galera-0" Oct 03 08:56:43 crc kubenswrapper[4790]: I1003 08:56:43.183911 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/89aeb08e-2699-4c1b-99ef-4ae2cfce9de3-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"89aeb08e-2699-4c1b-99ef-4ae2cfce9de3\") " pod="openstack/openstack-cell1-galera-0" Oct 03 08:56:43 crc kubenswrapper[4790]: I1003 08:56:43.235326 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Oct 03 08:56:43 crc kubenswrapper[4790]: I1003 08:56:43.236644 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Oct 03 08:56:43 crc kubenswrapper[4790]: I1003 08:56:43.244376 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-kn7ns" Oct 03 08:56:43 crc kubenswrapper[4790]: I1003 08:56:43.244658 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Oct 03 08:56:43 crc kubenswrapper[4790]: I1003 08:56:43.244807 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Oct 03 08:56:43 crc kubenswrapper[4790]: I1003 08:56:43.246346 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Oct 03 08:56:43 crc kubenswrapper[4790]: I1003 08:56:43.259075 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Oct 03 08:56:43 crc kubenswrapper[4790]: I1003 08:56:43.284759 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/010b668f-22e4-4ad8-aae5-decf063cd49b-kolla-config\") pod \"openstack-galera-0\" (UID: \"010b668f-22e4-4ad8-aae5-decf063cd49b\") " pod="openstack/openstack-galera-0" Oct 03 08:56:43 crc kubenswrapper[4790]: I1003 08:56:43.284829 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/010b668f-22e4-4ad8-aae5-decf063cd49b-config-data-generated\") pod \"openstack-galera-0\" (UID: \"010b668f-22e4-4ad8-aae5-decf063cd49b\") " pod="openstack/openstack-galera-0" Oct 03 08:56:43 crc kubenswrapper[4790]: I1003 08:56:43.284857 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/010b668f-22e4-4ad8-aae5-decf063cd49b-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"010b668f-22e4-4ad8-aae5-decf063cd49b\") " pod="openstack/openstack-galera-0" Oct 03 08:56:43 crc kubenswrapper[4790]: I1003 08:56:43.284889 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/010b668f-22e4-4ad8-aae5-decf063cd49b-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"010b668f-22e4-4ad8-aae5-decf063cd49b\") " pod="openstack/openstack-galera-0" Oct 03 08:56:43 crc kubenswrapper[4790]: I1003 08:56:43.284942 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/89aeb08e-2699-4c1b-99ef-4ae2cfce9de3-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"89aeb08e-2699-4c1b-99ef-4ae2cfce9de3\") " pod="openstack/openstack-cell1-galera-0" Oct 03 08:56:43 crc kubenswrapper[4790]: I1003 08:56:43.284972 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/010b668f-22e4-4ad8-aae5-decf063cd49b-secrets\") pod \"openstack-galera-0\" (UID: \"010b668f-22e4-4ad8-aae5-decf063cd49b\") " pod="openstack/openstack-galera-0" Oct 03 08:56:43 crc kubenswrapper[4790]: I1003 08:56:43.285016 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/010b668f-22e4-4ad8-aae5-decf063cd49b-operator-scripts\") pod \"openstack-galera-0\" (UID: \"010b668f-22e4-4ad8-aae5-decf063cd49b\") " pod="openstack/openstack-galera-0" Oct 03 08:56:43 crc kubenswrapper[4790]: I1003 08:56:43.285049 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/010b668f-22e4-4ad8-aae5-decf063cd49b-config-data-default\") pod \"openstack-galera-0\" (UID: \"010b668f-22e4-4ad8-aae5-decf063cd49b\") " pod="openstack/openstack-galera-0" Oct 03 08:56:43 crc kubenswrapper[4790]: I1003 08:56:43.285083 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/89aeb08e-2699-4c1b-99ef-4ae2cfce9de3-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"89aeb08e-2699-4c1b-99ef-4ae2cfce9de3\") " pod="openstack/openstack-cell1-galera-0" Oct 03 08:56:43 crc kubenswrapper[4790]: I1003 08:56:43.285205 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/89aeb08e-2699-4c1b-99ef-4ae2cfce9de3-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"89aeb08e-2699-4c1b-99ef-4ae2cfce9de3\") " pod="openstack/openstack-cell1-galera-0" Oct 03 08:56:43 crc kubenswrapper[4790]: I1003 08:56:43.285302 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/89aeb08e-2699-4c1b-99ef-4ae2cfce9de3-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"89aeb08e-2699-4c1b-99ef-4ae2cfce9de3\") " pod="openstack/openstack-cell1-galera-0" Oct 03 08:56:43 crc kubenswrapper[4790]: I1003 08:56:43.285345 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/89aeb08e-2699-4c1b-99ef-4ae2cfce9de3-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"89aeb08e-2699-4c1b-99ef-4ae2cfce9de3\") " pod="openstack/openstack-cell1-galera-0" Oct 03 08:56:43 crc kubenswrapper[4790]: I1003 08:56:43.285353 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/89aeb08e-2699-4c1b-99ef-4ae2cfce9de3-secrets\") pod \"openstack-cell1-galera-0\" (UID: \"89aeb08e-2699-4c1b-99ef-4ae2cfce9de3\") " pod="openstack/openstack-cell1-galera-0" Oct 03 08:56:43 crc kubenswrapper[4790]: I1003 08:56:43.285407 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/89aeb08e-2699-4c1b-99ef-4ae2cfce9de3-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"89aeb08e-2699-4c1b-99ef-4ae2cfce9de3\") " pod="openstack/openstack-cell1-galera-0" Oct 03 08:56:43 crc kubenswrapper[4790]: I1003 08:56:43.285440 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k9chh\" (UniqueName: \"kubernetes.io/projected/010b668f-22e4-4ad8-aae5-decf063cd49b-kube-api-access-k9chh\") pod \"openstack-galera-0\" (UID: \"010b668f-22e4-4ad8-aae5-decf063cd49b\") " pod="openstack/openstack-galera-0" Oct 03 08:56:43 crc kubenswrapper[4790]: I1003 08:56:43.285496 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/89aeb08e-2699-4c1b-99ef-4ae2cfce9de3-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"89aeb08e-2699-4c1b-99ef-4ae2cfce9de3\") " pod="openstack/openstack-cell1-galera-0" Oct 03 08:56:43 crc kubenswrapper[4790]: I1003 08:56:43.285553 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kmqlr\" (UniqueName: \"kubernetes.io/projected/89aeb08e-2699-4c1b-99ef-4ae2cfce9de3-kube-api-access-kmqlr\") pod \"openstack-cell1-galera-0\" (UID: \"89aeb08e-2699-4c1b-99ef-4ae2cfce9de3\") " pod="openstack/openstack-cell1-galera-0" Oct 03 08:56:43 crc kubenswrapper[4790]: I1003 08:56:43.285613 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"openstack-galera-0\" (UID: \"010b668f-22e4-4ad8-aae5-decf063cd49b\") " pod="openstack/openstack-galera-0" Oct 03 08:56:43 crc kubenswrapper[4790]: I1003 08:56:43.285637 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"openstack-cell1-galera-0\" (UID: \"89aeb08e-2699-4c1b-99ef-4ae2cfce9de3\") " pod="openstack/openstack-cell1-galera-0" Oct 03 08:56:43 crc kubenswrapper[4790]: I1003 08:56:43.285972 4790 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"openstack-cell1-galera-0\" (UID: \"89aeb08e-2699-4c1b-99ef-4ae2cfce9de3\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/openstack-cell1-galera-0" Oct 03 08:56:43 crc kubenswrapper[4790]: I1003 08:56:43.286318 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/89aeb08e-2699-4c1b-99ef-4ae2cfce9de3-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"89aeb08e-2699-4c1b-99ef-4ae2cfce9de3\") " pod="openstack/openstack-cell1-galera-0" Oct 03 08:56:43 crc kubenswrapper[4790]: I1003 08:56:43.286560 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/89aeb08e-2699-4c1b-99ef-4ae2cfce9de3-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"89aeb08e-2699-4c1b-99ef-4ae2cfce9de3\") " pod="openstack/openstack-cell1-galera-0" Oct 03 08:56:43 crc kubenswrapper[4790]: I1003 08:56:43.287214 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/89aeb08e-2699-4c1b-99ef-4ae2cfce9de3-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"89aeb08e-2699-4c1b-99ef-4ae2cfce9de3\") " pod="openstack/openstack-cell1-galera-0" Oct 03 08:56:43 crc kubenswrapper[4790]: I1003 08:56:43.299693 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/89aeb08e-2699-4c1b-99ef-4ae2cfce9de3-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"89aeb08e-2699-4c1b-99ef-4ae2cfce9de3\") " pod="openstack/openstack-cell1-galera-0" Oct 03 08:56:43 crc kubenswrapper[4790]: I1003 08:56:43.300297 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/89aeb08e-2699-4c1b-99ef-4ae2cfce9de3-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"89aeb08e-2699-4c1b-99ef-4ae2cfce9de3\") " pod="openstack/openstack-cell1-galera-0" Oct 03 08:56:43 crc kubenswrapper[4790]: I1003 08:56:43.306319 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kmqlr\" (UniqueName: \"kubernetes.io/projected/89aeb08e-2699-4c1b-99ef-4ae2cfce9de3-kube-api-access-kmqlr\") pod \"openstack-cell1-galera-0\" (UID: \"89aeb08e-2699-4c1b-99ef-4ae2cfce9de3\") " pod="openstack/openstack-cell1-galera-0" Oct 03 08:56:43 crc kubenswrapper[4790]: I1003 08:56:43.324182 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/89aeb08e-2699-4c1b-99ef-4ae2cfce9de3-secrets\") pod \"openstack-cell1-galera-0\" (UID: \"89aeb08e-2699-4c1b-99ef-4ae2cfce9de3\") " pod="openstack/openstack-cell1-galera-0" Oct 03 08:56:43 crc kubenswrapper[4790]: I1003 08:56:43.336296 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"openstack-cell1-galera-0\" (UID: \"89aeb08e-2699-4c1b-99ef-4ae2cfce9de3\") " pod="openstack/openstack-cell1-galera-0" Oct 03 08:56:43 crc kubenswrapper[4790]: I1003 08:56:43.387522 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k9chh\" (UniqueName: \"kubernetes.io/projected/010b668f-22e4-4ad8-aae5-decf063cd49b-kube-api-access-k9chh\") pod \"openstack-galera-0\" (UID: \"010b668f-22e4-4ad8-aae5-decf063cd49b\") " pod="openstack/openstack-galera-0" Oct 03 08:56:43 crc kubenswrapper[4790]: I1003 08:56:43.387768 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"openstack-galera-0\" (UID: \"010b668f-22e4-4ad8-aae5-decf063cd49b\") " pod="openstack/openstack-galera-0" Oct 03 08:56:43 crc kubenswrapper[4790]: I1003 08:56:43.387845 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/010b668f-22e4-4ad8-aae5-decf063cd49b-kolla-config\") pod \"openstack-galera-0\" (UID: \"010b668f-22e4-4ad8-aae5-decf063cd49b\") " pod="openstack/openstack-galera-0" Oct 03 08:56:43 crc kubenswrapper[4790]: I1003 08:56:43.387887 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/010b668f-22e4-4ad8-aae5-decf063cd49b-config-data-generated\") pod \"openstack-galera-0\" (UID: \"010b668f-22e4-4ad8-aae5-decf063cd49b\") " pod="openstack/openstack-galera-0" Oct 03 08:56:43 crc kubenswrapper[4790]: I1003 08:56:43.387933 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/010b668f-22e4-4ad8-aae5-decf063cd49b-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"010b668f-22e4-4ad8-aae5-decf063cd49b\") " pod="openstack/openstack-galera-0" Oct 03 08:56:43 crc kubenswrapper[4790]: I1003 08:56:43.387965 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/010b668f-22e4-4ad8-aae5-decf063cd49b-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"010b668f-22e4-4ad8-aae5-decf063cd49b\") " pod="openstack/openstack-galera-0" Oct 03 08:56:43 crc kubenswrapper[4790]: I1003 08:56:43.388026 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/010b668f-22e4-4ad8-aae5-decf063cd49b-secrets\") pod \"openstack-galera-0\" (UID: \"010b668f-22e4-4ad8-aae5-decf063cd49b\") " pod="openstack/openstack-galera-0" Oct 03 08:56:43 crc kubenswrapper[4790]: I1003 08:56:43.388075 4790 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"openstack-galera-0\" (UID: \"010b668f-22e4-4ad8-aae5-decf063cd49b\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/openstack-galera-0" Oct 03 08:56:43 crc kubenswrapper[4790]: I1003 08:56:43.388924 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/010b668f-22e4-4ad8-aae5-decf063cd49b-config-data-generated\") pod \"openstack-galera-0\" (UID: \"010b668f-22e4-4ad8-aae5-decf063cd49b\") " pod="openstack/openstack-galera-0" Oct 03 08:56:43 crc kubenswrapper[4790]: I1003 08:56:43.388089 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/010b668f-22e4-4ad8-aae5-decf063cd49b-operator-scripts\") pod \"openstack-galera-0\" (UID: \"010b668f-22e4-4ad8-aae5-decf063cd49b\") " pod="openstack/openstack-galera-0" Oct 03 08:56:43 crc kubenswrapper[4790]: I1003 08:56:43.389016 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/010b668f-22e4-4ad8-aae5-decf063cd49b-config-data-default\") pod \"openstack-galera-0\" (UID: \"010b668f-22e4-4ad8-aae5-decf063cd49b\") " pod="openstack/openstack-galera-0" Oct 03 08:56:43 crc kubenswrapper[4790]: I1003 08:56:43.389393 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/010b668f-22e4-4ad8-aae5-decf063cd49b-kolla-config\") pod \"openstack-galera-0\" (UID: \"010b668f-22e4-4ad8-aae5-decf063cd49b\") " pod="openstack/openstack-galera-0" Oct 03 08:56:43 crc kubenswrapper[4790]: I1003 08:56:43.389869 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/010b668f-22e4-4ad8-aae5-decf063cd49b-config-data-default\") pod \"openstack-galera-0\" (UID: \"010b668f-22e4-4ad8-aae5-decf063cd49b\") " pod="openstack/openstack-galera-0" Oct 03 08:56:43 crc kubenswrapper[4790]: I1003 08:56:43.391843 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/010b668f-22e4-4ad8-aae5-decf063cd49b-secrets\") pod \"openstack-galera-0\" (UID: \"010b668f-22e4-4ad8-aae5-decf063cd49b\") " pod="openstack/openstack-galera-0" Oct 03 08:56:43 crc kubenswrapper[4790]: I1003 08:56:43.392481 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/010b668f-22e4-4ad8-aae5-decf063cd49b-operator-scripts\") pod \"openstack-galera-0\" (UID: \"010b668f-22e4-4ad8-aae5-decf063cd49b\") " pod="openstack/openstack-galera-0" Oct 03 08:56:43 crc kubenswrapper[4790]: I1003 08:56:43.392535 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/010b668f-22e4-4ad8-aae5-decf063cd49b-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"010b668f-22e4-4ad8-aae5-decf063cd49b\") " pod="openstack/openstack-galera-0" Oct 03 08:56:43 crc kubenswrapper[4790]: I1003 08:56:43.394974 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/010b668f-22e4-4ad8-aae5-decf063cd49b-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"010b668f-22e4-4ad8-aae5-decf063cd49b\") " pod="openstack/openstack-galera-0" Oct 03 08:56:43 crc kubenswrapper[4790]: I1003 08:56:43.404279 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k9chh\" (UniqueName: \"kubernetes.io/projected/010b668f-22e4-4ad8-aae5-decf063cd49b-kube-api-access-k9chh\") pod \"openstack-galera-0\" (UID: \"010b668f-22e4-4ad8-aae5-decf063cd49b\") " pod="openstack/openstack-galera-0" Oct 03 08:56:43 crc kubenswrapper[4790]: I1003 08:56:43.407689 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"openstack-galera-0\" (UID: \"010b668f-22e4-4ad8-aae5-decf063cd49b\") " pod="openstack/openstack-galera-0" Oct 03 08:56:43 crc kubenswrapper[4790]: I1003 08:56:43.442510 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Oct 03 08:56:43 crc kubenswrapper[4790]: I1003 08:56:43.569668 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Oct 03 08:56:43 crc kubenswrapper[4790]: I1003 08:56:43.646561 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Oct 03 08:56:43 crc kubenswrapper[4790]: I1003 08:56:43.648106 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Oct 03 08:56:43 crc kubenswrapper[4790]: I1003 08:56:43.651371 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Oct 03 08:56:43 crc kubenswrapper[4790]: I1003 08:56:43.652590 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc" Oct 03 08:56:43 crc kubenswrapper[4790]: I1003 08:56:43.652768 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-c99gv" Oct 03 08:56:43 crc kubenswrapper[4790]: I1003 08:56:43.660519 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Oct 03 08:56:43 crc kubenswrapper[4790]: I1003 08:56:43.696265 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b885fd9-8ed3-45b9-8e6a-63069940ae79-combined-ca-bundle\") pod \"memcached-0\" (UID: \"3b885fd9-8ed3-45b9-8e6a-63069940ae79\") " pod="openstack/memcached-0" Oct 03 08:56:43 crc kubenswrapper[4790]: I1003 08:56:43.696346 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rb9qk\" (UniqueName: \"kubernetes.io/projected/3b885fd9-8ed3-45b9-8e6a-63069940ae79-kube-api-access-rb9qk\") pod \"memcached-0\" (UID: \"3b885fd9-8ed3-45b9-8e6a-63069940ae79\") " pod="openstack/memcached-0" Oct 03 08:56:43 crc kubenswrapper[4790]: I1003 08:56:43.696393 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3b885fd9-8ed3-45b9-8e6a-63069940ae79-config-data\") pod \"memcached-0\" (UID: \"3b885fd9-8ed3-45b9-8e6a-63069940ae79\") " pod="openstack/memcached-0" Oct 03 08:56:43 crc kubenswrapper[4790]: I1003 08:56:43.696424 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/3b885fd9-8ed3-45b9-8e6a-63069940ae79-kolla-config\") pod \"memcached-0\" (UID: \"3b885fd9-8ed3-45b9-8e6a-63069940ae79\") " pod="openstack/memcached-0" Oct 03 08:56:43 crc kubenswrapper[4790]: I1003 08:56:43.696437 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/3b885fd9-8ed3-45b9-8e6a-63069940ae79-memcached-tls-certs\") pod \"memcached-0\" (UID: \"3b885fd9-8ed3-45b9-8e6a-63069940ae79\") " pod="openstack/memcached-0" Oct 03 08:56:43 crc kubenswrapper[4790]: I1003 08:56:43.797521 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rb9qk\" (UniqueName: \"kubernetes.io/projected/3b885fd9-8ed3-45b9-8e6a-63069940ae79-kube-api-access-rb9qk\") pod \"memcached-0\" (UID: \"3b885fd9-8ed3-45b9-8e6a-63069940ae79\") " pod="openstack/memcached-0" Oct 03 08:56:43 crc kubenswrapper[4790]: I1003 08:56:43.797603 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3b885fd9-8ed3-45b9-8e6a-63069940ae79-config-data\") pod \"memcached-0\" (UID: \"3b885fd9-8ed3-45b9-8e6a-63069940ae79\") " pod="openstack/memcached-0" Oct 03 08:56:43 crc kubenswrapper[4790]: I1003 08:56:43.797640 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/3b885fd9-8ed3-45b9-8e6a-63069940ae79-kolla-config\") pod \"memcached-0\" (UID: \"3b885fd9-8ed3-45b9-8e6a-63069940ae79\") " pod="openstack/memcached-0" Oct 03 08:56:43 crc kubenswrapper[4790]: I1003 08:56:43.797658 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/3b885fd9-8ed3-45b9-8e6a-63069940ae79-memcached-tls-certs\") pod \"memcached-0\" (UID: \"3b885fd9-8ed3-45b9-8e6a-63069940ae79\") " pod="openstack/memcached-0" Oct 03 08:56:43 crc kubenswrapper[4790]: I1003 08:56:43.797724 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b885fd9-8ed3-45b9-8e6a-63069940ae79-combined-ca-bundle\") pod \"memcached-0\" (UID: \"3b885fd9-8ed3-45b9-8e6a-63069940ae79\") " pod="openstack/memcached-0" Oct 03 08:56:43 crc kubenswrapper[4790]: I1003 08:56:43.798696 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3b885fd9-8ed3-45b9-8e6a-63069940ae79-config-data\") pod \"memcached-0\" (UID: \"3b885fd9-8ed3-45b9-8e6a-63069940ae79\") " pod="openstack/memcached-0" Oct 03 08:56:43 crc kubenswrapper[4790]: I1003 08:56:43.801628 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b885fd9-8ed3-45b9-8e6a-63069940ae79-combined-ca-bundle\") pod \"memcached-0\" (UID: \"3b885fd9-8ed3-45b9-8e6a-63069940ae79\") " pod="openstack/memcached-0" Oct 03 08:56:43 crc kubenswrapper[4790]: I1003 08:56:43.801772 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/3b885fd9-8ed3-45b9-8e6a-63069940ae79-memcached-tls-certs\") pod \"memcached-0\" (UID: \"3b885fd9-8ed3-45b9-8e6a-63069940ae79\") " pod="openstack/memcached-0" Oct 03 08:56:43 crc kubenswrapper[4790]: I1003 08:56:43.806890 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/3b885fd9-8ed3-45b9-8e6a-63069940ae79-kolla-config\") pod \"memcached-0\" (UID: \"3b885fd9-8ed3-45b9-8e6a-63069940ae79\") " pod="openstack/memcached-0" Oct 03 08:56:43 crc kubenswrapper[4790]: I1003 08:56:43.825795 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rb9qk\" (UniqueName: \"kubernetes.io/projected/3b885fd9-8ed3-45b9-8e6a-63069940ae79-kube-api-access-rb9qk\") pod \"memcached-0\" (UID: \"3b885fd9-8ed3-45b9-8e6a-63069940ae79\") " pod="openstack/memcached-0" Oct 03 08:56:43 crc kubenswrapper[4790]: I1003 08:56:43.970360 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Oct 03 08:56:45 crc kubenswrapper[4790]: I1003 08:56:45.759239 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Oct 03 08:56:45 crc kubenswrapper[4790]: I1003 08:56:45.762368 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 03 08:56:45 crc kubenswrapper[4790]: I1003 08:56:45.764411 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-8zpp5" Oct 03 08:56:45 crc kubenswrapper[4790]: I1003 08:56:45.774507 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 03 08:56:45 crc kubenswrapper[4790]: I1003 08:56:45.831884 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-frf8p\" (UniqueName: \"kubernetes.io/projected/a93cbe46-9c4b-49b9-916f-b26ed8c80fe9-kube-api-access-frf8p\") pod \"kube-state-metrics-0\" (UID: \"a93cbe46-9c4b-49b9-916f-b26ed8c80fe9\") " pod="openstack/kube-state-metrics-0" Oct 03 08:56:45 crc kubenswrapper[4790]: I1003 08:56:45.934515 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-frf8p\" (UniqueName: \"kubernetes.io/projected/a93cbe46-9c4b-49b9-916f-b26ed8c80fe9-kube-api-access-frf8p\") pod \"kube-state-metrics-0\" (UID: \"a93cbe46-9c4b-49b9-916f-b26ed8c80fe9\") " pod="openstack/kube-state-metrics-0" Oct 03 08:56:45 crc kubenswrapper[4790]: I1003 08:56:45.965319 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-frf8p\" (UniqueName: \"kubernetes.io/projected/a93cbe46-9c4b-49b9-916f-b26ed8c80fe9-kube-api-access-frf8p\") pod \"kube-state-metrics-0\" (UID: \"a93cbe46-9c4b-49b9-916f-b26ed8c80fe9\") " pod="openstack/kube-state-metrics-0" Oct 03 08:56:46 crc kubenswrapper[4790]: I1003 08:56:46.084635 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 03 08:56:47 crc kubenswrapper[4790]: I1003 08:56:47.081376 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/prometheus-metric-storage-0"] Oct 03 08:56:47 crc kubenswrapper[4790]: I1003 08:56:47.084146 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Oct 03 08:56:47 crc kubenswrapper[4790]: I1003 08:56:47.088196 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-0" Oct 03 08:56:47 crc kubenswrapper[4790]: I1003 08:56:47.088340 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-prometheus-dockercfg-xvhkt" Oct 03 08:56:47 crc kubenswrapper[4790]: I1003 08:56:47.088470 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage" Oct 03 08:56:47 crc kubenswrapper[4790]: I1003 08:56:47.090648 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-thanos-prometheus-http-client-file" Oct 03 08:56:47 crc kubenswrapper[4790]: I1003 08:56:47.090684 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-web-config" Oct 03 08:56:47 crc kubenswrapper[4790]: I1003 08:56:47.095595 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Oct 03 08:56:47 crc kubenswrapper[4790]: I1003 08:56:47.098555 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-tls-assets-0" Oct 03 08:56:47 crc kubenswrapper[4790]: I1003 08:56:47.154915 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/3dc14abb-14bb-4bee-9864-8507e37d824d-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"3dc14abb-14bb-4bee-9864-8507e37d824d\") " pod="openstack/prometheus-metric-storage-0" Oct 03 08:56:47 crc kubenswrapper[4790]: I1003 08:56:47.154972 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/3dc14abb-14bb-4bee-9864-8507e37d824d-config\") pod \"prometheus-metric-storage-0\" (UID: \"3dc14abb-14bb-4bee-9864-8507e37d824d\") " pod="openstack/prometheus-metric-storage-0" Oct 03 08:56:47 crc kubenswrapper[4790]: I1003 08:56:47.154999 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/3dc14abb-14bb-4bee-9864-8507e37d824d-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"3dc14abb-14bb-4bee-9864-8507e37d824d\") " pod="openstack/prometheus-metric-storage-0" Oct 03 08:56:47 crc kubenswrapper[4790]: I1003 08:56:47.155096 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/3dc14abb-14bb-4bee-9864-8507e37d824d-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"3dc14abb-14bb-4bee-9864-8507e37d824d\") " pod="openstack/prometheus-metric-storage-0" Oct 03 08:56:47 crc kubenswrapper[4790]: I1003 08:56:47.155129 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-84e9d10c-4308-45f2-a94a-547cf8f7d3b4\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-84e9d10c-4308-45f2-a94a-547cf8f7d3b4\") pod \"prometheus-metric-storage-0\" (UID: \"3dc14abb-14bb-4bee-9864-8507e37d824d\") " pod="openstack/prometheus-metric-storage-0" Oct 03 08:56:47 crc kubenswrapper[4790]: I1003 08:56:47.155160 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/3dc14abb-14bb-4bee-9864-8507e37d824d-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"3dc14abb-14bb-4bee-9864-8507e37d824d\") " pod="openstack/prometheus-metric-storage-0" Oct 03 08:56:47 crc kubenswrapper[4790]: I1003 08:56:47.155235 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/3dc14abb-14bb-4bee-9864-8507e37d824d-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"3dc14abb-14bb-4bee-9864-8507e37d824d\") " pod="openstack/prometheus-metric-storage-0" Oct 03 08:56:47 crc kubenswrapper[4790]: I1003 08:56:47.155264 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-78pfp\" (UniqueName: \"kubernetes.io/projected/3dc14abb-14bb-4bee-9864-8507e37d824d-kube-api-access-78pfp\") pod \"prometheus-metric-storage-0\" (UID: \"3dc14abb-14bb-4bee-9864-8507e37d824d\") " pod="openstack/prometheus-metric-storage-0" Oct 03 08:56:47 crc kubenswrapper[4790]: I1003 08:56:47.256472 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/3dc14abb-14bb-4bee-9864-8507e37d824d-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"3dc14abb-14bb-4bee-9864-8507e37d824d\") " pod="openstack/prometheus-metric-storage-0" Oct 03 08:56:47 crc kubenswrapper[4790]: I1003 08:56:47.256532 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-78pfp\" (UniqueName: \"kubernetes.io/projected/3dc14abb-14bb-4bee-9864-8507e37d824d-kube-api-access-78pfp\") pod \"prometheus-metric-storage-0\" (UID: \"3dc14abb-14bb-4bee-9864-8507e37d824d\") " pod="openstack/prometheus-metric-storage-0" Oct 03 08:56:47 crc kubenswrapper[4790]: I1003 08:56:47.256586 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/3dc14abb-14bb-4bee-9864-8507e37d824d-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"3dc14abb-14bb-4bee-9864-8507e37d824d\") " pod="openstack/prometheus-metric-storage-0" Oct 03 08:56:47 crc kubenswrapper[4790]: I1003 08:56:47.256617 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/3dc14abb-14bb-4bee-9864-8507e37d824d-config\") pod \"prometheus-metric-storage-0\" (UID: \"3dc14abb-14bb-4bee-9864-8507e37d824d\") " pod="openstack/prometheus-metric-storage-0" Oct 03 08:56:47 crc kubenswrapper[4790]: I1003 08:56:47.256643 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/3dc14abb-14bb-4bee-9864-8507e37d824d-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"3dc14abb-14bb-4bee-9864-8507e37d824d\") " pod="openstack/prometheus-metric-storage-0" Oct 03 08:56:47 crc kubenswrapper[4790]: I1003 08:56:47.256720 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/3dc14abb-14bb-4bee-9864-8507e37d824d-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"3dc14abb-14bb-4bee-9864-8507e37d824d\") " pod="openstack/prometheus-metric-storage-0" Oct 03 08:56:47 crc kubenswrapper[4790]: I1003 08:56:47.256783 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-84e9d10c-4308-45f2-a94a-547cf8f7d3b4\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-84e9d10c-4308-45f2-a94a-547cf8f7d3b4\") pod \"prometheus-metric-storage-0\" (UID: \"3dc14abb-14bb-4bee-9864-8507e37d824d\") " pod="openstack/prometheus-metric-storage-0" Oct 03 08:56:47 crc kubenswrapper[4790]: I1003 08:56:47.256817 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/3dc14abb-14bb-4bee-9864-8507e37d824d-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"3dc14abb-14bb-4bee-9864-8507e37d824d\") " pod="openstack/prometheus-metric-storage-0" Oct 03 08:56:47 crc kubenswrapper[4790]: I1003 08:56:47.260918 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/3dc14abb-14bb-4bee-9864-8507e37d824d-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"3dc14abb-14bb-4bee-9864-8507e37d824d\") " pod="openstack/prometheus-metric-storage-0" Oct 03 08:56:47 crc kubenswrapper[4790]: I1003 08:56:47.261702 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/3dc14abb-14bb-4bee-9864-8507e37d824d-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"3dc14abb-14bb-4bee-9864-8507e37d824d\") " pod="openstack/prometheus-metric-storage-0" Oct 03 08:56:47 crc kubenswrapper[4790]: I1003 08:56:47.262512 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/3dc14abb-14bb-4bee-9864-8507e37d824d-config\") pod \"prometheus-metric-storage-0\" (UID: \"3dc14abb-14bb-4bee-9864-8507e37d824d\") " pod="openstack/prometheus-metric-storage-0" Oct 03 08:56:47 crc kubenswrapper[4790]: I1003 08:56:47.263315 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/3dc14abb-14bb-4bee-9864-8507e37d824d-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"3dc14abb-14bb-4bee-9864-8507e37d824d\") " pod="openstack/prometheus-metric-storage-0" Oct 03 08:56:47 crc kubenswrapper[4790]: I1003 08:56:47.264267 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/3dc14abb-14bb-4bee-9864-8507e37d824d-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"3dc14abb-14bb-4bee-9864-8507e37d824d\") " pod="openstack/prometheus-metric-storage-0" Oct 03 08:56:47 crc kubenswrapper[4790]: I1003 08:56:47.264817 4790 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Oct 03 08:56:47 crc kubenswrapper[4790]: I1003 08:56:47.264840 4790 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-84e9d10c-4308-45f2-a94a-547cf8f7d3b4\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-84e9d10c-4308-45f2-a94a-547cf8f7d3b4\") pod \"prometheus-metric-storage-0\" (UID: \"3dc14abb-14bb-4bee-9864-8507e37d824d\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/a9f968d81b8706f372416bcf6becb6316da12568f5c5e3584a75f6147541fb3e/globalmount\"" pod="openstack/prometheus-metric-storage-0" Oct 03 08:56:47 crc kubenswrapper[4790]: I1003 08:56:47.266194 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/3dc14abb-14bb-4bee-9864-8507e37d824d-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"3dc14abb-14bb-4bee-9864-8507e37d824d\") " pod="openstack/prometheus-metric-storage-0" Oct 03 08:56:47 crc kubenswrapper[4790]: I1003 08:56:47.280865 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-78pfp\" (UniqueName: \"kubernetes.io/projected/3dc14abb-14bb-4bee-9864-8507e37d824d-kube-api-access-78pfp\") pod \"prometheus-metric-storage-0\" (UID: \"3dc14abb-14bb-4bee-9864-8507e37d824d\") " pod="openstack/prometheus-metric-storage-0" Oct 03 08:56:47 crc kubenswrapper[4790]: I1003 08:56:47.340806 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-84e9d10c-4308-45f2-a94a-547cf8f7d3b4\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-84e9d10c-4308-45f2-a94a-547cf8f7d3b4\") pod \"prometheus-metric-storage-0\" (UID: \"3dc14abb-14bb-4bee-9864-8507e37d824d\") " pod="openstack/prometheus-metric-storage-0" Oct 03 08:56:47 crc kubenswrapper[4790]: I1003 08:56:47.419399 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Oct 03 08:56:49 crc kubenswrapper[4790]: I1003 08:56:49.240499 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-8ljvm"] Oct 03 08:56:49 crc kubenswrapper[4790]: I1003 08:56:49.242953 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-8ljvm" Oct 03 08:56:49 crc kubenswrapper[4790]: I1003 08:56:49.245860 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncontroller-ovncontroller-dockercfg-kp8rn" Oct 03 08:56:49 crc kubenswrapper[4790]: I1003 08:56:49.256374 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Oct 03 08:56:49 crc kubenswrapper[4790]: I1003 08:56:49.259124 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-vrsc7"] Oct 03 08:56:49 crc kubenswrapper[4790]: I1003 08:56:49.261344 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-vrsc7" Oct 03 08:56:49 crc kubenswrapper[4790]: I1003 08:56:49.263383 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovncontroller-ovndbs" Oct 03 08:56:49 crc kubenswrapper[4790]: I1003 08:56:49.295858 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/845dcf0a-43fb-4760-aac6-d12c02a8b785-var-log-ovn\") pod \"ovn-controller-8ljvm\" (UID: \"845dcf0a-43fb-4760-aac6-d12c02a8b785\") " pod="openstack/ovn-controller-8ljvm" Oct 03 08:56:49 crc kubenswrapper[4790]: I1003 08:56:49.295913 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/5d8a4a4d-fc2d-49d8-80fd-8b6faf7e3aec-etc-ovs\") pod \"ovn-controller-ovs-vrsc7\" (UID: \"5d8a4a4d-fc2d-49d8-80fd-8b6faf7e3aec\") " pod="openstack/ovn-controller-ovs-vrsc7" Oct 03 08:56:49 crc kubenswrapper[4790]: I1003 08:56:49.295940 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qs7tn\" (UniqueName: \"kubernetes.io/projected/5d8a4a4d-fc2d-49d8-80fd-8b6faf7e3aec-kube-api-access-qs7tn\") pod \"ovn-controller-ovs-vrsc7\" (UID: \"5d8a4a4d-fc2d-49d8-80fd-8b6faf7e3aec\") " pod="openstack/ovn-controller-ovs-vrsc7" Oct 03 08:56:49 crc kubenswrapper[4790]: I1003 08:56:49.296119 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d7hrg\" (UniqueName: \"kubernetes.io/projected/845dcf0a-43fb-4760-aac6-d12c02a8b785-kube-api-access-d7hrg\") pod \"ovn-controller-8ljvm\" (UID: \"845dcf0a-43fb-4760-aac6-d12c02a8b785\") " pod="openstack/ovn-controller-8ljvm" Oct 03 08:56:49 crc kubenswrapper[4790]: I1003 08:56:49.296256 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/845dcf0a-43fb-4760-aac6-d12c02a8b785-ovn-controller-tls-certs\") pod \"ovn-controller-8ljvm\" (UID: \"845dcf0a-43fb-4760-aac6-d12c02a8b785\") " pod="openstack/ovn-controller-8ljvm" Oct 03 08:56:49 crc kubenswrapper[4790]: I1003 08:56:49.296325 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/845dcf0a-43fb-4760-aac6-d12c02a8b785-scripts\") pod \"ovn-controller-8ljvm\" (UID: \"845dcf0a-43fb-4760-aac6-d12c02a8b785\") " pod="openstack/ovn-controller-8ljvm" Oct 03 08:56:49 crc kubenswrapper[4790]: I1003 08:56:49.296352 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/845dcf0a-43fb-4760-aac6-d12c02a8b785-var-run-ovn\") pod \"ovn-controller-8ljvm\" (UID: \"845dcf0a-43fb-4760-aac6-d12c02a8b785\") " pod="openstack/ovn-controller-8ljvm" Oct 03 08:56:49 crc kubenswrapper[4790]: I1003 08:56:49.296437 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/5d8a4a4d-fc2d-49d8-80fd-8b6faf7e3aec-var-run\") pod \"ovn-controller-ovs-vrsc7\" (UID: \"5d8a4a4d-fc2d-49d8-80fd-8b6faf7e3aec\") " pod="openstack/ovn-controller-ovs-vrsc7" Oct 03 08:56:49 crc kubenswrapper[4790]: I1003 08:56:49.296542 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5d8a4a4d-fc2d-49d8-80fd-8b6faf7e3aec-scripts\") pod \"ovn-controller-ovs-vrsc7\" (UID: \"5d8a4a4d-fc2d-49d8-80fd-8b6faf7e3aec\") " pod="openstack/ovn-controller-ovs-vrsc7" Oct 03 08:56:49 crc kubenswrapper[4790]: I1003 08:56:49.296605 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/845dcf0a-43fb-4760-aac6-d12c02a8b785-combined-ca-bundle\") pod \"ovn-controller-8ljvm\" (UID: \"845dcf0a-43fb-4760-aac6-d12c02a8b785\") " pod="openstack/ovn-controller-8ljvm" Oct 03 08:56:49 crc kubenswrapper[4790]: I1003 08:56:49.296631 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/5d8a4a4d-fc2d-49d8-80fd-8b6faf7e3aec-var-log\") pod \"ovn-controller-ovs-vrsc7\" (UID: \"5d8a4a4d-fc2d-49d8-80fd-8b6faf7e3aec\") " pod="openstack/ovn-controller-ovs-vrsc7" Oct 03 08:56:49 crc kubenswrapper[4790]: I1003 08:56:49.296681 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/5d8a4a4d-fc2d-49d8-80fd-8b6faf7e3aec-var-lib\") pod \"ovn-controller-ovs-vrsc7\" (UID: \"5d8a4a4d-fc2d-49d8-80fd-8b6faf7e3aec\") " pod="openstack/ovn-controller-ovs-vrsc7" Oct 03 08:56:49 crc kubenswrapper[4790]: I1003 08:56:49.296779 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/845dcf0a-43fb-4760-aac6-d12c02a8b785-var-run\") pod \"ovn-controller-8ljvm\" (UID: \"845dcf0a-43fb-4760-aac6-d12c02a8b785\") " pod="openstack/ovn-controller-8ljvm" Oct 03 08:56:49 crc kubenswrapper[4790]: I1003 08:56:49.315341 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-vrsc7"] Oct 03 08:56:49 crc kubenswrapper[4790]: I1003 08:56:49.330484 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-8ljvm"] Oct 03 08:56:49 crc kubenswrapper[4790]: I1003 08:56:49.398181 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d7hrg\" (UniqueName: \"kubernetes.io/projected/845dcf0a-43fb-4760-aac6-d12c02a8b785-kube-api-access-d7hrg\") pod \"ovn-controller-8ljvm\" (UID: \"845dcf0a-43fb-4760-aac6-d12c02a8b785\") " pod="openstack/ovn-controller-8ljvm" Oct 03 08:56:49 crc kubenswrapper[4790]: I1003 08:56:49.398464 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/845dcf0a-43fb-4760-aac6-d12c02a8b785-ovn-controller-tls-certs\") pod \"ovn-controller-8ljvm\" (UID: \"845dcf0a-43fb-4760-aac6-d12c02a8b785\") " pod="openstack/ovn-controller-8ljvm" Oct 03 08:56:49 crc kubenswrapper[4790]: I1003 08:56:49.398543 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/845dcf0a-43fb-4760-aac6-d12c02a8b785-scripts\") pod \"ovn-controller-8ljvm\" (UID: \"845dcf0a-43fb-4760-aac6-d12c02a8b785\") " pod="openstack/ovn-controller-8ljvm" Oct 03 08:56:49 crc kubenswrapper[4790]: I1003 08:56:49.398584 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/845dcf0a-43fb-4760-aac6-d12c02a8b785-var-run-ovn\") pod \"ovn-controller-8ljvm\" (UID: \"845dcf0a-43fb-4760-aac6-d12c02a8b785\") " pod="openstack/ovn-controller-8ljvm" Oct 03 08:56:49 crc kubenswrapper[4790]: I1003 08:56:49.398604 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/5d8a4a4d-fc2d-49d8-80fd-8b6faf7e3aec-var-run\") pod \"ovn-controller-ovs-vrsc7\" (UID: \"5d8a4a4d-fc2d-49d8-80fd-8b6faf7e3aec\") " pod="openstack/ovn-controller-ovs-vrsc7" Oct 03 08:56:49 crc kubenswrapper[4790]: I1003 08:56:49.398630 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5d8a4a4d-fc2d-49d8-80fd-8b6faf7e3aec-scripts\") pod \"ovn-controller-ovs-vrsc7\" (UID: \"5d8a4a4d-fc2d-49d8-80fd-8b6faf7e3aec\") " pod="openstack/ovn-controller-ovs-vrsc7" Oct 03 08:56:49 crc kubenswrapper[4790]: I1003 08:56:49.398660 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/845dcf0a-43fb-4760-aac6-d12c02a8b785-combined-ca-bundle\") pod \"ovn-controller-8ljvm\" (UID: \"845dcf0a-43fb-4760-aac6-d12c02a8b785\") " pod="openstack/ovn-controller-8ljvm" Oct 03 08:56:49 crc kubenswrapper[4790]: I1003 08:56:49.398682 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/5d8a4a4d-fc2d-49d8-80fd-8b6faf7e3aec-var-log\") pod \"ovn-controller-ovs-vrsc7\" (UID: \"5d8a4a4d-fc2d-49d8-80fd-8b6faf7e3aec\") " pod="openstack/ovn-controller-ovs-vrsc7" Oct 03 08:56:49 crc kubenswrapper[4790]: I1003 08:56:49.398734 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/5d8a4a4d-fc2d-49d8-80fd-8b6faf7e3aec-var-lib\") pod \"ovn-controller-ovs-vrsc7\" (UID: \"5d8a4a4d-fc2d-49d8-80fd-8b6faf7e3aec\") " pod="openstack/ovn-controller-ovs-vrsc7" Oct 03 08:56:49 crc kubenswrapper[4790]: I1003 08:56:49.398800 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/845dcf0a-43fb-4760-aac6-d12c02a8b785-var-run\") pod \"ovn-controller-8ljvm\" (UID: \"845dcf0a-43fb-4760-aac6-d12c02a8b785\") " pod="openstack/ovn-controller-8ljvm" Oct 03 08:56:49 crc kubenswrapper[4790]: I1003 08:56:49.398839 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/845dcf0a-43fb-4760-aac6-d12c02a8b785-var-log-ovn\") pod \"ovn-controller-8ljvm\" (UID: \"845dcf0a-43fb-4760-aac6-d12c02a8b785\") " pod="openstack/ovn-controller-8ljvm" Oct 03 08:56:49 crc kubenswrapper[4790]: I1003 08:56:49.398870 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/5d8a4a4d-fc2d-49d8-80fd-8b6faf7e3aec-etc-ovs\") pod \"ovn-controller-ovs-vrsc7\" (UID: \"5d8a4a4d-fc2d-49d8-80fd-8b6faf7e3aec\") " pod="openstack/ovn-controller-ovs-vrsc7" Oct 03 08:56:49 crc kubenswrapper[4790]: I1003 08:56:49.398906 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qs7tn\" (UniqueName: \"kubernetes.io/projected/5d8a4a4d-fc2d-49d8-80fd-8b6faf7e3aec-kube-api-access-qs7tn\") pod \"ovn-controller-ovs-vrsc7\" (UID: \"5d8a4a4d-fc2d-49d8-80fd-8b6faf7e3aec\") " pod="openstack/ovn-controller-ovs-vrsc7" Oct 03 08:56:49 crc kubenswrapper[4790]: I1003 08:56:49.400883 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/5d8a4a4d-fc2d-49d8-80fd-8b6faf7e3aec-var-log\") pod \"ovn-controller-ovs-vrsc7\" (UID: \"5d8a4a4d-fc2d-49d8-80fd-8b6faf7e3aec\") " pod="openstack/ovn-controller-ovs-vrsc7" Oct 03 08:56:49 crc kubenswrapper[4790]: I1003 08:56:49.401055 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/5d8a4a4d-fc2d-49d8-80fd-8b6faf7e3aec-var-run\") pod \"ovn-controller-ovs-vrsc7\" (UID: \"5d8a4a4d-fc2d-49d8-80fd-8b6faf7e3aec\") " pod="openstack/ovn-controller-ovs-vrsc7" Oct 03 08:56:49 crc kubenswrapper[4790]: I1003 08:56:49.401360 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/5d8a4a4d-fc2d-49d8-80fd-8b6faf7e3aec-var-lib\") pod \"ovn-controller-ovs-vrsc7\" (UID: \"5d8a4a4d-fc2d-49d8-80fd-8b6faf7e3aec\") " pod="openstack/ovn-controller-ovs-vrsc7" Oct 03 08:56:49 crc kubenswrapper[4790]: I1003 08:56:49.402353 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/845dcf0a-43fb-4760-aac6-d12c02a8b785-var-run\") pod \"ovn-controller-8ljvm\" (UID: \"845dcf0a-43fb-4760-aac6-d12c02a8b785\") " pod="openstack/ovn-controller-8ljvm" Oct 03 08:56:49 crc kubenswrapper[4790]: I1003 08:56:49.402520 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/845dcf0a-43fb-4760-aac6-d12c02a8b785-var-log-ovn\") pod \"ovn-controller-8ljvm\" (UID: \"845dcf0a-43fb-4760-aac6-d12c02a8b785\") " pod="openstack/ovn-controller-8ljvm" Oct 03 08:56:49 crc kubenswrapper[4790]: I1003 08:56:49.402697 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/5d8a4a4d-fc2d-49d8-80fd-8b6faf7e3aec-etc-ovs\") pod \"ovn-controller-ovs-vrsc7\" (UID: \"5d8a4a4d-fc2d-49d8-80fd-8b6faf7e3aec\") " pod="openstack/ovn-controller-ovs-vrsc7" Oct 03 08:56:49 crc kubenswrapper[4790]: I1003 08:56:49.402795 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/845dcf0a-43fb-4760-aac6-d12c02a8b785-var-run-ovn\") pod \"ovn-controller-8ljvm\" (UID: \"845dcf0a-43fb-4760-aac6-d12c02a8b785\") " pod="openstack/ovn-controller-8ljvm" Oct 03 08:56:49 crc kubenswrapper[4790]: I1003 08:56:49.404292 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/845dcf0a-43fb-4760-aac6-d12c02a8b785-scripts\") pod \"ovn-controller-8ljvm\" (UID: \"845dcf0a-43fb-4760-aac6-d12c02a8b785\") " pod="openstack/ovn-controller-8ljvm" Oct 03 08:56:49 crc kubenswrapper[4790]: I1003 08:56:49.409311 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/845dcf0a-43fb-4760-aac6-d12c02a8b785-ovn-controller-tls-certs\") pod \"ovn-controller-8ljvm\" (UID: \"845dcf0a-43fb-4760-aac6-d12c02a8b785\") " pod="openstack/ovn-controller-8ljvm" Oct 03 08:56:49 crc kubenswrapper[4790]: I1003 08:56:49.417281 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5d8a4a4d-fc2d-49d8-80fd-8b6faf7e3aec-scripts\") pod \"ovn-controller-ovs-vrsc7\" (UID: \"5d8a4a4d-fc2d-49d8-80fd-8b6faf7e3aec\") " pod="openstack/ovn-controller-ovs-vrsc7" Oct 03 08:56:49 crc kubenswrapper[4790]: I1003 08:56:49.420118 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/845dcf0a-43fb-4760-aac6-d12c02a8b785-combined-ca-bundle\") pod \"ovn-controller-8ljvm\" (UID: \"845dcf0a-43fb-4760-aac6-d12c02a8b785\") " pod="openstack/ovn-controller-8ljvm" Oct 03 08:56:49 crc kubenswrapper[4790]: I1003 08:56:49.427186 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qs7tn\" (UniqueName: \"kubernetes.io/projected/5d8a4a4d-fc2d-49d8-80fd-8b6faf7e3aec-kube-api-access-qs7tn\") pod \"ovn-controller-ovs-vrsc7\" (UID: \"5d8a4a4d-fc2d-49d8-80fd-8b6faf7e3aec\") " pod="openstack/ovn-controller-ovs-vrsc7" Oct 03 08:56:49 crc kubenswrapper[4790]: I1003 08:56:49.438378 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d7hrg\" (UniqueName: \"kubernetes.io/projected/845dcf0a-43fb-4760-aac6-d12c02a8b785-kube-api-access-d7hrg\") pod \"ovn-controller-8ljvm\" (UID: \"845dcf0a-43fb-4760-aac6-d12c02a8b785\") " pod="openstack/ovn-controller-8ljvm" Oct 03 08:56:49 crc kubenswrapper[4790]: I1003 08:56:49.561111 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-8ljvm" Oct 03 08:56:49 crc kubenswrapper[4790]: I1003 08:56:49.578409 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-vrsc7" Oct 03 08:56:50 crc kubenswrapper[4790]: I1003 08:56:50.084054 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Oct 03 08:56:50 crc kubenswrapper[4790]: I1003 08:56:50.085758 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Oct 03 08:56:50 crc kubenswrapper[4790]: I1003 08:56:50.088046 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-sb-ovndbs" Oct 03 08:56:50 crc kubenswrapper[4790]: I1003 08:56:50.088203 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovn-metrics" Oct 03 08:56:50 crc kubenswrapper[4790]: I1003 08:56:50.094055 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-dxqdb" Oct 03 08:56:50 crc kubenswrapper[4790]: I1003 08:56:50.094321 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Oct 03 08:56:50 crc kubenswrapper[4790]: I1003 08:56:50.094371 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Oct 03 08:56:50 crc kubenswrapper[4790]: I1003 08:56:50.110184 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"ovsdbserver-sb-0\" (UID: \"c53865bf-810d-440a-bfa4-171e3183066c\") " pod="openstack/ovsdbserver-sb-0" Oct 03 08:56:50 crc kubenswrapper[4790]: I1003 08:56:50.110248 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/c53865bf-810d-440a-bfa4-171e3183066c-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"c53865bf-810d-440a-bfa4-171e3183066c\") " pod="openstack/ovsdbserver-sb-0" Oct 03 08:56:50 crc kubenswrapper[4790]: I1003 08:56:50.110281 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/c53865bf-810d-440a-bfa4-171e3183066c-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"c53865bf-810d-440a-bfa4-171e3183066c\") " pod="openstack/ovsdbserver-sb-0" Oct 03 08:56:50 crc kubenswrapper[4790]: I1003 08:56:50.110318 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c53865bf-810d-440a-bfa4-171e3183066c-config\") pod \"ovsdbserver-sb-0\" (UID: \"c53865bf-810d-440a-bfa4-171e3183066c\") " pod="openstack/ovsdbserver-sb-0" Oct 03 08:56:50 crc kubenswrapper[4790]: I1003 08:56:50.110346 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c53865bf-810d-440a-bfa4-171e3183066c-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"c53865bf-810d-440a-bfa4-171e3183066c\") " pod="openstack/ovsdbserver-sb-0" Oct 03 08:56:50 crc kubenswrapper[4790]: I1003 08:56:50.110410 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b6nk8\" (UniqueName: \"kubernetes.io/projected/c53865bf-810d-440a-bfa4-171e3183066c-kube-api-access-b6nk8\") pod \"ovsdbserver-sb-0\" (UID: \"c53865bf-810d-440a-bfa4-171e3183066c\") " pod="openstack/ovsdbserver-sb-0" Oct 03 08:56:50 crc kubenswrapper[4790]: I1003 08:56:50.110444 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/c53865bf-810d-440a-bfa4-171e3183066c-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"c53865bf-810d-440a-bfa4-171e3183066c\") " pod="openstack/ovsdbserver-sb-0" Oct 03 08:56:50 crc kubenswrapper[4790]: I1003 08:56:50.110547 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c53865bf-810d-440a-bfa4-171e3183066c-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"c53865bf-810d-440a-bfa4-171e3183066c\") " pod="openstack/ovsdbserver-sb-0" Oct 03 08:56:50 crc kubenswrapper[4790]: I1003 08:56:50.117708 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Oct 03 08:56:50 crc kubenswrapper[4790]: I1003 08:56:50.212329 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c53865bf-810d-440a-bfa4-171e3183066c-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"c53865bf-810d-440a-bfa4-171e3183066c\") " pod="openstack/ovsdbserver-sb-0" Oct 03 08:56:50 crc kubenswrapper[4790]: I1003 08:56:50.212404 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"ovsdbserver-sb-0\" (UID: \"c53865bf-810d-440a-bfa4-171e3183066c\") " pod="openstack/ovsdbserver-sb-0" Oct 03 08:56:50 crc kubenswrapper[4790]: I1003 08:56:50.212437 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/c53865bf-810d-440a-bfa4-171e3183066c-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"c53865bf-810d-440a-bfa4-171e3183066c\") " pod="openstack/ovsdbserver-sb-0" Oct 03 08:56:50 crc kubenswrapper[4790]: I1003 08:56:50.212465 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/c53865bf-810d-440a-bfa4-171e3183066c-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"c53865bf-810d-440a-bfa4-171e3183066c\") " pod="openstack/ovsdbserver-sb-0" Oct 03 08:56:50 crc kubenswrapper[4790]: I1003 08:56:50.212492 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c53865bf-810d-440a-bfa4-171e3183066c-config\") pod \"ovsdbserver-sb-0\" (UID: \"c53865bf-810d-440a-bfa4-171e3183066c\") " pod="openstack/ovsdbserver-sb-0" Oct 03 08:56:50 crc kubenswrapper[4790]: I1003 08:56:50.212516 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c53865bf-810d-440a-bfa4-171e3183066c-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"c53865bf-810d-440a-bfa4-171e3183066c\") " pod="openstack/ovsdbserver-sb-0" Oct 03 08:56:50 crc kubenswrapper[4790]: I1003 08:56:50.212599 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b6nk8\" (UniqueName: \"kubernetes.io/projected/c53865bf-810d-440a-bfa4-171e3183066c-kube-api-access-b6nk8\") pod \"ovsdbserver-sb-0\" (UID: \"c53865bf-810d-440a-bfa4-171e3183066c\") " pod="openstack/ovsdbserver-sb-0" Oct 03 08:56:50 crc kubenswrapper[4790]: I1003 08:56:50.212634 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/c53865bf-810d-440a-bfa4-171e3183066c-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"c53865bf-810d-440a-bfa4-171e3183066c\") " pod="openstack/ovsdbserver-sb-0" Oct 03 08:56:50 crc kubenswrapper[4790]: I1003 08:56:50.213128 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/c53865bf-810d-440a-bfa4-171e3183066c-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"c53865bf-810d-440a-bfa4-171e3183066c\") " pod="openstack/ovsdbserver-sb-0" Oct 03 08:56:50 crc kubenswrapper[4790]: I1003 08:56:50.213242 4790 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"ovsdbserver-sb-0\" (UID: \"c53865bf-810d-440a-bfa4-171e3183066c\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/ovsdbserver-sb-0" Oct 03 08:56:50 crc kubenswrapper[4790]: I1003 08:56:50.213723 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c53865bf-810d-440a-bfa4-171e3183066c-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"c53865bf-810d-440a-bfa4-171e3183066c\") " pod="openstack/ovsdbserver-sb-0" Oct 03 08:56:50 crc kubenswrapper[4790]: I1003 08:56:50.214068 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c53865bf-810d-440a-bfa4-171e3183066c-config\") pod \"ovsdbserver-sb-0\" (UID: \"c53865bf-810d-440a-bfa4-171e3183066c\") " pod="openstack/ovsdbserver-sb-0" Oct 03 08:56:50 crc kubenswrapper[4790]: I1003 08:56:50.219452 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c53865bf-810d-440a-bfa4-171e3183066c-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"c53865bf-810d-440a-bfa4-171e3183066c\") " pod="openstack/ovsdbserver-sb-0" Oct 03 08:56:50 crc kubenswrapper[4790]: I1003 08:56:50.221756 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/c53865bf-810d-440a-bfa4-171e3183066c-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"c53865bf-810d-440a-bfa4-171e3183066c\") " pod="openstack/ovsdbserver-sb-0" Oct 03 08:56:50 crc kubenswrapper[4790]: I1003 08:56:50.230463 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b6nk8\" (UniqueName: \"kubernetes.io/projected/c53865bf-810d-440a-bfa4-171e3183066c-kube-api-access-b6nk8\") pod \"ovsdbserver-sb-0\" (UID: \"c53865bf-810d-440a-bfa4-171e3183066c\") " pod="openstack/ovsdbserver-sb-0" Oct 03 08:56:50 crc kubenswrapper[4790]: I1003 08:56:50.252549 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"ovsdbserver-sb-0\" (UID: \"c53865bf-810d-440a-bfa4-171e3183066c\") " pod="openstack/ovsdbserver-sb-0" Oct 03 08:56:50 crc kubenswrapper[4790]: I1003 08:56:50.254676 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/c53865bf-810d-440a-bfa4-171e3183066c-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"c53865bf-810d-440a-bfa4-171e3183066c\") " pod="openstack/ovsdbserver-sb-0" Oct 03 08:56:50 crc kubenswrapper[4790]: I1003 08:56:50.408411 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Oct 03 08:56:52 crc kubenswrapper[4790]: E1003 08:56:52.397011 4790 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.58:5001/podified-master-centos10/openstack-neutron-server:watcher_latest" Oct 03 08:56:52 crc kubenswrapper[4790]: E1003 08:56:52.397339 4790 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.58:5001/podified-master-centos10/openstack-neutron-server:watcher_latest" Oct 03 08:56:52 crc kubenswrapper[4790]: E1003 08:56:52.397496 4790 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:38.102.83.58:5001/podified-master-centos10/openstack-neutron-server:watcher_latest,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:ndfhb5h667h568h584h5f9h58dh565h664h587h597h577h64bh5c4h66fh647hbdh68ch5c5h68dh686h5f7h64hd7hc6h55fh57bh98h57fh87h5fh57fq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-gxxrw,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-59c78cff8f-gxv7p_openstack(7c60d6d1-6e32-4c60-a7c8-7bfb48d7267b): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 03 08:56:52 crc kubenswrapper[4790]: E1003 08:56:52.399642 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-59c78cff8f-gxv7p" podUID="7c60d6d1-6e32-4c60-a7c8-7bfb48d7267b" Oct 03 08:56:52 crc kubenswrapper[4790]: E1003 08:56:52.405430 4790 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.58:5001/podified-master-centos10/openstack-neutron-server:watcher_latest" Oct 03 08:56:52 crc kubenswrapper[4790]: E1003 08:56:52.405476 4790 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.58:5001/podified-master-centos10/openstack-neutron-server:watcher_latest" Oct 03 08:56:52 crc kubenswrapper[4790]: E1003 08:56:52.405625 4790 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:38.102.83.58:5001/podified-master-centos10/openstack-neutron-server:watcher_latest,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nffh5bdhf4h5f8h79h55h77h58fh56dh7bh6fh578hbch55dh68h56bhd9h65dh57ch658hc9h566h666h688h58h65dh684h5d7h6ch575h5d6h88q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-hmwdq,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-5d46db5bb7-87gxz_openstack(c4776223-9bc5-480f-acb7-d71c662bd3a8): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 03 08:56:52 crc kubenswrapper[4790]: E1003 08:56:52.407051 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-5d46db5bb7-87gxz" podUID="c4776223-9bc5-480f-acb7-d71c662bd3a8" Oct 03 08:56:52 crc kubenswrapper[4790]: I1003 08:56:52.900706 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 03 08:56:53 crc kubenswrapper[4790]: I1003 08:56:53.280185 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-86b4778477-cpjz6"] Oct 03 08:56:53 crc kubenswrapper[4790]: I1003 08:56:53.286714 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57dc99974f-9r62r"] Oct 03 08:56:53 crc kubenswrapper[4790]: W1003 08:56:53.290469 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod007c7c03_1417_496d_b35d_242dbaacc078.slice/crio-2edc6a756a768e20c71cf171b5688e661aaaebc7d72841def190a28be4b805aa WatchSource:0}: Error finding container 2edc6a756a768e20c71cf171b5688e661aaaebc7d72841def190a28be4b805aa: Status 404 returned error can't find the container with id 2edc6a756a768e20c71cf171b5688e661aaaebc7d72841def190a28be4b805aa Oct 03 08:56:53 crc kubenswrapper[4790]: W1003 08:56:53.291679 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5d262119_1fe3_42b9_bdc1_fe03ce687c2e.slice/crio-fb872bff84c1fe003745acc1857df562012325f7d6d0d5b62971acff1b1eef24 WatchSource:0}: Error finding container fb872bff84c1fe003745acc1857df562012325f7d6d0d5b62971acff1b1eef24: Status 404 returned error can't find the container with id fb872bff84c1fe003745acc1857df562012325f7d6d0d5b62971acff1b1eef24 Oct 03 08:56:53 crc kubenswrapper[4790]: I1003 08:56:53.307022 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5d46db5bb7-87gxz" Oct 03 08:56:53 crc kubenswrapper[4790]: I1003 08:56:53.468680 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hmwdq\" (UniqueName: \"kubernetes.io/projected/c4776223-9bc5-480f-acb7-d71c662bd3a8-kube-api-access-hmwdq\") pod \"c4776223-9bc5-480f-acb7-d71c662bd3a8\" (UID: \"c4776223-9bc5-480f-acb7-d71c662bd3a8\") " Oct 03 08:56:53 crc kubenswrapper[4790]: I1003 08:56:53.469083 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c4776223-9bc5-480f-acb7-d71c662bd3a8-config\") pod \"c4776223-9bc5-480f-acb7-d71c662bd3a8\" (UID: \"c4776223-9bc5-480f-acb7-d71c662bd3a8\") " Oct 03 08:56:53 crc kubenswrapper[4790]: I1003 08:56:53.469987 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c4776223-9bc5-480f-acb7-d71c662bd3a8-config" (OuterVolumeSpecName: "config") pod "c4776223-9bc5-480f-acb7-d71c662bd3a8" (UID: "c4776223-9bc5-480f-acb7-d71c662bd3a8"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:56:53 crc kubenswrapper[4790]: I1003 08:56:53.478025 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c4776223-9bc5-480f-acb7-d71c662bd3a8-kube-api-access-hmwdq" (OuterVolumeSpecName: "kube-api-access-hmwdq") pod "c4776223-9bc5-480f-acb7-d71c662bd3a8" (UID: "c4776223-9bc5-480f-acb7-d71c662bd3a8"). InnerVolumeSpecName "kube-api-access-hmwdq". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:56:53 crc kubenswrapper[4790]: I1003 08:56:53.572658 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hmwdq\" (UniqueName: \"kubernetes.io/projected/c4776223-9bc5-480f-acb7-d71c662bd3a8-kube-api-access-hmwdq\") on node \"crc\" DevicePath \"\"" Oct 03 08:56:53 crc kubenswrapper[4790]: I1003 08:56:53.572695 4790 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c4776223-9bc5-480f-acb7-d71c662bd3a8-config\") on node \"crc\" DevicePath \"\"" Oct 03 08:56:53 crc kubenswrapper[4790]: I1003 08:56:53.630997 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Oct 03 08:56:53 crc kubenswrapper[4790]: I1003 08:56:53.635120 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Oct 03 08:56:53 crc kubenswrapper[4790]: I1003 08:56:53.638294 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Oct 03 08:56:53 crc kubenswrapper[4790]: I1003 08:56:53.638625 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-nb-ovndbs" Oct 03 08:56:53 crc kubenswrapper[4790]: I1003 08:56:53.639775 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Oct 03 08:56:53 crc kubenswrapper[4790]: I1003 08:56:53.639822 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-9xff8" Oct 03 08:56:53 crc kubenswrapper[4790]: I1003 08:56:53.654219 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Oct 03 08:56:53 crc kubenswrapper[4790]: I1003 08:56:53.776855 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/6330b59a-ef23-4482-9fa5-58ccb2f6bc5a-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"6330b59a-ef23-4482-9fa5-58ccb2f6bc5a\") " pod="openstack/ovsdbserver-nb-0" Oct 03 08:56:53 crc kubenswrapper[4790]: I1003 08:56:53.776972 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"ovsdbserver-nb-0\" (UID: \"6330b59a-ef23-4482-9fa5-58ccb2f6bc5a\") " pod="openstack/ovsdbserver-nb-0" Oct 03 08:56:53 crc kubenswrapper[4790]: I1003 08:56:53.777307 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/6330b59a-ef23-4482-9fa5-58ccb2f6bc5a-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"6330b59a-ef23-4482-9fa5-58ccb2f6bc5a\") " pod="openstack/ovsdbserver-nb-0" Oct 03 08:56:53 crc kubenswrapper[4790]: I1003 08:56:53.777343 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6330b59a-ef23-4482-9fa5-58ccb2f6bc5a-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"6330b59a-ef23-4482-9fa5-58ccb2f6bc5a\") " pod="openstack/ovsdbserver-nb-0" Oct 03 08:56:53 crc kubenswrapper[4790]: I1003 08:56:53.777504 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6330b59a-ef23-4482-9fa5-58ccb2f6bc5a-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"6330b59a-ef23-4482-9fa5-58ccb2f6bc5a\") " pod="openstack/ovsdbserver-nb-0" Oct 03 08:56:53 crc kubenswrapper[4790]: I1003 08:56:53.777625 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8c4mz\" (UniqueName: \"kubernetes.io/projected/6330b59a-ef23-4482-9fa5-58ccb2f6bc5a-kube-api-access-8c4mz\") pod \"ovsdbserver-nb-0\" (UID: \"6330b59a-ef23-4482-9fa5-58ccb2f6bc5a\") " pod="openstack/ovsdbserver-nb-0" Oct 03 08:56:53 crc kubenswrapper[4790]: I1003 08:56:53.777705 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6330b59a-ef23-4482-9fa5-58ccb2f6bc5a-config\") pod \"ovsdbserver-nb-0\" (UID: \"6330b59a-ef23-4482-9fa5-58ccb2f6bc5a\") " pod="openstack/ovsdbserver-nb-0" Oct 03 08:56:53 crc kubenswrapper[4790]: I1003 08:56:53.778225 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/6330b59a-ef23-4482-9fa5-58ccb2f6bc5a-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"6330b59a-ef23-4482-9fa5-58ccb2f6bc5a\") " pod="openstack/ovsdbserver-nb-0" Oct 03 08:56:53 crc kubenswrapper[4790]: I1003 08:56:53.904141 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/6330b59a-ef23-4482-9fa5-58ccb2f6bc5a-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"6330b59a-ef23-4482-9fa5-58ccb2f6bc5a\") " pod="openstack/ovsdbserver-nb-0" Oct 03 08:56:53 crc kubenswrapper[4790]: I1003 08:56:53.904490 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/6330b59a-ef23-4482-9fa5-58ccb2f6bc5a-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"6330b59a-ef23-4482-9fa5-58ccb2f6bc5a\") " pod="openstack/ovsdbserver-nb-0" Oct 03 08:56:53 crc kubenswrapper[4790]: I1003 08:56:53.904531 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"ovsdbserver-nb-0\" (UID: \"6330b59a-ef23-4482-9fa5-58ccb2f6bc5a\") " pod="openstack/ovsdbserver-nb-0" Oct 03 08:56:53 crc kubenswrapper[4790]: I1003 08:56:53.904612 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/6330b59a-ef23-4482-9fa5-58ccb2f6bc5a-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"6330b59a-ef23-4482-9fa5-58ccb2f6bc5a\") " pod="openstack/ovsdbserver-nb-0" Oct 03 08:56:53 crc kubenswrapper[4790]: I1003 08:56:53.904637 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6330b59a-ef23-4482-9fa5-58ccb2f6bc5a-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"6330b59a-ef23-4482-9fa5-58ccb2f6bc5a\") " pod="openstack/ovsdbserver-nb-0" Oct 03 08:56:53 crc kubenswrapper[4790]: I1003 08:56:53.904661 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6330b59a-ef23-4482-9fa5-58ccb2f6bc5a-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"6330b59a-ef23-4482-9fa5-58ccb2f6bc5a\") " pod="openstack/ovsdbserver-nb-0" Oct 03 08:56:53 crc kubenswrapper[4790]: I1003 08:56:53.905393 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8c4mz\" (UniqueName: \"kubernetes.io/projected/6330b59a-ef23-4482-9fa5-58ccb2f6bc5a-kube-api-access-8c4mz\") pod \"ovsdbserver-nb-0\" (UID: \"6330b59a-ef23-4482-9fa5-58ccb2f6bc5a\") " pod="openstack/ovsdbserver-nb-0" Oct 03 08:56:53 crc kubenswrapper[4790]: I1003 08:56:53.905423 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6330b59a-ef23-4482-9fa5-58ccb2f6bc5a-config\") pod \"ovsdbserver-nb-0\" (UID: \"6330b59a-ef23-4482-9fa5-58ccb2f6bc5a\") " pod="openstack/ovsdbserver-nb-0" Oct 03 08:56:53 crc kubenswrapper[4790]: I1003 08:56:53.907334 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6330b59a-ef23-4482-9fa5-58ccb2f6bc5a-config\") pod \"ovsdbserver-nb-0\" (UID: \"6330b59a-ef23-4482-9fa5-58ccb2f6bc5a\") " pod="openstack/ovsdbserver-nb-0" Oct 03 08:56:53 crc kubenswrapper[4790]: I1003 08:56:53.907615 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/6330b59a-ef23-4482-9fa5-58ccb2f6bc5a-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"6330b59a-ef23-4482-9fa5-58ccb2f6bc5a\") " pod="openstack/ovsdbserver-nb-0" Oct 03 08:56:53 crc kubenswrapper[4790]: I1003 08:56:53.911754 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6330b59a-ef23-4482-9fa5-58ccb2f6bc5a-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"6330b59a-ef23-4482-9fa5-58ccb2f6bc5a\") " pod="openstack/ovsdbserver-nb-0" Oct 03 08:56:53 crc kubenswrapper[4790]: I1003 08:56:53.911970 4790 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"ovsdbserver-nb-0\" (UID: \"6330b59a-ef23-4482-9fa5-58ccb2f6bc5a\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/ovsdbserver-nb-0" Oct 03 08:56:53 crc kubenswrapper[4790]: I1003 08:56:53.920228 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/6330b59a-ef23-4482-9fa5-58ccb2f6bc5a-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"6330b59a-ef23-4482-9fa5-58ccb2f6bc5a\") " pod="openstack/ovsdbserver-nb-0" Oct 03 08:56:53 crc kubenswrapper[4790]: I1003 08:56:53.925466 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6330b59a-ef23-4482-9fa5-58ccb2f6bc5a-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"6330b59a-ef23-4482-9fa5-58ccb2f6bc5a\") " pod="openstack/ovsdbserver-nb-0" Oct 03 08:56:53 crc kubenswrapper[4790]: I1003 08:56:53.926431 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/6330b59a-ef23-4482-9fa5-58ccb2f6bc5a-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"6330b59a-ef23-4482-9fa5-58ccb2f6bc5a\") " pod="openstack/ovsdbserver-nb-0" Oct 03 08:56:53 crc kubenswrapper[4790]: I1003 08:56:53.985954 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8c4mz\" (UniqueName: \"kubernetes.io/projected/6330b59a-ef23-4482-9fa5-58ccb2f6bc5a-kube-api-access-8c4mz\") pod \"ovsdbserver-nb-0\" (UID: \"6330b59a-ef23-4482-9fa5-58ccb2f6bc5a\") " pod="openstack/ovsdbserver-nb-0" Oct 03 08:56:53 crc kubenswrapper[4790]: I1003 08:56:53.997341 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-8ljvm"] Oct 03 08:56:54 crc kubenswrapper[4790]: I1003 08:56:53.998078 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"f5ab6b2b-5bba-4043-b116-b8d996df1ace","Type":"ContainerStarted","Data":"e4f27495358a00205ee47f385968638e41b2517d471d590ba99cc7a1f150bfd6"} Oct 03 08:56:54 crc kubenswrapper[4790]: W1003 08:56:54.015754 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dc14abb_14bb_4bee_9864_8507e37d824d.slice/crio-d348ec39b29be4bfa56af1730b69c2d7e9b6cfda886f84630fbe86bf5dce8091 WatchSource:0}: Error finding container d348ec39b29be4bfa56af1730b69c2d7e9b6cfda886f84630fbe86bf5dce8091: Status 404 returned error can't find the container with id d348ec39b29be4bfa56af1730b69c2d7e9b6cfda886f84630fbe86bf5dce8091 Oct 03 08:56:54 crc kubenswrapper[4790]: I1003 08:56:54.020302 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"ovsdbserver-nb-0\" (UID: \"6330b59a-ef23-4482-9fa5-58ccb2f6bc5a\") " pod="openstack/ovsdbserver-nb-0" Oct 03 08:56:54 crc kubenswrapper[4790]: I1003 08:56:54.023199 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-59c78cff8f-gxv7p" event={"ID":"7c60d6d1-6e32-4c60-a7c8-7bfb48d7267b","Type":"ContainerDied","Data":"24d354d8f83d55e63cab2b1e36adaa11a290f9cfe4cf5dd8ff734068e84e4dc8"} Oct 03 08:56:54 crc kubenswrapper[4790]: I1003 08:56:54.023240 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="24d354d8f83d55e63cab2b1e36adaa11a290f9cfe4cf5dd8ff734068e84e4dc8" Oct 03 08:56:54 crc kubenswrapper[4790]: I1003 08:56:54.024034 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-59c78cff8f-gxv7p" Oct 03 08:56:54 crc kubenswrapper[4790]: I1003 08:56:54.042365 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 03 08:56:54 crc kubenswrapper[4790]: I1003 08:56:54.043836 4790 generic.go:334] "Generic (PLEG): container finished" podID="5d262119-1fe3-42b9-bdc1-fe03ce687c2e" containerID="6ac1685a22e2aacabeeb8835254010656df644903df90bfb40bf374cfd2ba724" exitCode=0 Oct 03 08:56:54 crc kubenswrapper[4790]: I1003 08:56:54.044041 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57dc99974f-9r62r" event={"ID":"5d262119-1fe3-42b9-bdc1-fe03ce687c2e","Type":"ContainerDied","Data":"6ac1685a22e2aacabeeb8835254010656df644903df90bfb40bf374cfd2ba724"} Oct 03 08:56:54 crc kubenswrapper[4790]: I1003 08:56:54.044159 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57dc99974f-9r62r" event={"ID":"5d262119-1fe3-42b9-bdc1-fe03ce687c2e","Type":"ContainerStarted","Data":"fb872bff84c1fe003745acc1857df562012325f7d6d0d5b62971acff1b1eef24"} Oct 03 08:56:54 crc kubenswrapper[4790]: I1003 08:56:54.049950 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d46db5bb7-87gxz" event={"ID":"c4776223-9bc5-480f-acb7-d71c662bd3a8","Type":"ContainerDied","Data":"4d2170a2549f7289e12696df3e75af242a8b6e6877c54a105783ee846094999c"} Oct 03 08:56:54 crc kubenswrapper[4790]: I1003 08:56:54.050052 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5d46db5bb7-87gxz" Oct 03 08:56:54 crc kubenswrapper[4790]: I1003 08:56:54.053775 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7b9746b6c-rmnft"] Oct 03 08:56:54 crc kubenswrapper[4790]: I1003 08:56:54.059610 4790 generic.go:334] "Generic (PLEG): container finished" podID="007c7c03-1417-496d-b35d-242dbaacc078" containerID="6def2da72aa16292a45d28a978a2d098fc01aa57a2572615781488adb41bd6de" exitCode=0 Oct 03 08:56:54 crc kubenswrapper[4790]: I1003 08:56:54.059759 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86b4778477-cpjz6" event={"ID":"007c7c03-1417-496d-b35d-242dbaacc078","Type":"ContainerDied","Data":"6def2da72aa16292a45d28a978a2d098fc01aa57a2572615781488adb41bd6de"} Oct 03 08:56:54 crc kubenswrapper[4790]: I1003 08:56:54.060468 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86b4778477-cpjz6" event={"ID":"007c7c03-1417-496d-b35d-242dbaacc078","Type":"ContainerStarted","Data":"2edc6a756a768e20c71cf171b5688e661aaaebc7d72841def190a28be4b805aa"} Oct 03 08:56:54 crc kubenswrapper[4790]: I1003 08:56:54.060918 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-notifications-server-0"] Oct 03 08:56:54 crc kubenswrapper[4790]: I1003 08:56:54.084369 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Oct 03 08:56:54 crc kubenswrapper[4790]: I1003 08:56:54.099939 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Oct 03 08:56:54 crc kubenswrapper[4790]: I1003 08:56:54.115557 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 03 08:56:54 crc kubenswrapper[4790]: I1003 08:56:54.128239 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Oct 03 08:56:54 crc kubenswrapper[4790]: I1003 08:56:54.139230 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Oct 03 08:56:54 crc kubenswrapper[4790]: I1003 08:56:54.161795 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Oct 03 08:56:54 crc kubenswrapper[4790]: I1003 08:56:54.213827 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5d46db5bb7-87gxz"] Oct 03 08:56:54 crc kubenswrapper[4790]: I1003 08:56:54.218600 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7c60d6d1-6e32-4c60-a7c8-7bfb48d7267b-dns-svc\") pod \"7c60d6d1-6e32-4c60-a7c8-7bfb48d7267b\" (UID: \"7c60d6d1-6e32-4c60-a7c8-7bfb48d7267b\") " Oct 03 08:56:54 crc kubenswrapper[4790]: I1003 08:56:54.218747 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7c60d6d1-6e32-4c60-a7c8-7bfb48d7267b-config\") pod \"7c60d6d1-6e32-4c60-a7c8-7bfb48d7267b\" (UID: \"7c60d6d1-6e32-4c60-a7c8-7bfb48d7267b\") " Oct 03 08:56:54 crc kubenswrapper[4790]: I1003 08:56:54.218788 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gxxrw\" (UniqueName: \"kubernetes.io/projected/7c60d6d1-6e32-4c60-a7c8-7bfb48d7267b-kube-api-access-gxxrw\") pod \"7c60d6d1-6e32-4c60-a7c8-7bfb48d7267b\" (UID: \"7c60d6d1-6e32-4c60-a7c8-7bfb48d7267b\") " Oct 03 08:56:54 crc kubenswrapper[4790]: I1003 08:56:54.219229 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7c60d6d1-6e32-4c60-a7c8-7bfb48d7267b-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "7c60d6d1-6e32-4c60-a7c8-7bfb48d7267b" (UID: "7c60d6d1-6e32-4c60-a7c8-7bfb48d7267b"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:56:54 crc kubenswrapper[4790]: I1003 08:56:54.219702 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7c60d6d1-6e32-4c60-a7c8-7bfb48d7267b-config" (OuterVolumeSpecName: "config") pod "7c60d6d1-6e32-4c60-a7c8-7bfb48d7267b" (UID: "7c60d6d1-6e32-4c60-a7c8-7bfb48d7267b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:56:54 crc kubenswrapper[4790]: I1003 08:56:54.225891 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7c60d6d1-6e32-4c60-a7c8-7bfb48d7267b-kube-api-access-gxxrw" (OuterVolumeSpecName: "kube-api-access-gxxrw") pod "7c60d6d1-6e32-4c60-a7c8-7bfb48d7267b" (UID: "7c60d6d1-6e32-4c60-a7c8-7bfb48d7267b"). InnerVolumeSpecName "kube-api-access-gxxrw". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:56:54 crc kubenswrapper[4790]: I1003 08:56:54.230395 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5d46db5bb7-87gxz"] Oct 03 08:56:54 crc kubenswrapper[4790]: I1003 08:56:54.235175 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-vrsc7"] Oct 03 08:56:54 crc kubenswrapper[4790]: I1003 08:56:54.272824 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Oct 03 08:56:54 crc kubenswrapper[4790]: I1003 08:56:54.320805 4790 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7c60d6d1-6e32-4c60-a7c8-7bfb48d7267b-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 03 08:56:54 crc kubenswrapper[4790]: I1003 08:56:54.320856 4790 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7c60d6d1-6e32-4c60-a7c8-7bfb48d7267b-config\") on node \"crc\" DevicePath \"\"" Oct 03 08:56:54 crc kubenswrapper[4790]: I1003 08:56:54.320868 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gxxrw\" (UniqueName: \"kubernetes.io/projected/7c60d6d1-6e32-4c60-a7c8-7bfb48d7267b-kube-api-access-gxxrw\") on node \"crc\" DevicePath \"\"" Oct 03 08:56:54 crc kubenswrapper[4790]: I1003 08:56:54.369837 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c4776223-9bc5-480f-acb7-d71c662bd3a8" path="/var/lib/kubelet/pods/c4776223-9bc5-480f-acb7-d71c662bd3a8/volumes" Oct 03 08:56:54 crc kubenswrapper[4790]: I1003 08:56:54.655668 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57dc99974f-9r62r" Oct 03 08:56:54 crc kubenswrapper[4790]: I1003 08:56:54.769835 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-6vcdh"] Oct 03 08:56:54 crc kubenswrapper[4790]: E1003 08:56:54.771454 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5d262119-1fe3-42b9-bdc1-fe03ce687c2e" containerName="init" Oct 03 08:56:54 crc kubenswrapper[4790]: I1003 08:56:54.771476 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d262119-1fe3-42b9-bdc1-fe03ce687c2e" containerName="init" Oct 03 08:56:54 crc kubenswrapper[4790]: I1003 08:56:54.771950 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="5d262119-1fe3-42b9-bdc1-fe03ce687c2e" containerName="init" Oct 03 08:56:54 crc kubenswrapper[4790]: I1003 08:56:54.791898 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-6vcdh" Oct 03 08:56:54 crc kubenswrapper[4790]: I1003 08:56:54.796874 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Oct 03 08:56:54 crc kubenswrapper[4790]: I1003 08:56:54.799786 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-6vcdh"] Oct 03 08:56:54 crc kubenswrapper[4790]: I1003 08:56:54.848705 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5d262119-1fe3-42b9-bdc1-fe03ce687c2e-config\") pod \"5d262119-1fe3-42b9-bdc1-fe03ce687c2e\" (UID: \"5d262119-1fe3-42b9-bdc1-fe03ce687c2e\") " Oct 03 08:56:54 crc kubenswrapper[4790]: I1003 08:56:54.849737 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5d262119-1fe3-42b9-bdc1-fe03ce687c2e-dns-svc\") pod \"5d262119-1fe3-42b9-bdc1-fe03ce687c2e\" (UID: \"5d262119-1fe3-42b9-bdc1-fe03ce687c2e\") " Oct 03 08:56:54 crc kubenswrapper[4790]: I1003 08:56:54.849816 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-65b6n\" (UniqueName: \"kubernetes.io/projected/5d262119-1fe3-42b9-bdc1-fe03ce687c2e-kube-api-access-65b6n\") pod \"5d262119-1fe3-42b9-bdc1-fe03ce687c2e\" (UID: \"5d262119-1fe3-42b9-bdc1-fe03ce687c2e\") " Oct 03 08:56:54 crc kubenswrapper[4790]: I1003 08:56:54.850121 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f892381b-8fc9-4352-915e-62d0b92cb7b6-config\") pod \"ovn-controller-metrics-6vcdh\" (UID: \"f892381b-8fc9-4352-915e-62d0b92cb7b6\") " pod="openstack/ovn-controller-metrics-6vcdh" Oct 03 08:56:54 crc kubenswrapper[4790]: I1003 08:56:54.850193 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-82v5g\" (UniqueName: \"kubernetes.io/projected/f892381b-8fc9-4352-915e-62d0b92cb7b6-kube-api-access-82v5g\") pod \"ovn-controller-metrics-6vcdh\" (UID: \"f892381b-8fc9-4352-915e-62d0b92cb7b6\") " pod="openstack/ovn-controller-metrics-6vcdh" Oct 03 08:56:54 crc kubenswrapper[4790]: I1003 08:56:54.850231 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/f892381b-8fc9-4352-915e-62d0b92cb7b6-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-6vcdh\" (UID: \"f892381b-8fc9-4352-915e-62d0b92cb7b6\") " pod="openstack/ovn-controller-metrics-6vcdh" Oct 03 08:56:54 crc kubenswrapper[4790]: I1003 08:56:54.850266 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/f892381b-8fc9-4352-915e-62d0b92cb7b6-ovn-rundir\") pod \"ovn-controller-metrics-6vcdh\" (UID: \"f892381b-8fc9-4352-915e-62d0b92cb7b6\") " pod="openstack/ovn-controller-metrics-6vcdh" Oct 03 08:56:54 crc kubenswrapper[4790]: I1003 08:56:54.850314 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/f892381b-8fc9-4352-915e-62d0b92cb7b6-ovs-rundir\") pod \"ovn-controller-metrics-6vcdh\" (UID: \"f892381b-8fc9-4352-915e-62d0b92cb7b6\") " pod="openstack/ovn-controller-metrics-6vcdh" Oct 03 08:56:54 crc kubenswrapper[4790]: I1003 08:56:54.850351 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f892381b-8fc9-4352-915e-62d0b92cb7b6-combined-ca-bundle\") pod \"ovn-controller-metrics-6vcdh\" (UID: \"f892381b-8fc9-4352-915e-62d0b92cb7b6\") " pod="openstack/ovn-controller-metrics-6vcdh" Oct 03 08:56:54 crc kubenswrapper[4790]: I1003 08:56:54.913812 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5d262119-1fe3-42b9-bdc1-fe03ce687c2e-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "5d262119-1fe3-42b9-bdc1-fe03ce687c2e" (UID: "5d262119-1fe3-42b9-bdc1-fe03ce687c2e"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:56:54 crc kubenswrapper[4790]: I1003 08:56:54.915187 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5d262119-1fe3-42b9-bdc1-fe03ce687c2e-config" (OuterVolumeSpecName: "config") pod "5d262119-1fe3-42b9-bdc1-fe03ce687c2e" (UID: "5d262119-1fe3-42b9-bdc1-fe03ce687c2e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:56:54 crc kubenswrapper[4790]: I1003 08:56:54.919739 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5d262119-1fe3-42b9-bdc1-fe03ce687c2e-kube-api-access-65b6n" (OuterVolumeSpecName: "kube-api-access-65b6n") pod "5d262119-1fe3-42b9-bdc1-fe03ce687c2e" (UID: "5d262119-1fe3-42b9-bdc1-fe03ce687c2e"). InnerVolumeSpecName "kube-api-access-65b6n". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:56:54 crc kubenswrapper[4790]: I1003 08:56:54.961418 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-82v5g\" (UniqueName: \"kubernetes.io/projected/f892381b-8fc9-4352-915e-62d0b92cb7b6-kube-api-access-82v5g\") pod \"ovn-controller-metrics-6vcdh\" (UID: \"f892381b-8fc9-4352-915e-62d0b92cb7b6\") " pod="openstack/ovn-controller-metrics-6vcdh" Oct 03 08:56:54 crc kubenswrapper[4790]: I1003 08:56:54.961466 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/f892381b-8fc9-4352-915e-62d0b92cb7b6-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-6vcdh\" (UID: \"f892381b-8fc9-4352-915e-62d0b92cb7b6\") " pod="openstack/ovn-controller-metrics-6vcdh" Oct 03 08:56:54 crc kubenswrapper[4790]: I1003 08:56:54.961492 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/f892381b-8fc9-4352-915e-62d0b92cb7b6-ovn-rundir\") pod \"ovn-controller-metrics-6vcdh\" (UID: \"f892381b-8fc9-4352-915e-62d0b92cb7b6\") " pod="openstack/ovn-controller-metrics-6vcdh" Oct 03 08:56:54 crc kubenswrapper[4790]: I1003 08:56:54.961507 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/f892381b-8fc9-4352-915e-62d0b92cb7b6-ovs-rundir\") pod \"ovn-controller-metrics-6vcdh\" (UID: \"f892381b-8fc9-4352-915e-62d0b92cb7b6\") " pod="openstack/ovn-controller-metrics-6vcdh" Oct 03 08:56:54 crc kubenswrapper[4790]: I1003 08:56:54.961530 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f892381b-8fc9-4352-915e-62d0b92cb7b6-combined-ca-bundle\") pod \"ovn-controller-metrics-6vcdh\" (UID: \"f892381b-8fc9-4352-915e-62d0b92cb7b6\") " pod="openstack/ovn-controller-metrics-6vcdh" Oct 03 08:56:54 crc kubenswrapper[4790]: I1003 08:56:54.961593 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f892381b-8fc9-4352-915e-62d0b92cb7b6-config\") pod \"ovn-controller-metrics-6vcdh\" (UID: \"f892381b-8fc9-4352-915e-62d0b92cb7b6\") " pod="openstack/ovn-controller-metrics-6vcdh" Oct 03 08:56:54 crc kubenswrapper[4790]: I1003 08:56:54.961647 4790 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5d262119-1fe3-42b9-bdc1-fe03ce687c2e-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 03 08:56:54 crc kubenswrapper[4790]: I1003 08:56:54.961658 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-65b6n\" (UniqueName: \"kubernetes.io/projected/5d262119-1fe3-42b9-bdc1-fe03ce687c2e-kube-api-access-65b6n\") on node \"crc\" DevicePath \"\"" Oct 03 08:56:54 crc kubenswrapper[4790]: I1003 08:56:54.961681 4790 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5d262119-1fe3-42b9-bdc1-fe03ce687c2e-config\") on node \"crc\" DevicePath \"\"" Oct 03 08:56:54 crc kubenswrapper[4790]: I1003 08:56:54.962192 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f892381b-8fc9-4352-915e-62d0b92cb7b6-config\") pod \"ovn-controller-metrics-6vcdh\" (UID: \"f892381b-8fc9-4352-915e-62d0b92cb7b6\") " pod="openstack/ovn-controller-metrics-6vcdh" Oct 03 08:56:54 crc kubenswrapper[4790]: I1003 08:56:54.963428 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/f892381b-8fc9-4352-915e-62d0b92cb7b6-ovs-rundir\") pod \"ovn-controller-metrics-6vcdh\" (UID: \"f892381b-8fc9-4352-915e-62d0b92cb7b6\") " pod="openstack/ovn-controller-metrics-6vcdh" Oct 03 08:56:54 crc kubenswrapper[4790]: I1003 08:56:54.963480 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/f892381b-8fc9-4352-915e-62d0b92cb7b6-ovn-rundir\") pod \"ovn-controller-metrics-6vcdh\" (UID: \"f892381b-8fc9-4352-915e-62d0b92cb7b6\") " pod="openstack/ovn-controller-metrics-6vcdh" Oct 03 08:56:54 crc kubenswrapper[4790]: I1003 08:56:54.971294 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f892381b-8fc9-4352-915e-62d0b92cb7b6-combined-ca-bundle\") pod \"ovn-controller-metrics-6vcdh\" (UID: \"f892381b-8fc9-4352-915e-62d0b92cb7b6\") " pod="openstack/ovn-controller-metrics-6vcdh" Oct 03 08:56:54 crc kubenswrapper[4790]: I1003 08:56:54.982239 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/f892381b-8fc9-4352-915e-62d0b92cb7b6-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-6vcdh\" (UID: \"f892381b-8fc9-4352-915e-62d0b92cb7b6\") " pod="openstack/ovn-controller-metrics-6vcdh" Oct 03 08:56:55 crc kubenswrapper[4790]: I1003 08:56:55.007899 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-82v5g\" (UniqueName: \"kubernetes.io/projected/f892381b-8fc9-4352-915e-62d0b92cb7b6-kube-api-access-82v5g\") pod \"ovn-controller-metrics-6vcdh\" (UID: \"f892381b-8fc9-4352-915e-62d0b92cb7b6\") " pod="openstack/ovn-controller-metrics-6vcdh" Oct 03 08:56:55 crc kubenswrapper[4790]: I1003 08:56:55.066665 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7b9746b6c-rmnft"] Oct 03 08:56:55 crc kubenswrapper[4790]: I1003 08:56:55.125998 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"89aeb08e-2699-4c1b-99ef-4ae2cfce9de3","Type":"ContainerStarted","Data":"abdf3e52c4372b9ad257a4f02e171aa04c969b8d6b3d7889d3fbb030bcbf5294"} Oct 03 08:56:55 crc kubenswrapper[4790]: I1003 08:56:55.133152 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-8ljvm" event={"ID":"845dcf0a-43fb-4760-aac6-d12c02a8b785","Type":"ContainerStarted","Data":"73998a7851ff685943dfe73d2b89a6fef658fb694aba6a03b9034ec8a57d0a3f"} Oct 03 08:56:55 crc kubenswrapper[4790]: I1003 08:56:55.135759 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-788c899457-p84kc"] Oct 03 08:56:55 crc kubenswrapper[4790]: I1003 08:56:55.142970 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-788c899457-p84kc" Oct 03 08:56:55 crc kubenswrapper[4790]: I1003 08:56:55.148919 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-6vcdh" Oct 03 08:56:55 crc kubenswrapper[4790]: I1003 08:56:55.153240 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Oct 03 08:56:55 crc kubenswrapper[4790]: I1003 08:56:55.160943 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-788c899457-p84kc"] Oct 03 08:56:55 crc kubenswrapper[4790]: I1003 08:56:55.168725 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/18264edd-1eee-4089-ad97-63c7092a8650-config\") pod \"dnsmasq-dns-788c899457-p84kc\" (UID: \"18264edd-1eee-4089-ad97-63c7092a8650\") " pod="openstack/dnsmasq-dns-788c899457-p84kc" Oct 03 08:56:55 crc kubenswrapper[4790]: I1003 08:56:55.168801 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w72nw\" (UniqueName: \"kubernetes.io/projected/18264edd-1eee-4089-ad97-63c7092a8650-kube-api-access-w72nw\") pod \"dnsmasq-dns-788c899457-p84kc\" (UID: \"18264edd-1eee-4089-ad97-63c7092a8650\") " pod="openstack/dnsmasq-dns-788c899457-p84kc" Oct 03 08:56:55 crc kubenswrapper[4790]: I1003 08:56:55.168829 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/18264edd-1eee-4089-ad97-63c7092a8650-ovsdbserver-sb\") pod \"dnsmasq-dns-788c899457-p84kc\" (UID: \"18264edd-1eee-4089-ad97-63c7092a8650\") " pod="openstack/dnsmasq-dns-788c899457-p84kc" Oct 03 08:56:55 crc kubenswrapper[4790]: I1003 08:56:55.168875 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/18264edd-1eee-4089-ad97-63c7092a8650-dns-svc\") pod \"dnsmasq-dns-788c899457-p84kc\" (UID: \"18264edd-1eee-4089-ad97-63c7092a8650\") " pod="openstack/dnsmasq-dns-788c899457-p84kc" Oct 03 08:56:55 crc kubenswrapper[4790]: I1003 08:56:55.178449 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"c53865bf-810d-440a-bfa4-171e3183066c","Type":"ContainerStarted","Data":"1fde777d7078d74c20ecebbf55f957eae438c5cae2670f6711bac329324ac02f"} Oct 03 08:56:55 crc kubenswrapper[4790]: I1003 08:56:55.195681 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"3b885fd9-8ed3-45b9-8e6a-63069940ae79","Type":"ContainerStarted","Data":"ffacb8f354dfe308aeb46965d427da0c334254a45fe1ef7dc544c59102288b5b"} Oct 03 08:56:55 crc kubenswrapper[4790]: I1003 08:56:55.210619 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"3dc14abb-14bb-4bee-9864-8507e37d824d","Type":"ContainerStarted","Data":"d348ec39b29be4bfa56af1730b69c2d7e9b6cfda886f84630fbe86bf5dce8091"} Oct 03 08:56:55 crc kubenswrapper[4790]: I1003 08:56:55.215077 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"010b668f-22e4-4ad8-aae5-decf063cd49b","Type":"ContainerStarted","Data":"cf49f028d069f3e8f50a5c963473172600dcb177dfb234e822e81c7c2b915503"} Oct 03 08:56:55 crc kubenswrapper[4790]: I1003 08:56:55.218154 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57dc99974f-9r62r" Oct 03 08:56:55 crc kubenswrapper[4790]: I1003 08:56:55.218495 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57dc99974f-9r62r" event={"ID":"5d262119-1fe3-42b9-bdc1-fe03ce687c2e","Type":"ContainerDied","Data":"fb872bff84c1fe003745acc1857df562012325f7d6d0d5b62971acff1b1eef24"} Oct 03 08:56:55 crc kubenswrapper[4790]: I1003 08:56:55.218557 4790 scope.go:117] "RemoveContainer" containerID="6ac1685a22e2aacabeeb8835254010656df644903df90bfb40bf374cfd2ba724" Oct 03 08:56:55 crc kubenswrapper[4790]: I1003 08:56:55.225111 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Oct 03 08:56:55 crc kubenswrapper[4790]: I1003 08:56:55.234709 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-vrsc7" event={"ID":"5d8a4a4d-fc2d-49d8-80fd-8b6faf7e3aec","Type":"ContainerStarted","Data":"a8034b39c7fe53623e645bcf1dc51307101d57b688ffae21a2908fe708bce29e"} Oct 03 08:56:55 crc kubenswrapper[4790]: I1003 08:56:55.240274 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-notifications-server-0" event={"ID":"04926fe2-c5e2-4663-8945-448a17ec2a8b","Type":"ContainerStarted","Data":"3e5f3e9a7cd60c683a4289a5096e0a431ad4781be342c590c8e3571448757ff3"} Oct 03 08:56:55 crc kubenswrapper[4790]: I1003 08:56:55.243523 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"f79d57e8-680a-41b6-bca9-c233a695be4e","Type":"ContainerStarted","Data":"b222b3d339eff75c384074dc30ffa77e7683d9a7296948d7303c51f3a939e5f7"} Oct 03 08:56:55 crc kubenswrapper[4790]: I1003 08:56:55.247698 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"a93cbe46-9c4b-49b9-916f-b26ed8c80fe9","Type":"ContainerStarted","Data":"6bf9ae8c2995bec5c9ed445d0c82a475195feee6453695ddc6d967018851ec9b"} Oct 03 08:56:55 crc kubenswrapper[4790]: I1003 08:56:55.251559 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86b4778477-cpjz6" event={"ID":"007c7c03-1417-496d-b35d-242dbaacc078","Type":"ContainerStarted","Data":"b98bfe339a6912c522d44af2ec696bcec0c67a89c5e7486e0ed7b3c99d3eceb9"} Oct 03 08:56:55 crc kubenswrapper[4790]: I1003 08:56:55.252042 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-86b4778477-cpjz6" Oct 03 08:56:55 crc kubenswrapper[4790]: I1003 08:56:55.257152 4790 generic.go:334] "Generic (PLEG): container finished" podID="9269d241-5eae-4eb1-b5a9-c4ac47ce177e" containerID="4227c40deecaa8a969cfdfaa0a9738eb174bad9141bd35d8295af11e64a3c6d2" exitCode=0 Oct 03 08:56:55 crc kubenswrapper[4790]: I1003 08:56:55.257229 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-59c78cff8f-gxv7p" Oct 03 08:56:55 crc kubenswrapper[4790]: I1003 08:56:55.258015 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b9746b6c-rmnft" event={"ID":"9269d241-5eae-4eb1-b5a9-c4ac47ce177e","Type":"ContainerDied","Data":"4227c40deecaa8a969cfdfaa0a9738eb174bad9141bd35d8295af11e64a3c6d2"} Oct 03 08:56:55 crc kubenswrapper[4790]: I1003 08:56:55.258118 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b9746b6c-rmnft" event={"ID":"9269d241-5eae-4eb1-b5a9-c4ac47ce177e","Type":"ContainerStarted","Data":"0aaf0d8e5dbae4a3d88f9853c6f4c735bcd837f1857cc0e228c107d09cebedb6"} Oct 03 08:56:55 crc kubenswrapper[4790]: I1003 08:56:55.270073 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/18264edd-1eee-4089-ad97-63c7092a8650-config\") pod \"dnsmasq-dns-788c899457-p84kc\" (UID: \"18264edd-1eee-4089-ad97-63c7092a8650\") " pod="openstack/dnsmasq-dns-788c899457-p84kc" Oct 03 08:56:55 crc kubenswrapper[4790]: I1003 08:56:55.271547 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w72nw\" (UniqueName: \"kubernetes.io/projected/18264edd-1eee-4089-ad97-63c7092a8650-kube-api-access-w72nw\") pod \"dnsmasq-dns-788c899457-p84kc\" (UID: \"18264edd-1eee-4089-ad97-63c7092a8650\") " pod="openstack/dnsmasq-dns-788c899457-p84kc" Oct 03 08:56:55 crc kubenswrapper[4790]: I1003 08:56:55.271642 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/18264edd-1eee-4089-ad97-63c7092a8650-ovsdbserver-sb\") pod \"dnsmasq-dns-788c899457-p84kc\" (UID: \"18264edd-1eee-4089-ad97-63c7092a8650\") " pod="openstack/dnsmasq-dns-788c899457-p84kc" Oct 03 08:56:55 crc kubenswrapper[4790]: I1003 08:56:55.271849 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/18264edd-1eee-4089-ad97-63c7092a8650-dns-svc\") pod \"dnsmasq-dns-788c899457-p84kc\" (UID: \"18264edd-1eee-4089-ad97-63c7092a8650\") " pod="openstack/dnsmasq-dns-788c899457-p84kc" Oct 03 08:56:55 crc kubenswrapper[4790]: I1003 08:56:55.271972 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/18264edd-1eee-4089-ad97-63c7092a8650-config\") pod \"dnsmasq-dns-788c899457-p84kc\" (UID: \"18264edd-1eee-4089-ad97-63c7092a8650\") " pod="openstack/dnsmasq-dns-788c899457-p84kc" Oct 03 08:56:55 crc kubenswrapper[4790]: I1003 08:56:55.273344 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/18264edd-1eee-4089-ad97-63c7092a8650-dns-svc\") pod \"dnsmasq-dns-788c899457-p84kc\" (UID: \"18264edd-1eee-4089-ad97-63c7092a8650\") " pod="openstack/dnsmasq-dns-788c899457-p84kc" Oct 03 08:56:55 crc kubenswrapper[4790]: I1003 08:56:55.274362 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/18264edd-1eee-4089-ad97-63c7092a8650-ovsdbserver-sb\") pod \"dnsmasq-dns-788c899457-p84kc\" (UID: \"18264edd-1eee-4089-ad97-63c7092a8650\") " pod="openstack/dnsmasq-dns-788c899457-p84kc" Oct 03 08:56:55 crc kubenswrapper[4790]: I1003 08:56:55.286745 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57dc99974f-9r62r"] Oct 03 08:56:55 crc kubenswrapper[4790]: I1003 08:56:55.301083 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-57dc99974f-9r62r"] Oct 03 08:56:55 crc kubenswrapper[4790]: I1003 08:56:55.308414 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w72nw\" (UniqueName: \"kubernetes.io/projected/18264edd-1eee-4089-ad97-63c7092a8650-kube-api-access-w72nw\") pod \"dnsmasq-dns-788c899457-p84kc\" (UID: \"18264edd-1eee-4089-ad97-63c7092a8650\") " pod="openstack/dnsmasq-dns-788c899457-p84kc" Oct 03 08:56:55 crc kubenswrapper[4790]: I1003 08:56:55.343175 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-59c78cff8f-gxv7p"] Oct 03 08:56:55 crc kubenswrapper[4790]: I1003 08:56:55.355919 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-59c78cff8f-gxv7p"] Oct 03 08:56:55 crc kubenswrapper[4790]: I1003 08:56:55.370080 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-86b4778477-cpjz6" podStartSLOduration=17.290692824 podStartE2EDuration="17.370053636s" podCreationTimestamp="2025-10-03 08:56:38 +0000 UTC" firstStartedPulling="2025-10-03 08:56:53.293487205 +0000 UTC m=+999.646421443" lastFinishedPulling="2025-10-03 08:56:53.372848017 +0000 UTC m=+999.725782255" observedRunningTime="2025-10-03 08:56:55.341756792 +0000 UTC m=+1001.694691030" watchObservedRunningTime="2025-10-03 08:56:55.370053636 +0000 UTC m=+1001.722987874" Oct 03 08:56:55 crc kubenswrapper[4790]: I1003 08:56:55.443402 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-86b4778477-cpjz6"] Oct 03 08:56:55 crc kubenswrapper[4790]: I1003 08:56:55.474979 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7bfbdb9bd9-46r9c"] Oct 03 08:56:55 crc kubenswrapper[4790]: I1003 08:56:55.476636 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7bfbdb9bd9-46r9c" Oct 03 08:56:55 crc kubenswrapper[4790]: I1003 08:56:55.482272 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-788c899457-p84kc" Oct 03 08:56:55 crc kubenswrapper[4790]: I1003 08:56:55.483938 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7bfbdb9bd9-46r9c"] Oct 03 08:56:55 crc kubenswrapper[4790]: I1003 08:56:55.490232 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Oct 03 08:56:55 crc kubenswrapper[4790]: I1003 08:56:55.576348 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7cb8c0c5-d92e-462c-88ea-5bbfc5b06d90-config\") pod \"dnsmasq-dns-7bfbdb9bd9-46r9c\" (UID: \"7cb8c0c5-d92e-462c-88ea-5bbfc5b06d90\") " pod="openstack/dnsmasq-dns-7bfbdb9bd9-46r9c" Oct 03 08:56:55 crc kubenswrapper[4790]: I1003 08:56:55.576394 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7cb8c0c5-d92e-462c-88ea-5bbfc5b06d90-ovsdbserver-nb\") pod \"dnsmasq-dns-7bfbdb9bd9-46r9c\" (UID: \"7cb8c0c5-d92e-462c-88ea-5bbfc5b06d90\") " pod="openstack/dnsmasq-dns-7bfbdb9bd9-46r9c" Oct 03 08:56:55 crc kubenswrapper[4790]: I1003 08:56:55.576438 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7hwkw\" (UniqueName: \"kubernetes.io/projected/7cb8c0c5-d92e-462c-88ea-5bbfc5b06d90-kube-api-access-7hwkw\") pod \"dnsmasq-dns-7bfbdb9bd9-46r9c\" (UID: \"7cb8c0c5-d92e-462c-88ea-5bbfc5b06d90\") " pod="openstack/dnsmasq-dns-7bfbdb9bd9-46r9c" Oct 03 08:56:55 crc kubenswrapper[4790]: I1003 08:56:55.576466 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7cb8c0c5-d92e-462c-88ea-5bbfc5b06d90-dns-svc\") pod \"dnsmasq-dns-7bfbdb9bd9-46r9c\" (UID: \"7cb8c0c5-d92e-462c-88ea-5bbfc5b06d90\") " pod="openstack/dnsmasq-dns-7bfbdb9bd9-46r9c" Oct 03 08:56:55 crc kubenswrapper[4790]: I1003 08:56:55.576483 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7cb8c0c5-d92e-462c-88ea-5bbfc5b06d90-ovsdbserver-sb\") pod \"dnsmasq-dns-7bfbdb9bd9-46r9c\" (UID: \"7cb8c0c5-d92e-462c-88ea-5bbfc5b06d90\") " pod="openstack/dnsmasq-dns-7bfbdb9bd9-46r9c" Oct 03 08:56:55 crc kubenswrapper[4790]: W1003 08:56:55.648218 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6330b59a_ef23_4482_9fa5_58ccb2f6bc5a.slice/crio-085751a1f5a1ad2fc45c1ad73f667992b8dcb4574762d06c3ca6a6a925bd7fea WatchSource:0}: Error finding container 085751a1f5a1ad2fc45c1ad73f667992b8dcb4574762d06c3ca6a6a925bd7fea: Status 404 returned error can't find the container with id 085751a1f5a1ad2fc45c1ad73f667992b8dcb4574762d06c3ca6a6a925bd7fea Oct 03 08:56:55 crc kubenswrapper[4790]: I1003 08:56:55.679033 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7cb8c0c5-d92e-462c-88ea-5bbfc5b06d90-config\") pod \"dnsmasq-dns-7bfbdb9bd9-46r9c\" (UID: \"7cb8c0c5-d92e-462c-88ea-5bbfc5b06d90\") " pod="openstack/dnsmasq-dns-7bfbdb9bd9-46r9c" Oct 03 08:56:55 crc kubenswrapper[4790]: I1003 08:56:55.679212 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7cb8c0c5-d92e-462c-88ea-5bbfc5b06d90-ovsdbserver-nb\") pod \"dnsmasq-dns-7bfbdb9bd9-46r9c\" (UID: \"7cb8c0c5-d92e-462c-88ea-5bbfc5b06d90\") " pod="openstack/dnsmasq-dns-7bfbdb9bd9-46r9c" Oct 03 08:56:55 crc kubenswrapper[4790]: I1003 08:56:55.679269 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7hwkw\" (UniqueName: \"kubernetes.io/projected/7cb8c0c5-d92e-462c-88ea-5bbfc5b06d90-kube-api-access-7hwkw\") pod \"dnsmasq-dns-7bfbdb9bd9-46r9c\" (UID: \"7cb8c0c5-d92e-462c-88ea-5bbfc5b06d90\") " pod="openstack/dnsmasq-dns-7bfbdb9bd9-46r9c" Oct 03 08:56:55 crc kubenswrapper[4790]: I1003 08:56:55.679303 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7cb8c0c5-d92e-462c-88ea-5bbfc5b06d90-dns-svc\") pod \"dnsmasq-dns-7bfbdb9bd9-46r9c\" (UID: \"7cb8c0c5-d92e-462c-88ea-5bbfc5b06d90\") " pod="openstack/dnsmasq-dns-7bfbdb9bd9-46r9c" Oct 03 08:56:55 crc kubenswrapper[4790]: I1003 08:56:55.679324 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7cb8c0c5-d92e-462c-88ea-5bbfc5b06d90-ovsdbserver-sb\") pod \"dnsmasq-dns-7bfbdb9bd9-46r9c\" (UID: \"7cb8c0c5-d92e-462c-88ea-5bbfc5b06d90\") " pod="openstack/dnsmasq-dns-7bfbdb9bd9-46r9c" Oct 03 08:56:55 crc kubenswrapper[4790]: I1003 08:56:55.681081 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7cb8c0c5-d92e-462c-88ea-5bbfc5b06d90-ovsdbserver-sb\") pod \"dnsmasq-dns-7bfbdb9bd9-46r9c\" (UID: \"7cb8c0c5-d92e-462c-88ea-5bbfc5b06d90\") " pod="openstack/dnsmasq-dns-7bfbdb9bd9-46r9c" Oct 03 08:56:55 crc kubenswrapper[4790]: I1003 08:56:55.681566 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7cb8c0c5-d92e-462c-88ea-5bbfc5b06d90-config\") pod \"dnsmasq-dns-7bfbdb9bd9-46r9c\" (UID: \"7cb8c0c5-d92e-462c-88ea-5bbfc5b06d90\") " pod="openstack/dnsmasq-dns-7bfbdb9bd9-46r9c" Oct 03 08:56:55 crc kubenswrapper[4790]: I1003 08:56:55.681799 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7cb8c0c5-d92e-462c-88ea-5bbfc5b06d90-dns-svc\") pod \"dnsmasq-dns-7bfbdb9bd9-46r9c\" (UID: \"7cb8c0c5-d92e-462c-88ea-5bbfc5b06d90\") " pod="openstack/dnsmasq-dns-7bfbdb9bd9-46r9c" Oct 03 08:56:55 crc kubenswrapper[4790]: I1003 08:56:55.682705 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7cb8c0c5-d92e-462c-88ea-5bbfc5b06d90-ovsdbserver-nb\") pod \"dnsmasq-dns-7bfbdb9bd9-46r9c\" (UID: \"7cb8c0c5-d92e-462c-88ea-5bbfc5b06d90\") " pod="openstack/dnsmasq-dns-7bfbdb9bd9-46r9c" Oct 03 08:56:55 crc kubenswrapper[4790]: I1003 08:56:55.697176 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7hwkw\" (UniqueName: \"kubernetes.io/projected/7cb8c0c5-d92e-462c-88ea-5bbfc5b06d90-kube-api-access-7hwkw\") pod \"dnsmasq-dns-7bfbdb9bd9-46r9c\" (UID: \"7cb8c0c5-d92e-462c-88ea-5bbfc5b06d90\") " pod="openstack/dnsmasq-dns-7bfbdb9bd9-46r9c" Oct 03 08:56:55 crc kubenswrapper[4790]: I1003 08:56:55.836963 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7bfbdb9bd9-46r9c" Oct 03 08:56:56 crc kubenswrapper[4790]: I1003 08:56:56.277215 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"6330b59a-ef23-4482-9fa5-58ccb2f6bc5a","Type":"ContainerStarted","Data":"085751a1f5a1ad2fc45c1ad73f667992b8dcb4574762d06c3ca6a6a925bd7fea"} Oct 03 08:56:56 crc kubenswrapper[4790]: I1003 08:56:56.342120 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5d262119-1fe3-42b9-bdc1-fe03ce687c2e" path="/var/lib/kubelet/pods/5d262119-1fe3-42b9-bdc1-fe03ce687c2e/volumes" Oct 03 08:56:56 crc kubenswrapper[4790]: I1003 08:56:56.342882 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7c60d6d1-6e32-4c60-a7c8-7bfb48d7267b" path="/var/lib/kubelet/pods/7c60d6d1-6e32-4c60-a7c8-7bfb48d7267b/volumes" Oct 03 08:56:57 crc kubenswrapper[4790]: I1003 08:56:57.168712 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-788c899457-p84kc"] Oct 03 08:56:57 crc kubenswrapper[4790]: I1003 08:56:57.296073 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-86b4778477-cpjz6" podUID="007c7c03-1417-496d-b35d-242dbaacc078" containerName="dnsmasq-dns" containerID="cri-o://b98bfe339a6912c522d44af2ec696bcec0c67a89c5e7486e0ed7b3c99d3eceb9" gracePeriod=10 Oct 03 08:56:58 crc kubenswrapper[4790]: I1003 08:56:58.304954 4790 generic.go:334] "Generic (PLEG): container finished" podID="007c7c03-1417-496d-b35d-242dbaacc078" containerID="b98bfe339a6912c522d44af2ec696bcec0c67a89c5e7486e0ed7b3c99d3eceb9" exitCode=0 Oct 03 08:56:58 crc kubenswrapper[4790]: I1003 08:56:58.305030 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86b4778477-cpjz6" event={"ID":"007c7c03-1417-496d-b35d-242dbaacc078","Type":"ContainerDied","Data":"b98bfe339a6912c522d44af2ec696bcec0c67a89c5e7486e0ed7b3c99d3eceb9"} Oct 03 08:56:59 crc kubenswrapper[4790]: W1003 08:56:58.999765 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod18264edd_1eee_4089_ad97_63c7092a8650.slice/crio-aa04d87a1f8fcbb769584d4702b66ef9d33b99a5aca3815032dbafa8911569d2 WatchSource:0}: Error finding container aa04d87a1f8fcbb769584d4702b66ef9d33b99a5aca3815032dbafa8911569d2: Status 404 returned error can't find the container with id aa04d87a1f8fcbb769584d4702b66ef9d33b99a5aca3815032dbafa8911569d2 Oct 03 08:56:59 crc kubenswrapper[4790]: I1003 08:56:59.311462 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-788c899457-p84kc" event={"ID":"18264edd-1eee-4089-ad97-63c7092a8650","Type":"ContainerStarted","Data":"aa04d87a1f8fcbb769584d4702b66ef9d33b99a5aca3815032dbafa8911569d2"} Oct 03 08:57:01 crc kubenswrapper[4790]: I1003 08:57:01.531026 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86b4778477-cpjz6" Oct 03 08:57:01 crc kubenswrapper[4790]: I1003 08:57:01.693834 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/007c7c03-1417-496d-b35d-242dbaacc078-dns-svc\") pod \"007c7c03-1417-496d-b35d-242dbaacc078\" (UID: \"007c7c03-1417-496d-b35d-242dbaacc078\") " Oct 03 08:57:01 crc kubenswrapper[4790]: I1003 08:57:01.693882 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gqckf\" (UniqueName: \"kubernetes.io/projected/007c7c03-1417-496d-b35d-242dbaacc078-kube-api-access-gqckf\") pod \"007c7c03-1417-496d-b35d-242dbaacc078\" (UID: \"007c7c03-1417-496d-b35d-242dbaacc078\") " Oct 03 08:57:01 crc kubenswrapper[4790]: I1003 08:57:01.693911 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/007c7c03-1417-496d-b35d-242dbaacc078-config\") pod \"007c7c03-1417-496d-b35d-242dbaacc078\" (UID: \"007c7c03-1417-496d-b35d-242dbaacc078\") " Oct 03 08:57:01 crc kubenswrapper[4790]: I1003 08:57:01.698856 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/007c7c03-1417-496d-b35d-242dbaacc078-kube-api-access-gqckf" (OuterVolumeSpecName: "kube-api-access-gqckf") pod "007c7c03-1417-496d-b35d-242dbaacc078" (UID: "007c7c03-1417-496d-b35d-242dbaacc078"). InnerVolumeSpecName "kube-api-access-gqckf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:57:01 crc kubenswrapper[4790]: I1003 08:57:01.729727 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/007c7c03-1417-496d-b35d-242dbaacc078-config" (OuterVolumeSpecName: "config") pod "007c7c03-1417-496d-b35d-242dbaacc078" (UID: "007c7c03-1417-496d-b35d-242dbaacc078"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:57:01 crc kubenswrapper[4790]: I1003 08:57:01.732780 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/007c7c03-1417-496d-b35d-242dbaacc078-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "007c7c03-1417-496d-b35d-242dbaacc078" (UID: "007c7c03-1417-496d-b35d-242dbaacc078"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:57:01 crc kubenswrapper[4790]: I1003 08:57:01.795475 4790 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/007c7c03-1417-496d-b35d-242dbaacc078-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 03 08:57:01 crc kubenswrapper[4790]: I1003 08:57:01.795512 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gqckf\" (UniqueName: \"kubernetes.io/projected/007c7c03-1417-496d-b35d-242dbaacc078-kube-api-access-gqckf\") on node \"crc\" DevicePath \"\"" Oct 03 08:57:01 crc kubenswrapper[4790]: I1003 08:57:01.795523 4790 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/007c7c03-1417-496d-b35d-242dbaacc078-config\") on node \"crc\" DevicePath \"\"" Oct 03 08:57:01 crc kubenswrapper[4790]: I1003 08:57:01.867208 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-6vcdh"] Oct 03 08:57:02 crc kubenswrapper[4790]: W1003 08:57:02.204755 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf892381b_8fc9_4352_915e_62d0b92cb7b6.slice/crio-e610d7de2bf1ba03a0ffba22ee74409645a76d9d20e3b650cc1146f7d8a0af55 WatchSource:0}: Error finding container e610d7de2bf1ba03a0ffba22ee74409645a76d9d20e3b650cc1146f7d8a0af55: Status 404 returned error can't find the container with id e610d7de2bf1ba03a0ffba22ee74409645a76d9d20e3b650cc1146f7d8a0af55 Oct 03 08:57:02 crc kubenswrapper[4790]: I1003 08:57:02.334688 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86b4778477-cpjz6" Oct 03 08:57:02 crc kubenswrapper[4790]: I1003 08:57:02.338687 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-6vcdh" event={"ID":"f892381b-8fc9-4352-915e-62d0b92cb7b6","Type":"ContainerStarted","Data":"e610d7de2bf1ba03a0ffba22ee74409645a76d9d20e3b650cc1146f7d8a0af55"} Oct 03 08:57:02 crc kubenswrapper[4790]: I1003 08:57:02.338731 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86b4778477-cpjz6" event={"ID":"007c7c03-1417-496d-b35d-242dbaacc078","Type":"ContainerDied","Data":"2edc6a756a768e20c71cf171b5688e661aaaebc7d72841def190a28be4b805aa"} Oct 03 08:57:02 crc kubenswrapper[4790]: I1003 08:57:02.338758 4790 scope.go:117] "RemoveContainer" containerID="b98bfe339a6912c522d44af2ec696bcec0c67a89c5e7486e0ed7b3c99d3eceb9" Oct 03 08:57:02 crc kubenswrapper[4790]: I1003 08:57:02.374285 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-86b4778477-cpjz6"] Oct 03 08:57:02 crc kubenswrapper[4790]: I1003 08:57:02.380159 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-86b4778477-cpjz6"] Oct 03 08:57:02 crc kubenswrapper[4790]: I1003 08:57:02.525810 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7bfbdb9bd9-46r9c"] Oct 03 08:57:02 crc kubenswrapper[4790]: I1003 08:57:02.572862 4790 scope.go:117] "RemoveContainer" containerID="6def2da72aa16292a45d28a978a2d098fc01aa57a2572615781488adb41bd6de" Oct 03 08:57:02 crc kubenswrapper[4790]: W1003 08:57:02.578270 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7cb8c0c5_d92e_462c_88ea_5bbfc5b06d90.slice/crio-f6996b1cdf8dbb5992e59898124f786bad7d76f3282b2d2a47e309254a46e8cc WatchSource:0}: Error finding container f6996b1cdf8dbb5992e59898124f786bad7d76f3282b2d2a47e309254a46e8cc: Status 404 returned error can't find the container with id f6996b1cdf8dbb5992e59898124f786bad7d76f3282b2d2a47e309254a46e8cc Oct 03 08:57:03 crc kubenswrapper[4790]: I1003 08:57:03.344527 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b9746b6c-rmnft" event={"ID":"9269d241-5eae-4eb1-b5a9-c4ac47ce177e","Type":"ContainerStarted","Data":"54852890d6da353794bab5dfcfcf9a00616e67989097493ae01945c983265da4"} Oct 03 08:57:03 crc kubenswrapper[4790]: I1003 08:57:03.344709 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7b9746b6c-rmnft" podUID="9269d241-5eae-4eb1-b5a9-c4ac47ce177e" containerName="dnsmasq-dns" containerID="cri-o://54852890d6da353794bab5dfcfcf9a00616e67989097493ae01945c983265da4" gracePeriod=10 Oct 03 08:57:03 crc kubenswrapper[4790]: I1003 08:57:03.348265 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7bfbdb9bd9-46r9c" event={"ID":"7cb8c0c5-d92e-462c-88ea-5bbfc5b06d90","Type":"ContainerStarted","Data":"f6996b1cdf8dbb5992e59898124f786bad7d76f3282b2d2a47e309254a46e8cc"} Oct 03 08:57:03 crc kubenswrapper[4790]: I1003 08:57:03.352999 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-788c899457-p84kc" event={"ID":"18264edd-1eee-4089-ad97-63c7092a8650","Type":"ContainerStarted","Data":"d8ab163703c8c49478be3bc9dfe1a0f92c00e90e8156536e958921c089f5c294"} Oct 03 08:57:03 crc kubenswrapper[4790]: I1003 08:57:03.381117 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7b9746b6c-rmnft" podStartSLOduration=25.381050868 podStartE2EDuration="25.381050868s" podCreationTimestamp="2025-10-03 08:56:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 08:57:03.363713283 +0000 UTC m=+1009.716647601" watchObservedRunningTime="2025-10-03 08:57:03.381050868 +0000 UTC m=+1009.733985106" Oct 03 08:57:03 crc kubenswrapper[4790]: I1003 08:57:03.854219 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7b9746b6c-rmnft" Oct 03 08:57:04 crc kubenswrapper[4790]: I1003 08:57:04.171350 4790 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-86b4778477-cpjz6" podUID="007c7c03-1417-496d-b35d-242dbaacc078" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.106:5353: i/o timeout" Oct 03 08:57:04 crc kubenswrapper[4790]: I1003 08:57:04.339428 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="007c7c03-1417-496d-b35d-242dbaacc078" path="/var/lib/kubelet/pods/007c7c03-1417-496d-b35d-242dbaacc078/volumes" Oct 03 08:57:04 crc kubenswrapper[4790]: I1003 08:57:04.362775 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"89aeb08e-2699-4c1b-99ef-4ae2cfce9de3","Type":"ContainerStarted","Data":"5bbdb17b428727f50245fd2d02361bc8828b017e1b1f6e3d150e814a1d75e96d"} Oct 03 08:57:04 crc kubenswrapper[4790]: I1003 08:57:04.365016 4790 generic.go:334] "Generic (PLEG): container finished" podID="9269d241-5eae-4eb1-b5a9-c4ac47ce177e" containerID="54852890d6da353794bab5dfcfcf9a00616e67989097493ae01945c983265da4" exitCode=0 Oct 03 08:57:04 crc kubenswrapper[4790]: I1003 08:57:04.365065 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b9746b6c-rmnft" event={"ID":"9269d241-5eae-4eb1-b5a9-c4ac47ce177e","Type":"ContainerDied","Data":"54852890d6da353794bab5dfcfcf9a00616e67989097493ae01945c983265da4"} Oct 03 08:57:04 crc kubenswrapper[4790]: I1003 08:57:04.365085 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b9746b6c-rmnft" event={"ID":"9269d241-5eae-4eb1-b5a9-c4ac47ce177e","Type":"ContainerDied","Data":"0aaf0d8e5dbae4a3d88f9853c6f4c735bcd837f1857cc0e228c107d09cebedb6"} Oct 03 08:57:04 crc kubenswrapper[4790]: I1003 08:57:04.365096 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0aaf0d8e5dbae4a3d88f9853c6f4c735bcd837f1857cc0e228c107d09cebedb6" Oct 03 08:57:04 crc kubenswrapper[4790]: I1003 08:57:04.366399 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"3b885fd9-8ed3-45b9-8e6a-63069940ae79","Type":"ContainerStarted","Data":"51f8c123aa66831aabfbb219b72f2b830de0f1591600aeb97e06a2a79bebb480"} Oct 03 08:57:04 crc kubenswrapper[4790]: I1003 08:57:04.366858 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Oct 03 08:57:04 crc kubenswrapper[4790]: I1003 08:57:04.368400 4790 generic.go:334] "Generic (PLEG): container finished" podID="18264edd-1eee-4089-ad97-63c7092a8650" containerID="d8ab163703c8c49478be3bc9dfe1a0f92c00e90e8156536e958921c089f5c294" exitCode=0 Oct 03 08:57:04 crc kubenswrapper[4790]: I1003 08:57:04.368428 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-788c899457-p84kc" event={"ID":"18264edd-1eee-4089-ad97-63c7092a8650","Type":"ContainerDied","Data":"d8ab163703c8c49478be3bc9dfe1a0f92c00e90e8156536e958921c089f5c294"} Oct 03 08:57:04 crc kubenswrapper[4790]: I1003 08:57:04.465329 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=13.135578964 podStartE2EDuration="21.465309263s" podCreationTimestamp="2025-10-03 08:56:43 +0000 UTC" firstStartedPulling="2025-10-03 08:56:54.029163215 +0000 UTC m=+1000.382097453" lastFinishedPulling="2025-10-03 08:57:02.358893524 +0000 UTC m=+1008.711827752" observedRunningTime="2025-10-03 08:57:04.440306318 +0000 UTC m=+1010.793240566" watchObservedRunningTime="2025-10-03 08:57:04.465309263 +0000 UTC m=+1010.818243501" Oct 03 08:57:04 crc kubenswrapper[4790]: I1003 08:57:04.714231 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7b9746b6c-rmnft" Oct 03 08:57:04 crc kubenswrapper[4790]: I1003 08:57:04.846778 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r7zzg\" (UniqueName: \"kubernetes.io/projected/9269d241-5eae-4eb1-b5a9-c4ac47ce177e-kube-api-access-r7zzg\") pod \"9269d241-5eae-4eb1-b5a9-c4ac47ce177e\" (UID: \"9269d241-5eae-4eb1-b5a9-c4ac47ce177e\") " Oct 03 08:57:04 crc kubenswrapper[4790]: I1003 08:57:04.846890 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9269d241-5eae-4eb1-b5a9-c4ac47ce177e-config\") pod \"9269d241-5eae-4eb1-b5a9-c4ac47ce177e\" (UID: \"9269d241-5eae-4eb1-b5a9-c4ac47ce177e\") " Oct 03 08:57:04 crc kubenswrapper[4790]: I1003 08:57:04.846977 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9269d241-5eae-4eb1-b5a9-c4ac47ce177e-dns-svc\") pod \"9269d241-5eae-4eb1-b5a9-c4ac47ce177e\" (UID: \"9269d241-5eae-4eb1-b5a9-c4ac47ce177e\") " Oct 03 08:57:04 crc kubenswrapper[4790]: I1003 08:57:04.857185 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9269d241-5eae-4eb1-b5a9-c4ac47ce177e-kube-api-access-r7zzg" (OuterVolumeSpecName: "kube-api-access-r7zzg") pod "9269d241-5eae-4eb1-b5a9-c4ac47ce177e" (UID: "9269d241-5eae-4eb1-b5a9-c4ac47ce177e"). InnerVolumeSpecName "kube-api-access-r7zzg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:57:04 crc kubenswrapper[4790]: I1003 08:57:04.925393 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9269d241-5eae-4eb1-b5a9-c4ac47ce177e-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "9269d241-5eae-4eb1-b5a9-c4ac47ce177e" (UID: "9269d241-5eae-4eb1-b5a9-c4ac47ce177e"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:57:04 crc kubenswrapper[4790]: I1003 08:57:04.926921 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9269d241-5eae-4eb1-b5a9-c4ac47ce177e-config" (OuterVolumeSpecName: "config") pod "9269d241-5eae-4eb1-b5a9-c4ac47ce177e" (UID: "9269d241-5eae-4eb1-b5a9-c4ac47ce177e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:57:04 crc kubenswrapper[4790]: I1003 08:57:04.948374 4790 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9269d241-5eae-4eb1-b5a9-c4ac47ce177e-config\") on node \"crc\" DevicePath \"\"" Oct 03 08:57:04 crc kubenswrapper[4790]: I1003 08:57:04.948401 4790 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9269d241-5eae-4eb1-b5a9-c4ac47ce177e-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 03 08:57:04 crc kubenswrapper[4790]: I1003 08:57:04.948410 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r7zzg\" (UniqueName: \"kubernetes.io/projected/9269d241-5eae-4eb1-b5a9-c4ac47ce177e-kube-api-access-r7zzg\") on node \"crc\" DevicePath \"\"" Oct 03 08:57:05 crc kubenswrapper[4790]: I1003 08:57:05.377683 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"c53865bf-810d-440a-bfa4-171e3183066c","Type":"ContainerStarted","Data":"8cab0390d30404db7e7397793cb026b5a505d26e0d519fe7d4558ae244c68805"} Oct 03 08:57:05 crc kubenswrapper[4790]: I1003 08:57:05.379776 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"f5ab6b2b-5bba-4043-b116-b8d996df1ace","Type":"ContainerStarted","Data":"f077e584af63fd1d7d0309f8b1b0933b111eb5a14bd21bd6d74599277edc4323"} Oct 03 08:57:05 crc kubenswrapper[4790]: I1003 08:57:05.381347 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-vrsc7" event={"ID":"5d8a4a4d-fc2d-49d8-80fd-8b6faf7e3aec","Type":"ContainerStarted","Data":"f4ab55cc09c9d42fad9827d94b84b937e77bc26a95271d4cf8766a99e323fa3c"} Oct 03 08:57:05 crc kubenswrapper[4790]: I1003 08:57:05.383883 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"6330b59a-ef23-4482-9fa5-58ccb2f6bc5a","Type":"ContainerStarted","Data":"d4a962e284dcef2bec7c09ae5ccf5a3c11d4db81c31472ba82d1c334cbde7f81"} Oct 03 08:57:05 crc kubenswrapper[4790]: I1003 08:57:05.383933 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7b9746b6c-rmnft" Oct 03 08:57:05 crc kubenswrapper[4790]: I1003 08:57:05.445467 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7b9746b6c-rmnft"] Oct 03 08:57:05 crc kubenswrapper[4790]: I1003 08:57:05.453800 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7b9746b6c-rmnft"] Oct 03 08:57:06 crc kubenswrapper[4790]: I1003 08:57:06.339110 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9269d241-5eae-4eb1-b5a9-c4ac47ce177e" path="/var/lib/kubelet/pods/9269d241-5eae-4eb1-b5a9-c4ac47ce177e/volumes" Oct 03 08:57:06 crc kubenswrapper[4790]: I1003 08:57:06.392960 4790 generic.go:334] "Generic (PLEG): container finished" podID="5d8a4a4d-fc2d-49d8-80fd-8b6faf7e3aec" containerID="f4ab55cc09c9d42fad9827d94b84b937e77bc26a95271d4cf8766a99e323fa3c" exitCode=0 Oct 03 08:57:06 crc kubenswrapper[4790]: I1003 08:57:06.393020 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-vrsc7" event={"ID":"5d8a4a4d-fc2d-49d8-80fd-8b6faf7e3aec","Type":"ContainerDied","Data":"f4ab55cc09c9d42fad9827d94b84b937e77bc26a95271d4cf8766a99e323fa3c"} Oct 03 08:57:06 crc kubenswrapper[4790]: I1003 08:57:06.397104 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"f79d57e8-680a-41b6-bca9-c233a695be4e","Type":"ContainerStarted","Data":"967ede7f4d897538e40daed769bfd55730474c6700fa08ab1a0dcd1960359c86"} Oct 03 08:57:06 crc kubenswrapper[4790]: I1003 08:57:06.410058 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"3dc14abb-14bb-4bee-9864-8507e37d824d","Type":"ContainerStarted","Data":"b560c625c12679a3e883f9561a4c8ed6a1a1cabb6d9496291e354a5429835a90"} Oct 03 08:57:09 crc kubenswrapper[4790]: I1003 08:57:09.429811 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"a93cbe46-9c4b-49b9-916f-b26ed8c80fe9","Type":"ContainerStarted","Data":"1c1a53cc61127423e411317bf7e6b8b910d0c813b5d7526ab3073aa3cd1b416e"} Oct 03 08:57:11 crc kubenswrapper[4790]: I1003 08:57:11.449431 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-notifications-server-0" event={"ID":"04926fe2-c5e2-4663-8945-448a17ec2a8b","Type":"ContainerStarted","Data":"d800fd3bcb397d0650138d460f6c0f701f0013b82b7baf3fc84e85e9723387fb"} Oct 03 08:57:12 crc kubenswrapper[4790]: I1003 08:57:12.490614 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-8ljvm" event={"ID":"845dcf0a-43fb-4760-aac6-d12c02a8b785","Type":"ContainerStarted","Data":"465733d8c60532939a7e643d37c4a5b24ae34309f1c6a25e91fbc03be101ce97"} Oct 03 08:57:12 crc kubenswrapper[4790]: I1003 08:57:12.492134 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-8ljvm" Oct 03 08:57:12 crc kubenswrapper[4790]: I1003 08:57:12.495351 4790 generic.go:334] "Generic (PLEG): container finished" podID="7cb8c0c5-d92e-462c-88ea-5bbfc5b06d90" containerID="891481d62dee7ee7b7166ac6f6aa0df403064c27e650a6377c91e10ec268967f" exitCode=0 Oct 03 08:57:12 crc kubenswrapper[4790]: I1003 08:57:12.495449 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7bfbdb9bd9-46r9c" event={"ID":"7cb8c0c5-d92e-462c-88ea-5bbfc5b06d90","Type":"ContainerDied","Data":"891481d62dee7ee7b7166ac6f6aa0df403064c27e650a6377c91e10ec268967f"} Oct 03 08:57:12 crc kubenswrapper[4790]: I1003 08:57:12.500535 4790 generic.go:334] "Generic (PLEG): container finished" podID="3dc14abb-14bb-4bee-9864-8507e37d824d" containerID="b560c625c12679a3e883f9561a4c8ed6a1a1cabb6d9496291e354a5429835a90" exitCode=0 Oct 03 08:57:12 crc kubenswrapper[4790]: I1003 08:57:12.500632 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"3dc14abb-14bb-4bee-9864-8507e37d824d","Type":"ContainerDied","Data":"b560c625c12679a3e883f9561a4c8ed6a1a1cabb6d9496291e354a5429835a90"} Oct 03 08:57:12 crc kubenswrapper[4790]: I1003 08:57:12.504099 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"010b668f-22e4-4ad8-aae5-decf063cd49b","Type":"ContainerStarted","Data":"a4509cbef03a8a7e1516442a998cde882f083aa206498660c2948da3bb708306"} Oct 03 08:57:12 crc kubenswrapper[4790]: I1003 08:57:12.505362 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Oct 03 08:57:12 crc kubenswrapper[4790]: I1003 08:57:12.513961 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-8ljvm" podStartSLOduration=14.5404003 podStartE2EDuration="23.513938945s" podCreationTimestamp="2025-10-03 08:56:49 +0000 UTC" firstStartedPulling="2025-10-03 08:56:53.97381394 +0000 UTC m=+1000.326748178" lastFinishedPulling="2025-10-03 08:57:02.947352585 +0000 UTC m=+1009.300286823" observedRunningTime="2025-10-03 08:57:12.513359679 +0000 UTC m=+1018.866293917" watchObservedRunningTime="2025-10-03 08:57:12.513938945 +0000 UTC m=+1018.866873183" Oct 03 08:57:12 crc kubenswrapper[4790]: I1003 08:57:12.625661 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=18.513455553 podStartE2EDuration="27.625638823s" podCreationTimestamp="2025-10-03 08:56:45 +0000 UTC" firstStartedPulling="2025-10-03 08:56:53.984521283 +0000 UTC m=+1000.337455521" lastFinishedPulling="2025-10-03 08:57:03.096704543 +0000 UTC m=+1009.449638791" observedRunningTime="2025-10-03 08:57:12.606369816 +0000 UTC m=+1018.959304074" watchObservedRunningTime="2025-10-03 08:57:12.625638823 +0000 UTC m=+1018.978573071" Oct 03 08:57:13 crc kubenswrapper[4790]: I1003 08:57:13.512384 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"6330b59a-ef23-4482-9fa5-58ccb2f6bc5a","Type":"ContainerStarted","Data":"21802210c13ee1e44e226302ba97231b14e30dbc6d119637031b0e0c40d43fc0"} Oct 03 08:57:13 crc kubenswrapper[4790]: I1003 08:57:13.514646 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"c53865bf-810d-440a-bfa4-171e3183066c","Type":"ContainerStarted","Data":"147d5781746328708f202f320a3dc2a7f8d5eb2a92b9c03e880592866d561fd7"} Oct 03 08:57:13 crc kubenswrapper[4790]: I1003 08:57:13.516783 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7bfbdb9bd9-46r9c" event={"ID":"7cb8c0c5-d92e-462c-88ea-5bbfc5b06d90","Type":"ContainerStarted","Data":"5297c1d1c550f585c8f121fa581ec21e03dab18111f77659d2c048346d132741"} Oct 03 08:57:13 crc kubenswrapper[4790]: I1003 08:57:13.516915 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7bfbdb9bd9-46r9c" Oct 03 08:57:13 crc kubenswrapper[4790]: I1003 08:57:13.518553 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-788c899457-p84kc" event={"ID":"18264edd-1eee-4089-ad97-63c7092a8650","Type":"ContainerStarted","Data":"4aa9709240cf6f4d1db5efb1fa197164deef1387713c42477a9203230fd18d05"} Oct 03 08:57:13 crc kubenswrapper[4790]: I1003 08:57:13.518923 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-788c899457-p84kc" Oct 03 08:57:13 crc kubenswrapper[4790]: I1003 08:57:13.520543 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-vrsc7" event={"ID":"5d8a4a4d-fc2d-49d8-80fd-8b6faf7e3aec","Type":"ContainerStarted","Data":"ad0b204dc5fef1b1ee8062bc53bc98772f511506b7298178c72581dbf29e92a2"} Oct 03 08:57:13 crc kubenswrapper[4790]: I1003 08:57:13.520589 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-vrsc7" event={"ID":"5d8a4a4d-fc2d-49d8-80fd-8b6faf7e3aec","Type":"ContainerStarted","Data":"40db05f7a5716dcb2afe5a2c693496e2bae118533f5339716e6144d2cde048d5"} Oct 03 08:57:13 crc kubenswrapper[4790]: I1003 08:57:13.520761 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-vrsc7" Oct 03 08:57:13 crc kubenswrapper[4790]: I1003 08:57:13.522054 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-6vcdh" event={"ID":"f892381b-8fc9-4352-915e-62d0b92cb7b6","Type":"ContainerStarted","Data":"664c56361bc32962eb5dfe6d39fadc0c7082f786f0a70b52017a637069241566"} Oct 03 08:57:13 crc kubenswrapper[4790]: I1003 08:57:13.536427 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=4.904488026 podStartE2EDuration="21.536410698s" podCreationTimestamp="2025-10-03 08:56:52 +0000 UTC" firstStartedPulling="2025-10-03 08:56:55.715257247 +0000 UTC m=+1002.068191485" lastFinishedPulling="2025-10-03 08:57:12.347179899 +0000 UTC m=+1018.700114157" observedRunningTime="2025-10-03 08:57:13.532349457 +0000 UTC m=+1019.885283695" watchObservedRunningTime="2025-10-03 08:57:13.536410698 +0000 UTC m=+1019.889344936" Oct 03 08:57:13 crc kubenswrapper[4790]: I1003 08:57:13.555530 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-788c899457-p84kc" podStartSLOduration=18.55549805 podStartE2EDuration="18.55549805s" podCreationTimestamp="2025-10-03 08:56:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 08:57:13.553833895 +0000 UTC m=+1019.906768173" watchObservedRunningTime="2025-10-03 08:57:13.55549805 +0000 UTC m=+1019.908432288" Oct 03 08:57:13 crc kubenswrapper[4790]: I1003 08:57:13.603498 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-vrsc7" podStartSLOduration=16.631031958 podStartE2EDuration="24.603477285s" podCreationTimestamp="2025-10-03 08:56:49 +0000 UTC" firstStartedPulling="2025-10-03 08:56:54.263299856 +0000 UTC m=+1000.616234094" lastFinishedPulling="2025-10-03 08:57:02.235745183 +0000 UTC m=+1008.588679421" observedRunningTime="2025-10-03 08:57:13.571534259 +0000 UTC m=+1019.924468517" watchObservedRunningTime="2025-10-03 08:57:13.603477285 +0000 UTC m=+1019.956411523" Oct 03 08:57:13 crc kubenswrapper[4790]: I1003 08:57:13.605755 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=6.378864866 podStartE2EDuration="24.605746696s" podCreationTimestamp="2025-10-03 08:56:49 +0000 UTC" firstStartedPulling="2025-10-03 08:56:54.077269612 +0000 UTC m=+1000.430203850" lastFinishedPulling="2025-10-03 08:57:12.304151442 +0000 UTC m=+1018.657085680" observedRunningTime="2025-10-03 08:57:13.589373438 +0000 UTC m=+1019.942307706" watchObservedRunningTime="2025-10-03 08:57:13.605746696 +0000 UTC m=+1019.958680934" Oct 03 08:57:13 crc kubenswrapper[4790]: I1003 08:57:13.622477 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-6vcdh" podStartSLOduration=9.52537023 podStartE2EDuration="19.622447363s" podCreationTimestamp="2025-10-03 08:56:54 +0000 UTC" firstStartedPulling="2025-10-03 08:57:02.235704892 +0000 UTC m=+1008.588639130" lastFinishedPulling="2025-10-03 08:57:12.332782025 +0000 UTC m=+1018.685716263" observedRunningTime="2025-10-03 08:57:13.609747896 +0000 UTC m=+1019.962682154" watchObservedRunningTime="2025-10-03 08:57:13.622447363 +0000 UTC m=+1019.975381601" Oct 03 08:57:13 crc kubenswrapper[4790]: I1003 08:57:13.633775 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7bfbdb9bd9-46r9c" podStartSLOduration=18.633757183 podStartE2EDuration="18.633757183s" podCreationTimestamp="2025-10-03 08:56:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 08:57:13.633427854 +0000 UTC m=+1019.986362102" watchObservedRunningTime="2025-10-03 08:57:13.633757183 +0000 UTC m=+1019.986691431" Oct 03 08:57:13 crc kubenswrapper[4790]: I1003 08:57:13.973756 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Oct 03 08:57:14 crc kubenswrapper[4790]: I1003 08:57:14.273751 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Oct 03 08:57:14 crc kubenswrapper[4790]: I1003 08:57:14.409740 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Oct 03 08:57:14 crc kubenswrapper[4790]: I1003 08:57:14.453985 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Oct 03 08:57:14 crc kubenswrapper[4790]: I1003 08:57:14.531182 4790 generic.go:334] "Generic (PLEG): container finished" podID="89aeb08e-2699-4c1b-99ef-4ae2cfce9de3" containerID="5bbdb17b428727f50245fd2d02361bc8828b017e1b1f6e3d150e814a1d75e96d" exitCode=0 Oct 03 08:57:14 crc kubenswrapper[4790]: I1003 08:57:14.531308 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"89aeb08e-2699-4c1b-99ef-4ae2cfce9de3","Type":"ContainerDied","Data":"5bbdb17b428727f50245fd2d02361bc8828b017e1b1f6e3d150e814a1d75e96d"} Oct 03 08:57:14 crc kubenswrapper[4790]: I1003 08:57:14.532088 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Oct 03 08:57:14 crc kubenswrapper[4790]: I1003 08:57:14.532116 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-vrsc7" Oct 03 08:57:14 crc kubenswrapper[4790]: I1003 08:57:14.779017 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Oct 03 08:57:15 crc kubenswrapper[4790]: I1003 08:57:15.275913 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Oct 03 08:57:15 crc kubenswrapper[4790]: I1003 08:57:15.323659 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Oct 03 08:57:15 crc kubenswrapper[4790]: I1003 08:57:15.543036 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"89aeb08e-2699-4c1b-99ef-4ae2cfce9de3","Type":"ContainerStarted","Data":"2e0696d426724a577ae26cb1af56c1fa763a226975e2f9198ab4b25fa897ab06"} Oct 03 08:57:15 crc kubenswrapper[4790]: I1003 08:57:15.573983 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=25.244216832 podStartE2EDuration="33.573959581s" podCreationTimestamp="2025-10-03 08:56:42 +0000 UTC" firstStartedPulling="2025-10-03 08:56:54.025648249 +0000 UTC m=+1000.378582477" lastFinishedPulling="2025-10-03 08:57:02.355390978 +0000 UTC m=+1008.708325226" observedRunningTime="2025-10-03 08:57:15.562461957 +0000 UTC m=+1021.915396195" watchObservedRunningTime="2025-10-03 08:57:15.573959581 +0000 UTC m=+1021.926893839" Oct 03 08:57:15 crc kubenswrapper[4790]: I1003 08:57:15.583426 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Oct 03 08:57:15 crc kubenswrapper[4790]: I1003 08:57:15.743794 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Oct 03 08:57:15 crc kubenswrapper[4790]: E1003 08:57:15.744151 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="007c7c03-1417-496d-b35d-242dbaacc078" containerName="init" Oct 03 08:57:15 crc kubenswrapper[4790]: I1003 08:57:15.744163 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="007c7c03-1417-496d-b35d-242dbaacc078" containerName="init" Oct 03 08:57:15 crc kubenswrapper[4790]: E1003 08:57:15.744181 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9269d241-5eae-4eb1-b5a9-c4ac47ce177e" containerName="init" Oct 03 08:57:15 crc kubenswrapper[4790]: I1003 08:57:15.744187 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="9269d241-5eae-4eb1-b5a9-c4ac47ce177e" containerName="init" Oct 03 08:57:15 crc kubenswrapper[4790]: E1003 08:57:15.744202 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9269d241-5eae-4eb1-b5a9-c4ac47ce177e" containerName="dnsmasq-dns" Oct 03 08:57:15 crc kubenswrapper[4790]: I1003 08:57:15.744207 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="9269d241-5eae-4eb1-b5a9-c4ac47ce177e" containerName="dnsmasq-dns" Oct 03 08:57:15 crc kubenswrapper[4790]: E1003 08:57:15.744233 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="007c7c03-1417-496d-b35d-242dbaacc078" containerName="dnsmasq-dns" Oct 03 08:57:15 crc kubenswrapper[4790]: I1003 08:57:15.744239 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="007c7c03-1417-496d-b35d-242dbaacc078" containerName="dnsmasq-dns" Oct 03 08:57:15 crc kubenswrapper[4790]: I1003 08:57:15.744406 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="9269d241-5eae-4eb1-b5a9-c4ac47ce177e" containerName="dnsmasq-dns" Oct 03 08:57:15 crc kubenswrapper[4790]: I1003 08:57:15.744421 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="007c7c03-1417-496d-b35d-242dbaacc078" containerName="dnsmasq-dns" Oct 03 08:57:15 crc kubenswrapper[4790]: I1003 08:57:15.745247 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Oct 03 08:57:15 crc kubenswrapper[4790]: I1003 08:57:15.752029 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Oct 03 08:57:15 crc kubenswrapper[4790]: I1003 08:57:15.753293 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovnnorthd-ovndbs" Oct 03 08:57:15 crc kubenswrapper[4790]: I1003 08:57:15.753482 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Oct 03 08:57:15 crc kubenswrapper[4790]: I1003 08:57:15.753629 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-k4jz2" Oct 03 08:57:15 crc kubenswrapper[4790]: I1003 08:57:15.753737 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Oct 03 08:57:15 crc kubenswrapper[4790]: I1003 08:57:15.859784 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/b0f1b68c-e2ab-4d14-b207-a8a336b1bc1b-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"b0f1b68c-e2ab-4d14-b207-a8a336b1bc1b\") " pod="openstack/ovn-northd-0" Oct 03 08:57:15 crc kubenswrapper[4790]: I1003 08:57:15.859841 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5qpkj\" (UniqueName: \"kubernetes.io/projected/b0f1b68c-e2ab-4d14-b207-a8a336b1bc1b-kube-api-access-5qpkj\") pod \"ovn-northd-0\" (UID: \"b0f1b68c-e2ab-4d14-b207-a8a336b1bc1b\") " pod="openstack/ovn-northd-0" Oct 03 08:57:15 crc kubenswrapper[4790]: I1003 08:57:15.859900 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b0f1b68c-e2ab-4d14-b207-a8a336b1bc1b-scripts\") pod \"ovn-northd-0\" (UID: \"b0f1b68c-e2ab-4d14-b207-a8a336b1bc1b\") " pod="openstack/ovn-northd-0" Oct 03 08:57:15 crc kubenswrapper[4790]: I1003 08:57:15.859972 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/b0f1b68c-e2ab-4d14-b207-a8a336b1bc1b-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"b0f1b68c-e2ab-4d14-b207-a8a336b1bc1b\") " pod="openstack/ovn-northd-0" Oct 03 08:57:15 crc kubenswrapper[4790]: I1003 08:57:15.860013 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/b0f1b68c-e2ab-4d14-b207-a8a336b1bc1b-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"b0f1b68c-e2ab-4d14-b207-a8a336b1bc1b\") " pod="openstack/ovn-northd-0" Oct 03 08:57:15 crc kubenswrapper[4790]: I1003 08:57:15.860081 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b0f1b68c-e2ab-4d14-b207-a8a336b1bc1b-config\") pod \"ovn-northd-0\" (UID: \"b0f1b68c-e2ab-4d14-b207-a8a336b1bc1b\") " pod="openstack/ovn-northd-0" Oct 03 08:57:15 crc kubenswrapper[4790]: I1003 08:57:15.860103 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0f1b68c-e2ab-4d14-b207-a8a336b1bc1b-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"b0f1b68c-e2ab-4d14-b207-a8a336b1bc1b\") " pod="openstack/ovn-northd-0" Oct 03 08:57:15 crc kubenswrapper[4790]: I1003 08:57:15.961960 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b0f1b68c-e2ab-4d14-b207-a8a336b1bc1b-scripts\") pod \"ovn-northd-0\" (UID: \"b0f1b68c-e2ab-4d14-b207-a8a336b1bc1b\") " pod="openstack/ovn-northd-0" Oct 03 08:57:15 crc kubenswrapper[4790]: I1003 08:57:15.962024 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/b0f1b68c-e2ab-4d14-b207-a8a336b1bc1b-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"b0f1b68c-e2ab-4d14-b207-a8a336b1bc1b\") " pod="openstack/ovn-northd-0" Oct 03 08:57:15 crc kubenswrapper[4790]: I1003 08:57:15.962055 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/b0f1b68c-e2ab-4d14-b207-a8a336b1bc1b-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"b0f1b68c-e2ab-4d14-b207-a8a336b1bc1b\") " pod="openstack/ovn-northd-0" Oct 03 08:57:15 crc kubenswrapper[4790]: I1003 08:57:15.962088 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b0f1b68c-e2ab-4d14-b207-a8a336b1bc1b-config\") pod \"ovn-northd-0\" (UID: \"b0f1b68c-e2ab-4d14-b207-a8a336b1bc1b\") " pod="openstack/ovn-northd-0" Oct 03 08:57:15 crc kubenswrapper[4790]: I1003 08:57:15.962105 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0f1b68c-e2ab-4d14-b207-a8a336b1bc1b-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"b0f1b68c-e2ab-4d14-b207-a8a336b1bc1b\") " pod="openstack/ovn-northd-0" Oct 03 08:57:15 crc kubenswrapper[4790]: I1003 08:57:15.962155 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/b0f1b68c-e2ab-4d14-b207-a8a336b1bc1b-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"b0f1b68c-e2ab-4d14-b207-a8a336b1bc1b\") " pod="openstack/ovn-northd-0" Oct 03 08:57:15 crc kubenswrapper[4790]: I1003 08:57:15.962175 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5qpkj\" (UniqueName: \"kubernetes.io/projected/b0f1b68c-e2ab-4d14-b207-a8a336b1bc1b-kube-api-access-5qpkj\") pod \"ovn-northd-0\" (UID: \"b0f1b68c-e2ab-4d14-b207-a8a336b1bc1b\") " pod="openstack/ovn-northd-0" Oct 03 08:57:15 crc kubenswrapper[4790]: I1003 08:57:15.963020 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/b0f1b68c-e2ab-4d14-b207-a8a336b1bc1b-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"b0f1b68c-e2ab-4d14-b207-a8a336b1bc1b\") " pod="openstack/ovn-northd-0" Oct 03 08:57:15 crc kubenswrapper[4790]: I1003 08:57:15.963876 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b0f1b68c-e2ab-4d14-b207-a8a336b1bc1b-scripts\") pod \"ovn-northd-0\" (UID: \"b0f1b68c-e2ab-4d14-b207-a8a336b1bc1b\") " pod="openstack/ovn-northd-0" Oct 03 08:57:15 crc kubenswrapper[4790]: I1003 08:57:15.963940 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b0f1b68c-e2ab-4d14-b207-a8a336b1bc1b-config\") pod \"ovn-northd-0\" (UID: \"b0f1b68c-e2ab-4d14-b207-a8a336b1bc1b\") " pod="openstack/ovn-northd-0" Oct 03 08:57:15 crc kubenswrapper[4790]: I1003 08:57:15.968951 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0f1b68c-e2ab-4d14-b207-a8a336b1bc1b-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"b0f1b68c-e2ab-4d14-b207-a8a336b1bc1b\") " pod="openstack/ovn-northd-0" Oct 03 08:57:15 crc kubenswrapper[4790]: I1003 08:57:15.970231 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/b0f1b68c-e2ab-4d14-b207-a8a336b1bc1b-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"b0f1b68c-e2ab-4d14-b207-a8a336b1bc1b\") " pod="openstack/ovn-northd-0" Oct 03 08:57:15 crc kubenswrapper[4790]: I1003 08:57:15.987961 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/b0f1b68c-e2ab-4d14-b207-a8a336b1bc1b-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"b0f1b68c-e2ab-4d14-b207-a8a336b1bc1b\") " pod="openstack/ovn-northd-0" Oct 03 08:57:15 crc kubenswrapper[4790]: I1003 08:57:15.996050 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5qpkj\" (UniqueName: \"kubernetes.io/projected/b0f1b68c-e2ab-4d14-b207-a8a336b1bc1b-kube-api-access-5qpkj\") pod \"ovn-northd-0\" (UID: \"b0f1b68c-e2ab-4d14-b207-a8a336b1bc1b\") " pod="openstack/ovn-northd-0" Oct 03 08:57:16 crc kubenswrapper[4790]: I1003 08:57:16.084015 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Oct 03 08:57:16 crc kubenswrapper[4790]: I1003 08:57:16.095093 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Oct 03 08:57:16 crc kubenswrapper[4790]: I1003 08:57:16.195727 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-788c899457-p84kc"] Oct 03 08:57:16 crc kubenswrapper[4790]: I1003 08:57:16.197082 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-788c899457-p84kc" podUID="18264edd-1eee-4089-ad97-63c7092a8650" containerName="dnsmasq-dns" containerID="cri-o://4aa9709240cf6f4d1db5efb1fa197164deef1387713c42477a9203230fd18d05" gracePeriod=10 Oct 03 08:57:16 crc kubenswrapper[4790]: I1003 08:57:16.226548 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-675946d6fc-std9f"] Oct 03 08:57:16 crc kubenswrapper[4790]: I1003 08:57:16.230518 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675946d6fc-std9f" Oct 03 08:57:16 crc kubenswrapper[4790]: I1003 08:57:16.258896 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675946d6fc-std9f"] Oct 03 08:57:16 crc kubenswrapper[4790]: I1003 08:57:16.381543 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/dd4153b0-ab10-423f-bb3a-be4bac79c118-ovsdbserver-nb\") pod \"dnsmasq-dns-675946d6fc-std9f\" (UID: \"dd4153b0-ab10-423f-bb3a-be4bac79c118\") " pod="openstack/dnsmasq-dns-675946d6fc-std9f" Oct 03 08:57:16 crc kubenswrapper[4790]: I1003 08:57:16.381983 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/dd4153b0-ab10-423f-bb3a-be4bac79c118-dns-svc\") pod \"dnsmasq-dns-675946d6fc-std9f\" (UID: \"dd4153b0-ab10-423f-bb3a-be4bac79c118\") " pod="openstack/dnsmasq-dns-675946d6fc-std9f" Oct 03 08:57:16 crc kubenswrapper[4790]: I1003 08:57:16.382044 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-txmgh\" (UniqueName: \"kubernetes.io/projected/dd4153b0-ab10-423f-bb3a-be4bac79c118-kube-api-access-txmgh\") pod \"dnsmasq-dns-675946d6fc-std9f\" (UID: \"dd4153b0-ab10-423f-bb3a-be4bac79c118\") " pod="openstack/dnsmasq-dns-675946d6fc-std9f" Oct 03 08:57:16 crc kubenswrapper[4790]: I1003 08:57:16.382141 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/dd4153b0-ab10-423f-bb3a-be4bac79c118-ovsdbserver-sb\") pod \"dnsmasq-dns-675946d6fc-std9f\" (UID: \"dd4153b0-ab10-423f-bb3a-be4bac79c118\") " pod="openstack/dnsmasq-dns-675946d6fc-std9f" Oct 03 08:57:16 crc kubenswrapper[4790]: I1003 08:57:16.382191 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dd4153b0-ab10-423f-bb3a-be4bac79c118-config\") pod \"dnsmasq-dns-675946d6fc-std9f\" (UID: \"dd4153b0-ab10-423f-bb3a-be4bac79c118\") " pod="openstack/dnsmasq-dns-675946d6fc-std9f" Oct 03 08:57:16 crc kubenswrapper[4790]: I1003 08:57:16.491010 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/dd4153b0-ab10-423f-bb3a-be4bac79c118-ovsdbserver-nb\") pod \"dnsmasq-dns-675946d6fc-std9f\" (UID: \"dd4153b0-ab10-423f-bb3a-be4bac79c118\") " pod="openstack/dnsmasq-dns-675946d6fc-std9f" Oct 03 08:57:16 crc kubenswrapper[4790]: I1003 08:57:16.491084 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/dd4153b0-ab10-423f-bb3a-be4bac79c118-dns-svc\") pod \"dnsmasq-dns-675946d6fc-std9f\" (UID: \"dd4153b0-ab10-423f-bb3a-be4bac79c118\") " pod="openstack/dnsmasq-dns-675946d6fc-std9f" Oct 03 08:57:16 crc kubenswrapper[4790]: I1003 08:57:16.491140 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-txmgh\" (UniqueName: \"kubernetes.io/projected/dd4153b0-ab10-423f-bb3a-be4bac79c118-kube-api-access-txmgh\") pod \"dnsmasq-dns-675946d6fc-std9f\" (UID: \"dd4153b0-ab10-423f-bb3a-be4bac79c118\") " pod="openstack/dnsmasq-dns-675946d6fc-std9f" Oct 03 08:57:16 crc kubenswrapper[4790]: I1003 08:57:16.491226 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/dd4153b0-ab10-423f-bb3a-be4bac79c118-ovsdbserver-sb\") pod \"dnsmasq-dns-675946d6fc-std9f\" (UID: \"dd4153b0-ab10-423f-bb3a-be4bac79c118\") " pod="openstack/dnsmasq-dns-675946d6fc-std9f" Oct 03 08:57:16 crc kubenswrapper[4790]: I1003 08:57:16.491260 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dd4153b0-ab10-423f-bb3a-be4bac79c118-config\") pod \"dnsmasq-dns-675946d6fc-std9f\" (UID: \"dd4153b0-ab10-423f-bb3a-be4bac79c118\") " pod="openstack/dnsmasq-dns-675946d6fc-std9f" Oct 03 08:57:16 crc kubenswrapper[4790]: I1003 08:57:16.492363 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/dd4153b0-ab10-423f-bb3a-be4bac79c118-ovsdbserver-nb\") pod \"dnsmasq-dns-675946d6fc-std9f\" (UID: \"dd4153b0-ab10-423f-bb3a-be4bac79c118\") " pod="openstack/dnsmasq-dns-675946d6fc-std9f" Oct 03 08:57:16 crc kubenswrapper[4790]: I1003 08:57:16.492378 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/dd4153b0-ab10-423f-bb3a-be4bac79c118-dns-svc\") pod \"dnsmasq-dns-675946d6fc-std9f\" (UID: \"dd4153b0-ab10-423f-bb3a-be4bac79c118\") " pod="openstack/dnsmasq-dns-675946d6fc-std9f" Oct 03 08:57:16 crc kubenswrapper[4790]: I1003 08:57:16.496318 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/dd4153b0-ab10-423f-bb3a-be4bac79c118-ovsdbserver-sb\") pod \"dnsmasq-dns-675946d6fc-std9f\" (UID: \"dd4153b0-ab10-423f-bb3a-be4bac79c118\") " pod="openstack/dnsmasq-dns-675946d6fc-std9f" Oct 03 08:57:16 crc kubenswrapper[4790]: I1003 08:57:16.496481 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dd4153b0-ab10-423f-bb3a-be4bac79c118-config\") pod \"dnsmasq-dns-675946d6fc-std9f\" (UID: \"dd4153b0-ab10-423f-bb3a-be4bac79c118\") " pod="openstack/dnsmasq-dns-675946d6fc-std9f" Oct 03 08:57:16 crc kubenswrapper[4790]: I1003 08:57:16.521929 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-txmgh\" (UniqueName: \"kubernetes.io/projected/dd4153b0-ab10-423f-bb3a-be4bac79c118-kube-api-access-txmgh\") pod \"dnsmasq-dns-675946d6fc-std9f\" (UID: \"dd4153b0-ab10-423f-bb3a-be4bac79c118\") " pod="openstack/dnsmasq-dns-675946d6fc-std9f" Oct 03 08:57:16 crc kubenswrapper[4790]: I1003 08:57:16.556496 4790 generic.go:334] "Generic (PLEG): container finished" podID="010b668f-22e4-4ad8-aae5-decf063cd49b" containerID="a4509cbef03a8a7e1516442a998cde882f083aa206498660c2948da3bb708306" exitCode=0 Oct 03 08:57:16 crc kubenswrapper[4790]: I1003 08:57:16.556554 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"010b668f-22e4-4ad8-aae5-decf063cd49b","Type":"ContainerDied","Data":"a4509cbef03a8a7e1516442a998cde882f083aa206498660c2948da3bb708306"} Oct 03 08:57:16 crc kubenswrapper[4790]: I1003 08:57:16.559899 4790 generic.go:334] "Generic (PLEG): container finished" podID="18264edd-1eee-4089-ad97-63c7092a8650" containerID="4aa9709240cf6f4d1db5efb1fa197164deef1387713c42477a9203230fd18d05" exitCode=0 Oct 03 08:57:16 crc kubenswrapper[4790]: I1003 08:57:16.559952 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-788c899457-p84kc" event={"ID":"18264edd-1eee-4089-ad97-63c7092a8650","Type":"ContainerDied","Data":"4aa9709240cf6f4d1db5efb1fa197164deef1387713c42477a9203230fd18d05"} Oct 03 08:57:16 crc kubenswrapper[4790]: I1003 08:57:16.600723 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675946d6fc-std9f" Oct 03 08:57:16 crc kubenswrapper[4790]: I1003 08:57:16.710402 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Oct 03 08:57:16 crc kubenswrapper[4790]: I1003 08:57:16.924015 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-788c899457-p84kc" Oct 03 08:57:16 crc kubenswrapper[4790]: I1003 08:57:16.998493 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w72nw\" (UniqueName: \"kubernetes.io/projected/18264edd-1eee-4089-ad97-63c7092a8650-kube-api-access-w72nw\") pod \"18264edd-1eee-4089-ad97-63c7092a8650\" (UID: \"18264edd-1eee-4089-ad97-63c7092a8650\") " Oct 03 08:57:16 crc kubenswrapper[4790]: I1003 08:57:16.998620 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/18264edd-1eee-4089-ad97-63c7092a8650-ovsdbserver-sb\") pod \"18264edd-1eee-4089-ad97-63c7092a8650\" (UID: \"18264edd-1eee-4089-ad97-63c7092a8650\") " Oct 03 08:57:16 crc kubenswrapper[4790]: I1003 08:57:16.998739 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/18264edd-1eee-4089-ad97-63c7092a8650-config\") pod \"18264edd-1eee-4089-ad97-63c7092a8650\" (UID: \"18264edd-1eee-4089-ad97-63c7092a8650\") " Oct 03 08:57:16 crc kubenswrapper[4790]: I1003 08:57:16.998872 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/18264edd-1eee-4089-ad97-63c7092a8650-dns-svc\") pod \"18264edd-1eee-4089-ad97-63c7092a8650\" (UID: \"18264edd-1eee-4089-ad97-63c7092a8650\") " Oct 03 08:57:17 crc kubenswrapper[4790]: I1003 08:57:17.009312 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/18264edd-1eee-4089-ad97-63c7092a8650-kube-api-access-w72nw" (OuterVolumeSpecName: "kube-api-access-w72nw") pod "18264edd-1eee-4089-ad97-63c7092a8650" (UID: "18264edd-1eee-4089-ad97-63c7092a8650"). InnerVolumeSpecName "kube-api-access-w72nw". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:57:17 crc kubenswrapper[4790]: I1003 08:57:17.067174 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/18264edd-1eee-4089-ad97-63c7092a8650-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "18264edd-1eee-4089-ad97-63c7092a8650" (UID: "18264edd-1eee-4089-ad97-63c7092a8650"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:57:17 crc kubenswrapper[4790]: I1003 08:57:17.092088 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/18264edd-1eee-4089-ad97-63c7092a8650-config" (OuterVolumeSpecName: "config") pod "18264edd-1eee-4089-ad97-63c7092a8650" (UID: "18264edd-1eee-4089-ad97-63c7092a8650"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:57:17 crc kubenswrapper[4790]: I1003 08:57:17.092560 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/18264edd-1eee-4089-ad97-63c7092a8650-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "18264edd-1eee-4089-ad97-63c7092a8650" (UID: "18264edd-1eee-4089-ad97-63c7092a8650"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:57:17 crc kubenswrapper[4790]: I1003 08:57:17.104012 4790 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/18264edd-1eee-4089-ad97-63c7092a8650-config\") on node \"crc\" DevicePath \"\"" Oct 03 08:57:17 crc kubenswrapper[4790]: I1003 08:57:17.104036 4790 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/18264edd-1eee-4089-ad97-63c7092a8650-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 03 08:57:17 crc kubenswrapper[4790]: I1003 08:57:17.104072 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w72nw\" (UniqueName: \"kubernetes.io/projected/18264edd-1eee-4089-ad97-63c7092a8650-kube-api-access-w72nw\") on node \"crc\" DevicePath \"\"" Oct 03 08:57:17 crc kubenswrapper[4790]: I1003 08:57:17.104084 4790 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/18264edd-1eee-4089-ad97-63c7092a8650-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 03 08:57:17 crc kubenswrapper[4790]: I1003 08:57:17.325527 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675946d6fc-std9f"] Oct 03 08:57:17 crc kubenswrapper[4790]: I1003 08:57:17.331588 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-storage-0"] Oct 03 08:57:17 crc kubenswrapper[4790]: E1003 08:57:17.331994 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="18264edd-1eee-4089-ad97-63c7092a8650" containerName="dnsmasq-dns" Oct 03 08:57:17 crc kubenswrapper[4790]: I1003 08:57:17.332017 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="18264edd-1eee-4089-ad97-63c7092a8650" containerName="dnsmasq-dns" Oct 03 08:57:17 crc kubenswrapper[4790]: E1003 08:57:17.332037 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="18264edd-1eee-4089-ad97-63c7092a8650" containerName="init" Oct 03 08:57:17 crc kubenswrapper[4790]: I1003 08:57:17.332043 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="18264edd-1eee-4089-ad97-63c7092a8650" containerName="init" Oct 03 08:57:17 crc kubenswrapper[4790]: I1003 08:57:17.332216 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="18264edd-1eee-4089-ad97-63c7092a8650" containerName="dnsmasq-dns" Oct 03 08:57:17 crc kubenswrapper[4790]: W1003 08:57:17.333317 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddd4153b0_ab10_423f_bb3a_be4bac79c118.slice/crio-f54334b10bed2f7e8edd859681bff19c659e0ce02efa3af5d54188dac7ee56ca WatchSource:0}: Error finding container f54334b10bed2f7e8edd859681bff19c659e0ce02efa3af5d54188dac7ee56ca: Status 404 returned error can't find the container with id f54334b10bed2f7e8edd859681bff19c659e0ce02efa3af5d54188dac7ee56ca Oct 03 08:57:17 crc kubenswrapper[4790]: I1003 08:57:17.338652 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Oct 03 08:57:17 crc kubenswrapper[4790]: I1003 08:57:17.351858 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-storage-config-data" Oct 03 08:57:17 crc kubenswrapper[4790]: I1003 08:57:17.352206 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-files" Oct 03 08:57:17 crc kubenswrapper[4790]: I1003 08:57:17.352299 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-conf" Oct 03 08:57:17 crc kubenswrapper[4790]: I1003 08:57:17.352401 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-swift-dockercfg-stjdn" Oct 03 08:57:17 crc kubenswrapper[4790]: I1003 08:57:17.355444 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Oct 03 08:57:17 crc kubenswrapper[4790]: I1003 08:57:17.409180 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/d6c105dc-48df-4d07-8a82-ca93e78ad98c-cache\") pod \"swift-storage-0\" (UID: \"d6c105dc-48df-4d07-8a82-ca93e78ad98c\") " pod="openstack/swift-storage-0" Oct 03 08:57:17 crc kubenswrapper[4790]: I1003 08:57:17.409347 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"swift-storage-0\" (UID: \"d6c105dc-48df-4d07-8a82-ca93e78ad98c\") " pod="openstack/swift-storage-0" Oct 03 08:57:17 crc kubenswrapper[4790]: I1003 08:57:17.409374 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/d6c105dc-48df-4d07-8a82-ca93e78ad98c-etc-swift\") pod \"swift-storage-0\" (UID: \"d6c105dc-48df-4d07-8a82-ca93e78ad98c\") " pod="openstack/swift-storage-0" Oct 03 08:57:17 crc kubenswrapper[4790]: I1003 08:57:17.409445 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qjk7c\" (UniqueName: \"kubernetes.io/projected/d6c105dc-48df-4d07-8a82-ca93e78ad98c-kube-api-access-qjk7c\") pod \"swift-storage-0\" (UID: \"d6c105dc-48df-4d07-8a82-ca93e78ad98c\") " pod="openstack/swift-storage-0" Oct 03 08:57:17 crc kubenswrapper[4790]: I1003 08:57:17.409496 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/d6c105dc-48df-4d07-8a82-ca93e78ad98c-lock\") pod \"swift-storage-0\" (UID: \"d6c105dc-48df-4d07-8a82-ca93e78ad98c\") " pod="openstack/swift-storage-0" Oct 03 08:57:17 crc kubenswrapper[4790]: I1003 08:57:17.511470 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/d6c105dc-48df-4d07-8a82-ca93e78ad98c-lock\") pod \"swift-storage-0\" (UID: \"d6c105dc-48df-4d07-8a82-ca93e78ad98c\") " pod="openstack/swift-storage-0" Oct 03 08:57:17 crc kubenswrapper[4790]: I1003 08:57:17.511532 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/d6c105dc-48df-4d07-8a82-ca93e78ad98c-cache\") pod \"swift-storage-0\" (UID: \"d6c105dc-48df-4d07-8a82-ca93e78ad98c\") " pod="openstack/swift-storage-0" Oct 03 08:57:17 crc kubenswrapper[4790]: I1003 08:57:17.511604 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"swift-storage-0\" (UID: \"d6c105dc-48df-4d07-8a82-ca93e78ad98c\") " pod="openstack/swift-storage-0" Oct 03 08:57:17 crc kubenswrapper[4790]: I1003 08:57:17.511629 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/d6c105dc-48df-4d07-8a82-ca93e78ad98c-etc-swift\") pod \"swift-storage-0\" (UID: \"d6c105dc-48df-4d07-8a82-ca93e78ad98c\") " pod="openstack/swift-storage-0" Oct 03 08:57:17 crc kubenswrapper[4790]: I1003 08:57:17.511709 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qjk7c\" (UniqueName: \"kubernetes.io/projected/d6c105dc-48df-4d07-8a82-ca93e78ad98c-kube-api-access-qjk7c\") pod \"swift-storage-0\" (UID: \"d6c105dc-48df-4d07-8a82-ca93e78ad98c\") " pod="openstack/swift-storage-0" Oct 03 08:57:17 crc kubenswrapper[4790]: E1003 08:57:17.512003 4790 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Oct 03 08:57:17 crc kubenswrapper[4790]: I1003 08:57:17.512075 4790 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"swift-storage-0\" (UID: \"d6c105dc-48df-4d07-8a82-ca93e78ad98c\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/swift-storage-0" Oct 03 08:57:17 crc kubenswrapper[4790]: I1003 08:57:17.512055 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/d6c105dc-48df-4d07-8a82-ca93e78ad98c-cache\") pod \"swift-storage-0\" (UID: \"d6c105dc-48df-4d07-8a82-ca93e78ad98c\") " pod="openstack/swift-storage-0" Oct 03 08:57:17 crc kubenswrapper[4790]: E1003 08:57:17.512091 4790 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Oct 03 08:57:17 crc kubenswrapper[4790]: E1003 08:57:17.512235 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d6c105dc-48df-4d07-8a82-ca93e78ad98c-etc-swift podName:d6c105dc-48df-4d07-8a82-ca93e78ad98c nodeName:}" failed. No retries permitted until 2025-10-03 08:57:18.012216505 +0000 UTC m=+1024.365150743 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/d6c105dc-48df-4d07-8a82-ca93e78ad98c-etc-swift") pod "swift-storage-0" (UID: "d6c105dc-48df-4d07-8a82-ca93e78ad98c") : configmap "swift-ring-files" not found Oct 03 08:57:17 crc kubenswrapper[4790]: I1003 08:57:17.512024 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/d6c105dc-48df-4d07-8a82-ca93e78ad98c-lock\") pod \"swift-storage-0\" (UID: \"d6c105dc-48df-4d07-8a82-ca93e78ad98c\") " pod="openstack/swift-storage-0" Oct 03 08:57:17 crc kubenswrapper[4790]: I1003 08:57:17.534666 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qjk7c\" (UniqueName: \"kubernetes.io/projected/d6c105dc-48df-4d07-8a82-ca93e78ad98c-kube-api-access-qjk7c\") pod \"swift-storage-0\" (UID: \"d6c105dc-48df-4d07-8a82-ca93e78ad98c\") " pod="openstack/swift-storage-0" Oct 03 08:57:17 crc kubenswrapper[4790]: I1003 08:57:17.537797 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"swift-storage-0\" (UID: \"d6c105dc-48df-4d07-8a82-ca93e78ad98c\") " pod="openstack/swift-storage-0" Oct 03 08:57:17 crc kubenswrapper[4790]: I1003 08:57:17.569727 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"b0f1b68c-e2ab-4d14-b207-a8a336b1bc1b","Type":"ContainerStarted","Data":"203168a4c3312a09238754550a4f6e61387d2235c4183757c7723615e27f6385"} Oct 03 08:57:17 crc kubenswrapper[4790]: I1003 08:57:17.574158 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675946d6fc-std9f" event={"ID":"dd4153b0-ab10-423f-bb3a-be4bac79c118","Type":"ContainerStarted","Data":"f54334b10bed2f7e8edd859681bff19c659e0ce02efa3af5d54188dac7ee56ca"} Oct 03 08:57:17 crc kubenswrapper[4790]: I1003 08:57:17.576472 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"010b668f-22e4-4ad8-aae5-decf063cd49b","Type":"ContainerStarted","Data":"523e171dfd8352bceb5c4073995f82cda841c7081b9c67da0a59b04415c423b7"} Oct 03 08:57:17 crc kubenswrapper[4790]: I1003 08:57:17.580002 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-788c899457-p84kc" event={"ID":"18264edd-1eee-4089-ad97-63c7092a8650","Type":"ContainerDied","Data":"aa04d87a1f8fcbb769584d4702b66ef9d33b99a5aca3815032dbafa8911569d2"} Oct 03 08:57:17 crc kubenswrapper[4790]: I1003 08:57:17.580086 4790 scope.go:117] "RemoveContainer" containerID="4aa9709240cf6f4d1db5efb1fa197164deef1387713c42477a9203230fd18d05" Oct 03 08:57:17 crc kubenswrapper[4790]: I1003 08:57:17.580041 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-788c899457-p84kc" Oct 03 08:57:17 crc kubenswrapper[4790]: I1003 08:57:17.624145 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=26.651213872 podStartE2EDuration="35.624123699s" podCreationTimestamp="2025-10-03 08:56:42 +0000 UTC" firstStartedPulling="2025-10-03 08:56:53.987109764 +0000 UTC m=+1000.340044002" lastFinishedPulling="2025-10-03 08:57:02.960019601 +0000 UTC m=+1009.312953829" observedRunningTime="2025-10-03 08:57:17.61648992 +0000 UTC m=+1023.969424158" watchObservedRunningTime="2025-10-03 08:57:17.624123699 +0000 UTC m=+1023.977057937" Oct 03 08:57:17 crc kubenswrapper[4790]: I1003 08:57:17.676309 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-788c899457-p84kc"] Oct 03 08:57:17 crc kubenswrapper[4790]: I1003 08:57:17.677476 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-788c899457-p84kc"] Oct 03 08:57:17 crc kubenswrapper[4790]: I1003 08:57:17.827892 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-5tpvp"] Oct 03 08:57:17 crc kubenswrapper[4790]: I1003 08:57:17.828925 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-5tpvp" Oct 03 08:57:17 crc kubenswrapper[4790]: I1003 08:57:17.831431 4790 scope.go:117] "RemoveContainer" containerID="d8ab163703c8c49478be3bc9dfe1a0f92c00e90e8156536e958921c089f5c294" Oct 03 08:57:17 crc kubenswrapper[4790]: I1003 08:57:17.833396 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Oct 03 08:57:17 crc kubenswrapper[4790]: I1003 08:57:17.834197 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-scripts" Oct 03 08:57:17 crc kubenswrapper[4790]: I1003 08:57:17.834370 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-config-data" Oct 03 08:57:17 crc kubenswrapper[4790]: I1003 08:57:17.858447 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-5tpvp"] Oct 03 08:57:17 crc kubenswrapper[4790]: I1003 08:57:17.918719 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/4f9b13ce-6ec8-4e05-b395-c7ec580823ed-ring-data-devices\") pod \"swift-ring-rebalance-5tpvp\" (UID: \"4f9b13ce-6ec8-4e05-b395-c7ec580823ed\") " pod="openstack/swift-ring-rebalance-5tpvp" Oct 03 08:57:17 crc kubenswrapper[4790]: I1003 08:57:17.919091 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f9b13ce-6ec8-4e05-b395-c7ec580823ed-combined-ca-bundle\") pod \"swift-ring-rebalance-5tpvp\" (UID: \"4f9b13ce-6ec8-4e05-b395-c7ec580823ed\") " pod="openstack/swift-ring-rebalance-5tpvp" Oct 03 08:57:17 crc kubenswrapper[4790]: I1003 08:57:17.919139 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/4f9b13ce-6ec8-4e05-b395-c7ec580823ed-etc-swift\") pod \"swift-ring-rebalance-5tpvp\" (UID: \"4f9b13ce-6ec8-4e05-b395-c7ec580823ed\") " pod="openstack/swift-ring-rebalance-5tpvp" Oct 03 08:57:17 crc kubenswrapper[4790]: I1003 08:57:17.919169 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4f9b13ce-6ec8-4e05-b395-c7ec580823ed-scripts\") pod \"swift-ring-rebalance-5tpvp\" (UID: \"4f9b13ce-6ec8-4e05-b395-c7ec580823ed\") " pod="openstack/swift-ring-rebalance-5tpvp" Oct 03 08:57:17 crc kubenswrapper[4790]: I1003 08:57:17.919189 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-26qt4\" (UniqueName: \"kubernetes.io/projected/4f9b13ce-6ec8-4e05-b395-c7ec580823ed-kube-api-access-26qt4\") pod \"swift-ring-rebalance-5tpvp\" (UID: \"4f9b13ce-6ec8-4e05-b395-c7ec580823ed\") " pod="openstack/swift-ring-rebalance-5tpvp" Oct 03 08:57:17 crc kubenswrapper[4790]: I1003 08:57:17.919208 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/4f9b13ce-6ec8-4e05-b395-c7ec580823ed-swiftconf\") pod \"swift-ring-rebalance-5tpvp\" (UID: \"4f9b13ce-6ec8-4e05-b395-c7ec580823ed\") " pod="openstack/swift-ring-rebalance-5tpvp" Oct 03 08:57:17 crc kubenswrapper[4790]: I1003 08:57:17.919290 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/4f9b13ce-6ec8-4e05-b395-c7ec580823ed-dispersionconf\") pod \"swift-ring-rebalance-5tpvp\" (UID: \"4f9b13ce-6ec8-4e05-b395-c7ec580823ed\") " pod="openstack/swift-ring-rebalance-5tpvp" Oct 03 08:57:18 crc kubenswrapper[4790]: I1003 08:57:18.020148 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4f9b13ce-6ec8-4e05-b395-c7ec580823ed-scripts\") pod \"swift-ring-rebalance-5tpvp\" (UID: \"4f9b13ce-6ec8-4e05-b395-c7ec580823ed\") " pod="openstack/swift-ring-rebalance-5tpvp" Oct 03 08:57:18 crc kubenswrapper[4790]: I1003 08:57:18.020185 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-26qt4\" (UniqueName: \"kubernetes.io/projected/4f9b13ce-6ec8-4e05-b395-c7ec580823ed-kube-api-access-26qt4\") pod \"swift-ring-rebalance-5tpvp\" (UID: \"4f9b13ce-6ec8-4e05-b395-c7ec580823ed\") " pod="openstack/swift-ring-rebalance-5tpvp" Oct 03 08:57:18 crc kubenswrapper[4790]: I1003 08:57:18.020208 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/4f9b13ce-6ec8-4e05-b395-c7ec580823ed-swiftconf\") pod \"swift-ring-rebalance-5tpvp\" (UID: \"4f9b13ce-6ec8-4e05-b395-c7ec580823ed\") " pod="openstack/swift-ring-rebalance-5tpvp" Oct 03 08:57:18 crc kubenswrapper[4790]: I1003 08:57:18.020238 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/4f9b13ce-6ec8-4e05-b395-c7ec580823ed-dispersionconf\") pod \"swift-ring-rebalance-5tpvp\" (UID: \"4f9b13ce-6ec8-4e05-b395-c7ec580823ed\") " pod="openstack/swift-ring-rebalance-5tpvp" Oct 03 08:57:18 crc kubenswrapper[4790]: I1003 08:57:18.020299 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/d6c105dc-48df-4d07-8a82-ca93e78ad98c-etc-swift\") pod \"swift-storage-0\" (UID: \"d6c105dc-48df-4d07-8a82-ca93e78ad98c\") " pod="openstack/swift-storage-0" Oct 03 08:57:18 crc kubenswrapper[4790]: I1003 08:57:18.020314 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/4f9b13ce-6ec8-4e05-b395-c7ec580823ed-ring-data-devices\") pod \"swift-ring-rebalance-5tpvp\" (UID: \"4f9b13ce-6ec8-4e05-b395-c7ec580823ed\") " pod="openstack/swift-ring-rebalance-5tpvp" Oct 03 08:57:18 crc kubenswrapper[4790]: I1003 08:57:18.020353 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f9b13ce-6ec8-4e05-b395-c7ec580823ed-combined-ca-bundle\") pod \"swift-ring-rebalance-5tpvp\" (UID: \"4f9b13ce-6ec8-4e05-b395-c7ec580823ed\") " pod="openstack/swift-ring-rebalance-5tpvp" Oct 03 08:57:18 crc kubenswrapper[4790]: I1003 08:57:18.020391 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/4f9b13ce-6ec8-4e05-b395-c7ec580823ed-etc-swift\") pod \"swift-ring-rebalance-5tpvp\" (UID: \"4f9b13ce-6ec8-4e05-b395-c7ec580823ed\") " pod="openstack/swift-ring-rebalance-5tpvp" Oct 03 08:57:18 crc kubenswrapper[4790]: I1003 08:57:18.020900 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/4f9b13ce-6ec8-4e05-b395-c7ec580823ed-etc-swift\") pod \"swift-ring-rebalance-5tpvp\" (UID: \"4f9b13ce-6ec8-4e05-b395-c7ec580823ed\") " pod="openstack/swift-ring-rebalance-5tpvp" Oct 03 08:57:18 crc kubenswrapper[4790]: I1003 08:57:18.021185 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4f9b13ce-6ec8-4e05-b395-c7ec580823ed-scripts\") pod \"swift-ring-rebalance-5tpvp\" (UID: \"4f9b13ce-6ec8-4e05-b395-c7ec580823ed\") " pod="openstack/swift-ring-rebalance-5tpvp" Oct 03 08:57:18 crc kubenswrapper[4790]: E1003 08:57:18.021385 4790 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Oct 03 08:57:18 crc kubenswrapper[4790]: E1003 08:57:18.021412 4790 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Oct 03 08:57:18 crc kubenswrapper[4790]: E1003 08:57:18.021463 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d6c105dc-48df-4d07-8a82-ca93e78ad98c-etc-swift podName:d6c105dc-48df-4d07-8a82-ca93e78ad98c nodeName:}" failed. No retries permitted until 2025-10-03 08:57:19.021445067 +0000 UTC m=+1025.374379395 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/d6c105dc-48df-4d07-8a82-ca93e78ad98c-etc-swift") pod "swift-storage-0" (UID: "d6c105dc-48df-4d07-8a82-ca93e78ad98c") : configmap "swift-ring-files" not found Oct 03 08:57:18 crc kubenswrapper[4790]: I1003 08:57:18.022395 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/4f9b13ce-6ec8-4e05-b395-c7ec580823ed-ring-data-devices\") pod \"swift-ring-rebalance-5tpvp\" (UID: \"4f9b13ce-6ec8-4e05-b395-c7ec580823ed\") " pod="openstack/swift-ring-rebalance-5tpvp" Oct 03 08:57:18 crc kubenswrapper[4790]: I1003 08:57:18.024306 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/4f9b13ce-6ec8-4e05-b395-c7ec580823ed-dispersionconf\") pod \"swift-ring-rebalance-5tpvp\" (UID: \"4f9b13ce-6ec8-4e05-b395-c7ec580823ed\") " pod="openstack/swift-ring-rebalance-5tpvp" Oct 03 08:57:18 crc kubenswrapper[4790]: I1003 08:57:18.024722 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/4f9b13ce-6ec8-4e05-b395-c7ec580823ed-swiftconf\") pod \"swift-ring-rebalance-5tpvp\" (UID: \"4f9b13ce-6ec8-4e05-b395-c7ec580823ed\") " pod="openstack/swift-ring-rebalance-5tpvp" Oct 03 08:57:18 crc kubenswrapper[4790]: I1003 08:57:18.032326 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f9b13ce-6ec8-4e05-b395-c7ec580823ed-combined-ca-bundle\") pod \"swift-ring-rebalance-5tpvp\" (UID: \"4f9b13ce-6ec8-4e05-b395-c7ec580823ed\") " pod="openstack/swift-ring-rebalance-5tpvp" Oct 03 08:57:18 crc kubenswrapper[4790]: I1003 08:57:18.039415 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-26qt4\" (UniqueName: \"kubernetes.io/projected/4f9b13ce-6ec8-4e05-b395-c7ec580823ed-kube-api-access-26qt4\") pod \"swift-ring-rebalance-5tpvp\" (UID: \"4f9b13ce-6ec8-4e05-b395-c7ec580823ed\") " pod="openstack/swift-ring-rebalance-5tpvp" Oct 03 08:57:18 crc kubenswrapper[4790]: I1003 08:57:18.152377 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-5tpvp" Oct 03 08:57:18 crc kubenswrapper[4790]: E1003 08:57:18.267825 4790 upgradeaware.go:441] Error proxying data from backend to client: writeto tcp 38.102.83.113:51716->38.102.83.113:44451: read tcp 38.102.83.113:51716->38.102.83.113:44451: read: connection reset by peer Oct 03 08:57:18 crc kubenswrapper[4790]: I1003 08:57:18.349556 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="18264edd-1eee-4089-ad97-63c7092a8650" path="/var/lib/kubelet/pods/18264edd-1eee-4089-ad97-63c7092a8650/volumes" Oct 03 08:57:18 crc kubenswrapper[4790]: I1003 08:57:18.591547 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"b0f1b68c-e2ab-4d14-b207-a8a336b1bc1b","Type":"ContainerStarted","Data":"e70119ebc5d1d9d74bc2379369ec1a3641aff2132e1907c59602a2b977476f5d"} Oct 03 08:57:18 crc kubenswrapper[4790]: I1003 08:57:18.591930 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"b0f1b68c-e2ab-4d14-b207-a8a336b1bc1b","Type":"ContainerStarted","Data":"6115e3faa50274567639fd3a67accad4e804b53506e703a29f563125f22f200c"} Oct 03 08:57:18 crc kubenswrapper[4790]: I1003 08:57:18.591952 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Oct 03 08:57:18 crc kubenswrapper[4790]: I1003 08:57:18.595030 4790 generic.go:334] "Generic (PLEG): container finished" podID="dd4153b0-ab10-423f-bb3a-be4bac79c118" containerID="7fcda0e7cf3b307ca9c13a7755139b6213989832222956f1fe141e835ff7c023" exitCode=0 Oct 03 08:57:18 crc kubenswrapper[4790]: I1003 08:57:18.595065 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675946d6fc-std9f" event={"ID":"dd4153b0-ab10-423f-bb3a-be4bac79c118","Type":"ContainerDied","Data":"7fcda0e7cf3b307ca9c13a7755139b6213989832222956f1fe141e835ff7c023"} Oct 03 08:57:18 crc kubenswrapper[4790]: I1003 08:57:18.609262 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=2.584622978 podStartE2EDuration="3.609245959s" podCreationTimestamp="2025-10-03 08:57:15 +0000 UTC" firstStartedPulling="2025-10-03 08:57:16.9031103 +0000 UTC m=+1023.256044538" lastFinishedPulling="2025-10-03 08:57:17.927733281 +0000 UTC m=+1024.280667519" observedRunningTime="2025-10-03 08:57:18.60705113 +0000 UTC m=+1024.959985368" watchObservedRunningTime="2025-10-03 08:57:18.609245959 +0000 UTC m=+1024.962180197" Oct 03 08:57:18 crc kubenswrapper[4790]: I1003 08:57:18.654815 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-5tpvp"] Oct 03 08:57:19 crc kubenswrapper[4790]: I1003 08:57:19.043603 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/d6c105dc-48df-4d07-8a82-ca93e78ad98c-etc-swift\") pod \"swift-storage-0\" (UID: \"d6c105dc-48df-4d07-8a82-ca93e78ad98c\") " pod="openstack/swift-storage-0" Oct 03 08:57:19 crc kubenswrapper[4790]: E1003 08:57:19.043851 4790 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Oct 03 08:57:19 crc kubenswrapper[4790]: E1003 08:57:19.043887 4790 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Oct 03 08:57:19 crc kubenswrapper[4790]: E1003 08:57:19.043953 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d6c105dc-48df-4d07-8a82-ca93e78ad98c-etc-swift podName:d6c105dc-48df-4d07-8a82-ca93e78ad98c nodeName:}" failed. No retries permitted until 2025-10-03 08:57:21.043931531 +0000 UTC m=+1027.396865769 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/d6c105dc-48df-4d07-8a82-ca93e78ad98c-etc-swift") pod "swift-storage-0" (UID: "d6c105dc-48df-4d07-8a82-ca93e78ad98c") : configmap "swift-ring-files" not found Oct 03 08:57:20 crc kubenswrapper[4790]: W1003 08:57:20.643933 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4f9b13ce_6ec8_4e05_b395_c7ec580823ed.slice/crio-4a45eae668d0c92e71c27b5f2930e067bc30576f47b15890282712a45fd39498 WatchSource:0}: Error finding container 4a45eae668d0c92e71c27b5f2930e067bc30576f47b15890282712a45fd39498: Status 404 returned error can't find the container with id 4a45eae668d0c92e71c27b5f2930e067bc30576f47b15890282712a45fd39498 Oct 03 08:57:20 crc kubenswrapper[4790]: I1003 08:57:20.839745 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7bfbdb9bd9-46r9c" Oct 03 08:57:21 crc kubenswrapper[4790]: I1003 08:57:21.080801 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/d6c105dc-48df-4d07-8a82-ca93e78ad98c-etc-swift\") pod \"swift-storage-0\" (UID: \"d6c105dc-48df-4d07-8a82-ca93e78ad98c\") " pod="openstack/swift-storage-0" Oct 03 08:57:21 crc kubenswrapper[4790]: E1003 08:57:21.081007 4790 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Oct 03 08:57:21 crc kubenswrapper[4790]: E1003 08:57:21.081039 4790 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Oct 03 08:57:21 crc kubenswrapper[4790]: E1003 08:57:21.081105 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d6c105dc-48df-4d07-8a82-ca93e78ad98c-etc-swift podName:d6c105dc-48df-4d07-8a82-ca93e78ad98c nodeName:}" failed. No retries permitted until 2025-10-03 08:57:25.081082783 +0000 UTC m=+1031.434017031 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/d6c105dc-48df-4d07-8a82-ca93e78ad98c-etc-swift") pod "swift-storage-0" (UID: "d6c105dc-48df-4d07-8a82-ca93e78ad98c") : configmap "swift-ring-files" not found Oct 03 08:57:21 crc kubenswrapper[4790]: I1003 08:57:21.622339 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-5tpvp" event={"ID":"4f9b13ce-6ec8-4e05-b395-c7ec580823ed","Type":"ContainerStarted","Data":"4a45eae668d0c92e71c27b5f2930e067bc30576f47b15890282712a45fd39498"} Oct 03 08:57:21 crc kubenswrapper[4790]: I1003 08:57:21.627045 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675946d6fc-std9f" event={"ID":"dd4153b0-ab10-423f-bb3a-be4bac79c118","Type":"ContainerStarted","Data":"dc3f67af7517d1bed8849c40f9891ea9a8e11f9f2be2fb0354d64d3b6488e4ca"} Oct 03 08:57:21 crc kubenswrapper[4790]: I1003 08:57:21.627185 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-675946d6fc-std9f" Oct 03 08:57:21 crc kubenswrapper[4790]: I1003 08:57:21.631046 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"3dc14abb-14bb-4bee-9864-8507e37d824d","Type":"ContainerStarted","Data":"3d67417773adb7ca519c4b78e60ec103659dc2679f5b2b7d1c5377b7affff09e"} Oct 03 08:57:21 crc kubenswrapper[4790]: I1003 08:57:21.649652 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-675946d6fc-std9f" podStartSLOduration=5.649636118 podStartE2EDuration="5.649636118s" podCreationTimestamp="2025-10-03 08:57:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 08:57:21.645790893 +0000 UTC m=+1027.998725151" watchObservedRunningTime="2025-10-03 08:57:21.649636118 +0000 UTC m=+1028.002570356" Oct 03 08:57:23 crc kubenswrapper[4790]: I1003 08:57:23.445048 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Oct 03 08:57:23 crc kubenswrapper[4790]: I1003 08:57:23.455563 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Oct 03 08:57:23 crc kubenswrapper[4790]: I1003 08:57:23.528839 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Oct 03 08:57:23 crc kubenswrapper[4790]: I1003 08:57:23.570817 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Oct 03 08:57:23 crc kubenswrapper[4790]: I1003 08:57:23.570881 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Oct 03 08:57:23 crc kubenswrapper[4790]: I1003 08:57:23.623925 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Oct 03 08:57:23 crc kubenswrapper[4790]: I1003 08:57:23.710562 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Oct 03 08:57:23 crc kubenswrapper[4790]: I1003 08:57:23.715303 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Oct 03 08:57:23 crc kubenswrapper[4790]: I1003 08:57:23.993283 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-28lqw"] Oct 03 08:57:23 crc kubenswrapper[4790]: I1003 08:57:23.994639 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-28lqw" Oct 03 08:57:24 crc kubenswrapper[4790]: I1003 08:57:24.008984 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-28lqw"] Oct 03 08:57:24 crc kubenswrapper[4790]: I1003 08:57:24.118591 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-8pfjr"] Oct 03 08:57:24 crc kubenswrapper[4790]: I1003 08:57:24.119711 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-8pfjr" Oct 03 08:57:24 crc kubenswrapper[4790]: I1003 08:57:24.130691 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-8pfjr"] Oct 03 08:57:24 crc kubenswrapper[4790]: I1003 08:57:24.158863 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wl76n\" (UniqueName: \"kubernetes.io/projected/d5099e4b-6707-4f2c-9d6f-504c101cf5c4-kube-api-access-wl76n\") pod \"placement-db-create-28lqw\" (UID: \"d5099e4b-6707-4f2c-9d6f-504c101cf5c4\") " pod="openstack/placement-db-create-28lqw" Oct 03 08:57:24 crc kubenswrapper[4790]: I1003 08:57:24.260798 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wl76n\" (UniqueName: \"kubernetes.io/projected/d5099e4b-6707-4f2c-9d6f-504c101cf5c4-kube-api-access-wl76n\") pod \"placement-db-create-28lqw\" (UID: \"d5099e4b-6707-4f2c-9d6f-504c101cf5c4\") " pod="openstack/placement-db-create-28lqw" Oct 03 08:57:24 crc kubenswrapper[4790]: I1003 08:57:24.260931 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vg5th\" (UniqueName: \"kubernetes.io/projected/ba913c08-cae4-40c2-9ec8-a4865beb0918-kube-api-access-vg5th\") pod \"glance-db-create-8pfjr\" (UID: \"ba913c08-cae4-40c2-9ec8-a4865beb0918\") " pod="openstack/glance-db-create-8pfjr" Oct 03 08:57:24 crc kubenswrapper[4790]: I1003 08:57:24.281414 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wl76n\" (UniqueName: \"kubernetes.io/projected/d5099e4b-6707-4f2c-9d6f-504c101cf5c4-kube-api-access-wl76n\") pod \"placement-db-create-28lqw\" (UID: \"d5099e4b-6707-4f2c-9d6f-504c101cf5c4\") " pod="openstack/placement-db-create-28lqw" Oct 03 08:57:24 crc kubenswrapper[4790]: I1003 08:57:24.316285 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-28lqw" Oct 03 08:57:24 crc kubenswrapper[4790]: I1003 08:57:24.362845 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vg5th\" (UniqueName: \"kubernetes.io/projected/ba913c08-cae4-40c2-9ec8-a4865beb0918-kube-api-access-vg5th\") pod \"glance-db-create-8pfjr\" (UID: \"ba913c08-cae4-40c2-9ec8-a4865beb0918\") " pod="openstack/glance-db-create-8pfjr" Oct 03 08:57:24 crc kubenswrapper[4790]: I1003 08:57:24.384687 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vg5th\" (UniqueName: \"kubernetes.io/projected/ba913c08-cae4-40c2-9ec8-a4865beb0918-kube-api-access-vg5th\") pod \"glance-db-create-8pfjr\" (UID: \"ba913c08-cae4-40c2-9ec8-a4865beb0918\") " pod="openstack/glance-db-create-8pfjr" Oct 03 08:57:24 crc kubenswrapper[4790]: I1003 08:57:24.441518 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-8pfjr" Oct 03 08:57:25 crc kubenswrapper[4790]: I1003 08:57:25.082627 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/d6c105dc-48df-4d07-8a82-ca93e78ad98c-etc-swift\") pod \"swift-storage-0\" (UID: \"d6c105dc-48df-4d07-8a82-ca93e78ad98c\") " pod="openstack/swift-storage-0" Oct 03 08:57:25 crc kubenswrapper[4790]: E1003 08:57:25.082884 4790 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Oct 03 08:57:25 crc kubenswrapper[4790]: E1003 08:57:25.082918 4790 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Oct 03 08:57:25 crc kubenswrapper[4790]: E1003 08:57:25.082992 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d6c105dc-48df-4d07-8a82-ca93e78ad98c-etc-swift podName:d6c105dc-48df-4d07-8a82-ca93e78ad98c nodeName:}" failed. No retries permitted until 2025-10-03 08:57:33.082970043 +0000 UTC m=+1039.435904291 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/d6c105dc-48df-4d07-8a82-ca93e78ad98c-etc-swift") pod "swift-storage-0" (UID: "d6c105dc-48df-4d07-8a82-ca93e78ad98c") : configmap "swift-ring-files" not found Oct 03 08:57:26 crc kubenswrapper[4790]: I1003 08:57:26.150101 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-db-create-dxr5m"] Oct 03 08:57:26 crc kubenswrapper[4790]: I1003 08:57:26.151631 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-db-create-dxr5m" Oct 03 08:57:26 crc kubenswrapper[4790]: I1003 08:57:26.165356 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-db-create-dxr5m"] Oct 03 08:57:26 crc kubenswrapper[4790]: I1003 08:57:26.306430 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-66rfg\" (UniqueName: \"kubernetes.io/projected/03453a08-49f9-4433-ba29-bd7f5ce716ff-kube-api-access-66rfg\") pod \"watcher-db-create-dxr5m\" (UID: \"03453a08-49f9-4433-ba29-bd7f5ce716ff\") " pod="openstack/watcher-db-create-dxr5m" Oct 03 08:57:26 crc kubenswrapper[4790]: I1003 08:57:26.408914 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-66rfg\" (UniqueName: \"kubernetes.io/projected/03453a08-49f9-4433-ba29-bd7f5ce716ff-kube-api-access-66rfg\") pod \"watcher-db-create-dxr5m\" (UID: \"03453a08-49f9-4433-ba29-bd7f5ce716ff\") " pod="openstack/watcher-db-create-dxr5m" Oct 03 08:57:26 crc kubenswrapper[4790]: I1003 08:57:26.435063 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-66rfg\" (UniqueName: \"kubernetes.io/projected/03453a08-49f9-4433-ba29-bd7f5ce716ff-kube-api-access-66rfg\") pod \"watcher-db-create-dxr5m\" (UID: \"03453a08-49f9-4433-ba29-bd7f5ce716ff\") " pod="openstack/watcher-db-create-dxr5m" Oct 03 08:57:26 crc kubenswrapper[4790]: I1003 08:57:26.484876 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-db-create-dxr5m" Oct 03 08:57:26 crc kubenswrapper[4790]: I1003 08:57:26.602775 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-675946d6fc-std9f" Oct 03 08:57:26 crc kubenswrapper[4790]: I1003 08:57:26.670445 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7bfbdb9bd9-46r9c"] Oct 03 08:57:26 crc kubenswrapper[4790]: I1003 08:57:26.670752 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7bfbdb9bd9-46r9c" podUID="7cb8c0c5-d92e-462c-88ea-5bbfc5b06d90" containerName="dnsmasq-dns" containerID="cri-o://5297c1d1c550f585c8f121fa581ec21e03dab18111f77659d2c048346d132741" gracePeriod=10 Oct 03 08:57:26 crc kubenswrapper[4790]: I1003 08:57:26.693074 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"3dc14abb-14bb-4bee-9864-8507e37d824d","Type":"ContainerStarted","Data":"e8d3fa0b987f5e1092abe139ea488243df2e034303948d70d6e0e2f553df6929"} Oct 03 08:57:27 crc kubenswrapper[4790]: I1003 08:57:27.264928 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7bfbdb9bd9-46r9c" Oct 03 08:57:27 crc kubenswrapper[4790]: I1003 08:57:27.369052 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-db-create-dxr5m"] Oct 03 08:57:27 crc kubenswrapper[4790]: I1003 08:57:27.426801 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7hwkw\" (UniqueName: \"kubernetes.io/projected/7cb8c0c5-d92e-462c-88ea-5bbfc5b06d90-kube-api-access-7hwkw\") pod \"7cb8c0c5-d92e-462c-88ea-5bbfc5b06d90\" (UID: \"7cb8c0c5-d92e-462c-88ea-5bbfc5b06d90\") " Oct 03 08:57:27 crc kubenswrapper[4790]: I1003 08:57:27.427276 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7cb8c0c5-d92e-462c-88ea-5bbfc5b06d90-ovsdbserver-sb\") pod \"7cb8c0c5-d92e-462c-88ea-5bbfc5b06d90\" (UID: \"7cb8c0c5-d92e-462c-88ea-5bbfc5b06d90\") " Oct 03 08:57:27 crc kubenswrapper[4790]: I1003 08:57:27.427373 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7cb8c0c5-d92e-462c-88ea-5bbfc5b06d90-config\") pod \"7cb8c0c5-d92e-462c-88ea-5bbfc5b06d90\" (UID: \"7cb8c0c5-d92e-462c-88ea-5bbfc5b06d90\") " Oct 03 08:57:27 crc kubenswrapper[4790]: I1003 08:57:27.427553 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7cb8c0c5-d92e-462c-88ea-5bbfc5b06d90-dns-svc\") pod \"7cb8c0c5-d92e-462c-88ea-5bbfc5b06d90\" (UID: \"7cb8c0c5-d92e-462c-88ea-5bbfc5b06d90\") " Oct 03 08:57:27 crc kubenswrapper[4790]: I1003 08:57:27.427740 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7cb8c0c5-d92e-462c-88ea-5bbfc5b06d90-ovsdbserver-nb\") pod \"7cb8c0c5-d92e-462c-88ea-5bbfc5b06d90\" (UID: \"7cb8c0c5-d92e-462c-88ea-5bbfc5b06d90\") " Oct 03 08:57:27 crc kubenswrapper[4790]: I1003 08:57:27.438970 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7cb8c0c5-d92e-462c-88ea-5bbfc5b06d90-kube-api-access-7hwkw" (OuterVolumeSpecName: "kube-api-access-7hwkw") pod "7cb8c0c5-d92e-462c-88ea-5bbfc5b06d90" (UID: "7cb8c0c5-d92e-462c-88ea-5bbfc5b06d90"). InnerVolumeSpecName "kube-api-access-7hwkw". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:57:27 crc kubenswrapper[4790]: I1003 08:57:27.476161 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7cb8c0c5-d92e-462c-88ea-5bbfc5b06d90-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "7cb8c0c5-d92e-462c-88ea-5bbfc5b06d90" (UID: "7cb8c0c5-d92e-462c-88ea-5bbfc5b06d90"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:57:27 crc kubenswrapper[4790]: I1003 08:57:27.492512 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7cb8c0c5-d92e-462c-88ea-5bbfc5b06d90-config" (OuterVolumeSpecName: "config") pod "7cb8c0c5-d92e-462c-88ea-5bbfc5b06d90" (UID: "7cb8c0c5-d92e-462c-88ea-5bbfc5b06d90"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:57:27 crc kubenswrapper[4790]: I1003 08:57:27.503974 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7cb8c0c5-d92e-462c-88ea-5bbfc5b06d90-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "7cb8c0c5-d92e-462c-88ea-5bbfc5b06d90" (UID: "7cb8c0c5-d92e-462c-88ea-5bbfc5b06d90"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:57:27 crc kubenswrapper[4790]: I1003 08:57:27.521425 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7cb8c0c5-d92e-462c-88ea-5bbfc5b06d90-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "7cb8c0c5-d92e-462c-88ea-5bbfc5b06d90" (UID: "7cb8c0c5-d92e-462c-88ea-5bbfc5b06d90"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:57:27 crc kubenswrapper[4790]: I1003 08:57:27.525572 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-28lqw"] Oct 03 08:57:27 crc kubenswrapper[4790]: I1003 08:57:27.530306 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7hwkw\" (UniqueName: \"kubernetes.io/projected/7cb8c0c5-d92e-462c-88ea-5bbfc5b06d90-kube-api-access-7hwkw\") on node \"crc\" DevicePath \"\"" Oct 03 08:57:27 crc kubenswrapper[4790]: I1003 08:57:27.530359 4790 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7cb8c0c5-d92e-462c-88ea-5bbfc5b06d90-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 03 08:57:27 crc kubenswrapper[4790]: I1003 08:57:27.530372 4790 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7cb8c0c5-d92e-462c-88ea-5bbfc5b06d90-config\") on node \"crc\" DevicePath \"\"" Oct 03 08:57:27 crc kubenswrapper[4790]: I1003 08:57:27.530385 4790 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7cb8c0c5-d92e-462c-88ea-5bbfc5b06d90-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 03 08:57:27 crc kubenswrapper[4790]: I1003 08:57:27.530398 4790 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7cb8c0c5-d92e-462c-88ea-5bbfc5b06d90-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 03 08:57:27 crc kubenswrapper[4790]: I1003 08:57:27.536932 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-8pfjr"] Oct 03 08:57:27 crc kubenswrapper[4790]: I1003 08:57:27.702259 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-db-create-dxr5m" event={"ID":"03453a08-49f9-4433-ba29-bd7f5ce716ff","Type":"ContainerStarted","Data":"251cab79401a7d0bb4ea271e7f2932757bd2faf71661c89056cfa92b60e54933"} Oct 03 08:57:27 crc kubenswrapper[4790]: I1003 08:57:27.704456 4790 generic.go:334] "Generic (PLEG): container finished" podID="7cb8c0c5-d92e-462c-88ea-5bbfc5b06d90" containerID="5297c1d1c550f585c8f121fa581ec21e03dab18111f77659d2c048346d132741" exitCode=0 Oct 03 08:57:27 crc kubenswrapper[4790]: I1003 08:57:27.704520 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7bfbdb9bd9-46r9c" event={"ID":"7cb8c0c5-d92e-462c-88ea-5bbfc5b06d90","Type":"ContainerDied","Data":"5297c1d1c550f585c8f121fa581ec21e03dab18111f77659d2c048346d132741"} Oct 03 08:57:27 crc kubenswrapper[4790]: I1003 08:57:27.704546 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7bfbdb9bd9-46r9c" event={"ID":"7cb8c0c5-d92e-462c-88ea-5bbfc5b06d90","Type":"ContainerDied","Data":"f6996b1cdf8dbb5992e59898124f786bad7d76f3282b2d2a47e309254a46e8cc"} Oct 03 08:57:27 crc kubenswrapper[4790]: I1003 08:57:27.704628 4790 scope.go:117] "RemoveContainer" containerID="5297c1d1c550f585c8f121fa581ec21e03dab18111f77659d2c048346d132741" Oct 03 08:57:27 crc kubenswrapper[4790]: I1003 08:57:27.704866 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7bfbdb9bd9-46r9c" Oct 03 08:57:27 crc kubenswrapper[4790]: I1003 08:57:27.710709 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-5tpvp" event={"ID":"4f9b13ce-6ec8-4e05-b395-c7ec580823ed","Type":"ContainerStarted","Data":"d29ac79fa7d0ddf1e73030b7d05bc87f015a103898a4ebfba90722cb2320679b"} Oct 03 08:57:27 crc kubenswrapper[4790]: I1003 08:57:27.747894 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-ring-rebalance-5tpvp" podStartSLOduration=4.55289541 podStartE2EDuration="10.747864062s" podCreationTimestamp="2025-10-03 08:57:17 +0000 UTC" firstStartedPulling="2025-10-03 08:57:20.647518923 +0000 UTC m=+1027.000453161" lastFinishedPulling="2025-10-03 08:57:26.842487575 +0000 UTC m=+1033.195421813" observedRunningTime="2025-10-03 08:57:27.743514553 +0000 UTC m=+1034.096448791" watchObservedRunningTime="2025-10-03 08:57:27.747864062 +0000 UTC m=+1034.100798320" Oct 03 08:57:27 crc kubenswrapper[4790]: I1003 08:57:27.769928 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7bfbdb9bd9-46r9c"] Oct 03 08:57:27 crc kubenswrapper[4790]: I1003 08:57:27.777008 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7bfbdb9bd9-46r9c"] Oct 03 08:57:27 crc kubenswrapper[4790]: W1003 08:57:27.897540 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podba913c08_cae4_40c2_9ec8_a4865beb0918.slice/crio-85bf6e02916b6926c3f3e634daca0fb102b7a1c864ca039bac8713898e335c90 WatchSource:0}: Error finding container 85bf6e02916b6926c3f3e634daca0fb102b7a1c864ca039bac8713898e335c90: Status 404 returned error can't find the container with id 85bf6e02916b6926c3f3e634daca0fb102b7a1c864ca039bac8713898e335c90 Oct 03 08:57:27 crc kubenswrapper[4790]: I1003 08:57:27.907997 4790 scope.go:117] "RemoveContainer" containerID="891481d62dee7ee7b7166ac6f6aa0df403064c27e650a6377c91e10ec268967f" Oct 03 08:57:28 crc kubenswrapper[4790]: I1003 08:57:28.013632 4790 scope.go:117] "RemoveContainer" containerID="5297c1d1c550f585c8f121fa581ec21e03dab18111f77659d2c048346d132741" Oct 03 08:57:28 crc kubenswrapper[4790]: E1003 08:57:28.014610 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5297c1d1c550f585c8f121fa581ec21e03dab18111f77659d2c048346d132741\": container with ID starting with 5297c1d1c550f585c8f121fa581ec21e03dab18111f77659d2c048346d132741 not found: ID does not exist" containerID="5297c1d1c550f585c8f121fa581ec21e03dab18111f77659d2c048346d132741" Oct 03 08:57:28 crc kubenswrapper[4790]: I1003 08:57:28.014679 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5297c1d1c550f585c8f121fa581ec21e03dab18111f77659d2c048346d132741"} err="failed to get container status \"5297c1d1c550f585c8f121fa581ec21e03dab18111f77659d2c048346d132741\": rpc error: code = NotFound desc = could not find container \"5297c1d1c550f585c8f121fa581ec21e03dab18111f77659d2c048346d132741\": container with ID starting with 5297c1d1c550f585c8f121fa581ec21e03dab18111f77659d2c048346d132741 not found: ID does not exist" Oct 03 08:57:28 crc kubenswrapper[4790]: I1003 08:57:28.014714 4790 scope.go:117] "RemoveContainer" containerID="891481d62dee7ee7b7166ac6f6aa0df403064c27e650a6377c91e10ec268967f" Oct 03 08:57:28 crc kubenswrapper[4790]: E1003 08:57:28.015285 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"891481d62dee7ee7b7166ac6f6aa0df403064c27e650a6377c91e10ec268967f\": container with ID starting with 891481d62dee7ee7b7166ac6f6aa0df403064c27e650a6377c91e10ec268967f not found: ID does not exist" containerID="891481d62dee7ee7b7166ac6f6aa0df403064c27e650a6377c91e10ec268967f" Oct 03 08:57:28 crc kubenswrapper[4790]: I1003 08:57:28.015356 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"891481d62dee7ee7b7166ac6f6aa0df403064c27e650a6377c91e10ec268967f"} err="failed to get container status \"891481d62dee7ee7b7166ac6f6aa0df403064c27e650a6377c91e10ec268967f\": rpc error: code = NotFound desc = could not find container \"891481d62dee7ee7b7166ac6f6aa0df403064c27e650a6377c91e10ec268967f\": container with ID starting with 891481d62dee7ee7b7166ac6f6aa0df403064c27e650a6377c91e10ec268967f not found: ID does not exist" Oct 03 08:57:28 crc kubenswrapper[4790]: I1003 08:57:28.345351 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7cb8c0c5-d92e-462c-88ea-5bbfc5b06d90" path="/var/lib/kubelet/pods/7cb8c0c5-d92e-462c-88ea-5bbfc5b06d90/volumes" Oct 03 08:57:28 crc kubenswrapper[4790]: E1003 08:57:28.372551 4790 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd5099e4b_6707_4f2c_9d6f_504c101cf5c4.slice/crio-854d53f0aaa919fc6a3c59bd76bb5f09d0b211de937538ba392a28e415ca8db2.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podba913c08_cae4_40c2_9ec8_a4865beb0918.slice/crio-conmon-002a0e39313fce8f4fe9012d3c16ea9336cdf187d2705860eccd71d0763552ae.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod03453a08_49f9_4433_ba29_bd7f5ce716ff.slice/crio-c380022c023961acb7fa7fb3fe1b74bb348a1e3aa4d56b08a9866dd031a1c876.scope\": RecentStats: unable to find data in memory cache]" Oct 03 08:57:28 crc kubenswrapper[4790]: I1003 08:57:28.735178 4790 generic.go:334] "Generic (PLEG): container finished" podID="d5099e4b-6707-4f2c-9d6f-504c101cf5c4" containerID="854d53f0aaa919fc6a3c59bd76bb5f09d0b211de937538ba392a28e415ca8db2" exitCode=0 Oct 03 08:57:28 crc kubenswrapper[4790]: I1003 08:57:28.735290 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-28lqw" event={"ID":"d5099e4b-6707-4f2c-9d6f-504c101cf5c4","Type":"ContainerDied","Data":"854d53f0aaa919fc6a3c59bd76bb5f09d0b211de937538ba392a28e415ca8db2"} Oct 03 08:57:28 crc kubenswrapper[4790]: I1003 08:57:28.735737 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-28lqw" event={"ID":"d5099e4b-6707-4f2c-9d6f-504c101cf5c4","Type":"ContainerStarted","Data":"2c05f74af8d6bbd7806aaa5a8a5cfeaa9e61b8fdaf64fabc80251f2318553821"} Oct 03 08:57:28 crc kubenswrapper[4790]: I1003 08:57:28.738427 4790 generic.go:334] "Generic (PLEG): container finished" podID="03453a08-49f9-4433-ba29-bd7f5ce716ff" containerID="c380022c023961acb7fa7fb3fe1b74bb348a1e3aa4d56b08a9866dd031a1c876" exitCode=0 Oct 03 08:57:28 crc kubenswrapper[4790]: I1003 08:57:28.738475 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-db-create-dxr5m" event={"ID":"03453a08-49f9-4433-ba29-bd7f5ce716ff","Type":"ContainerDied","Data":"c380022c023961acb7fa7fb3fe1b74bb348a1e3aa4d56b08a9866dd031a1c876"} Oct 03 08:57:28 crc kubenswrapper[4790]: I1003 08:57:28.740850 4790 generic.go:334] "Generic (PLEG): container finished" podID="ba913c08-cae4-40c2-9ec8-a4865beb0918" containerID="002a0e39313fce8f4fe9012d3c16ea9336cdf187d2705860eccd71d0763552ae" exitCode=0 Oct 03 08:57:28 crc kubenswrapper[4790]: I1003 08:57:28.740953 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-8pfjr" event={"ID":"ba913c08-cae4-40c2-9ec8-a4865beb0918","Type":"ContainerDied","Data":"002a0e39313fce8f4fe9012d3c16ea9336cdf187d2705860eccd71d0763552ae"} Oct 03 08:57:28 crc kubenswrapper[4790]: I1003 08:57:28.740995 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-8pfjr" event={"ID":"ba913c08-cae4-40c2-9ec8-a4865beb0918","Type":"ContainerStarted","Data":"85bf6e02916b6926c3f3e634daca0fb102b7a1c864ca039bac8713898e335c90"} Oct 03 08:57:30 crc kubenswrapper[4790]: I1003 08:57:30.765453 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-8pfjr" event={"ID":"ba913c08-cae4-40c2-9ec8-a4865beb0918","Type":"ContainerDied","Data":"85bf6e02916b6926c3f3e634daca0fb102b7a1c864ca039bac8713898e335c90"} Oct 03 08:57:30 crc kubenswrapper[4790]: I1003 08:57:30.766082 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="85bf6e02916b6926c3f3e634daca0fb102b7a1c864ca039bac8713898e335c90" Oct 03 08:57:30 crc kubenswrapper[4790]: I1003 08:57:30.780450 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-28lqw" event={"ID":"d5099e4b-6707-4f2c-9d6f-504c101cf5c4","Type":"ContainerDied","Data":"2c05f74af8d6bbd7806aaa5a8a5cfeaa9e61b8fdaf64fabc80251f2318553821"} Oct 03 08:57:30 crc kubenswrapper[4790]: I1003 08:57:30.780496 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2c05f74af8d6bbd7806aaa5a8a5cfeaa9e61b8fdaf64fabc80251f2318553821" Oct 03 08:57:30 crc kubenswrapper[4790]: I1003 08:57:30.783163 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-db-create-dxr5m" event={"ID":"03453a08-49f9-4433-ba29-bd7f5ce716ff","Type":"ContainerDied","Data":"251cab79401a7d0bb4ea271e7f2932757bd2faf71661c89056cfa92b60e54933"} Oct 03 08:57:30 crc kubenswrapper[4790]: I1003 08:57:30.783318 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="251cab79401a7d0bb4ea271e7f2932757bd2faf71661c89056cfa92b60e54933" Oct 03 08:57:30 crc kubenswrapper[4790]: I1003 08:57:30.847807 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-8pfjr" Oct 03 08:57:30 crc kubenswrapper[4790]: I1003 08:57:30.856035 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-db-create-dxr5m" Oct 03 08:57:30 crc kubenswrapper[4790]: I1003 08:57:30.862293 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-28lqw" Oct 03 08:57:31 crc kubenswrapper[4790]: I1003 08:57:31.012157 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vg5th\" (UniqueName: \"kubernetes.io/projected/ba913c08-cae4-40c2-9ec8-a4865beb0918-kube-api-access-vg5th\") pod \"ba913c08-cae4-40c2-9ec8-a4865beb0918\" (UID: \"ba913c08-cae4-40c2-9ec8-a4865beb0918\") " Oct 03 08:57:31 crc kubenswrapper[4790]: I1003 08:57:31.012247 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wl76n\" (UniqueName: \"kubernetes.io/projected/d5099e4b-6707-4f2c-9d6f-504c101cf5c4-kube-api-access-wl76n\") pod \"d5099e4b-6707-4f2c-9d6f-504c101cf5c4\" (UID: \"d5099e4b-6707-4f2c-9d6f-504c101cf5c4\") " Oct 03 08:57:31 crc kubenswrapper[4790]: I1003 08:57:31.012301 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-66rfg\" (UniqueName: \"kubernetes.io/projected/03453a08-49f9-4433-ba29-bd7f5ce716ff-kube-api-access-66rfg\") pod \"03453a08-49f9-4433-ba29-bd7f5ce716ff\" (UID: \"03453a08-49f9-4433-ba29-bd7f5ce716ff\") " Oct 03 08:57:31 crc kubenswrapper[4790]: I1003 08:57:31.021157 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ba913c08-cae4-40c2-9ec8-a4865beb0918-kube-api-access-vg5th" (OuterVolumeSpecName: "kube-api-access-vg5th") pod "ba913c08-cae4-40c2-9ec8-a4865beb0918" (UID: "ba913c08-cae4-40c2-9ec8-a4865beb0918"). InnerVolumeSpecName "kube-api-access-vg5th". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:57:31 crc kubenswrapper[4790]: I1003 08:57:31.021240 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/03453a08-49f9-4433-ba29-bd7f5ce716ff-kube-api-access-66rfg" (OuterVolumeSpecName: "kube-api-access-66rfg") pod "03453a08-49f9-4433-ba29-bd7f5ce716ff" (UID: "03453a08-49f9-4433-ba29-bd7f5ce716ff"). InnerVolumeSpecName "kube-api-access-66rfg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:57:31 crc kubenswrapper[4790]: I1003 08:57:31.021528 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d5099e4b-6707-4f2c-9d6f-504c101cf5c4-kube-api-access-wl76n" (OuterVolumeSpecName: "kube-api-access-wl76n") pod "d5099e4b-6707-4f2c-9d6f-504c101cf5c4" (UID: "d5099e4b-6707-4f2c-9d6f-504c101cf5c4"). InnerVolumeSpecName "kube-api-access-wl76n". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:57:31 crc kubenswrapper[4790]: I1003 08:57:31.116638 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vg5th\" (UniqueName: \"kubernetes.io/projected/ba913c08-cae4-40c2-9ec8-a4865beb0918-kube-api-access-vg5th\") on node \"crc\" DevicePath \"\"" Oct 03 08:57:31 crc kubenswrapper[4790]: I1003 08:57:31.116940 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wl76n\" (UniqueName: \"kubernetes.io/projected/d5099e4b-6707-4f2c-9d6f-504c101cf5c4-kube-api-access-wl76n\") on node \"crc\" DevicePath \"\"" Oct 03 08:57:31 crc kubenswrapper[4790]: I1003 08:57:31.117035 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-66rfg\" (UniqueName: \"kubernetes.io/projected/03453a08-49f9-4433-ba29-bd7f5ce716ff-kube-api-access-66rfg\") on node \"crc\" DevicePath \"\"" Oct 03 08:57:31 crc kubenswrapper[4790]: I1003 08:57:31.149398 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Oct 03 08:57:31 crc kubenswrapper[4790]: I1003 08:57:31.794899 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-db-create-dxr5m" Oct 03 08:57:31 crc kubenswrapper[4790]: I1003 08:57:31.798622 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"3dc14abb-14bb-4bee-9864-8507e37d824d","Type":"ContainerStarted","Data":"66efb58d24c825aea5e1279e2cf9068a06ecd1fcb3114cda1bd5d487cf794226"} Oct 03 08:57:31 crc kubenswrapper[4790]: I1003 08:57:31.798687 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-28lqw" Oct 03 08:57:31 crc kubenswrapper[4790]: I1003 08:57:31.798929 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-8pfjr" Oct 03 08:57:31 crc kubenswrapper[4790]: I1003 08:57:31.879524 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/prometheus-metric-storage-0" podStartSLOduration=9.065928129 podStartE2EDuration="45.879501046s" podCreationTimestamp="2025-10-03 08:56:46 +0000 UTC" firstStartedPulling="2025-10-03 08:56:54.024611021 +0000 UTC m=+1000.377545259" lastFinishedPulling="2025-10-03 08:57:30.838183938 +0000 UTC m=+1037.191118176" observedRunningTime="2025-10-03 08:57:31.836063697 +0000 UTC m=+1038.188997945" watchObservedRunningTime="2025-10-03 08:57:31.879501046 +0000 UTC m=+1038.232435294" Oct 03 08:57:32 crc kubenswrapper[4790]: I1003 08:57:32.419785 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/prometheus-metric-storage-0" Oct 03 08:57:32 crc kubenswrapper[4790]: I1003 08:57:32.419847 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/prometheus-metric-storage-0" Oct 03 08:57:32 crc kubenswrapper[4790]: I1003 08:57:32.424016 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/prometheus-metric-storage-0" Oct 03 08:57:32 crc kubenswrapper[4790]: I1003 08:57:32.806669 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/prometheus-metric-storage-0" Oct 03 08:57:33 crc kubenswrapper[4790]: I1003 08:57:33.151894 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/d6c105dc-48df-4d07-8a82-ca93e78ad98c-etc-swift\") pod \"swift-storage-0\" (UID: \"d6c105dc-48df-4d07-8a82-ca93e78ad98c\") " pod="openstack/swift-storage-0" Oct 03 08:57:33 crc kubenswrapper[4790]: E1003 08:57:33.152103 4790 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Oct 03 08:57:33 crc kubenswrapper[4790]: E1003 08:57:33.152121 4790 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Oct 03 08:57:33 crc kubenswrapper[4790]: E1003 08:57:33.152160 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d6c105dc-48df-4d07-8a82-ca93e78ad98c-etc-swift podName:d6c105dc-48df-4d07-8a82-ca93e78ad98c nodeName:}" failed. No retries permitted until 2025-10-03 08:57:49.152145339 +0000 UTC m=+1055.505079577 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/d6c105dc-48df-4d07-8a82-ca93e78ad98c-etc-swift") pod "swift-storage-0" (UID: "d6c105dc-48df-4d07-8a82-ca93e78ad98c") : configmap "swift-ring-files" not found Oct 03 08:57:33 crc kubenswrapper[4790]: I1003 08:57:33.646784 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-jqhqt"] Oct 03 08:57:33 crc kubenswrapper[4790]: E1003 08:57:33.647141 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d5099e4b-6707-4f2c-9d6f-504c101cf5c4" containerName="mariadb-database-create" Oct 03 08:57:33 crc kubenswrapper[4790]: I1003 08:57:33.647163 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5099e4b-6707-4f2c-9d6f-504c101cf5c4" containerName="mariadb-database-create" Oct 03 08:57:33 crc kubenswrapper[4790]: E1003 08:57:33.647181 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7cb8c0c5-d92e-462c-88ea-5bbfc5b06d90" containerName="dnsmasq-dns" Oct 03 08:57:33 crc kubenswrapper[4790]: I1003 08:57:33.647190 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="7cb8c0c5-d92e-462c-88ea-5bbfc5b06d90" containerName="dnsmasq-dns" Oct 03 08:57:33 crc kubenswrapper[4790]: E1003 08:57:33.647199 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="03453a08-49f9-4433-ba29-bd7f5ce716ff" containerName="mariadb-database-create" Oct 03 08:57:33 crc kubenswrapper[4790]: I1003 08:57:33.647207 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="03453a08-49f9-4433-ba29-bd7f5ce716ff" containerName="mariadb-database-create" Oct 03 08:57:33 crc kubenswrapper[4790]: E1003 08:57:33.647220 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7cb8c0c5-d92e-462c-88ea-5bbfc5b06d90" containerName="init" Oct 03 08:57:33 crc kubenswrapper[4790]: I1003 08:57:33.647227 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="7cb8c0c5-d92e-462c-88ea-5bbfc5b06d90" containerName="init" Oct 03 08:57:33 crc kubenswrapper[4790]: E1003 08:57:33.647246 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba913c08-cae4-40c2-9ec8-a4865beb0918" containerName="mariadb-database-create" Oct 03 08:57:33 crc kubenswrapper[4790]: I1003 08:57:33.647253 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba913c08-cae4-40c2-9ec8-a4865beb0918" containerName="mariadb-database-create" Oct 03 08:57:33 crc kubenswrapper[4790]: I1003 08:57:33.647449 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="d5099e4b-6707-4f2c-9d6f-504c101cf5c4" containerName="mariadb-database-create" Oct 03 08:57:33 crc kubenswrapper[4790]: I1003 08:57:33.647471 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="7cb8c0c5-d92e-462c-88ea-5bbfc5b06d90" containerName="dnsmasq-dns" Oct 03 08:57:33 crc kubenswrapper[4790]: I1003 08:57:33.647479 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="ba913c08-cae4-40c2-9ec8-a4865beb0918" containerName="mariadb-database-create" Oct 03 08:57:33 crc kubenswrapper[4790]: I1003 08:57:33.647487 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="03453a08-49f9-4433-ba29-bd7f5ce716ff" containerName="mariadb-database-create" Oct 03 08:57:33 crc kubenswrapper[4790]: I1003 08:57:33.648284 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-jqhqt" Oct 03 08:57:33 crc kubenswrapper[4790]: I1003 08:57:33.654350 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-jqhqt"] Oct 03 08:57:33 crc kubenswrapper[4790]: I1003 08:57:33.765435 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lvld4\" (UniqueName: \"kubernetes.io/projected/f0dccb48-67d6-443b-8572-ff55454618ef-kube-api-access-lvld4\") pod \"keystone-db-create-jqhqt\" (UID: \"f0dccb48-67d6-443b-8572-ff55454618ef\") " pod="openstack/keystone-db-create-jqhqt" Oct 03 08:57:33 crc kubenswrapper[4790]: I1003 08:57:33.867745 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lvld4\" (UniqueName: \"kubernetes.io/projected/f0dccb48-67d6-443b-8572-ff55454618ef-kube-api-access-lvld4\") pod \"keystone-db-create-jqhqt\" (UID: \"f0dccb48-67d6-443b-8572-ff55454618ef\") " pod="openstack/keystone-db-create-jqhqt" Oct 03 08:57:33 crc kubenswrapper[4790]: I1003 08:57:33.896680 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lvld4\" (UniqueName: \"kubernetes.io/projected/f0dccb48-67d6-443b-8572-ff55454618ef-kube-api-access-lvld4\") pod \"keystone-db-create-jqhqt\" (UID: \"f0dccb48-67d6-443b-8572-ff55454618ef\") " pod="openstack/keystone-db-create-jqhqt" Oct 03 08:57:33 crc kubenswrapper[4790]: I1003 08:57:33.974555 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-jqhqt" Oct 03 08:57:34 crc kubenswrapper[4790]: I1003 08:57:34.265077 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-3dbc-account-create-qgx9w"] Oct 03 08:57:34 crc kubenswrapper[4790]: I1003 08:57:34.267178 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-3dbc-account-create-qgx9w" Oct 03 08:57:34 crc kubenswrapper[4790]: I1003 08:57:34.271460 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Oct 03 08:57:34 crc kubenswrapper[4790]: I1003 08:57:34.275503 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-3dbc-account-create-qgx9w"] Oct 03 08:57:34 crc kubenswrapper[4790]: I1003 08:57:34.378941 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jqtsw\" (UniqueName: \"kubernetes.io/projected/ec270558-02cf-4d29-932d-b92a185a0396-kube-api-access-jqtsw\") pod \"glance-3dbc-account-create-qgx9w\" (UID: \"ec270558-02cf-4d29-932d-b92a185a0396\") " pod="openstack/glance-3dbc-account-create-qgx9w" Oct 03 08:57:34 crc kubenswrapper[4790]: I1003 08:57:34.466031 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-jqhqt"] Oct 03 08:57:34 crc kubenswrapper[4790]: W1003 08:57:34.469856 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf0dccb48_67d6_443b_8572_ff55454618ef.slice/crio-710ed624045b92ddd769fd7b3ea467b1d8ac665887733644e64fdf75a4bdde12 WatchSource:0}: Error finding container 710ed624045b92ddd769fd7b3ea467b1d8ac665887733644e64fdf75a4bdde12: Status 404 returned error can't find the container with id 710ed624045b92ddd769fd7b3ea467b1d8ac665887733644e64fdf75a4bdde12 Oct 03 08:57:34 crc kubenswrapper[4790]: I1003 08:57:34.480526 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jqtsw\" (UniqueName: \"kubernetes.io/projected/ec270558-02cf-4d29-932d-b92a185a0396-kube-api-access-jqtsw\") pod \"glance-3dbc-account-create-qgx9w\" (UID: \"ec270558-02cf-4d29-932d-b92a185a0396\") " pod="openstack/glance-3dbc-account-create-qgx9w" Oct 03 08:57:34 crc kubenswrapper[4790]: I1003 08:57:34.508630 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jqtsw\" (UniqueName: \"kubernetes.io/projected/ec270558-02cf-4d29-932d-b92a185a0396-kube-api-access-jqtsw\") pod \"glance-3dbc-account-create-qgx9w\" (UID: \"ec270558-02cf-4d29-932d-b92a185a0396\") " pod="openstack/glance-3dbc-account-create-qgx9w" Oct 03 08:57:34 crc kubenswrapper[4790]: I1003 08:57:34.590608 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-3dbc-account-create-qgx9w" Oct 03 08:57:34 crc kubenswrapper[4790]: I1003 08:57:34.821165 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-jqhqt" event={"ID":"f0dccb48-67d6-443b-8572-ff55454618ef","Type":"ContainerStarted","Data":"856ea50cea4f73c4347d3d07896076a26addfb96d480bd72a2e628c27e5ed309"} Oct 03 08:57:34 crc kubenswrapper[4790]: I1003 08:57:34.821196 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-jqhqt" event={"ID":"f0dccb48-67d6-443b-8572-ff55454618ef","Type":"ContainerStarted","Data":"710ed624045b92ddd769fd7b3ea467b1d8ac665887733644e64fdf75a4bdde12"} Oct 03 08:57:34 crc kubenswrapper[4790]: I1003 08:57:34.840680 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-create-jqhqt" podStartSLOduration=1.8406642560000002 podStartE2EDuration="1.840664256s" podCreationTimestamp="2025-10-03 08:57:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 08:57:34.837279733 +0000 UTC m=+1041.190213961" watchObservedRunningTime="2025-10-03 08:57:34.840664256 +0000 UTC m=+1041.193598494" Oct 03 08:57:35 crc kubenswrapper[4790]: I1003 08:57:35.098915 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-3dbc-account-create-qgx9w"] Oct 03 08:57:35 crc kubenswrapper[4790]: W1003 08:57:35.103731 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podec270558_02cf_4d29_932d_b92a185a0396.slice/crio-ec59d0fdf240a9ddfd64691b7caf36af34cdadc83c11cd579a7f23c4f6c78bd4 WatchSource:0}: Error finding container ec59d0fdf240a9ddfd64691b7caf36af34cdadc83c11cd579a7f23c4f6c78bd4: Status 404 returned error can't find the container with id ec59d0fdf240a9ddfd64691b7caf36af34cdadc83c11cd579a7f23c4f6c78bd4 Oct 03 08:57:35 crc kubenswrapper[4790]: I1003 08:57:35.829894 4790 generic.go:334] "Generic (PLEG): container finished" podID="4f9b13ce-6ec8-4e05-b395-c7ec580823ed" containerID="d29ac79fa7d0ddf1e73030b7d05bc87f015a103898a4ebfba90722cb2320679b" exitCode=0 Oct 03 08:57:35 crc kubenswrapper[4790]: I1003 08:57:35.829984 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-5tpvp" event={"ID":"4f9b13ce-6ec8-4e05-b395-c7ec580823ed","Type":"ContainerDied","Data":"d29ac79fa7d0ddf1e73030b7d05bc87f015a103898a4ebfba90722cb2320679b"} Oct 03 08:57:35 crc kubenswrapper[4790]: I1003 08:57:35.832279 4790 generic.go:334] "Generic (PLEG): container finished" podID="ec270558-02cf-4d29-932d-b92a185a0396" containerID="6c1c56bdc2139c51486d7bee72ed848dde56d34a71b6450dbdfc11afe4625775" exitCode=0 Oct 03 08:57:35 crc kubenswrapper[4790]: I1003 08:57:35.832354 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-3dbc-account-create-qgx9w" event={"ID":"ec270558-02cf-4d29-932d-b92a185a0396","Type":"ContainerDied","Data":"6c1c56bdc2139c51486d7bee72ed848dde56d34a71b6450dbdfc11afe4625775"} Oct 03 08:57:35 crc kubenswrapper[4790]: I1003 08:57:35.832379 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-3dbc-account-create-qgx9w" event={"ID":"ec270558-02cf-4d29-932d-b92a185a0396","Type":"ContainerStarted","Data":"ec59d0fdf240a9ddfd64691b7caf36af34cdadc83c11cd579a7f23c4f6c78bd4"} Oct 03 08:57:35 crc kubenswrapper[4790]: I1003 08:57:35.834816 4790 generic.go:334] "Generic (PLEG): container finished" podID="f0dccb48-67d6-443b-8572-ff55454618ef" containerID="856ea50cea4f73c4347d3d07896076a26addfb96d480bd72a2e628c27e5ed309" exitCode=0 Oct 03 08:57:35 crc kubenswrapper[4790]: I1003 08:57:35.834844 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-jqhqt" event={"ID":"f0dccb48-67d6-443b-8572-ff55454618ef","Type":"ContainerDied","Data":"856ea50cea4f73c4347d3d07896076a26addfb96d480bd72a2e628c27e5ed309"} Oct 03 08:57:35 crc kubenswrapper[4790]: I1003 08:57:35.997966 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/prometheus-metric-storage-0"] Oct 03 08:57:35 crc kubenswrapper[4790]: I1003 08:57:35.998260 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="3dc14abb-14bb-4bee-9864-8507e37d824d" containerName="prometheus" containerID="cri-o://3d67417773adb7ca519c4b78e60ec103659dc2679f5b2b7d1c5377b7affff09e" gracePeriod=600 Oct 03 08:57:35 crc kubenswrapper[4790]: I1003 08:57:35.998455 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="3dc14abb-14bb-4bee-9864-8507e37d824d" containerName="config-reloader" containerID="cri-o://e8d3fa0b987f5e1092abe139ea488243df2e034303948d70d6e0e2f553df6929" gracePeriod=600 Oct 03 08:57:36 crc kubenswrapper[4790]: I1003 08:57:35.998390 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="3dc14abb-14bb-4bee-9864-8507e37d824d" containerName="thanos-sidecar" containerID="cri-o://66efb58d24c825aea5e1279e2cf9068a06ecd1fcb3114cda1bd5d487cf794226" gracePeriod=600 Oct 03 08:57:36 crc kubenswrapper[4790]: I1003 08:57:36.846412 4790 generic.go:334] "Generic (PLEG): container finished" podID="3dc14abb-14bb-4bee-9864-8507e37d824d" containerID="66efb58d24c825aea5e1279e2cf9068a06ecd1fcb3114cda1bd5d487cf794226" exitCode=0 Oct 03 08:57:36 crc kubenswrapper[4790]: I1003 08:57:36.846705 4790 generic.go:334] "Generic (PLEG): container finished" podID="3dc14abb-14bb-4bee-9864-8507e37d824d" containerID="e8d3fa0b987f5e1092abe139ea488243df2e034303948d70d6e0e2f553df6929" exitCode=0 Oct 03 08:57:36 crc kubenswrapper[4790]: I1003 08:57:36.846714 4790 generic.go:334] "Generic (PLEG): container finished" podID="3dc14abb-14bb-4bee-9864-8507e37d824d" containerID="3d67417773adb7ca519c4b78e60ec103659dc2679f5b2b7d1c5377b7affff09e" exitCode=0 Oct 03 08:57:36 crc kubenswrapper[4790]: I1003 08:57:36.846554 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"3dc14abb-14bb-4bee-9864-8507e37d824d","Type":"ContainerDied","Data":"66efb58d24c825aea5e1279e2cf9068a06ecd1fcb3114cda1bd5d487cf794226"} Oct 03 08:57:36 crc kubenswrapper[4790]: I1003 08:57:36.846774 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"3dc14abb-14bb-4bee-9864-8507e37d824d","Type":"ContainerDied","Data":"e8d3fa0b987f5e1092abe139ea488243df2e034303948d70d6e0e2f553df6929"} Oct 03 08:57:36 crc kubenswrapper[4790]: I1003 08:57:36.846788 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"3dc14abb-14bb-4bee-9864-8507e37d824d","Type":"ContainerDied","Data":"3d67417773adb7ca519c4b78e60ec103659dc2679f5b2b7d1c5377b7affff09e"} Oct 03 08:57:36 crc kubenswrapper[4790]: I1003 08:57:36.847928 4790 generic.go:334] "Generic (PLEG): container finished" podID="f5ab6b2b-5bba-4043-b116-b8d996df1ace" containerID="f077e584af63fd1d7d0309f8b1b0933b111eb5a14bd21bd6d74599277edc4323" exitCode=0 Oct 03 08:57:36 crc kubenswrapper[4790]: I1003 08:57:36.848117 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"f5ab6b2b-5bba-4043-b116-b8d996df1ace","Type":"ContainerDied","Data":"f077e584af63fd1d7d0309f8b1b0933b111eb5a14bd21bd6d74599277edc4323"} Oct 03 08:57:37 crc kubenswrapper[4790]: I1003 08:57:37.055963 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Oct 03 08:57:37 crc kubenswrapper[4790]: I1003 08:57:37.131940 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/3dc14abb-14bb-4bee-9864-8507e37d824d-config-out\") pod \"3dc14abb-14bb-4bee-9864-8507e37d824d\" (UID: \"3dc14abb-14bb-4bee-9864-8507e37d824d\") " Oct 03 08:57:37 crc kubenswrapper[4790]: I1003 08:57:37.132400 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/3dc14abb-14bb-4bee-9864-8507e37d824d-tls-assets\") pod \"3dc14abb-14bb-4bee-9864-8507e37d824d\" (UID: \"3dc14abb-14bb-4bee-9864-8507e37d824d\") " Oct 03 08:57:37 crc kubenswrapper[4790]: I1003 08:57:37.132432 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/3dc14abb-14bb-4bee-9864-8507e37d824d-web-config\") pod \"3dc14abb-14bb-4bee-9864-8507e37d824d\" (UID: \"3dc14abb-14bb-4bee-9864-8507e37d824d\") " Oct 03 08:57:37 crc kubenswrapper[4790]: I1003 08:57:37.132457 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/3dc14abb-14bb-4bee-9864-8507e37d824d-config\") pod \"3dc14abb-14bb-4bee-9864-8507e37d824d\" (UID: \"3dc14abb-14bb-4bee-9864-8507e37d824d\") " Oct 03 08:57:37 crc kubenswrapper[4790]: I1003 08:57:37.132515 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-78pfp\" (UniqueName: \"kubernetes.io/projected/3dc14abb-14bb-4bee-9864-8507e37d824d-kube-api-access-78pfp\") pod \"3dc14abb-14bb-4bee-9864-8507e37d824d\" (UID: \"3dc14abb-14bb-4bee-9864-8507e37d824d\") " Oct 03 08:57:37 crc kubenswrapper[4790]: I1003 08:57:37.132678 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-db\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-84e9d10c-4308-45f2-a94a-547cf8f7d3b4\") pod \"3dc14abb-14bb-4bee-9864-8507e37d824d\" (UID: \"3dc14abb-14bb-4bee-9864-8507e37d824d\") " Oct 03 08:57:37 crc kubenswrapper[4790]: I1003 08:57:37.132708 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/3dc14abb-14bb-4bee-9864-8507e37d824d-thanos-prometheus-http-client-file\") pod \"3dc14abb-14bb-4bee-9864-8507e37d824d\" (UID: \"3dc14abb-14bb-4bee-9864-8507e37d824d\") " Oct 03 08:57:37 crc kubenswrapper[4790]: I1003 08:57:37.132746 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/3dc14abb-14bb-4bee-9864-8507e37d824d-prometheus-metric-storage-rulefiles-0\") pod \"3dc14abb-14bb-4bee-9864-8507e37d824d\" (UID: \"3dc14abb-14bb-4bee-9864-8507e37d824d\") " Oct 03 08:57:37 crc kubenswrapper[4790]: I1003 08:57:37.133632 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3dc14abb-14bb-4bee-9864-8507e37d824d-prometheus-metric-storage-rulefiles-0" (OuterVolumeSpecName: "prometheus-metric-storage-rulefiles-0") pod "3dc14abb-14bb-4bee-9864-8507e37d824d" (UID: "3dc14abb-14bb-4bee-9864-8507e37d824d"). InnerVolumeSpecName "prometheus-metric-storage-rulefiles-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:57:37 crc kubenswrapper[4790]: I1003 08:57:37.144346 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3dc14abb-14bb-4bee-9864-8507e37d824d-config" (OuterVolumeSpecName: "config") pod "3dc14abb-14bb-4bee-9864-8507e37d824d" (UID: "3dc14abb-14bb-4bee-9864-8507e37d824d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:57:37 crc kubenswrapper[4790]: I1003 08:57:37.145381 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3dc14abb-14bb-4bee-9864-8507e37d824d-kube-api-access-78pfp" (OuterVolumeSpecName: "kube-api-access-78pfp") pod "3dc14abb-14bb-4bee-9864-8507e37d824d" (UID: "3dc14abb-14bb-4bee-9864-8507e37d824d"). InnerVolumeSpecName "kube-api-access-78pfp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:57:37 crc kubenswrapper[4790]: I1003 08:57:37.160178 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3dc14abb-14bb-4bee-9864-8507e37d824d-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "3dc14abb-14bb-4bee-9864-8507e37d824d" (UID: "3dc14abb-14bb-4bee-9864-8507e37d824d"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:57:37 crc kubenswrapper[4790]: I1003 08:57:37.161876 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3dc14abb-14bb-4bee-9864-8507e37d824d-config-out" (OuterVolumeSpecName: "config-out") pod "3dc14abb-14bb-4bee-9864-8507e37d824d" (UID: "3dc14abb-14bb-4bee-9864-8507e37d824d"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 08:57:37 crc kubenswrapper[4790]: I1003 08:57:37.162048 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3dc14abb-14bb-4bee-9864-8507e37d824d-thanos-prometheus-http-client-file" (OuterVolumeSpecName: "thanos-prometheus-http-client-file") pod "3dc14abb-14bb-4bee-9864-8507e37d824d" (UID: "3dc14abb-14bb-4bee-9864-8507e37d824d"). InnerVolumeSpecName "thanos-prometheus-http-client-file". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:57:37 crc kubenswrapper[4790]: I1003 08:57:37.183732 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3dc14abb-14bb-4bee-9864-8507e37d824d-web-config" (OuterVolumeSpecName: "web-config") pod "3dc14abb-14bb-4bee-9864-8507e37d824d" (UID: "3dc14abb-14bb-4bee-9864-8507e37d824d"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:57:37 crc kubenswrapper[4790]: I1003 08:57:37.198425 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-84e9d10c-4308-45f2-a94a-547cf8f7d3b4" (OuterVolumeSpecName: "prometheus-metric-storage-db") pod "3dc14abb-14bb-4bee-9864-8507e37d824d" (UID: "3dc14abb-14bb-4bee-9864-8507e37d824d"). InnerVolumeSpecName "pvc-84e9d10c-4308-45f2-a94a-547cf8f7d3b4". PluginName "kubernetes.io/csi", VolumeGidValue "" Oct 03 08:57:37 crc kubenswrapper[4790]: I1003 08:57:37.234984 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-78pfp\" (UniqueName: \"kubernetes.io/projected/3dc14abb-14bb-4bee-9864-8507e37d824d-kube-api-access-78pfp\") on node \"crc\" DevicePath \"\"" Oct 03 08:57:37 crc kubenswrapper[4790]: I1003 08:57:37.235050 4790 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-84e9d10c-4308-45f2-a94a-547cf8f7d3b4\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-84e9d10c-4308-45f2-a94a-547cf8f7d3b4\") on node \"crc\" " Oct 03 08:57:37 crc kubenswrapper[4790]: I1003 08:57:37.235068 4790 reconciler_common.go:293] "Volume detached for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/3dc14abb-14bb-4bee-9864-8507e37d824d-thanos-prometheus-http-client-file\") on node \"crc\" DevicePath \"\"" Oct 03 08:57:37 crc kubenswrapper[4790]: I1003 08:57:37.235081 4790 reconciler_common.go:293] "Volume detached for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/3dc14abb-14bb-4bee-9864-8507e37d824d-prometheus-metric-storage-rulefiles-0\") on node \"crc\" DevicePath \"\"" Oct 03 08:57:37 crc kubenswrapper[4790]: I1003 08:57:37.235093 4790 reconciler_common.go:293] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/3dc14abb-14bb-4bee-9864-8507e37d824d-config-out\") on node \"crc\" DevicePath \"\"" Oct 03 08:57:37 crc kubenswrapper[4790]: I1003 08:57:37.235106 4790 reconciler_common.go:293] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/3dc14abb-14bb-4bee-9864-8507e37d824d-tls-assets\") on node \"crc\" DevicePath \"\"" Oct 03 08:57:37 crc kubenswrapper[4790]: I1003 08:57:37.235118 4790 reconciler_common.go:293] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/3dc14abb-14bb-4bee-9864-8507e37d824d-web-config\") on node \"crc\" DevicePath \"\"" Oct 03 08:57:37 crc kubenswrapper[4790]: I1003 08:57:37.235131 4790 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/3dc14abb-14bb-4bee-9864-8507e37d824d-config\") on node \"crc\" DevicePath \"\"" Oct 03 08:57:37 crc kubenswrapper[4790]: I1003 08:57:37.270127 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-jqhqt" Oct 03 08:57:37 crc kubenswrapper[4790]: I1003 08:57:37.289108 4790 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Oct 03 08:57:37 crc kubenswrapper[4790]: I1003 08:57:37.289283 4790 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-84e9d10c-4308-45f2-a94a-547cf8f7d3b4" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-84e9d10c-4308-45f2-a94a-547cf8f7d3b4") on node "crc" Oct 03 08:57:37 crc kubenswrapper[4790]: I1003 08:57:37.336762 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lvld4\" (UniqueName: \"kubernetes.io/projected/f0dccb48-67d6-443b-8572-ff55454618ef-kube-api-access-lvld4\") pod \"f0dccb48-67d6-443b-8572-ff55454618ef\" (UID: \"f0dccb48-67d6-443b-8572-ff55454618ef\") " Oct 03 08:57:37 crc kubenswrapper[4790]: I1003 08:57:37.337163 4790 reconciler_common.go:293] "Volume detached for volume \"pvc-84e9d10c-4308-45f2-a94a-547cf8f7d3b4\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-84e9d10c-4308-45f2-a94a-547cf8f7d3b4\") on node \"crc\" DevicePath \"\"" Oct 03 08:57:37 crc kubenswrapper[4790]: I1003 08:57:37.355698 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f0dccb48-67d6-443b-8572-ff55454618ef-kube-api-access-lvld4" (OuterVolumeSpecName: "kube-api-access-lvld4") pod "f0dccb48-67d6-443b-8572-ff55454618ef" (UID: "f0dccb48-67d6-443b-8572-ff55454618ef"). InnerVolumeSpecName "kube-api-access-lvld4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:57:37 crc kubenswrapper[4790]: I1003 08:57:37.438517 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lvld4\" (UniqueName: \"kubernetes.io/projected/f0dccb48-67d6-443b-8572-ff55454618ef-kube-api-access-lvld4\") on node \"crc\" DevicePath \"\"" Oct 03 08:57:37 crc kubenswrapper[4790]: I1003 08:57:37.443170 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-5tpvp" Oct 03 08:57:37 crc kubenswrapper[4790]: I1003 08:57:37.452678 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-3dbc-account-create-qgx9w" Oct 03 08:57:37 crc kubenswrapper[4790]: I1003 08:57:37.539496 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/4f9b13ce-6ec8-4e05-b395-c7ec580823ed-ring-data-devices\") pod \"4f9b13ce-6ec8-4e05-b395-c7ec580823ed\" (UID: \"4f9b13ce-6ec8-4e05-b395-c7ec580823ed\") " Oct 03 08:57:37 crc kubenswrapper[4790]: I1003 08:57:37.539615 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f9b13ce-6ec8-4e05-b395-c7ec580823ed-combined-ca-bundle\") pod \"4f9b13ce-6ec8-4e05-b395-c7ec580823ed\" (UID: \"4f9b13ce-6ec8-4e05-b395-c7ec580823ed\") " Oct 03 08:57:37 crc kubenswrapper[4790]: I1003 08:57:37.539661 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/4f9b13ce-6ec8-4e05-b395-c7ec580823ed-dispersionconf\") pod \"4f9b13ce-6ec8-4e05-b395-c7ec580823ed\" (UID: \"4f9b13ce-6ec8-4e05-b395-c7ec580823ed\") " Oct 03 08:57:37 crc kubenswrapper[4790]: I1003 08:57:37.539692 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-26qt4\" (UniqueName: \"kubernetes.io/projected/4f9b13ce-6ec8-4e05-b395-c7ec580823ed-kube-api-access-26qt4\") pod \"4f9b13ce-6ec8-4e05-b395-c7ec580823ed\" (UID: \"4f9b13ce-6ec8-4e05-b395-c7ec580823ed\") " Oct 03 08:57:37 crc kubenswrapper[4790]: I1003 08:57:37.539717 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jqtsw\" (UniqueName: \"kubernetes.io/projected/ec270558-02cf-4d29-932d-b92a185a0396-kube-api-access-jqtsw\") pod \"ec270558-02cf-4d29-932d-b92a185a0396\" (UID: \"ec270558-02cf-4d29-932d-b92a185a0396\") " Oct 03 08:57:37 crc kubenswrapper[4790]: I1003 08:57:37.539752 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4f9b13ce-6ec8-4e05-b395-c7ec580823ed-scripts\") pod \"4f9b13ce-6ec8-4e05-b395-c7ec580823ed\" (UID: \"4f9b13ce-6ec8-4e05-b395-c7ec580823ed\") " Oct 03 08:57:37 crc kubenswrapper[4790]: I1003 08:57:37.539790 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/4f9b13ce-6ec8-4e05-b395-c7ec580823ed-etc-swift\") pod \"4f9b13ce-6ec8-4e05-b395-c7ec580823ed\" (UID: \"4f9b13ce-6ec8-4e05-b395-c7ec580823ed\") " Oct 03 08:57:37 crc kubenswrapper[4790]: I1003 08:57:37.539866 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/4f9b13ce-6ec8-4e05-b395-c7ec580823ed-swiftconf\") pod \"4f9b13ce-6ec8-4e05-b395-c7ec580823ed\" (UID: \"4f9b13ce-6ec8-4e05-b395-c7ec580823ed\") " Oct 03 08:57:37 crc kubenswrapper[4790]: I1003 08:57:37.540301 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4f9b13ce-6ec8-4e05-b395-c7ec580823ed-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "4f9b13ce-6ec8-4e05-b395-c7ec580823ed" (UID: "4f9b13ce-6ec8-4e05-b395-c7ec580823ed"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:57:37 crc kubenswrapper[4790]: I1003 08:57:37.544624 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4f9b13ce-6ec8-4e05-b395-c7ec580823ed-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "4f9b13ce-6ec8-4e05-b395-c7ec580823ed" (UID: "4f9b13ce-6ec8-4e05-b395-c7ec580823ed"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 08:57:37 crc kubenswrapper[4790]: I1003 08:57:37.545157 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4f9b13ce-6ec8-4e05-b395-c7ec580823ed-kube-api-access-26qt4" (OuterVolumeSpecName: "kube-api-access-26qt4") pod "4f9b13ce-6ec8-4e05-b395-c7ec580823ed" (UID: "4f9b13ce-6ec8-4e05-b395-c7ec580823ed"). InnerVolumeSpecName "kube-api-access-26qt4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:57:37 crc kubenswrapper[4790]: I1003 08:57:37.545523 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ec270558-02cf-4d29-932d-b92a185a0396-kube-api-access-jqtsw" (OuterVolumeSpecName: "kube-api-access-jqtsw") pod "ec270558-02cf-4d29-932d-b92a185a0396" (UID: "ec270558-02cf-4d29-932d-b92a185a0396"). InnerVolumeSpecName "kube-api-access-jqtsw". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:57:37 crc kubenswrapper[4790]: I1003 08:57:37.557644 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4f9b13ce-6ec8-4e05-b395-c7ec580823ed-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "4f9b13ce-6ec8-4e05-b395-c7ec580823ed" (UID: "4f9b13ce-6ec8-4e05-b395-c7ec580823ed"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:57:37 crc kubenswrapper[4790]: I1003 08:57:37.567485 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4f9b13ce-6ec8-4e05-b395-c7ec580823ed-scripts" (OuterVolumeSpecName: "scripts") pod "4f9b13ce-6ec8-4e05-b395-c7ec580823ed" (UID: "4f9b13ce-6ec8-4e05-b395-c7ec580823ed"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:57:37 crc kubenswrapper[4790]: I1003 08:57:37.573309 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4f9b13ce-6ec8-4e05-b395-c7ec580823ed-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4f9b13ce-6ec8-4e05-b395-c7ec580823ed" (UID: "4f9b13ce-6ec8-4e05-b395-c7ec580823ed"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:57:37 crc kubenswrapper[4790]: I1003 08:57:37.577720 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4f9b13ce-6ec8-4e05-b395-c7ec580823ed-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "4f9b13ce-6ec8-4e05-b395-c7ec580823ed" (UID: "4f9b13ce-6ec8-4e05-b395-c7ec580823ed"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:57:37 crc kubenswrapper[4790]: I1003 08:57:37.642155 4790 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f9b13ce-6ec8-4e05-b395-c7ec580823ed-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 08:57:37 crc kubenswrapper[4790]: I1003 08:57:37.642198 4790 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/4f9b13ce-6ec8-4e05-b395-c7ec580823ed-dispersionconf\") on node \"crc\" DevicePath \"\"" Oct 03 08:57:37 crc kubenswrapper[4790]: I1003 08:57:37.642208 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-26qt4\" (UniqueName: \"kubernetes.io/projected/4f9b13ce-6ec8-4e05-b395-c7ec580823ed-kube-api-access-26qt4\") on node \"crc\" DevicePath \"\"" Oct 03 08:57:37 crc kubenswrapper[4790]: I1003 08:57:37.642221 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jqtsw\" (UniqueName: \"kubernetes.io/projected/ec270558-02cf-4d29-932d-b92a185a0396-kube-api-access-jqtsw\") on node \"crc\" DevicePath \"\"" Oct 03 08:57:37 crc kubenswrapper[4790]: I1003 08:57:37.642232 4790 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4f9b13ce-6ec8-4e05-b395-c7ec580823ed-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 08:57:37 crc kubenswrapper[4790]: I1003 08:57:37.642240 4790 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/4f9b13ce-6ec8-4e05-b395-c7ec580823ed-etc-swift\") on node \"crc\" DevicePath \"\"" Oct 03 08:57:37 crc kubenswrapper[4790]: I1003 08:57:37.642250 4790 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/4f9b13ce-6ec8-4e05-b395-c7ec580823ed-swiftconf\") on node \"crc\" DevicePath \"\"" Oct 03 08:57:37 crc kubenswrapper[4790]: I1003 08:57:37.642258 4790 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/4f9b13ce-6ec8-4e05-b395-c7ec580823ed-ring-data-devices\") on node \"crc\" DevicePath \"\"" Oct 03 08:57:37 crc kubenswrapper[4790]: I1003 08:57:37.865454 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-3dbc-account-create-qgx9w" event={"ID":"ec270558-02cf-4d29-932d-b92a185a0396","Type":"ContainerDied","Data":"ec59d0fdf240a9ddfd64691b7caf36af34cdadc83c11cd579a7f23c4f6c78bd4"} Oct 03 08:57:37 crc kubenswrapper[4790]: I1003 08:57:37.866559 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ec59d0fdf240a9ddfd64691b7caf36af34cdadc83c11cd579a7f23c4f6c78bd4" Oct 03 08:57:37 crc kubenswrapper[4790]: I1003 08:57:37.865534 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-3dbc-account-create-qgx9w" Oct 03 08:57:37 crc kubenswrapper[4790]: I1003 08:57:37.870068 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"f5ab6b2b-5bba-4043-b116-b8d996df1ace","Type":"ContainerStarted","Data":"d0b4288964fce1d822cc4ff4bbfc3a8a5741d45a60b21fefab8e2f14f47072ca"} Oct 03 08:57:37 crc kubenswrapper[4790]: I1003 08:57:37.870417 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Oct 03 08:57:37 crc kubenswrapper[4790]: I1003 08:57:37.874842 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"3dc14abb-14bb-4bee-9864-8507e37d824d","Type":"ContainerDied","Data":"d348ec39b29be4bfa56af1730b69c2d7e9b6cfda886f84630fbe86bf5dce8091"} Oct 03 08:57:37 crc kubenswrapper[4790]: I1003 08:57:37.874898 4790 scope.go:117] "RemoveContainer" containerID="66efb58d24c825aea5e1279e2cf9068a06ecd1fcb3114cda1bd5d487cf794226" Oct 03 08:57:37 crc kubenswrapper[4790]: I1003 08:57:37.875027 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Oct 03 08:57:37 crc kubenswrapper[4790]: I1003 08:57:37.881558 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-jqhqt" event={"ID":"f0dccb48-67d6-443b-8572-ff55454618ef","Type":"ContainerDied","Data":"710ed624045b92ddd769fd7b3ea467b1d8ac665887733644e64fdf75a4bdde12"} Oct 03 08:57:37 crc kubenswrapper[4790]: I1003 08:57:37.881701 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="710ed624045b92ddd769fd7b3ea467b1d8ac665887733644e64fdf75a4bdde12" Oct 03 08:57:37 crc kubenswrapper[4790]: I1003 08:57:37.881791 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-jqhqt" Oct 03 08:57:37 crc kubenswrapper[4790]: I1003 08:57:37.889486 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-5tpvp" event={"ID":"4f9b13ce-6ec8-4e05-b395-c7ec580823ed","Type":"ContainerDied","Data":"4a45eae668d0c92e71c27b5f2930e067bc30576f47b15890282712a45fd39498"} Oct 03 08:57:37 crc kubenswrapper[4790]: I1003 08:57:37.889544 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4a45eae668d0c92e71c27b5f2930e067bc30576f47b15890282712a45fd39498" Oct 03 08:57:37 crc kubenswrapper[4790]: I1003 08:57:37.889671 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-5tpvp" Oct 03 08:57:37 crc kubenswrapper[4790]: I1003 08:57:37.922476 4790 scope.go:117] "RemoveContainer" containerID="e8d3fa0b987f5e1092abe139ea488243df2e034303948d70d6e0e2f553df6929" Oct 03 08:57:37 crc kubenswrapper[4790]: I1003 08:57:37.938965 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=50.758561762 podStartE2EDuration="59.938945589s" podCreationTimestamp="2025-10-03 08:56:38 +0000 UTC" firstStartedPulling="2025-10-03 08:56:53.022136576 +0000 UTC m=+999.375070804" lastFinishedPulling="2025-10-03 08:57:02.202520393 +0000 UTC m=+1008.555454631" observedRunningTime="2025-10-03 08:57:37.899951763 +0000 UTC m=+1044.252886011" watchObservedRunningTime="2025-10-03 08:57:37.938945589 +0000 UTC m=+1044.291879827" Oct 03 08:57:37 crc kubenswrapper[4790]: I1003 08:57:37.945759 4790 scope.go:117] "RemoveContainer" containerID="3d67417773adb7ca519c4b78e60ec103659dc2679f5b2b7d1c5377b7affff09e" Oct 03 08:57:37 crc kubenswrapper[4790]: I1003 08:57:37.956195 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/prometheus-metric-storage-0"] Oct 03 08:57:37 crc kubenswrapper[4790]: I1003 08:57:37.969737 4790 scope.go:117] "RemoveContainer" containerID="b560c625c12679a3e883f9561a4c8ed6a1a1cabb6d9496291e354a5429835a90" Oct 03 08:57:37 crc kubenswrapper[4790]: I1003 08:57:37.972954 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/prometheus-metric-storage-0"] Oct 03 08:57:37 crc kubenswrapper[4790]: I1003 08:57:37.977339 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/prometheus-metric-storage-0"] Oct 03 08:57:37 crc kubenswrapper[4790]: E1003 08:57:37.977664 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4f9b13ce-6ec8-4e05-b395-c7ec580823ed" containerName="swift-ring-rebalance" Oct 03 08:57:37 crc kubenswrapper[4790]: I1003 08:57:37.977680 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f9b13ce-6ec8-4e05-b395-c7ec580823ed" containerName="swift-ring-rebalance" Oct 03 08:57:37 crc kubenswrapper[4790]: E1003 08:57:37.977694 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3dc14abb-14bb-4bee-9864-8507e37d824d" containerName="config-reloader" Oct 03 08:57:37 crc kubenswrapper[4790]: I1003 08:57:37.977701 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="3dc14abb-14bb-4bee-9864-8507e37d824d" containerName="config-reloader" Oct 03 08:57:37 crc kubenswrapper[4790]: E1003 08:57:37.977717 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ec270558-02cf-4d29-932d-b92a185a0396" containerName="mariadb-account-create" Oct 03 08:57:37 crc kubenswrapper[4790]: I1003 08:57:37.977724 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="ec270558-02cf-4d29-932d-b92a185a0396" containerName="mariadb-account-create" Oct 03 08:57:37 crc kubenswrapper[4790]: E1003 08:57:37.977734 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3dc14abb-14bb-4bee-9864-8507e37d824d" containerName="prometheus" Oct 03 08:57:37 crc kubenswrapper[4790]: I1003 08:57:37.977739 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="3dc14abb-14bb-4bee-9864-8507e37d824d" containerName="prometheus" Oct 03 08:57:37 crc kubenswrapper[4790]: E1003 08:57:37.977748 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3dc14abb-14bb-4bee-9864-8507e37d824d" containerName="init-config-reloader" Oct 03 08:57:37 crc kubenswrapper[4790]: I1003 08:57:37.977753 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="3dc14abb-14bb-4bee-9864-8507e37d824d" containerName="init-config-reloader" Oct 03 08:57:37 crc kubenswrapper[4790]: E1003 08:57:37.977770 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3dc14abb-14bb-4bee-9864-8507e37d824d" containerName="thanos-sidecar" Oct 03 08:57:37 crc kubenswrapper[4790]: I1003 08:57:37.977776 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="3dc14abb-14bb-4bee-9864-8507e37d824d" containerName="thanos-sidecar" Oct 03 08:57:37 crc kubenswrapper[4790]: E1003 08:57:37.977788 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f0dccb48-67d6-443b-8572-ff55454618ef" containerName="mariadb-database-create" Oct 03 08:57:37 crc kubenswrapper[4790]: I1003 08:57:37.977793 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="f0dccb48-67d6-443b-8572-ff55454618ef" containerName="mariadb-database-create" Oct 03 08:57:37 crc kubenswrapper[4790]: I1003 08:57:37.977959 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="3dc14abb-14bb-4bee-9864-8507e37d824d" containerName="thanos-sidecar" Oct 03 08:57:37 crc kubenswrapper[4790]: I1003 08:57:37.977975 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="f0dccb48-67d6-443b-8572-ff55454618ef" containerName="mariadb-database-create" Oct 03 08:57:37 crc kubenswrapper[4790]: I1003 08:57:37.977985 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="3dc14abb-14bb-4bee-9864-8507e37d824d" containerName="config-reloader" Oct 03 08:57:37 crc kubenswrapper[4790]: I1003 08:57:37.977994 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="3dc14abb-14bb-4bee-9864-8507e37d824d" containerName="prometheus" Oct 03 08:57:37 crc kubenswrapper[4790]: I1003 08:57:37.978001 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="ec270558-02cf-4d29-932d-b92a185a0396" containerName="mariadb-account-create" Oct 03 08:57:37 crc kubenswrapper[4790]: I1003 08:57:37.978010 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="4f9b13ce-6ec8-4e05-b395-c7ec580823ed" containerName="swift-ring-rebalance" Oct 03 08:57:37 crc kubenswrapper[4790]: I1003 08:57:37.983358 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Oct 03 08:57:37 crc kubenswrapper[4790]: I1003 08:57:37.988310 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-web-config" Oct 03 08:57:37 crc kubenswrapper[4790]: I1003 08:57:37.988518 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-thanos-prometheus-http-client-file" Oct 03 08:57:37 crc kubenswrapper[4790]: I1003 08:57:37.988640 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage" Oct 03 08:57:37 crc kubenswrapper[4790]: I1003 08:57:37.988769 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-prometheus-dockercfg-xvhkt" Oct 03 08:57:37 crc kubenswrapper[4790]: I1003 08:57:37.988887 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-0" Oct 03 08:57:37 crc kubenswrapper[4790]: I1003 08:57:37.988999 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-metric-storage-prometheus-svc" Oct 03 08:57:37 crc kubenswrapper[4790]: I1003 08:57:37.995045 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Oct 03 08:57:38 crc kubenswrapper[4790]: I1003 08:57:38.008703 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-tls-assets-0" Oct 03 08:57:38 crc kubenswrapper[4790]: I1003 08:57:38.047919 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/969ea996-248e-41c5-a868-bd2321bfdf49-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"969ea996-248e-41c5-a868-bd2321bfdf49\") " pod="openstack/prometheus-metric-storage-0" Oct 03 08:57:38 crc kubenswrapper[4790]: I1003 08:57:38.048004 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/969ea996-248e-41c5-a868-bd2321bfdf49-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"969ea996-248e-41c5-a868-bd2321bfdf49\") " pod="openstack/prometheus-metric-storage-0" Oct 03 08:57:38 crc kubenswrapper[4790]: I1003 08:57:38.048056 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/969ea996-248e-41c5-a868-bd2321bfdf49-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"969ea996-248e-41c5-a868-bd2321bfdf49\") " pod="openstack/prometheus-metric-storage-0" Oct 03 08:57:38 crc kubenswrapper[4790]: I1003 08:57:38.048343 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g55gr\" (UniqueName: \"kubernetes.io/projected/969ea996-248e-41c5-a868-bd2321bfdf49-kube-api-access-g55gr\") pod \"prometheus-metric-storage-0\" (UID: \"969ea996-248e-41c5-a868-bd2321bfdf49\") " pod="openstack/prometheus-metric-storage-0" Oct 03 08:57:38 crc kubenswrapper[4790]: I1003 08:57:38.048397 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/969ea996-248e-41c5-a868-bd2321bfdf49-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"969ea996-248e-41c5-a868-bd2321bfdf49\") " pod="openstack/prometheus-metric-storage-0" Oct 03 08:57:38 crc kubenswrapper[4790]: I1003 08:57:38.048417 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/969ea996-248e-41c5-a868-bd2321bfdf49-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"969ea996-248e-41c5-a868-bd2321bfdf49\") " pod="openstack/prometheus-metric-storage-0" Oct 03 08:57:38 crc kubenswrapper[4790]: I1003 08:57:38.048450 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/969ea996-248e-41c5-a868-bd2321bfdf49-config\") pod \"prometheus-metric-storage-0\" (UID: \"969ea996-248e-41c5-a868-bd2321bfdf49\") " pod="openstack/prometheus-metric-storage-0" Oct 03 08:57:38 crc kubenswrapper[4790]: I1003 08:57:38.048473 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-84e9d10c-4308-45f2-a94a-547cf8f7d3b4\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-84e9d10c-4308-45f2-a94a-547cf8f7d3b4\") pod \"prometheus-metric-storage-0\" (UID: \"969ea996-248e-41c5-a868-bd2321bfdf49\") " pod="openstack/prometheus-metric-storage-0" Oct 03 08:57:38 crc kubenswrapper[4790]: I1003 08:57:38.048522 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/969ea996-248e-41c5-a868-bd2321bfdf49-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"969ea996-248e-41c5-a868-bd2321bfdf49\") " pod="openstack/prometheus-metric-storage-0" Oct 03 08:57:38 crc kubenswrapper[4790]: I1003 08:57:38.048553 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/969ea996-248e-41c5-a868-bd2321bfdf49-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"969ea996-248e-41c5-a868-bd2321bfdf49\") " pod="openstack/prometheus-metric-storage-0" Oct 03 08:57:38 crc kubenswrapper[4790]: I1003 08:57:38.048598 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/969ea996-248e-41c5-a868-bd2321bfdf49-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"969ea996-248e-41c5-a868-bd2321bfdf49\") " pod="openstack/prometheus-metric-storage-0" Oct 03 08:57:38 crc kubenswrapper[4790]: I1003 08:57:38.150176 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/969ea996-248e-41c5-a868-bd2321bfdf49-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"969ea996-248e-41c5-a868-bd2321bfdf49\") " pod="openstack/prometheus-metric-storage-0" Oct 03 08:57:38 crc kubenswrapper[4790]: I1003 08:57:38.150214 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/969ea996-248e-41c5-a868-bd2321bfdf49-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"969ea996-248e-41c5-a868-bd2321bfdf49\") " pod="openstack/prometheus-metric-storage-0" Oct 03 08:57:38 crc kubenswrapper[4790]: I1003 08:57:38.150256 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/969ea996-248e-41c5-a868-bd2321bfdf49-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"969ea996-248e-41c5-a868-bd2321bfdf49\") " pod="openstack/prometheus-metric-storage-0" Oct 03 08:57:38 crc kubenswrapper[4790]: I1003 08:57:38.150273 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/969ea996-248e-41c5-a868-bd2321bfdf49-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"969ea996-248e-41c5-a868-bd2321bfdf49\") " pod="openstack/prometheus-metric-storage-0" Oct 03 08:57:38 crc kubenswrapper[4790]: I1003 08:57:38.150304 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/969ea996-248e-41c5-a868-bd2321bfdf49-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"969ea996-248e-41c5-a868-bd2321bfdf49\") " pod="openstack/prometheus-metric-storage-0" Oct 03 08:57:38 crc kubenswrapper[4790]: I1003 08:57:38.150338 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g55gr\" (UniqueName: \"kubernetes.io/projected/969ea996-248e-41c5-a868-bd2321bfdf49-kube-api-access-g55gr\") pod \"prometheus-metric-storage-0\" (UID: \"969ea996-248e-41c5-a868-bd2321bfdf49\") " pod="openstack/prometheus-metric-storage-0" Oct 03 08:57:38 crc kubenswrapper[4790]: I1003 08:57:38.150370 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/969ea996-248e-41c5-a868-bd2321bfdf49-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"969ea996-248e-41c5-a868-bd2321bfdf49\") " pod="openstack/prometheus-metric-storage-0" Oct 03 08:57:38 crc kubenswrapper[4790]: I1003 08:57:38.150388 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/969ea996-248e-41c5-a868-bd2321bfdf49-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"969ea996-248e-41c5-a868-bd2321bfdf49\") " pod="openstack/prometheus-metric-storage-0" Oct 03 08:57:38 crc kubenswrapper[4790]: I1003 08:57:38.150421 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/969ea996-248e-41c5-a868-bd2321bfdf49-config\") pod \"prometheus-metric-storage-0\" (UID: \"969ea996-248e-41c5-a868-bd2321bfdf49\") " pod="openstack/prometheus-metric-storage-0" Oct 03 08:57:38 crc kubenswrapper[4790]: I1003 08:57:38.150440 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-84e9d10c-4308-45f2-a94a-547cf8f7d3b4\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-84e9d10c-4308-45f2-a94a-547cf8f7d3b4\") pod \"prometheus-metric-storage-0\" (UID: \"969ea996-248e-41c5-a868-bd2321bfdf49\") " pod="openstack/prometheus-metric-storage-0" Oct 03 08:57:38 crc kubenswrapper[4790]: I1003 08:57:38.150482 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/969ea996-248e-41c5-a868-bd2321bfdf49-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"969ea996-248e-41c5-a868-bd2321bfdf49\") " pod="openstack/prometheus-metric-storage-0" Oct 03 08:57:38 crc kubenswrapper[4790]: I1003 08:57:38.151858 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/969ea996-248e-41c5-a868-bd2321bfdf49-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"969ea996-248e-41c5-a868-bd2321bfdf49\") " pod="openstack/prometheus-metric-storage-0" Oct 03 08:57:38 crc kubenswrapper[4790]: I1003 08:57:38.155934 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/969ea996-248e-41c5-a868-bd2321bfdf49-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"969ea996-248e-41c5-a868-bd2321bfdf49\") " pod="openstack/prometheus-metric-storage-0" Oct 03 08:57:38 crc kubenswrapper[4790]: I1003 08:57:38.155947 4790 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Oct 03 08:57:38 crc kubenswrapper[4790]: I1003 08:57:38.156054 4790 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-84e9d10c-4308-45f2-a94a-547cf8f7d3b4\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-84e9d10c-4308-45f2-a94a-547cf8f7d3b4\") pod \"prometheus-metric-storage-0\" (UID: \"969ea996-248e-41c5-a868-bd2321bfdf49\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/a9f968d81b8706f372416bcf6becb6316da12568f5c5e3584a75f6147541fb3e/globalmount\"" pod="openstack/prometheus-metric-storage-0" Oct 03 08:57:38 crc kubenswrapper[4790]: I1003 08:57:38.156565 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/969ea996-248e-41c5-a868-bd2321bfdf49-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"969ea996-248e-41c5-a868-bd2321bfdf49\") " pod="openstack/prometheus-metric-storage-0" Oct 03 08:57:38 crc kubenswrapper[4790]: I1003 08:57:38.157318 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/969ea996-248e-41c5-a868-bd2321bfdf49-config\") pod \"prometheus-metric-storage-0\" (UID: \"969ea996-248e-41c5-a868-bd2321bfdf49\") " pod="openstack/prometheus-metric-storage-0" Oct 03 08:57:38 crc kubenswrapper[4790]: I1003 08:57:38.158693 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/969ea996-248e-41c5-a868-bd2321bfdf49-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"969ea996-248e-41c5-a868-bd2321bfdf49\") " pod="openstack/prometheus-metric-storage-0" Oct 03 08:57:38 crc kubenswrapper[4790]: I1003 08:57:38.160220 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/969ea996-248e-41c5-a868-bd2321bfdf49-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"969ea996-248e-41c5-a868-bd2321bfdf49\") " pod="openstack/prometheus-metric-storage-0" Oct 03 08:57:38 crc kubenswrapper[4790]: I1003 08:57:38.160585 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/969ea996-248e-41c5-a868-bd2321bfdf49-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"969ea996-248e-41c5-a868-bd2321bfdf49\") " pod="openstack/prometheus-metric-storage-0" Oct 03 08:57:38 crc kubenswrapper[4790]: I1003 08:57:38.162479 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/969ea996-248e-41c5-a868-bd2321bfdf49-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"969ea996-248e-41c5-a868-bd2321bfdf49\") " pod="openstack/prometheus-metric-storage-0" Oct 03 08:57:38 crc kubenswrapper[4790]: I1003 08:57:38.167437 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/969ea996-248e-41c5-a868-bd2321bfdf49-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"969ea996-248e-41c5-a868-bd2321bfdf49\") " pod="openstack/prometheus-metric-storage-0" Oct 03 08:57:38 crc kubenswrapper[4790]: I1003 08:57:38.187121 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g55gr\" (UniqueName: \"kubernetes.io/projected/969ea996-248e-41c5-a868-bd2321bfdf49-kube-api-access-g55gr\") pod \"prometheus-metric-storage-0\" (UID: \"969ea996-248e-41c5-a868-bd2321bfdf49\") " pod="openstack/prometheus-metric-storage-0" Oct 03 08:57:38 crc kubenswrapper[4790]: I1003 08:57:38.265125 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-84e9d10c-4308-45f2-a94a-547cf8f7d3b4\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-84e9d10c-4308-45f2-a94a-547cf8f7d3b4\") pod \"prometheus-metric-storage-0\" (UID: \"969ea996-248e-41c5-a868-bd2321bfdf49\") " pod="openstack/prometheus-metric-storage-0" Oct 03 08:57:38 crc kubenswrapper[4790]: I1003 08:57:38.316284 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Oct 03 08:57:38 crc kubenswrapper[4790]: I1003 08:57:38.341410 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3dc14abb-14bb-4bee-9864-8507e37d824d" path="/var/lib/kubelet/pods/3dc14abb-14bb-4bee-9864-8507e37d824d/volumes" Oct 03 08:57:38 crc kubenswrapper[4790]: I1003 08:57:38.899166 4790 generic.go:334] "Generic (PLEG): container finished" podID="f79d57e8-680a-41b6-bca9-c233a695be4e" containerID="967ede7f4d897538e40daed769bfd55730474c6700fa08ab1a0dcd1960359c86" exitCode=0 Oct 03 08:57:38 crc kubenswrapper[4790]: I1003 08:57:38.899238 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"f79d57e8-680a-41b6-bca9-c233a695be4e","Type":"ContainerDied","Data":"967ede7f4d897538e40daed769bfd55730474c6700fa08ab1a0dcd1960359c86"} Oct 03 08:57:38 crc kubenswrapper[4790]: I1003 08:57:38.954286 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Oct 03 08:57:39 crc kubenswrapper[4790]: I1003 08:57:39.499085 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-m92cz"] Oct 03 08:57:39 crc kubenswrapper[4790]: I1003 08:57:39.500696 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-m92cz" Oct 03 08:57:39 crc kubenswrapper[4790]: I1003 08:57:39.502828 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Oct 03 08:57:39 crc kubenswrapper[4790]: I1003 08:57:39.514443 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-m92cz"] Oct 03 08:57:39 crc kubenswrapper[4790]: I1003 08:57:39.534029 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-lvfmv" Oct 03 08:57:39 crc kubenswrapper[4790]: I1003 08:57:39.576312 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2859de83-046e-475b-bb37-fb8f4a3de695-config-data\") pod \"glance-db-sync-m92cz\" (UID: \"2859de83-046e-475b-bb37-fb8f4a3de695\") " pod="openstack/glance-db-sync-m92cz" Oct 03 08:57:39 crc kubenswrapper[4790]: I1003 08:57:39.576367 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/2859de83-046e-475b-bb37-fb8f4a3de695-db-sync-config-data\") pod \"glance-db-sync-m92cz\" (UID: \"2859de83-046e-475b-bb37-fb8f4a3de695\") " pod="openstack/glance-db-sync-m92cz" Oct 03 08:57:39 crc kubenswrapper[4790]: I1003 08:57:39.576552 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2859de83-046e-475b-bb37-fb8f4a3de695-combined-ca-bundle\") pod \"glance-db-sync-m92cz\" (UID: \"2859de83-046e-475b-bb37-fb8f4a3de695\") " pod="openstack/glance-db-sync-m92cz" Oct 03 08:57:39 crc kubenswrapper[4790]: I1003 08:57:39.576642 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lxrhn\" (UniqueName: \"kubernetes.io/projected/2859de83-046e-475b-bb37-fb8f4a3de695-kube-api-access-lxrhn\") pod \"glance-db-sync-m92cz\" (UID: \"2859de83-046e-475b-bb37-fb8f4a3de695\") " pod="openstack/glance-db-sync-m92cz" Oct 03 08:57:39 crc kubenswrapper[4790]: I1003 08:57:39.678376 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2859de83-046e-475b-bb37-fb8f4a3de695-config-data\") pod \"glance-db-sync-m92cz\" (UID: \"2859de83-046e-475b-bb37-fb8f4a3de695\") " pod="openstack/glance-db-sync-m92cz" Oct 03 08:57:39 crc kubenswrapper[4790]: I1003 08:57:39.678438 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/2859de83-046e-475b-bb37-fb8f4a3de695-db-sync-config-data\") pod \"glance-db-sync-m92cz\" (UID: \"2859de83-046e-475b-bb37-fb8f4a3de695\") " pod="openstack/glance-db-sync-m92cz" Oct 03 08:57:39 crc kubenswrapper[4790]: I1003 08:57:39.678562 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2859de83-046e-475b-bb37-fb8f4a3de695-combined-ca-bundle\") pod \"glance-db-sync-m92cz\" (UID: \"2859de83-046e-475b-bb37-fb8f4a3de695\") " pod="openstack/glance-db-sync-m92cz" Oct 03 08:57:39 crc kubenswrapper[4790]: I1003 08:57:39.678627 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lxrhn\" (UniqueName: \"kubernetes.io/projected/2859de83-046e-475b-bb37-fb8f4a3de695-kube-api-access-lxrhn\") pod \"glance-db-sync-m92cz\" (UID: \"2859de83-046e-475b-bb37-fb8f4a3de695\") " pod="openstack/glance-db-sync-m92cz" Oct 03 08:57:39 crc kubenswrapper[4790]: I1003 08:57:39.683235 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/2859de83-046e-475b-bb37-fb8f4a3de695-db-sync-config-data\") pod \"glance-db-sync-m92cz\" (UID: \"2859de83-046e-475b-bb37-fb8f4a3de695\") " pod="openstack/glance-db-sync-m92cz" Oct 03 08:57:39 crc kubenswrapper[4790]: I1003 08:57:39.684290 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2859de83-046e-475b-bb37-fb8f4a3de695-combined-ca-bundle\") pod \"glance-db-sync-m92cz\" (UID: \"2859de83-046e-475b-bb37-fb8f4a3de695\") " pod="openstack/glance-db-sync-m92cz" Oct 03 08:57:39 crc kubenswrapper[4790]: I1003 08:57:39.684775 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2859de83-046e-475b-bb37-fb8f4a3de695-config-data\") pod \"glance-db-sync-m92cz\" (UID: \"2859de83-046e-475b-bb37-fb8f4a3de695\") " pod="openstack/glance-db-sync-m92cz" Oct 03 08:57:39 crc kubenswrapper[4790]: I1003 08:57:39.724393 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lxrhn\" (UniqueName: \"kubernetes.io/projected/2859de83-046e-475b-bb37-fb8f4a3de695-kube-api-access-lxrhn\") pod \"glance-db-sync-m92cz\" (UID: \"2859de83-046e-475b-bb37-fb8f4a3de695\") " pod="openstack/glance-db-sync-m92cz" Oct 03 08:57:39 crc kubenswrapper[4790]: I1003 08:57:39.835653 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-m92cz" Oct 03 08:57:39 crc kubenswrapper[4790]: I1003 08:57:39.915664 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"969ea996-248e-41c5-a868-bd2321bfdf49","Type":"ContainerStarted","Data":"6aba2847061e4fa530565b96e4fe6bc6987c8e70fff3a26d62d3735a6b628e86"} Oct 03 08:57:39 crc kubenswrapper[4790]: I1003 08:57:39.920127 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"f79d57e8-680a-41b6-bca9-c233a695be4e","Type":"ContainerStarted","Data":"e71013313068ea9ba19dfe304d8f5cc8e067a087124a03955964c966041a4c30"} Oct 03 08:57:39 crc kubenswrapper[4790]: I1003 08:57:39.920752 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Oct 03 08:57:39 crc kubenswrapper[4790]: I1003 08:57:39.943418 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=53.040483276 podStartE2EDuration="1m1.943400107s" podCreationTimestamp="2025-10-03 08:56:38 +0000 UTC" firstStartedPulling="2025-10-03 08:56:54.044412673 +0000 UTC m=+1000.397346911" lastFinishedPulling="2025-10-03 08:57:02.947329504 +0000 UTC m=+1009.300263742" observedRunningTime="2025-10-03 08:57:39.942521993 +0000 UTC m=+1046.295456241" watchObservedRunningTime="2025-10-03 08:57:39.943400107 +0000 UTC m=+1046.296334345" Oct 03 08:57:40 crc kubenswrapper[4790]: I1003 08:57:40.441534 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-m92cz"] Oct 03 08:57:40 crc kubenswrapper[4790]: W1003 08:57:40.455911 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2859de83_046e_475b_bb37_fb8f4a3de695.slice/crio-e5c7a0b846a2faad1bc369aaebdd77745c853b7ea509357ef22f83588a2a6923 WatchSource:0}: Error finding container e5c7a0b846a2faad1bc369aaebdd77745c853b7ea509357ef22f83588a2a6923: Status 404 returned error can't find the container with id e5c7a0b846a2faad1bc369aaebdd77745c853b7ea509357ef22f83588a2a6923 Oct 03 08:57:40 crc kubenswrapper[4790]: I1003 08:57:40.928648 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-m92cz" event={"ID":"2859de83-046e-475b-bb37-fb8f4a3de695","Type":"ContainerStarted","Data":"e5c7a0b846a2faad1bc369aaebdd77745c853b7ea509357ef22f83588a2a6923"} Oct 03 08:57:42 crc kubenswrapper[4790]: I1003 08:57:42.948699 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"969ea996-248e-41c5-a868-bd2321bfdf49","Type":"ContainerStarted","Data":"1be963d592188b95b344a514f65a2bc6200f41163ad164180e6b2aa47557d623"} Oct 03 08:57:43 crc kubenswrapper[4790]: I1003 08:57:43.730085 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-7a52-account-create-bzmtx"] Oct 03 08:57:43 crc kubenswrapper[4790]: I1003 08:57:43.731388 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-7a52-account-create-bzmtx" Oct 03 08:57:43 crc kubenswrapper[4790]: I1003 08:57:43.734190 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Oct 03 08:57:43 crc kubenswrapper[4790]: I1003 08:57:43.742164 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-7a52-account-create-bzmtx"] Oct 03 08:57:43 crc kubenswrapper[4790]: I1003 08:57:43.856725 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lgltk\" (UniqueName: \"kubernetes.io/projected/5da9e362-3ba2-4a58-a4e1-06c4ff646cca-kube-api-access-lgltk\") pod \"keystone-7a52-account-create-bzmtx\" (UID: \"5da9e362-3ba2-4a58-a4e1-06c4ff646cca\") " pod="openstack/keystone-7a52-account-create-bzmtx" Oct 03 08:57:43 crc kubenswrapper[4790]: I1003 08:57:43.958715 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lgltk\" (UniqueName: \"kubernetes.io/projected/5da9e362-3ba2-4a58-a4e1-06c4ff646cca-kube-api-access-lgltk\") pod \"keystone-7a52-account-create-bzmtx\" (UID: \"5da9e362-3ba2-4a58-a4e1-06c4ff646cca\") " pod="openstack/keystone-7a52-account-create-bzmtx" Oct 03 08:57:43 crc kubenswrapper[4790]: I1003 08:57:43.961694 4790 generic.go:334] "Generic (PLEG): container finished" podID="04926fe2-c5e2-4663-8945-448a17ec2a8b" containerID="d800fd3bcb397d0650138d460f6c0f701f0013b82b7baf3fc84e85e9723387fb" exitCode=0 Oct 03 08:57:43 crc kubenswrapper[4790]: I1003 08:57:43.961772 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-notifications-server-0" event={"ID":"04926fe2-c5e2-4663-8945-448a17ec2a8b","Type":"ContainerDied","Data":"d800fd3bcb397d0650138d460f6c0f701f0013b82b7baf3fc84e85e9723387fb"} Oct 03 08:57:43 crc kubenswrapper[4790]: I1003 08:57:43.978913 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lgltk\" (UniqueName: \"kubernetes.io/projected/5da9e362-3ba2-4a58-a4e1-06c4ff646cca-kube-api-access-lgltk\") pod \"keystone-7a52-account-create-bzmtx\" (UID: \"5da9e362-3ba2-4a58-a4e1-06c4ff646cca\") " pod="openstack/keystone-7a52-account-create-bzmtx" Oct 03 08:57:44 crc kubenswrapper[4790]: I1003 08:57:44.032870 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-b0cf-account-create-j9m59"] Oct 03 08:57:44 crc kubenswrapper[4790]: I1003 08:57:44.037294 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-b0cf-account-create-j9m59" Oct 03 08:57:44 crc kubenswrapper[4790]: I1003 08:57:44.041218 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Oct 03 08:57:44 crc kubenswrapper[4790]: I1003 08:57:44.051604 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-b0cf-account-create-j9m59"] Oct 03 08:57:44 crc kubenswrapper[4790]: I1003 08:57:44.061279 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-7a52-account-create-bzmtx" Oct 03 08:57:44 crc kubenswrapper[4790]: I1003 08:57:44.165974 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2jgvp\" (UniqueName: \"kubernetes.io/projected/a04e5a9b-9d29-41ec-8ca6-3106d6d431ac-kube-api-access-2jgvp\") pod \"placement-b0cf-account-create-j9m59\" (UID: \"a04e5a9b-9d29-41ec-8ca6-3106d6d431ac\") " pod="openstack/placement-b0cf-account-create-j9m59" Oct 03 08:57:44 crc kubenswrapper[4790]: I1003 08:57:44.269533 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2jgvp\" (UniqueName: \"kubernetes.io/projected/a04e5a9b-9d29-41ec-8ca6-3106d6d431ac-kube-api-access-2jgvp\") pod \"placement-b0cf-account-create-j9m59\" (UID: \"a04e5a9b-9d29-41ec-8ca6-3106d6d431ac\") " pod="openstack/placement-b0cf-account-create-j9m59" Oct 03 08:57:44 crc kubenswrapper[4790]: I1003 08:57:44.290941 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2jgvp\" (UniqueName: \"kubernetes.io/projected/a04e5a9b-9d29-41ec-8ca6-3106d6d431ac-kube-api-access-2jgvp\") pod \"placement-b0cf-account-create-j9m59\" (UID: \"a04e5a9b-9d29-41ec-8ca6-3106d6d431ac\") " pod="openstack/placement-b0cf-account-create-j9m59" Oct 03 08:57:44 crc kubenswrapper[4790]: I1003 08:57:44.456087 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-b0cf-account-create-j9m59" Oct 03 08:57:44 crc kubenswrapper[4790]: I1003 08:57:44.614882 4790 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-8ljvm" podUID="845dcf0a-43fb-4760-aac6-d12c02a8b785" containerName="ovn-controller" probeResult="failure" output=< Oct 03 08:57:44 crc kubenswrapper[4790]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Oct 03 08:57:44 crc kubenswrapper[4790]: > Oct 03 08:57:44 crc kubenswrapper[4790]: I1003 08:57:44.629932 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-7a52-account-create-bzmtx"] Oct 03 08:57:44 crc kubenswrapper[4790]: I1003 08:57:44.650892 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-vrsc7" Oct 03 08:57:44 crc kubenswrapper[4790]: I1003 08:57:44.651080 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-vrsc7" Oct 03 08:57:44 crc kubenswrapper[4790]: W1003 08:57:44.652165 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5da9e362_3ba2_4a58_a4e1_06c4ff646cca.slice/crio-526794ec60e59f4c11322349dc946ed96f4ca1b04cc2bc8df538a73a52d8df91 WatchSource:0}: Error finding container 526794ec60e59f4c11322349dc946ed96f4ca1b04cc2bc8df538a73a52d8df91: Status 404 returned error can't find the container with id 526794ec60e59f4c11322349dc946ed96f4ca1b04cc2bc8df538a73a52d8df91 Oct 03 08:57:44 crc kubenswrapper[4790]: I1003 08:57:44.907846 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-8ljvm-config-9clpb"] Oct 03 08:57:44 crc kubenswrapper[4790]: I1003 08:57:44.909285 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-8ljvm-config-9clpb" Oct 03 08:57:44 crc kubenswrapper[4790]: I1003 08:57:44.921410 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Oct 03 08:57:44 crc kubenswrapper[4790]: I1003 08:57:44.931228 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-8ljvm-config-9clpb"] Oct 03 08:57:44 crc kubenswrapper[4790]: I1003 08:57:44.994109 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/0a072c71-36df-4f5f-a7e9-90989b0d64ae-var-run\") pod \"ovn-controller-8ljvm-config-9clpb\" (UID: \"0a072c71-36df-4f5f-a7e9-90989b0d64ae\") " pod="openstack/ovn-controller-8ljvm-config-9clpb" Oct 03 08:57:44 crc kubenswrapper[4790]: I1003 08:57:44.994178 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/0a072c71-36df-4f5f-a7e9-90989b0d64ae-additional-scripts\") pod \"ovn-controller-8ljvm-config-9clpb\" (UID: \"0a072c71-36df-4f5f-a7e9-90989b0d64ae\") " pod="openstack/ovn-controller-8ljvm-config-9clpb" Oct 03 08:57:44 crc kubenswrapper[4790]: I1003 08:57:44.994199 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/0a072c71-36df-4f5f-a7e9-90989b0d64ae-var-log-ovn\") pod \"ovn-controller-8ljvm-config-9clpb\" (UID: \"0a072c71-36df-4f5f-a7e9-90989b0d64ae\") " pod="openstack/ovn-controller-8ljvm-config-9clpb" Oct 03 08:57:44 crc kubenswrapper[4790]: I1003 08:57:44.994233 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9c5z2\" (UniqueName: \"kubernetes.io/projected/0a072c71-36df-4f5f-a7e9-90989b0d64ae-kube-api-access-9c5z2\") pod \"ovn-controller-8ljvm-config-9clpb\" (UID: \"0a072c71-36df-4f5f-a7e9-90989b0d64ae\") " pod="openstack/ovn-controller-8ljvm-config-9clpb" Oct 03 08:57:44 crc kubenswrapper[4790]: I1003 08:57:44.994266 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0a072c71-36df-4f5f-a7e9-90989b0d64ae-scripts\") pod \"ovn-controller-8ljvm-config-9clpb\" (UID: \"0a072c71-36df-4f5f-a7e9-90989b0d64ae\") " pod="openstack/ovn-controller-8ljvm-config-9clpb" Oct 03 08:57:44 crc kubenswrapper[4790]: I1003 08:57:44.994293 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/0a072c71-36df-4f5f-a7e9-90989b0d64ae-var-run-ovn\") pod \"ovn-controller-8ljvm-config-9clpb\" (UID: \"0a072c71-36df-4f5f-a7e9-90989b0d64ae\") " pod="openstack/ovn-controller-8ljvm-config-9clpb" Oct 03 08:57:45 crc kubenswrapper[4790]: I1003 08:57:45.003601 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-7a52-account-create-bzmtx" event={"ID":"5da9e362-3ba2-4a58-a4e1-06c4ff646cca","Type":"ContainerStarted","Data":"e0c82b798526105642c0ff4fd307316f3a97bc290a9d45108d73157e07e5b386"} Oct 03 08:57:45 crc kubenswrapper[4790]: I1003 08:57:45.003650 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-7a52-account-create-bzmtx" event={"ID":"5da9e362-3ba2-4a58-a4e1-06c4ff646cca","Type":"ContainerStarted","Data":"526794ec60e59f4c11322349dc946ed96f4ca1b04cc2bc8df538a73a52d8df91"} Oct 03 08:57:45 crc kubenswrapper[4790]: I1003 08:57:45.020422 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-notifications-server-0" event={"ID":"04926fe2-c5e2-4663-8945-448a17ec2a8b","Type":"ContainerStarted","Data":"c8568b2f1028e2b148d052b6fe71bf5918f84b3f2b34bfa30aee6f559c00150f"} Oct 03 08:57:45 crc kubenswrapper[4790]: I1003 08:57:45.021009 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-notifications-server-0" Oct 03 08:57:45 crc kubenswrapper[4790]: I1003 08:57:45.035684 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-7a52-account-create-bzmtx" podStartSLOduration=2.035658831 podStartE2EDuration="2.035658831s" podCreationTimestamp="2025-10-03 08:57:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 08:57:45.0268458 +0000 UTC m=+1051.379780038" watchObservedRunningTime="2025-10-03 08:57:45.035658831 +0000 UTC m=+1051.388593069" Oct 03 08:57:45 crc kubenswrapper[4790]: I1003 08:57:45.095520 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0a072c71-36df-4f5f-a7e9-90989b0d64ae-scripts\") pod \"ovn-controller-8ljvm-config-9clpb\" (UID: \"0a072c71-36df-4f5f-a7e9-90989b0d64ae\") " pod="openstack/ovn-controller-8ljvm-config-9clpb" Oct 03 08:57:45 crc kubenswrapper[4790]: I1003 08:57:45.095591 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/0a072c71-36df-4f5f-a7e9-90989b0d64ae-var-run-ovn\") pod \"ovn-controller-8ljvm-config-9clpb\" (UID: \"0a072c71-36df-4f5f-a7e9-90989b0d64ae\") " pod="openstack/ovn-controller-8ljvm-config-9clpb" Oct 03 08:57:45 crc kubenswrapper[4790]: I1003 08:57:45.095694 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/0a072c71-36df-4f5f-a7e9-90989b0d64ae-var-run\") pod \"ovn-controller-8ljvm-config-9clpb\" (UID: \"0a072c71-36df-4f5f-a7e9-90989b0d64ae\") " pod="openstack/ovn-controller-8ljvm-config-9clpb" Oct 03 08:57:45 crc kubenswrapper[4790]: I1003 08:57:45.095704 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-b0cf-account-create-j9m59"] Oct 03 08:57:45 crc kubenswrapper[4790]: I1003 08:57:45.095722 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/0a072c71-36df-4f5f-a7e9-90989b0d64ae-additional-scripts\") pod \"ovn-controller-8ljvm-config-9clpb\" (UID: \"0a072c71-36df-4f5f-a7e9-90989b0d64ae\") " pod="openstack/ovn-controller-8ljvm-config-9clpb" Oct 03 08:57:45 crc kubenswrapper[4790]: I1003 08:57:45.095834 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/0a072c71-36df-4f5f-a7e9-90989b0d64ae-var-log-ovn\") pod \"ovn-controller-8ljvm-config-9clpb\" (UID: \"0a072c71-36df-4f5f-a7e9-90989b0d64ae\") " pod="openstack/ovn-controller-8ljvm-config-9clpb" Oct 03 08:57:45 crc kubenswrapper[4790]: I1003 08:57:45.095922 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9c5z2\" (UniqueName: \"kubernetes.io/projected/0a072c71-36df-4f5f-a7e9-90989b0d64ae-kube-api-access-9c5z2\") pod \"ovn-controller-8ljvm-config-9clpb\" (UID: \"0a072c71-36df-4f5f-a7e9-90989b0d64ae\") " pod="openstack/ovn-controller-8ljvm-config-9clpb" Oct 03 08:57:45 crc kubenswrapper[4790]: I1003 08:57:45.096423 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/0a072c71-36df-4f5f-a7e9-90989b0d64ae-var-run-ovn\") pod \"ovn-controller-8ljvm-config-9clpb\" (UID: \"0a072c71-36df-4f5f-a7e9-90989b0d64ae\") " pod="openstack/ovn-controller-8ljvm-config-9clpb" Oct 03 08:57:45 crc kubenswrapper[4790]: I1003 08:57:45.096423 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/0a072c71-36df-4f5f-a7e9-90989b0d64ae-var-log-ovn\") pod \"ovn-controller-8ljvm-config-9clpb\" (UID: \"0a072c71-36df-4f5f-a7e9-90989b0d64ae\") " pod="openstack/ovn-controller-8ljvm-config-9clpb" Oct 03 08:57:45 crc kubenswrapper[4790]: I1003 08:57:45.096490 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/0a072c71-36df-4f5f-a7e9-90989b0d64ae-var-run\") pod \"ovn-controller-8ljvm-config-9clpb\" (UID: \"0a072c71-36df-4f5f-a7e9-90989b0d64ae\") " pod="openstack/ovn-controller-8ljvm-config-9clpb" Oct 03 08:57:45 crc kubenswrapper[4790]: I1003 08:57:45.096729 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/0a072c71-36df-4f5f-a7e9-90989b0d64ae-additional-scripts\") pod \"ovn-controller-8ljvm-config-9clpb\" (UID: \"0a072c71-36df-4f5f-a7e9-90989b0d64ae\") " pod="openstack/ovn-controller-8ljvm-config-9clpb" Oct 03 08:57:45 crc kubenswrapper[4790]: I1003 08:57:45.098359 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0a072c71-36df-4f5f-a7e9-90989b0d64ae-scripts\") pod \"ovn-controller-8ljvm-config-9clpb\" (UID: \"0a072c71-36df-4f5f-a7e9-90989b0d64ae\") " pod="openstack/ovn-controller-8ljvm-config-9clpb" Oct 03 08:57:45 crc kubenswrapper[4790]: I1003 08:57:45.139973 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9c5z2\" (UniqueName: \"kubernetes.io/projected/0a072c71-36df-4f5f-a7e9-90989b0d64ae-kube-api-access-9c5z2\") pod \"ovn-controller-8ljvm-config-9clpb\" (UID: \"0a072c71-36df-4f5f-a7e9-90989b0d64ae\") " pod="openstack/ovn-controller-8ljvm-config-9clpb" Oct 03 08:57:45 crc kubenswrapper[4790]: I1003 08:57:45.237604 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-8ljvm-config-9clpb" Oct 03 08:57:45 crc kubenswrapper[4790]: I1003 08:57:45.730551 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-notifications-server-0" podStartSLOduration=58.799577866999996 podStartE2EDuration="1m7.730524585s" podCreationTimestamp="2025-10-03 08:56:38 +0000 UTC" firstStartedPulling="2025-10-03 08:56:54.016476309 +0000 UTC m=+1000.369410547" lastFinishedPulling="2025-10-03 08:57:02.947423027 +0000 UTC m=+1009.300357265" observedRunningTime="2025-10-03 08:57:45.084813297 +0000 UTC m=+1051.437747535" watchObservedRunningTime="2025-10-03 08:57:45.730524585 +0000 UTC m=+1052.083458833" Oct 03 08:57:45 crc kubenswrapper[4790]: I1003 08:57:45.732121 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-8ljvm-config-9clpb"] Oct 03 08:57:45 crc kubenswrapper[4790]: W1003 08:57:45.734437 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0a072c71_36df_4f5f_a7e9_90989b0d64ae.slice/crio-0b84403f8dfe12a05c4c0409c2a503616b1f5f74055ca97b3bb3584e9fd92a70 WatchSource:0}: Error finding container 0b84403f8dfe12a05c4c0409c2a503616b1f5f74055ca97b3bb3584e9fd92a70: Status 404 returned error can't find the container with id 0b84403f8dfe12a05c4c0409c2a503616b1f5f74055ca97b3bb3584e9fd92a70 Oct 03 08:57:46 crc kubenswrapper[4790]: I1003 08:57:46.042323 4790 generic.go:334] "Generic (PLEG): container finished" podID="a04e5a9b-9d29-41ec-8ca6-3106d6d431ac" containerID="8f7ce841158cd06344c6fffb05bd4f61802c4eea97cd88fea1ce751cfab4f994" exitCode=0 Oct 03 08:57:46 crc kubenswrapper[4790]: I1003 08:57:46.042592 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-b0cf-account-create-j9m59" event={"ID":"a04e5a9b-9d29-41ec-8ca6-3106d6d431ac","Type":"ContainerDied","Data":"8f7ce841158cd06344c6fffb05bd4f61802c4eea97cd88fea1ce751cfab4f994"} Oct 03 08:57:46 crc kubenswrapper[4790]: I1003 08:57:46.042668 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-b0cf-account-create-j9m59" event={"ID":"a04e5a9b-9d29-41ec-8ca6-3106d6d431ac","Type":"ContainerStarted","Data":"130c1a053d575d3a3103722704bb18fa59436991931e688856ba58754cee43c4"} Oct 03 08:57:46 crc kubenswrapper[4790]: I1003 08:57:46.045241 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-8ljvm-config-9clpb" event={"ID":"0a072c71-36df-4f5f-a7e9-90989b0d64ae","Type":"ContainerStarted","Data":"0b84403f8dfe12a05c4c0409c2a503616b1f5f74055ca97b3bb3584e9fd92a70"} Oct 03 08:57:46 crc kubenswrapper[4790]: I1003 08:57:46.047181 4790 generic.go:334] "Generic (PLEG): container finished" podID="5da9e362-3ba2-4a58-a4e1-06c4ff646cca" containerID="e0c82b798526105642c0ff4fd307316f3a97bc290a9d45108d73157e07e5b386" exitCode=0 Oct 03 08:57:46 crc kubenswrapper[4790]: I1003 08:57:46.047983 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-7a52-account-create-bzmtx" event={"ID":"5da9e362-3ba2-4a58-a4e1-06c4ff646cca","Type":"ContainerDied","Data":"e0c82b798526105642c0ff4fd307316f3a97bc290a9d45108d73157e07e5b386"} Oct 03 08:57:46 crc kubenswrapper[4790]: I1003 08:57:46.344621 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-5be0-account-create-clnmh"] Oct 03 08:57:46 crc kubenswrapper[4790]: I1003 08:57:46.346400 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-5be0-account-create-clnmh" Oct 03 08:57:46 crc kubenswrapper[4790]: I1003 08:57:46.356992 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-db-secret" Oct 03 08:57:46 crc kubenswrapper[4790]: I1003 08:57:46.371347 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-5be0-account-create-clnmh"] Oct 03 08:57:46 crc kubenswrapper[4790]: I1003 08:57:46.424295 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-thrtf\" (UniqueName: \"kubernetes.io/projected/6b7d6d53-4ca2-49d7-92bb-9144efa2c104-kube-api-access-thrtf\") pod \"watcher-5be0-account-create-clnmh\" (UID: \"6b7d6d53-4ca2-49d7-92bb-9144efa2c104\") " pod="openstack/watcher-5be0-account-create-clnmh" Oct 03 08:57:46 crc kubenswrapper[4790]: I1003 08:57:46.526429 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-thrtf\" (UniqueName: \"kubernetes.io/projected/6b7d6d53-4ca2-49d7-92bb-9144efa2c104-kube-api-access-thrtf\") pod \"watcher-5be0-account-create-clnmh\" (UID: \"6b7d6d53-4ca2-49d7-92bb-9144efa2c104\") " pod="openstack/watcher-5be0-account-create-clnmh" Oct 03 08:57:46 crc kubenswrapper[4790]: I1003 08:57:46.546893 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-thrtf\" (UniqueName: \"kubernetes.io/projected/6b7d6d53-4ca2-49d7-92bb-9144efa2c104-kube-api-access-thrtf\") pod \"watcher-5be0-account-create-clnmh\" (UID: \"6b7d6d53-4ca2-49d7-92bb-9144efa2c104\") " pod="openstack/watcher-5be0-account-create-clnmh" Oct 03 08:57:46 crc kubenswrapper[4790]: I1003 08:57:46.669495 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-5be0-account-create-clnmh" Oct 03 08:57:47 crc kubenswrapper[4790]: I1003 08:57:47.060389 4790 generic.go:334] "Generic (PLEG): container finished" podID="0a072c71-36df-4f5f-a7e9-90989b0d64ae" containerID="1b075cf75721e4b4dc222872f5825468c72b3460427c9967d29f7432a5c07169" exitCode=0 Oct 03 08:57:47 crc kubenswrapper[4790]: I1003 08:57:47.061304 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-8ljvm-config-9clpb" event={"ID":"0a072c71-36df-4f5f-a7e9-90989b0d64ae","Type":"ContainerDied","Data":"1b075cf75721e4b4dc222872f5825468c72b3460427c9967d29f7432a5c07169"} Oct 03 08:57:47 crc kubenswrapper[4790]: I1003 08:57:47.148953 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-5be0-account-create-clnmh"] Oct 03 08:57:48 crc kubenswrapper[4790]: I1003 08:57:48.073809 4790 generic.go:334] "Generic (PLEG): container finished" podID="6b7d6d53-4ca2-49d7-92bb-9144efa2c104" containerID="b3d94db5ea9f3fcafbabcf709c49a597d4397710f4d76be3a6835ddc2a8db74d" exitCode=0 Oct 03 08:57:48 crc kubenswrapper[4790]: I1003 08:57:48.073907 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-5be0-account-create-clnmh" event={"ID":"6b7d6d53-4ca2-49d7-92bb-9144efa2c104","Type":"ContainerDied","Data":"b3d94db5ea9f3fcafbabcf709c49a597d4397710f4d76be3a6835ddc2a8db74d"} Oct 03 08:57:48 crc kubenswrapper[4790]: I1003 08:57:48.074468 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-5be0-account-create-clnmh" event={"ID":"6b7d6d53-4ca2-49d7-92bb-9144efa2c104","Type":"ContainerStarted","Data":"508076a539694b2be7f6db728fae34b7dfbd5b22db936ca84df0282dad80de0d"} Oct 03 08:57:49 crc kubenswrapper[4790]: I1003 08:57:49.170943 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/d6c105dc-48df-4d07-8a82-ca93e78ad98c-etc-swift\") pod \"swift-storage-0\" (UID: \"d6c105dc-48df-4d07-8a82-ca93e78ad98c\") " pod="openstack/swift-storage-0" Oct 03 08:57:49 crc kubenswrapper[4790]: I1003 08:57:49.186639 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/d6c105dc-48df-4d07-8a82-ca93e78ad98c-etc-swift\") pod \"swift-storage-0\" (UID: \"d6c105dc-48df-4d07-8a82-ca93e78ad98c\") " pod="openstack/swift-storage-0" Oct 03 08:57:49 crc kubenswrapper[4790]: I1003 08:57:49.193083 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Oct 03 08:57:49 crc kubenswrapper[4790]: I1003 08:57:49.601508 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-8ljvm" Oct 03 08:57:49 crc kubenswrapper[4790]: I1003 08:57:49.695639 4790 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="f79d57e8-680a-41b6-bca9-c233a695be4e" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.107:5671: connect: connection refused" Oct 03 08:57:50 crc kubenswrapper[4790]: I1003 08:57:50.048933 4790 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-cell1-server-0" podUID="f5ab6b2b-5bba-4043-b116-b8d996df1ace" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.108:5671: connect: connection refused" Oct 03 08:57:51 crc kubenswrapper[4790]: I1003 08:57:51.129091 4790 generic.go:334] "Generic (PLEG): container finished" podID="969ea996-248e-41c5-a868-bd2321bfdf49" containerID="1be963d592188b95b344a514f65a2bc6200f41163ad164180e6b2aa47557d623" exitCode=0 Oct 03 08:57:51 crc kubenswrapper[4790]: I1003 08:57:51.129140 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"969ea996-248e-41c5-a868-bd2321bfdf49","Type":"ContainerDied","Data":"1be963d592188b95b344a514f65a2bc6200f41163ad164180e6b2aa47557d623"} Oct 03 08:57:54 crc kubenswrapper[4790]: I1003 08:57:54.672721 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-8ljvm-config-9clpb" Oct 03 08:57:54 crc kubenswrapper[4790]: I1003 08:57:54.749524 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-5be0-account-create-clnmh" Oct 03 08:57:54 crc kubenswrapper[4790]: I1003 08:57:54.754898 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-7a52-account-create-bzmtx" Oct 03 08:57:54 crc kubenswrapper[4790]: I1003 08:57:54.760935 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-b0cf-account-create-j9m59" Oct 03 08:57:54 crc kubenswrapper[4790]: I1003 08:57:54.781100 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/0a072c71-36df-4f5f-a7e9-90989b0d64ae-var-log-ovn\") pod \"0a072c71-36df-4f5f-a7e9-90989b0d64ae\" (UID: \"0a072c71-36df-4f5f-a7e9-90989b0d64ae\") " Oct 03 08:57:54 crc kubenswrapper[4790]: I1003 08:57:54.781488 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/0a072c71-36df-4f5f-a7e9-90989b0d64ae-var-run-ovn\") pod \"0a072c71-36df-4f5f-a7e9-90989b0d64ae\" (UID: \"0a072c71-36df-4f5f-a7e9-90989b0d64ae\") " Oct 03 08:57:54 crc kubenswrapper[4790]: I1003 08:57:54.781253 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0a072c71-36df-4f5f-a7e9-90989b0d64ae-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "0a072c71-36df-4f5f-a7e9-90989b0d64ae" (UID: "0a072c71-36df-4f5f-a7e9-90989b0d64ae"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 08:57:54 crc kubenswrapper[4790]: I1003 08:57:54.781658 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0a072c71-36df-4f5f-a7e9-90989b0d64ae-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "0a072c71-36df-4f5f-a7e9-90989b0d64ae" (UID: "0a072c71-36df-4f5f-a7e9-90989b0d64ae"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 08:57:54 crc kubenswrapper[4790]: I1003 08:57:54.781563 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2jgvp\" (UniqueName: \"kubernetes.io/projected/a04e5a9b-9d29-41ec-8ca6-3106d6d431ac-kube-api-access-2jgvp\") pod \"a04e5a9b-9d29-41ec-8ca6-3106d6d431ac\" (UID: \"a04e5a9b-9d29-41ec-8ca6-3106d6d431ac\") " Oct 03 08:57:54 crc kubenswrapper[4790]: I1003 08:57:54.782292 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lgltk\" (UniqueName: \"kubernetes.io/projected/5da9e362-3ba2-4a58-a4e1-06c4ff646cca-kube-api-access-lgltk\") pod \"5da9e362-3ba2-4a58-a4e1-06c4ff646cca\" (UID: \"5da9e362-3ba2-4a58-a4e1-06c4ff646cca\") " Oct 03 08:57:54 crc kubenswrapper[4790]: I1003 08:57:54.782611 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/0a072c71-36df-4f5f-a7e9-90989b0d64ae-var-run\") pod \"0a072c71-36df-4f5f-a7e9-90989b0d64ae\" (UID: \"0a072c71-36df-4f5f-a7e9-90989b0d64ae\") " Oct 03 08:57:54 crc kubenswrapper[4790]: I1003 08:57:54.782762 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9c5z2\" (UniqueName: \"kubernetes.io/projected/0a072c71-36df-4f5f-a7e9-90989b0d64ae-kube-api-access-9c5z2\") pod \"0a072c71-36df-4f5f-a7e9-90989b0d64ae\" (UID: \"0a072c71-36df-4f5f-a7e9-90989b0d64ae\") " Oct 03 08:57:54 crc kubenswrapper[4790]: I1003 08:57:54.782826 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-thrtf\" (UniqueName: \"kubernetes.io/projected/6b7d6d53-4ca2-49d7-92bb-9144efa2c104-kube-api-access-thrtf\") pod \"6b7d6d53-4ca2-49d7-92bb-9144efa2c104\" (UID: \"6b7d6d53-4ca2-49d7-92bb-9144efa2c104\") " Oct 03 08:57:54 crc kubenswrapper[4790]: I1003 08:57:54.782852 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0a072c71-36df-4f5f-a7e9-90989b0d64ae-scripts\") pod \"0a072c71-36df-4f5f-a7e9-90989b0d64ae\" (UID: \"0a072c71-36df-4f5f-a7e9-90989b0d64ae\") " Oct 03 08:57:54 crc kubenswrapper[4790]: I1003 08:57:54.782923 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/0a072c71-36df-4f5f-a7e9-90989b0d64ae-additional-scripts\") pod \"0a072c71-36df-4f5f-a7e9-90989b0d64ae\" (UID: \"0a072c71-36df-4f5f-a7e9-90989b0d64ae\") " Oct 03 08:57:54 crc kubenswrapper[4790]: I1003 08:57:54.783440 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0a072c71-36df-4f5f-a7e9-90989b0d64ae-var-run" (OuterVolumeSpecName: "var-run") pod "0a072c71-36df-4f5f-a7e9-90989b0d64ae" (UID: "0a072c71-36df-4f5f-a7e9-90989b0d64ae"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 08:57:54 crc kubenswrapper[4790]: I1003 08:57:54.784178 4790 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/0a072c71-36df-4f5f-a7e9-90989b0d64ae-var-log-ovn\") on node \"crc\" DevicePath \"\"" Oct 03 08:57:54 crc kubenswrapper[4790]: I1003 08:57:54.784208 4790 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/0a072c71-36df-4f5f-a7e9-90989b0d64ae-var-run-ovn\") on node \"crc\" DevicePath \"\"" Oct 03 08:57:54 crc kubenswrapper[4790]: I1003 08:57:54.784241 4790 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/0a072c71-36df-4f5f-a7e9-90989b0d64ae-var-run\") on node \"crc\" DevicePath \"\"" Oct 03 08:57:54 crc kubenswrapper[4790]: I1003 08:57:54.784420 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0a072c71-36df-4f5f-a7e9-90989b0d64ae-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "0a072c71-36df-4f5f-a7e9-90989b0d64ae" (UID: "0a072c71-36df-4f5f-a7e9-90989b0d64ae"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:57:54 crc kubenswrapper[4790]: I1003 08:57:54.784784 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0a072c71-36df-4f5f-a7e9-90989b0d64ae-scripts" (OuterVolumeSpecName: "scripts") pod "0a072c71-36df-4f5f-a7e9-90989b0d64ae" (UID: "0a072c71-36df-4f5f-a7e9-90989b0d64ae"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:57:54 crc kubenswrapper[4790]: I1003 08:57:54.792625 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0a072c71-36df-4f5f-a7e9-90989b0d64ae-kube-api-access-9c5z2" (OuterVolumeSpecName: "kube-api-access-9c5z2") pod "0a072c71-36df-4f5f-a7e9-90989b0d64ae" (UID: "0a072c71-36df-4f5f-a7e9-90989b0d64ae"). InnerVolumeSpecName "kube-api-access-9c5z2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:57:54 crc kubenswrapper[4790]: I1003 08:57:54.797909 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6b7d6d53-4ca2-49d7-92bb-9144efa2c104-kube-api-access-thrtf" (OuterVolumeSpecName: "kube-api-access-thrtf") pod "6b7d6d53-4ca2-49d7-92bb-9144efa2c104" (UID: "6b7d6d53-4ca2-49d7-92bb-9144efa2c104"). InnerVolumeSpecName "kube-api-access-thrtf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:57:54 crc kubenswrapper[4790]: I1003 08:57:54.800277 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a04e5a9b-9d29-41ec-8ca6-3106d6d431ac-kube-api-access-2jgvp" (OuterVolumeSpecName: "kube-api-access-2jgvp") pod "a04e5a9b-9d29-41ec-8ca6-3106d6d431ac" (UID: "a04e5a9b-9d29-41ec-8ca6-3106d6d431ac"). InnerVolumeSpecName "kube-api-access-2jgvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:57:54 crc kubenswrapper[4790]: I1003 08:57:54.802406 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5da9e362-3ba2-4a58-a4e1-06c4ff646cca-kube-api-access-lgltk" (OuterVolumeSpecName: "kube-api-access-lgltk") pod "5da9e362-3ba2-4a58-a4e1-06c4ff646cca" (UID: "5da9e362-3ba2-4a58-a4e1-06c4ff646cca"). InnerVolumeSpecName "kube-api-access-lgltk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:57:54 crc kubenswrapper[4790]: I1003 08:57:54.885370 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2jgvp\" (UniqueName: \"kubernetes.io/projected/a04e5a9b-9d29-41ec-8ca6-3106d6d431ac-kube-api-access-2jgvp\") on node \"crc\" DevicePath \"\"" Oct 03 08:57:54 crc kubenswrapper[4790]: I1003 08:57:54.885408 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lgltk\" (UniqueName: \"kubernetes.io/projected/5da9e362-3ba2-4a58-a4e1-06c4ff646cca-kube-api-access-lgltk\") on node \"crc\" DevicePath \"\"" Oct 03 08:57:54 crc kubenswrapper[4790]: I1003 08:57:54.885422 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9c5z2\" (UniqueName: \"kubernetes.io/projected/0a072c71-36df-4f5f-a7e9-90989b0d64ae-kube-api-access-9c5z2\") on node \"crc\" DevicePath \"\"" Oct 03 08:57:54 crc kubenswrapper[4790]: I1003 08:57:54.885435 4790 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0a072c71-36df-4f5f-a7e9-90989b0d64ae-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 08:57:54 crc kubenswrapper[4790]: I1003 08:57:54.885447 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-thrtf\" (UniqueName: \"kubernetes.io/projected/6b7d6d53-4ca2-49d7-92bb-9144efa2c104-kube-api-access-thrtf\") on node \"crc\" DevicePath \"\"" Oct 03 08:57:54 crc kubenswrapper[4790]: I1003 08:57:54.885458 4790 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/0a072c71-36df-4f5f-a7e9-90989b0d64ae-additional-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 08:57:55 crc kubenswrapper[4790]: I1003 08:57:55.180041 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-8ljvm-config-9clpb" event={"ID":"0a072c71-36df-4f5f-a7e9-90989b0d64ae","Type":"ContainerDied","Data":"0b84403f8dfe12a05c4c0409c2a503616b1f5f74055ca97b3bb3584e9fd92a70"} Oct 03 08:57:55 crc kubenswrapper[4790]: I1003 08:57:55.180076 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0b84403f8dfe12a05c4c0409c2a503616b1f5f74055ca97b3bb3584e9fd92a70" Oct 03 08:57:55 crc kubenswrapper[4790]: I1003 08:57:55.180151 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-8ljvm-config-9clpb" Oct 03 08:57:55 crc kubenswrapper[4790]: I1003 08:57:55.185033 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"969ea996-248e-41c5-a868-bd2321bfdf49","Type":"ContainerStarted","Data":"b2fa0bbfa445c98b31a8b51965a7772f66cb02af44db0718136e149250b82e41"} Oct 03 08:57:55 crc kubenswrapper[4790]: I1003 08:57:55.196250 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-7a52-account-create-bzmtx" Oct 03 08:57:55 crc kubenswrapper[4790]: I1003 08:57:55.196339 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-7a52-account-create-bzmtx" event={"ID":"5da9e362-3ba2-4a58-a4e1-06c4ff646cca","Type":"ContainerDied","Data":"526794ec60e59f4c11322349dc946ed96f4ca1b04cc2bc8df538a73a52d8df91"} Oct 03 08:57:55 crc kubenswrapper[4790]: I1003 08:57:55.196407 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="526794ec60e59f4c11322349dc946ed96f4ca1b04cc2bc8df538a73a52d8df91" Oct 03 08:57:55 crc kubenswrapper[4790]: I1003 08:57:55.198322 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-5be0-account-create-clnmh" event={"ID":"6b7d6d53-4ca2-49d7-92bb-9144efa2c104","Type":"ContainerDied","Data":"508076a539694b2be7f6db728fae34b7dfbd5b22db936ca84df0282dad80de0d"} Oct 03 08:57:55 crc kubenswrapper[4790]: I1003 08:57:55.198465 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="508076a539694b2be7f6db728fae34b7dfbd5b22db936ca84df0282dad80de0d" Oct 03 08:57:55 crc kubenswrapper[4790]: I1003 08:57:55.198358 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-5be0-account-create-clnmh" Oct 03 08:57:55 crc kubenswrapper[4790]: I1003 08:57:55.202147 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-b0cf-account-create-j9m59" Oct 03 08:57:55 crc kubenswrapper[4790]: I1003 08:57:55.202031 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-b0cf-account-create-j9m59" event={"ID":"a04e5a9b-9d29-41ec-8ca6-3106d6d431ac","Type":"ContainerDied","Data":"130c1a053d575d3a3103722704bb18fa59436991931e688856ba58754cee43c4"} Oct 03 08:57:55 crc kubenswrapper[4790]: I1003 08:57:55.203816 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="130c1a053d575d3a3103722704bb18fa59436991931e688856ba58754cee43c4" Oct 03 08:57:55 crc kubenswrapper[4790]: I1003 08:57:55.208080 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Oct 03 08:57:55 crc kubenswrapper[4790]: W1003 08:57:55.212736 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd6c105dc_48df_4d07_8a82_ca93e78ad98c.slice/crio-8f904766b9043c137a33937cd2ffaeb13178ddcb9f492ff13fa0e1320cee1f91 WatchSource:0}: Error finding container 8f904766b9043c137a33937cd2ffaeb13178ddcb9f492ff13fa0e1320cee1f91: Status 404 returned error can't find the container with id 8f904766b9043c137a33937cd2ffaeb13178ddcb9f492ff13fa0e1320cee1f91 Oct 03 08:57:55 crc kubenswrapper[4790]: I1003 08:57:55.787376 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-8ljvm-config-9clpb"] Oct 03 08:57:55 crc kubenswrapper[4790]: I1003 08:57:55.797255 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-8ljvm-config-9clpb"] Oct 03 08:57:55 crc kubenswrapper[4790]: I1003 08:57:55.884973 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-8ljvm-config-cvgm4"] Oct 03 08:57:55 crc kubenswrapper[4790]: E1003 08:57:55.885613 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5da9e362-3ba2-4a58-a4e1-06c4ff646cca" containerName="mariadb-account-create" Oct 03 08:57:55 crc kubenswrapper[4790]: I1003 08:57:55.885691 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="5da9e362-3ba2-4a58-a4e1-06c4ff646cca" containerName="mariadb-account-create" Oct 03 08:57:55 crc kubenswrapper[4790]: E1003 08:57:55.885772 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a04e5a9b-9d29-41ec-8ca6-3106d6d431ac" containerName="mariadb-account-create" Oct 03 08:57:55 crc kubenswrapper[4790]: I1003 08:57:55.885831 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="a04e5a9b-9d29-41ec-8ca6-3106d6d431ac" containerName="mariadb-account-create" Oct 03 08:57:55 crc kubenswrapper[4790]: E1003 08:57:55.885894 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0a072c71-36df-4f5f-a7e9-90989b0d64ae" containerName="ovn-config" Oct 03 08:57:55 crc kubenswrapper[4790]: I1003 08:57:55.885952 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a072c71-36df-4f5f-a7e9-90989b0d64ae" containerName="ovn-config" Oct 03 08:57:55 crc kubenswrapper[4790]: E1003 08:57:55.886013 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b7d6d53-4ca2-49d7-92bb-9144efa2c104" containerName="mariadb-account-create" Oct 03 08:57:55 crc kubenswrapper[4790]: I1003 08:57:55.886072 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b7d6d53-4ca2-49d7-92bb-9144efa2c104" containerName="mariadb-account-create" Oct 03 08:57:55 crc kubenswrapper[4790]: I1003 08:57:55.886295 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="5da9e362-3ba2-4a58-a4e1-06c4ff646cca" containerName="mariadb-account-create" Oct 03 08:57:55 crc kubenswrapper[4790]: I1003 08:57:55.886416 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="6b7d6d53-4ca2-49d7-92bb-9144efa2c104" containerName="mariadb-account-create" Oct 03 08:57:55 crc kubenswrapper[4790]: I1003 08:57:55.886489 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="a04e5a9b-9d29-41ec-8ca6-3106d6d431ac" containerName="mariadb-account-create" Oct 03 08:57:55 crc kubenswrapper[4790]: I1003 08:57:55.886597 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="0a072c71-36df-4f5f-a7e9-90989b0d64ae" containerName="ovn-config" Oct 03 08:57:55 crc kubenswrapper[4790]: I1003 08:57:55.887338 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-8ljvm-config-cvgm4" Oct 03 08:57:55 crc kubenswrapper[4790]: I1003 08:57:55.892240 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Oct 03 08:57:55 crc kubenswrapper[4790]: I1003 08:57:55.899229 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-8ljvm-config-cvgm4"] Oct 03 08:57:55 crc kubenswrapper[4790]: I1003 08:57:55.911461 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/2d7ff74f-da99-474d-9044-630e6103544c-var-run-ovn\") pod \"ovn-controller-8ljvm-config-cvgm4\" (UID: \"2d7ff74f-da99-474d-9044-630e6103544c\") " pod="openstack/ovn-controller-8ljvm-config-cvgm4" Oct 03 08:57:55 crc kubenswrapper[4790]: I1003 08:57:55.911601 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/2d7ff74f-da99-474d-9044-630e6103544c-var-log-ovn\") pod \"ovn-controller-8ljvm-config-cvgm4\" (UID: \"2d7ff74f-da99-474d-9044-630e6103544c\") " pod="openstack/ovn-controller-8ljvm-config-cvgm4" Oct 03 08:57:55 crc kubenswrapper[4790]: I1003 08:57:55.911663 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/2d7ff74f-da99-474d-9044-630e6103544c-var-run\") pod \"ovn-controller-8ljvm-config-cvgm4\" (UID: \"2d7ff74f-da99-474d-9044-630e6103544c\") " pod="openstack/ovn-controller-8ljvm-config-cvgm4" Oct 03 08:57:55 crc kubenswrapper[4790]: I1003 08:57:55.911701 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/2d7ff74f-da99-474d-9044-630e6103544c-additional-scripts\") pod \"ovn-controller-8ljvm-config-cvgm4\" (UID: \"2d7ff74f-da99-474d-9044-630e6103544c\") " pod="openstack/ovn-controller-8ljvm-config-cvgm4" Oct 03 08:57:55 crc kubenswrapper[4790]: I1003 08:57:55.911724 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2d7ff74f-da99-474d-9044-630e6103544c-scripts\") pod \"ovn-controller-8ljvm-config-cvgm4\" (UID: \"2d7ff74f-da99-474d-9044-630e6103544c\") " pod="openstack/ovn-controller-8ljvm-config-cvgm4" Oct 03 08:57:55 crc kubenswrapper[4790]: I1003 08:57:55.911759 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jjhvc\" (UniqueName: \"kubernetes.io/projected/2d7ff74f-da99-474d-9044-630e6103544c-kube-api-access-jjhvc\") pod \"ovn-controller-8ljvm-config-cvgm4\" (UID: \"2d7ff74f-da99-474d-9044-630e6103544c\") " pod="openstack/ovn-controller-8ljvm-config-cvgm4" Oct 03 08:57:56 crc kubenswrapper[4790]: I1003 08:57:56.013724 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/2d7ff74f-da99-474d-9044-630e6103544c-var-run\") pod \"ovn-controller-8ljvm-config-cvgm4\" (UID: \"2d7ff74f-da99-474d-9044-630e6103544c\") " pod="openstack/ovn-controller-8ljvm-config-cvgm4" Oct 03 08:57:56 crc kubenswrapper[4790]: I1003 08:57:56.013780 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/2d7ff74f-da99-474d-9044-630e6103544c-additional-scripts\") pod \"ovn-controller-8ljvm-config-cvgm4\" (UID: \"2d7ff74f-da99-474d-9044-630e6103544c\") " pod="openstack/ovn-controller-8ljvm-config-cvgm4" Oct 03 08:57:56 crc kubenswrapper[4790]: I1003 08:57:56.013806 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2d7ff74f-da99-474d-9044-630e6103544c-scripts\") pod \"ovn-controller-8ljvm-config-cvgm4\" (UID: \"2d7ff74f-da99-474d-9044-630e6103544c\") " pod="openstack/ovn-controller-8ljvm-config-cvgm4" Oct 03 08:57:56 crc kubenswrapper[4790]: I1003 08:57:56.013847 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jjhvc\" (UniqueName: \"kubernetes.io/projected/2d7ff74f-da99-474d-9044-630e6103544c-kube-api-access-jjhvc\") pod \"ovn-controller-8ljvm-config-cvgm4\" (UID: \"2d7ff74f-da99-474d-9044-630e6103544c\") " pod="openstack/ovn-controller-8ljvm-config-cvgm4" Oct 03 08:57:56 crc kubenswrapper[4790]: I1003 08:57:56.013900 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/2d7ff74f-da99-474d-9044-630e6103544c-var-run-ovn\") pod \"ovn-controller-8ljvm-config-cvgm4\" (UID: \"2d7ff74f-da99-474d-9044-630e6103544c\") " pod="openstack/ovn-controller-8ljvm-config-cvgm4" Oct 03 08:57:56 crc kubenswrapper[4790]: I1003 08:57:56.013991 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/2d7ff74f-da99-474d-9044-630e6103544c-var-log-ovn\") pod \"ovn-controller-8ljvm-config-cvgm4\" (UID: \"2d7ff74f-da99-474d-9044-630e6103544c\") " pod="openstack/ovn-controller-8ljvm-config-cvgm4" Oct 03 08:57:56 crc kubenswrapper[4790]: I1003 08:57:56.014189 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/2d7ff74f-da99-474d-9044-630e6103544c-var-run\") pod \"ovn-controller-8ljvm-config-cvgm4\" (UID: \"2d7ff74f-da99-474d-9044-630e6103544c\") " pod="openstack/ovn-controller-8ljvm-config-cvgm4" Oct 03 08:57:56 crc kubenswrapper[4790]: I1003 08:57:56.014243 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/2d7ff74f-da99-474d-9044-630e6103544c-var-log-ovn\") pod \"ovn-controller-8ljvm-config-cvgm4\" (UID: \"2d7ff74f-da99-474d-9044-630e6103544c\") " pod="openstack/ovn-controller-8ljvm-config-cvgm4" Oct 03 08:57:56 crc kubenswrapper[4790]: I1003 08:57:56.014310 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/2d7ff74f-da99-474d-9044-630e6103544c-var-run-ovn\") pod \"ovn-controller-8ljvm-config-cvgm4\" (UID: \"2d7ff74f-da99-474d-9044-630e6103544c\") " pod="openstack/ovn-controller-8ljvm-config-cvgm4" Oct 03 08:57:56 crc kubenswrapper[4790]: I1003 08:57:56.015282 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/2d7ff74f-da99-474d-9044-630e6103544c-additional-scripts\") pod \"ovn-controller-8ljvm-config-cvgm4\" (UID: \"2d7ff74f-da99-474d-9044-630e6103544c\") " pod="openstack/ovn-controller-8ljvm-config-cvgm4" Oct 03 08:57:56 crc kubenswrapper[4790]: I1003 08:57:56.016368 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2d7ff74f-da99-474d-9044-630e6103544c-scripts\") pod \"ovn-controller-8ljvm-config-cvgm4\" (UID: \"2d7ff74f-da99-474d-9044-630e6103544c\") " pod="openstack/ovn-controller-8ljvm-config-cvgm4" Oct 03 08:57:56 crc kubenswrapper[4790]: I1003 08:57:56.045742 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jjhvc\" (UniqueName: \"kubernetes.io/projected/2d7ff74f-da99-474d-9044-630e6103544c-kube-api-access-jjhvc\") pod \"ovn-controller-8ljvm-config-cvgm4\" (UID: \"2d7ff74f-da99-474d-9044-630e6103544c\") " pod="openstack/ovn-controller-8ljvm-config-cvgm4" Oct 03 08:57:56 crc kubenswrapper[4790]: I1003 08:57:56.209218 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-8ljvm-config-cvgm4" Oct 03 08:57:56 crc kubenswrapper[4790]: I1003 08:57:56.217588 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-m92cz" event={"ID":"2859de83-046e-475b-bb37-fb8f4a3de695","Type":"ContainerStarted","Data":"29a7188a62adcae7fa702c6f2b865eeb573ec43ca9cb59ab1a3ab007c0a35f02"} Oct 03 08:57:56 crc kubenswrapper[4790]: I1003 08:57:56.221006 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"d6c105dc-48df-4d07-8a82-ca93e78ad98c","Type":"ContainerStarted","Data":"8f904766b9043c137a33937cd2ffaeb13178ddcb9f492ff13fa0e1320cee1f91"} Oct 03 08:57:56 crc kubenswrapper[4790]: I1003 08:57:56.232836 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-m92cz" podStartSLOduration=3.104610245 podStartE2EDuration="17.232817012s" podCreationTimestamp="2025-10-03 08:57:39 +0000 UTC" firstStartedPulling="2025-10-03 08:57:40.458833118 +0000 UTC m=+1046.811767356" lastFinishedPulling="2025-10-03 08:57:54.587039875 +0000 UTC m=+1060.939974123" observedRunningTime="2025-10-03 08:57:56.230656543 +0000 UTC m=+1062.583590791" watchObservedRunningTime="2025-10-03 08:57:56.232817012 +0000 UTC m=+1062.585751250" Oct 03 08:57:56 crc kubenswrapper[4790]: I1003 08:57:56.338457 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0a072c71-36df-4f5f-a7e9-90989b0d64ae" path="/var/lib/kubelet/pods/0a072c71-36df-4f5f-a7e9-90989b0d64ae/volumes" Oct 03 08:57:56 crc kubenswrapper[4790]: I1003 08:57:56.575436 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-8ljvm-config-cvgm4"] Oct 03 08:57:57 crc kubenswrapper[4790]: I1003 08:57:57.228633 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-8ljvm-config-cvgm4" event={"ID":"2d7ff74f-da99-474d-9044-630e6103544c","Type":"ContainerStarted","Data":"c2fe2465c74cb35c41f727ed18c5dc12b448787c1a2a117121bc2df28cf5752c"} Oct 03 08:57:57 crc kubenswrapper[4790]: I1003 08:57:57.230153 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-8ljvm-config-cvgm4" event={"ID":"2d7ff74f-da99-474d-9044-630e6103544c","Type":"ContainerStarted","Data":"1e0cf5aed58c308dee92ea9912ae551e80b0b85f86808ac9f3542eb40853945e"} Oct 03 08:57:57 crc kubenswrapper[4790]: I1003 08:57:57.235028 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"d6c105dc-48df-4d07-8a82-ca93e78ad98c","Type":"ContainerStarted","Data":"a0245c901d43e964182a90ebb86ddd69f46175bfe516454e5b06033432d6e359"} Oct 03 08:57:57 crc kubenswrapper[4790]: I1003 08:57:57.235091 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"d6c105dc-48df-4d07-8a82-ca93e78ad98c","Type":"ContainerStarted","Data":"9a14dd7da4cdb1ec4521844d26baefe73dc4654abfce00a45b73da0c3c579763"} Oct 03 08:57:57 crc kubenswrapper[4790]: I1003 08:57:57.235106 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"d6c105dc-48df-4d07-8a82-ca93e78ad98c","Type":"ContainerStarted","Data":"8db2ba32ae7d91f048da64b0f75e5d5f79a9a3be354bcbb86558f7425b706256"} Oct 03 08:57:57 crc kubenswrapper[4790]: I1003 08:57:57.235114 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"d6c105dc-48df-4d07-8a82-ca93e78ad98c","Type":"ContainerStarted","Data":"c8afd2b3051fde3be7d1ea7cb32f1e50985a6e54dab22e30b54a65173c4c4cdc"} Oct 03 08:57:57 crc kubenswrapper[4790]: I1003 08:57:57.238009 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"969ea996-248e-41c5-a868-bd2321bfdf49","Type":"ContainerStarted","Data":"6a4046c5494a8658b86a447e2ee54bf5c6268e28cb833c7de38e281d7992caad"} Oct 03 08:57:58 crc kubenswrapper[4790]: I1003 08:57:58.250868 4790 generic.go:334] "Generic (PLEG): container finished" podID="2d7ff74f-da99-474d-9044-630e6103544c" containerID="c2fe2465c74cb35c41f727ed18c5dc12b448787c1a2a117121bc2df28cf5752c" exitCode=0 Oct 03 08:57:58 crc kubenswrapper[4790]: I1003 08:57:58.250973 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-8ljvm-config-cvgm4" event={"ID":"2d7ff74f-da99-474d-9044-630e6103544c","Type":"ContainerDied","Data":"c2fe2465c74cb35c41f727ed18c5dc12b448787c1a2a117121bc2df28cf5752c"} Oct 03 08:57:58 crc kubenswrapper[4790]: I1003 08:57:58.267496 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"d6c105dc-48df-4d07-8a82-ca93e78ad98c","Type":"ContainerStarted","Data":"2a9fe7748ff9cee591c830502ca8be8e44f891884ec295a0e53217567c64c929"} Oct 03 08:57:58 crc kubenswrapper[4790]: I1003 08:57:58.270927 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"969ea996-248e-41c5-a868-bd2321bfdf49","Type":"ContainerStarted","Data":"b820bc6e813910c00009b922deffd923d0f8bac02f59c283da5b716de5621538"} Oct 03 08:57:58 crc kubenswrapper[4790]: I1003 08:57:58.313032 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/prometheus-metric-storage-0" podStartSLOduration=21.313011942 podStartE2EDuration="21.313011942s" podCreationTimestamp="2025-10-03 08:57:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 08:57:58.306067333 +0000 UTC m=+1064.659001581" watchObservedRunningTime="2025-10-03 08:57:58.313011942 +0000 UTC m=+1064.665946170" Oct 03 08:57:58 crc kubenswrapper[4790]: I1003 08:57:58.316734 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/prometheus-metric-storage-0" Oct 03 08:57:59 crc kubenswrapper[4790]: I1003 08:57:59.282704 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"d6c105dc-48df-4d07-8a82-ca93e78ad98c","Type":"ContainerStarted","Data":"c7d119fe08784bc6b9ccc2fd820918ae31889d565a2595ca084c36f3a841305e"} Oct 03 08:57:59 crc kubenswrapper[4790]: I1003 08:57:59.283348 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"d6c105dc-48df-4d07-8a82-ca93e78ad98c","Type":"ContainerStarted","Data":"c2de60fd8b094a9829dfc0744f7b67ea6688a8e0b480a0e3e146ce46a01c0df0"} Oct 03 08:57:59 crc kubenswrapper[4790]: I1003 08:57:59.283375 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"d6c105dc-48df-4d07-8a82-ca93e78ad98c","Type":"ContainerStarted","Data":"4e2c934d3589b1f3de04da59d8206288470e4f50be826b4b4e301c15a5c8f5e8"} Oct 03 08:57:59 crc kubenswrapper[4790]: I1003 08:57:59.695589 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Oct 03 08:57:59 crc kubenswrapper[4790]: I1003 08:57:59.733147 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-8ljvm-config-cvgm4" Oct 03 08:57:59 crc kubenswrapper[4790]: I1003 08:57:59.881126 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2d7ff74f-da99-474d-9044-630e6103544c-scripts\") pod \"2d7ff74f-da99-474d-9044-630e6103544c\" (UID: \"2d7ff74f-da99-474d-9044-630e6103544c\") " Oct 03 08:57:59 crc kubenswrapper[4790]: I1003 08:57:59.881507 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jjhvc\" (UniqueName: \"kubernetes.io/projected/2d7ff74f-da99-474d-9044-630e6103544c-kube-api-access-jjhvc\") pod \"2d7ff74f-da99-474d-9044-630e6103544c\" (UID: \"2d7ff74f-da99-474d-9044-630e6103544c\") " Oct 03 08:57:59 crc kubenswrapper[4790]: I1003 08:57:59.881637 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/2d7ff74f-da99-474d-9044-630e6103544c-additional-scripts\") pod \"2d7ff74f-da99-474d-9044-630e6103544c\" (UID: \"2d7ff74f-da99-474d-9044-630e6103544c\") " Oct 03 08:57:59 crc kubenswrapper[4790]: I1003 08:57:59.881695 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/2d7ff74f-da99-474d-9044-630e6103544c-var-log-ovn\") pod \"2d7ff74f-da99-474d-9044-630e6103544c\" (UID: \"2d7ff74f-da99-474d-9044-630e6103544c\") " Oct 03 08:57:59 crc kubenswrapper[4790]: I1003 08:57:59.881763 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/2d7ff74f-da99-474d-9044-630e6103544c-var-run\") pod \"2d7ff74f-da99-474d-9044-630e6103544c\" (UID: \"2d7ff74f-da99-474d-9044-630e6103544c\") " Oct 03 08:57:59 crc kubenswrapper[4790]: I1003 08:57:59.881789 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/2d7ff74f-da99-474d-9044-630e6103544c-var-run-ovn\") pod \"2d7ff74f-da99-474d-9044-630e6103544c\" (UID: \"2d7ff74f-da99-474d-9044-630e6103544c\") " Oct 03 08:57:59 crc kubenswrapper[4790]: I1003 08:57:59.881971 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2d7ff74f-da99-474d-9044-630e6103544c-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "2d7ff74f-da99-474d-9044-630e6103544c" (UID: "2d7ff74f-da99-474d-9044-630e6103544c"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 08:57:59 crc kubenswrapper[4790]: I1003 08:57:59.882004 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2d7ff74f-da99-474d-9044-630e6103544c-var-run" (OuterVolumeSpecName: "var-run") pod "2d7ff74f-da99-474d-9044-630e6103544c" (UID: "2d7ff74f-da99-474d-9044-630e6103544c"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 08:57:59 crc kubenswrapper[4790]: I1003 08:57:59.882061 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2d7ff74f-da99-474d-9044-630e6103544c-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "2d7ff74f-da99-474d-9044-630e6103544c" (UID: "2d7ff74f-da99-474d-9044-630e6103544c"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 08:57:59 crc kubenswrapper[4790]: I1003 08:57:59.882222 4790 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/2d7ff74f-da99-474d-9044-630e6103544c-var-log-ovn\") on node \"crc\" DevicePath \"\"" Oct 03 08:57:59 crc kubenswrapper[4790]: I1003 08:57:59.882237 4790 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/2d7ff74f-da99-474d-9044-630e6103544c-var-run\") on node \"crc\" DevicePath \"\"" Oct 03 08:57:59 crc kubenswrapper[4790]: I1003 08:57:59.882246 4790 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/2d7ff74f-da99-474d-9044-630e6103544c-var-run-ovn\") on node \"crc\" DevicePath \"\"" Oct 03 08:57:59 crc kubenswrapper[4790]: I1003 08:57:59.882496 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2d7ff74f-da99-474d-9044-630e6103544c-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "2d7ff74f-da99-474d-9044-630e6103544c" (UID: "2d7ff74f-da99-474d-9044-630e6103544c"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:57:59 crc kubenswrapper[4790]: I1003 08:57:59.882800 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2d7ff74f-da99-474d-9044-630e6103544c-scripts" (OuterVolumeSpecName: "scripts") pod "2d7ff74f-da99-474d-9044-630e6103544c" (UID: "2d7ff74f-da99-474d-9044-630e6103544c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:57:59 crc kubenswrapper[4790]: I1003 08:57:59.903082 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2d7ff74f-da99-474d-9044-630e6103544c-kube-api-access-jjhvc" (OuterVolumeSpecName: "kube-api-access-jjhvc") pod "2d7ff74f-da99-474d-9044-630e6103544c" (UID: "2d7ff74f-da99-474d-9044-630e6103544c"). InnerVolumeSpecName "kube-api-access-jjhvc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:57:59 crc kubenswrapper[4790]: I1003 08:57:59.983413 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jjhvc\" (UniqueName: \"kubernetes.io/projected/2d7ff74f-da99-474d-9044-630e6103544c-kube-api-access-jjhvc\") on node \"crc\" DevicePath \"\"" Oct 03 08:57:59 crc kubenswrapper[4790]: I1003 08:57:59.983450 4790 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/2d7ff74f-da99-474d-9044-630e6103544c-additional-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 08:57:59 crc kubenswrapper[4790]: I1003 08:57:59.983460 4790 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2d7ff74f-da99-474d-9044-630e6103544c-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 08:58:00 crc kubenswrapper[4790]: I1003 08:58:00.047833 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Oct 03 08:58:00 crc kubenswrapper[4790]: I1003 08:58:00.084513 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-c8zcf"] Oct 03 08:58:00 crc kubenswrapper[4790]: E1003 08:58:00.084965 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2d7ff74f-da99-474d-9044-630e6103544c" containerName="ovn-config" Oct 03 08:58:00 crc kubenswrapper[4790]: I1003 08:58:00.084990 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="2d7ff74f-da99-474d-9044-630e6103544c" containerName="ovn-config" Oct 03 08:58:00 crc kubenswrapper[4790]: I1003 08:58:00.085202 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="2d7ff74f-da99-474d-9044-630e6103544c" containerName="ovn-config" Oct 03 08:58:00 crc kubenswrapper[4790]: I1003 08:58:00.085961 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-c8zcf" Oct 03 08:58:00 crc kubenswrapper[4790]: I1003 08:58:00.113223 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-c8zcf"] Oct 03 08:58:00 crc kubenswrapper[4790]: I1003 08:58:00.187954 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v2slf\" (UniqueName: \"kubernetes.io/projected/a103c12c-c153-4074-ab87-d9ddb6536748-kube-api-access-v2slf\") pod \"barbican-db-create-c8zcf\" (UID: \"a103c12c-c153-4074-ab87-d9ddb6536748\") " pod="openstack/barbican-db-create-c8zcf" Oct 03 08:58:00 crc kubenswrapper[4790]: I1003 08:58:00.292845 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v2slf\" (UniqueName: \"kubernetes.io/projected/a103c12c-c153-4074-ab87-d9ddb6536748-kube-api-access-v2slf\") pod \"barbican-db-create-c8zcf\" (UID: \"a103c12c-c153-4074-ab87-d9ddb6536748\") " pod="openstack/barbican-db-create-c8zcf" Oct 03 08:58:00 crc kubenswrapper[4790]: I1003 08:58:00.333308 4790 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-notifications-server-0" podUID="04926fe2-c5e2-4663-8945-448a17ec2a8b" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.109:5671: connect: connection refused" Oct 03 08:58:00 crc kubenswrapper[4790]: I1003 08:58:00.344657 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-8ljvm-config-cvgm4" Oct 03 08:58:00 crc kubenswrapper[4790]: I1003 08:58:00.354344 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v2slf\" (UniqueName: \"kubernetes.io/projected/a103c12c-c153-4074-ab87-d9ddb6536748-kube-api-access-v2slf\") pod \"barbican-db-create-c8zcf\" (UID: \"a103c12c-c153-4074-ab87-d9ddb6536748\") " pod="openstack/barbican-db-create-c8zcf" Oct 03 08:58:00 crc kubenswrapper[4790]: I1003 08:58:00.354362 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-8ljvm-config-cvgm4" event={"ID":"2d7ff74f-da99-474d-9044-630e6103544c","Type":"ContainerDied","Data":"1e0cf5aed58c308dee92ea9912ae551e80b0b85f86808ac9f3542eb40853945e"} Oct 03 08:58:00 crc kubenswrapper[4790]: I1003 08:58:00.354605 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1e0cf5aed58c308dee92ea9912ae551e80b0b85f86808ac9f3542eb40853945e" Oct 03 08:58:00 crc kubenswrapper[4790]: I1003 08:58:00.393069 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"d6c105dc-48df-4d07-8a82-ca93e78ad98c","Type":"ContainerStarted","Data":"84d385a54ba4269d4421c250d9d2d3298d77ac1088ffe82c8c424e6652042548"} Oct 03 08:58:00 crc kubenswrapper[4790]: I1003 08:58:00.393138 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"d6c105dc-48df-4d07-8a82-ca93e78ad98c","Type":"ContainerStarted","Data":"eb67d868de657a91011dd648d21bf94a663c1da5fbf5a275fa6237a1165680d7"} Oct 03 08:58:00 crc kubenswrapper[4790]: I1003 08:58:00.393152 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"d6c105dc-48df-4d07-8a82-ca93e78ad98c","Type":"ContainerStarted","Data":"552e66539fd1c2bfbed5c2fa451692eb4bbd0c4f264e9234ca861d4b4c13d50f"} Oct 03 08:58:00 crc kubenswrapper[4790]: I1003 08:58:00.393164 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"d6c105dc-48df-4d07-8a82-ca93e78ad98c","Type":"ContainerStarted","Data":"c8a52586f2916bcf43c880f0f0add1a74b7045be27da125a4c0bf42402daffad"} Oct 03 08:58:00 crc kubenswrapper[4790]: I1003 08:58:00.401957 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-c8zcf" Oct 03 08:58:00 crc kubenswrapper[4790]: I1003 08:58:00.433507 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-sn67l"] Oct 03 08:58:00 crc kubenswrapper[4790]: I1003 08:58:00.441509 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-sn67l" Oct 03 08:58:00 crc kubenswrapper[4790]: I1003 08:58:00.488865 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-sn67l"] Oct 03 08:58:00 crc kubenswrapper[4790]: I1003 08:58:00.510067 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-8ljvm-config-cvgm4"] Oct 03 08:58:00 crc kubenswrapper[4790]: I1003 08:58:00.521902 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-8ljvm-config-cvgm4"] Oct 03 08:58:00 crc kubenswrapper[4790]: I1003 08:58:00.599907 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g9k9w\" (UniqueName: \"kubernetes.io/projected/eb9c1e83-318c-48b5-ae1c-ec2f41a6da7e-kube-api-access-g9k9w\") pod \"cinder-db-create-sn67l\" (UID: \"eb9c1e83-318c-48b5-ae1c-ec2f41a6da7e\") " pod="openstack/cinder-db-create-sn67l" Oct 03 08:58:00 crc kubenswrapper[4790]: I1003 08:58:00.660938 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-djrdw"] Oct 03 08:58:00 crc kubenswrapper[4790]: I1003 08:58:00.663592 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-djrdw" Oct 03 08:58:00 crc kubenswrapper[4790]: I1003 08:58:00.702516 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-djrdw"] Oct 03 08:58:00 crc kubenswrapper[4790]: I1003 08:58:00.705237 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g9k9w\" (UniqueName: \"kubernetes.io/projected/eb9c1e83-318c-48b5-ae1c-ec2f41a6da7e-kube-api-access-g9k9w\") pod \"cinder-db-create-sn67l\" (UID: \"eb9c1e83-318c-48b5-ae1c-ec2f41a6da7e\") " pod="openstack/cinder-db-create-sn67l" Oct 03 08:58:00 crc kubenswrapper[4790]: I1003 08:58:00.737899 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g9k9w\" (UniqueName: \"kubernetes.io/projected/eb9c1e83-318c-48b5-ae1c-ec2f41a6da7e-kube-api-access-g9k9w\") pod \"cinder-db-create-sn67l\" (UID: \"eb9c1e83-318c-48b5-ae1c-ec2f41a6da7e\") " pod="openstack/cinder-db-create-sn67l" Oct 03 08:58:00 crc kubenswrapper[4790]: I1003 08:58:00.812644 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ssznt\" (UniqueName: \"kubernetes.io/projected/9b6fdd58-025f-4e30-a384-d4f2104e5cea-kube-api-access-ssznt\") pod \"neutron-db-create-djrdw\" (UID: \"9b6fdd58-025f-4e30-a384-d4f2104e5cea\") " pod="openstack/neutron-db-create-djrdw" Oct 03 08:58:00 crc kubenswrapper[4790]: I1003 08:58:00.841592 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-sn67l" Oct 03 08:58:00 crc kubenswrapper[4790]: I1003 08:58:00.843314 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-f9rhk"] Oct 03 08:58:00 crc kubenswrapper[4790]: I1003 08:58:00.849707 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-f9rhk" Oct 03 08:58:00 crc kubenswrapper[4790]: I1003 08:58:00.856255 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-fnq8f" Oct 03 08:58:00 crc kubenswrapper[4790]: I1003 08:58:00.856515 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Oct 03 08:58:00 crc kubenswrapper[4790]: I1003 08:58:00.856662 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Oct 03 08:58:00 crc kubenswrapper[4790]: I1003 08:58:00.856770 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Oct 03 08:58:00 crc kubenswrapper[4790]: I1003 08:58:00.875088 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-f9rhk"] Oct 03 08:58:00 crc kubenswrapper[4790]: I1003 08:58:00.917115 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ssznt\" (UniqueName: \"kubernetes.io/projected/9b6fdd58-025f-4e30-a384-d4f2104e5cea-kube-api-access-ssznt\") pod \"neutron-db-create-djrdw\" (UID: \"9b6fdd58-025f-4e30-a384-d4f2104e5cea\") " pod="openstack/neutron-db-create-djrdw" Oct 03 08:58:00 crc kubenswrapper[4790]: I1003 08:58:00.941807 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ssznt\" (UniqueName: \"kubernetes.io/projected/9b6fdd58-025f-4e30-a384-d4f2104e5cea-kube-api-access-ssznt\") pod \"neutron-db-create-djrdw\" (UID: \"9b6fdd58-025f-4e30-a384-d4f2104e5cea\") " pod="openstack/neutron-db-create-djrdw" Oct 03 08:58:01 crc kubenswrapper[4790]: I1003 08:58:01.019189 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4l82z\" (UniqueName: \"kubernetes.io/projected/cde60e0e-020c-42ca-9677-05fc2d3d7a72-kube-api-access-4l82z\") pod \"keystone-db-sync-f9rhk\" (UID: \"cde60e0e-020c-42ca-9677-05fc2d3d7a72\") " pod="openstack/keystone-db-sync-f9rhk" Oct 03 08:58:01 crc kubenswrapper[4790]: I1003 08:58:01.019594 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cde60e0e-020c-42ca-9677-05fc2d3d7a72-combined-ca-bundle\") pod \"keystone-db-sync-f9rhk\" (UID: \"cde60e0e-020c-42ca-9677-05fc2d3d7a72\") " pod="openstack/keystone-db-sync-f9rhk" Oct 03 08:58:01 crc kubenswrapper[4790]: I1003 08:58:01.019657 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cde60e0e-020c-42ca-9677-05fc2d3d7a72-config-data\") pod \"keystone-db-sync-f9rhk\" (UID: \"cde60e0e-020c-42ca-9677-05fc2d3d7a72\") " pod="openstack/keystone-db-sync-f9rhk" Oct 03 08:58:01 crc kubenswrapper[4790]: I1003 08:58:01.026144 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-djrdw" Oct 03 08:58:01 crc kubenswrapper[4790]: I1003 08:58:01.121428 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4l82z\" (UniqueName: \"kubernetes.io/projected/cde60e0e-020c-42ca-9677-05fc2d3d7a72-kube-api-access-4l82z\") pod \"keystone-db-sync-f9rhk\" (UID: \"cde60e0e-020c-42ca-9677-05fc2d3d7a72\") " pod="openstack/keystone-db-sync-f9rhk" Oct 03 08:58:01 crc kubenswrapper[4790]: I1003 08:58:01.121533 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cde60e0e-020c-42ca-9677-05fc2d3d7a72-combined-ca-bundle\") pod \"keystone-db-sync-f9rhk\" (UID: \"cde60e0e-020c-42ca-9677-05fc2d3d7a72\") " pod="openstack/keystone-db-sync-f9rhk" Oct 03 08:58:01 crc kubenswrapper[4790]: I1003 08:58:01.121570 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cde60e0e-020c-42ca-9677-05fc2d3d7a72-config-data\") pod \"keystone-db-sync-f9rhk\" (UID: \"cde60e0e-020c-42ca-9677-05fc2d3d7a72\") " pod="openstack/keystone-db-sync-f9rhk" Oct 03 08:58:01 crc kubenswrapper[4790]: I1003 08:58:01.126467 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cde60e0e-020c-42ca-9677-05fc2d3d7a72-config-data\") pod \"keystone-db-sync-f9rhk\" (UID: \"cde60e0e-020c-42ca-9677-05fc2d3d7a72\") " pod="openstack/keystone-db-sync-f9rhk" Oct 03 08:58:01 crc kubenswrapper[4790]: I1003 08:58:01.130185 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cde60e0e-020c-42ca-9677-05fc2d3d7a72-combined-ca-bundle\") pod \"keystone-db-sync-f9rhk\" (UID: \"cde60e0e-020c-42ca-9677-05fc2d3d7a72\") " pod="openstack/keystone-db-sync-f9rhk" Oct 03 08:58:01 crc kubenswrapper[4790]: I1003 08:58:01.167138 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4l82z\" (UniqueName: \"kubernetes.io/projected/cde60e0e-020c-42ca-9677-05fc2d3d7a72-kube-api-access-4l82z\") pod \"keystone-db-sync-f9rhk\" (UID: \"cde60e0e-020c-42ca-9677-05fc2d3d7a72\") " pod="openstack/keystone-db-sync-f9rhk" Oct 03 08:58:01 crc kubenswrapper[4790]: I1003 08:58:01.191940 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-f9rhk" Oct 03 08:58:01 crc kubenswrapper[4790]: I1003 08:58:01.208992 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-c8zcf"] Oct 03 08:58:01 crc kubenswrapper[4790]: I1003 08:58:01.419374 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-c8zcf" event={"ID":"a103c12c-c153-4074-ab87-d9ddb6536748","Type":"ContainerStarted","Data":"4fbac9829c4837a764bf1b3b8512ed96c30bb75baf06151abc9da2aadb7b6483"} Oct 03 08:58:01 crc kubenswrapper[4790]: I1003 08:58:01.444602 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-sn67l"] Oct 03 08:58:01 crc kubenswrapper[4790]: I1003 08:58:01.458853 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"d6c105dc-48df-4d07-8a82-ca93e78ad98c","Type":"ContainerStarted","Data":"1d725f65ee29b136ae4bdb03b20a0a0804b55e94865f038b695bfb3ecac2b4b0"} Oct 03 08:58:01 crc kubenswrapper[4790]: I1003 08:58:01.458907 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"d6c105dc-48df-4d07-8a82-ca93e78ad98c","Type":"ContainerStarted","Data":"d485d9c1dd0642c0ead6698512b84dd287b1a26210827b547105bba50dbcca76"} Oct 03 08:58:01 crc kubenswrapper[4790]: I1003 08:58:01.617251 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-djrdw"] Oct 03 08:58:02 crc kubenswrapper[4790]: I1003 08:58:02.023919 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-f9rhk"] Oct 03 08:58:02 crc kubenswrapper[4790]: W1003 08:58:02.049497 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcde60e0e_020c_42ca_9677_05fc2d3d7a72.slice/crio-62baf587c17d293fcd380e1df14e4b6e1636b489b510ec0441332d79325fddb5 WatchSource:0}: Error finding container 62baf587c17d293fcd380e1df14e4b6e1636b489b510ec0441332d79325fddb5: Status 404 returned error can't find the container with id 62baf587c17d293fcd380e1df14e4b6e1636b489b510ec0441332d79325fddb5 Oct 03 08:58:02 crc kubenswrapper[4790]: I1003 08:58:02.343664 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2d7ff74f-da99-474d-9044-630e6103544c" path="/var/lib/kubelet/pods/2d7ff74f-da99-474d-9044-630e6103544c/volumes" Oct 03 08:58:02 crc kubenswrapper[4790]: I1003 08:58:02.469343 4790 generic.go:334] "Generic (PLEG): container finished" podID="a103c12c-c153-4074-ab87-d9ddb6536748" containerID="83d384e00619ab506d3e1e6efec39d645745cc27660e6db983ea55d26cf7f3d9" exitCode=0 Oct 03 08:58:02 crc kubenswrapper[4790]: I1003 08:58:02.469409 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-c8zcf" event={"ID":"a103c12c-c153-4074-ab87-d9ddb6536748","Type":"ContainerDied","Data":"83d384e00619ab506d3e1e6efec39d645745cc27660e6db983ea55d26cf7f3d9"} Oct 03 08:58:02 crc kubenswrapper[4790]: I1003 08:58:02.482254 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"d6c105dc-48df-4d07-8a82-ca93e78ad98c","Type":"ContainerStarted","Data":"f687711ce482bfaa0264eb429a9ffe01fbf50409b3f662f126418b40d3dd82c4"} Oct 03 08:58:02 crc kubenswrapper[4790]: I1003 08:58:02.487459 4790 generic.go:334] "Generic (PLEG): container finished" podID="eb9c1e83-318c-48b5-ae1c-ec2f41a6da7e" containerID="d09f3aca7d9ce961a7309d936bb2ebdbbe99d1c086f5a27eeba56ef4836247cb" exitCode=0 Oct 03 08:58:02 crc kubenswrapper[4790]: I1003 08:58:02.487617 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-sn67l" event={"ID":"eb9c1e83-318c-48b5-ae1c-ec2f41a6da7e","Type":"ContainerDied","Data":"d09f3aca7d9ce961a7309d936bb2ebdbbe99d1c086f5a27eeba56ef4836247cb"} Oct 03 08:58:02 crc kubenswrapper[4790]: I1003 08:58:02.487643 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-sn67l" event={"ID":"eb9c1e83-318c-48b5-ae1c-ec2f41a6da7e","Type":"ContainerStarted","Data":"bdc3c63833f4a13d134682d5ba5582fdb3ce249920bdbebf55b0c514e08d4d39"} Oct 03 08:58:02 crc kubenswrapper[4790]: I1003 08:58:02.494977 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-f9rhk" event={"ID":"cde60e0e-020c-42ca-9677-05fc2d3d7a72","Type":"ContainerStarted","Data":"62baf587c17d293fcd380e1df14e4b6e1636b489b510ec0441332d79325fddb5"} Oct 03 08:58:02 crc kubenswrapper[4790]: I1003 08:58:02.496531 4790 generic.go:334] "Generic (PLEG): container finished" podID="9b6fdd58-025f-4e30-a384-d4f2104e5cea" containerID="51f3869a3a9c24efd2663439eb15f61a38025ec13909a422fd2767c106cdbe4a" exitCode=0 Oct 03 08:58:02 crc kubenswrapper[4790]: I1003 08:58:02.496671 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-djrdw" event={"ID":"9b6fdd58-025f-4e30-a384-d4f2104e5cea","Type":"ContainerDied","Data":"51f3869a3a9c24efd2663439eb15f61a38025ec13909a422fd2767c106cdbe4a"} Oct 03 08:58:02 crc kubenswrapper[4790]: I1003 08:58:02.496740 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-djrdw" event={"ID":"9b6fdd58-025f-4e30-a384-d4f2104e5cea","Type":"ContainerStarted","Data":"1e4a7aae56cebd253f3a95f14f26c3ada91d27aea0e5659e9dab6552f6f1bf7f"} Oct 03 08:58:02 crc kubenswrapper[4790]: I1003 08:58:02.605336 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-storage-0" podStartSLOduration=42.606512068 podStartE2EDuration="46.605319346s" podCreationTimestamp="2025-10-03 08:57:16 +0000 UTC" firstStartedPulling="2025-10-03 08:57:55.222553904 +0000 UTC m=+1061.575488142" lastFinishedPulling="2025-10-03 08:57:59.221361182 +0000 UTC m=+1065.574295420" observedRunningTime="2025-10-03 08:58:02.588306249 +0000 UTC m=+1068.941240497" watchObservedRunningTime="2025-10-03 08:58:02.605319346 +0000 UTC m=+1068.958253584" Oct 03 08:58:02 crc kubenswrapper[4790]: I1003 08:58:02.819230 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-989b5cc47-594d9"] Oct 03 08:58:02 crc kubenswrapper[4790]: I1003 08:58:02.821103 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-989b5cc47-594d9" Oct 03 08:58:02 crc kubenswrapper[4790]: I1003 08:58:02.823929 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-swift-storage-0" Oct 03 08:58:02 crc kubenswrapper[4790]: I1003 08:58:02.828703 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-989b5cc47-594d9"] Oct 03 08:58:02 crc kubenswrapper[4790]: I1003 08:58:02.880801 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c9ksm\" (UniqueName: \"kubernetes.io/projected/d4ab392a-37b8-46fa-b26c-5a1fb0a218e8-kube-api-access-c9ksm\") pod \"dnsmasq-dns-989b5cc47-594d9\" (UID: \"d4ab392a-37b8-46fa-b26c-5a1fb0a218e8\") " pod="openstack/dnsmasq-dns-989b5cc47-594d9" Oct 03 08:58:02 crc kubenswrapper[4790]: I1003 08:58:02.880853 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d4ab392a-37b8-46fa-b26c-5a1fb0a218e8-ovsdbserver-sb\") pod \"dnsmasq-dns-989b5cc47-594d9\" (UID: \"d4ab392a-37b8-46fa-b26c-5a1fb0a218e8\") " pod="openstack/dnsmasq-dns-989b5cc47-594d9" Oct 03 08:58:02 crc kubenswrapper[4790]: I1003 08:58:02.881095 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d4ab392a-37b8-46fa-b26c-5a1fb0a218e8-config\") pod \"dnsmasq-dns-989b5cc47-594d9\" (UID: \"d4ab392a-37b8-46fa-b26c-5a1fb0a218e8\") " pod="openstack/dnsmasq-dns-989b5cc47-594d9" Oct 03 08:58:02 crc kubenswrapper[4790]: I1003 08:58:02.881164 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d4ab392a-37b8-46fa-b26c-5a1fb0a218e8-dns-swift-storage-0\") pod \"dnsmasq-dns-989b5cc47-594d9\" (UID: \"d4ab392a-37b8-46fa-b26c-5a1fb0a218e8\") " pod="openstack/dnsmasq-dns-989b5cc47-594d9" Oct 03 08:58:02 crc kubenswrapper[4790]: I1003 08:58:02.881410 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d4ab392a-37b8-46fa-b26c-5a1fb0a218e8-ovsdbserver-nb\") pod \"dnsmasq-dns-989b5cc47-594d9\" (UID: \"d4ab392a-37b8-46fa-b26c-5a1fb0a218e8\") " pod="openstack/dnsmasq-dns-989b5cc47-594d9" Oct 03 08:58:02 crc kubenswrapper[4790]: I1003 08:58:02.881487 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d4ab392a-37b8-46fa-b26c-5a1fb0a218e8-dns-svc\") pod \"dnsmasq-dns-989b5cc47-594d9\" (UID: \"d4ab392a-37b8-46fa-b26c-5a1fb0a218e8\") " pod="openstack/dnsmasq-dns-989b5cc47-594d9" Oct 03 08:58:02 crc kubenswrapper[4790]: I1003 08:58:02.983273 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d4ab392a-37b8-46fa-b26c-5a1fb0a218e8-config\") pod \"dnsmasq-dns-989b5cc47-594d9\" (UID: \"d4ab392a-37b8-46fa-b26c-5a1fb0a218e8\") " pod="openstack/dnsmasq-dns-989b5cc47-594d9" Oct 03 08:58:02 crc kubenswrapper[4790]: I1003 08:58:02.983324 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d4ab392a-37b8-46fa-b26c-5a1fb0a218e8-dns-swift-storage-0\") pod \"dnsmasq-dns-989b5cc47-594d9\" (UID: \"d4ab392a-37b8-46fa-b26c-5a1fb0a218e8\") " pod="openstack/dnsmasq-dns-989b5cc47-594d9" Oct 03 08:58:02 crc kubenswrapper[4790]: I1003 08:58:02.983370 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d4ab392a-37b8-46fa-b26c-5a1fb0a218e8-ovsdbserver-nb\") pod \"dnsmasq-dns-989b5cc47-594d9\" (UID: \"d4ab392a-37b8-46fa-b26c-5a1fb0a218e8\") " pod="openstack/dnsmasq-dns-989b5cc47-594d9" Oct 03 08:58:02 crc kubenswrapper[4790]: I1003 08:58:02.983387 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d4ab392a-37b8-46fa-b26c-5a1fb0a218e8-dns-svc\") pod \"dnsmasq-dns-989b5cc47-594d9\" (UID: \"d4ab392a-37b8-46fa-b26c-5a1fb0a218e8\") " pod="openstack/dnsmasq-dns-989b5cc47-594d9" Oct 03 08:58:02 crc kubenswrapper[4790]: I1003 08:58:02.983454 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c9ksm\" (UniqueName: \"kubernetes.io/projected/d4ab392a-37b8-46fa-b26c-5a1fb0a218e8-kube-api-access-c9ksm\") pod \"dnsmasq-dns-989b5cc47-594d9\" (UID: \"d4ab392a-37b8-46fa-b26c-5a1fb0a218e8\") " pod="openstack/dnsmasq-dns-989b5cc47-594d9" Oct 03 08:58:02 crc kubenswrapper[4790]: I1003 08:58:02.984621 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d4ab392a-37b8-46fa-b26c-5a1fb0a218e8-dns-svc\") pod \"dnsmasq-dns-989b5cc47-594d9\" (UID: \"d4ab392a-37b8-46fa-b26c-5a1fb0a218e8\") " pod="openstack/dnsmasq-dns-989b5cc47-594d9" Oct 03 08:58:02 crc kubenswrapper[4790]: I1003 08:58:02.984622 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d4ab392a-37b8-46fa-b26c-5a1fb0a218e8-config\") pod \"dnsmasq-dns-989b5cc47-594d9\" (UID: \"d4ab392a-37b8-46fa-b26c-5a1fb0a218e8\") " pod="openstack/dnsmasq-dns-989b5cc47-594d9" Oct 03 08:58:02 crc kubenswrapper[4790]: I1003 08:58:02.984729 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d4ab392a-37b8-46fa-b26c-5a1fb0a218e8-ovsdbserver-sb\") pod \"dnsmasq-dns-989b5cc47-594d9\" (UID: \"d4ab392a-37b8-46fa-b26c-5a1fb0a218e8\") " pod="openstack/dnsmasq-dns-989b5cc47-594d9" Oct 03 08:58:02 crc kubenswrapper[4790]: I1003 08:58:02.984772 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d4ab392a-37b8-46fa-b26c-5a1fb0a218e8-ovsdbserver-nb\") pod \"dnsmasq-dns-989b5cc47-594d9\" (UID: \"d4ab392a-37b8-46fa-b26c-5a1fb0a218e8\") " pod="openstack/dnsmasq-dns-989b5cc47-594d9" Oct 03 08:58:02 crc kubenswrapper[4790]: I1003 08:58:02.985568 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d4ab392a-37b8-46fa-b26c-5a1fb0a218e8-ovsdbserver-sb\") pod \"dnsmasq-dns-989b5cc47-594d9\" (UID: \"d4ab392a-37b8-46fa-b26c-5a1fb0a218e8\") " pod="openstack/dnsmasq-dns-989b5cc47-594d9" Oct 03 08:58:02 crc kubenswrapper[4790]: I1003 08:58:02.985700 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d4ab392a-37b8-46fa-b26c-5a1fb0a218e8-dns-swift-storage-0\") pod \"dnsmasq-dns-989b5cc47-594d9\" (UID: \"d4ab392a-37b8-46fa-b26c-5a1fb0a218e8\") " pod="openstack/dnsmasq-dns-989b5cc47-594d9" Oct 03 08:58:03 crc kubenswrapper[4790]: I1003 08:58:03.013531 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c9ksm\" (UniqueName: \"kubernetes.io/projected/d4ab392a-37b8-46fa-b26c-5a1fb0a218e8-kube-api-access-c9ksm\") pod \"dnsmasq-dns-989b5cc47-594d9\" (UID: \"d4ab392a-37b8-46fa-b26c-5a1fb0a218e8\") " pod="openstack/dnsmasq-dns-989b5cc47-594d9" Oct 03 08:58:03 crc kubenswrapper[4790]: I1003 08:58:03.145122 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-989b5cc47-594d9" Oct 03 08:58:05 crc kubenswrapper[4790]: I1003 08:58:05.622399 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-989b5cc47-594d9"] Oct 03 08:58:06 crc kubenswrapper[4790]: I1003 08:58:06.559231 4790 generic.go:334] "Generic (PLEG): container finished" podID="2859de83-046e-475b-bb37-fb8f4a3de695" containerID="29a7188a62adcae7fa702c6f2b865eeb573ec43ca9cb59ab1a3ab007c0a35f02" exitCode=0 Oct 03 08:58:06 crc kubenswrapper[4790]: I1003 08:58:06.559279 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-m92cz" event={"ID":"2859de83-046e-475b-bb37-fb8f4a3de695","Type":"ContainerDied","Data":"29a7188a62adcae7fa702c6f2b865eeb573ec43ca9cb59ab1a3ab007c0a35f02"} Oct 03 08:58:07 crc kubenswrapper[4790]: W1003 08:58:07.420522 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd4ab392a_37b8_46fa_b26c_5a1fb0a218e8.slice/crio-4c0690eb69d02f552c314e0b0c2dfe2a621f0731ba4b3c4dfaa4b9a243c8a317 WatchSource:0}: Error finding container 4c0690eb69d02f552c314e0b0c2dfe2a621f0731ba4b3c4dfaa4b9a243c8a317: Status 404 returned error can't find the container with id 4c0690eb69d02f552c314e0b0c2dfe2a621f0731ba4b3c4dfaa4b9a243c8a317 Oct 03 08:58:07 crc kubenswrapper[4790]: I1003 08:58:07.569301 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-989b5cc47-594d9" event={"ID":"d4ab392a-37b8-46fa-b26c-5a1fb0a218e8","Type":"ContainerStarted","Data":"4c0690eb69d02f552c314e0b0c2dfe2a621f0731ba4b3c4dfaa4b9a243c8a317"} Oct 03 08:58:07 crc kubenswrapper[4790]: I1003 08:58:07.570698 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-djrdw" event={"ID":"9b6fdd58-025f-4e30-a384-d4f2104e5cea","Type":"ContainerDied","Data":"1e4a7aae56cebd253f3a95f14f26c3ada91d27aea0e5659e9dab6552f6f1bf7f"} Oct 03 08:58:07 crc kubenswrapper[4790]: I1003 08:58:07.570739 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1e4a7aae56cebd253f3a95f14f26c3ada91d27aea0e5659e9dab6552f6f1bf7f" Oct 03 08:58:07 crc kubenswrapper[4790]: I1003 08:58:07.572215 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-c8zcf" event={"ID":"a103c12c-c153-4074-ab87-d9ddb6536748","Type":"ContainerDied","Data":"4fbac9829c4837a764bf1b3b8512ed96c30bb75baf06151abc9da2aadb7b6483"} Oct 03 08:58:07 crc kubenswrapper[4790]: I1003 08:58:07.572251 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4fbac9829c4837a764bf1b3b8512ed96c30bb75baf06151abc9da2aadb7b6483" Oct 03 08:58:07 crc kubenswrapper[4790]: I1003 08:58:07.574366 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-sn67l" event={"ID":"eb9c1e83-318c-48b5-ae1c-ec2f41a6da7e","Type":"ContainerDied","Data":"bdc3c63833f4a13d134682d5ba5582fdb3ce249920bdbebf55b0c514e08d4d39"} Oct 03 08:58:07 crc kubenswrapper[4790]: I1003 08:58:07.574386 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bdc3c63833f4a13d134682d5ba5582fdb3ce249920bdbebf55b0c514e08d4d39" Oct 03 08:58:07 crc kubenswrapper[4790]: I1003 08:58:07.652626 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-sn67l" Oct 03 08:58:07 crc kubenswrapper[4790]: I1003 08:58:07.668661 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-c8zcf" Oct 03 08:58:07 crc kubenswrapper[4790]: I1003 08:58:07.682908 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-djrdw" Oct 03 08:58:07 crc kubenswrapper[4790]: I1003 08:58:07.765046 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g9k9w\" (UniqueName: \"kubernetes.io/projected/eb9c1e83-318c-48b5-ae1c-ec2f41a6da7e-kube-api-access-g9k9w\") pod \"eb9c1e83-318c-48b5-ae1c-ec2f41a6da7e\" (UID: \"eb9c1e83-318c-48b5-ae1c-ec2f41a6da7e\") " Oct 03 08:58:07 crc kubenswrapper[4790]: I1003 08:58:07.774985 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eb9c1e83-318c-48b5-ae1c-ec2f41a6da7e-kube-api-access-g9k9w" (OuterVolumeSpecName: "kube-api-access-g9k9w") pod "eb9c1e83-318c-48b5-ae1c-ec2f41a6da7e" (UID: "eb9c1e83-318c-48b5-ae1c-ec2f41a6da7e"). InnerVolumeSpecName "kube-api-access-g9k9w". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:58:07 crc kubenswrapper[4790]: I1003 08:58:07.866844 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ssznt\" (UniqueName: \"kubernetes.io/projected/9b6fdd58-025f-4e30-a384-d4f2104e5cea-kube-api-access-ssznt\") pod \"9b6fdd58-025f-4e30-a384-d4f2104e5cea\" (UID: \"9b6fdd58-025f-4e30-a384-d4f2104e5cea\") " Oct 03 08:58:07 crc kubenswrapper[4790]: I1003 08:58:07.866972 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v2slf\" (UniqueName: \"kubernetes.io/projected/a103c12c-c153-4074-ab87-d9ddb6536748-kube-api-access-v2slf\") pod \"a103c12c-c153-4074-ab87-d9ddb6536748\" (UID: \"a103c12c-c153-4074-ab87-d9ddb6536748\") " Oct 03 08:58:07 crc kubenswrapper[4790]: I1003 08:58:07.867316 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g9k9w\" (UniqueName: \"kubernetes.io/projected/eb9c1e83-318c-48b5-ae1c-ec2f41a6da7e-kube-api-access-g9k9w\") on node \"crc\" DevicePath \"\"" Oct 03 08:58:07 crc kubenswrapper[4790]: I1003 08:58:07.871151 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9b6fdd58-025f-4e30-a384-d4f2104e5cea-kube-api-access-ssznt" (OuterVolumeSpecName: "kube-api-access-ssznt") pod "9b6fdd58-025f-4e30-a384-d4f2104e5cea" (UID: "9b6fdd58-025f-4e30-a384-d4f2104e5cea"). InnerVolumeSpecName "kube-api-access-ssznt". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:58:07 crc kubenswrapper[4790]: I1003 08:58:07.871597 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a103c12c-c153-4074-ab87-d9ddb6536748-kube-api-access-v2slf" (OuterVolumeSpecName: "kube-api-access-v2slf") pod "a103c12c-c153-4074-ab87-d9ddb6536748" (UID: "a103c12c-c153-4074-ab87-d9ddb6536748"). InnerVolumeSpecName "kube-api-access-v2slf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:58:07 crc kubenswrapper[4790]: I1003 08:58:07.969058 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v2slf\" (UniqueName: \"kubernetes.io/projected/a103c12c-c153-4074-ab87-d9ddb6536748-kube-api-access-v2slf\") on node \"crc\" DevicePath \"\"" Oct 03 08:58:07 crc kubenswrapper[4790]: I1003 08:58:07.969136 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ssznt\" (UniqueName: \"kubernetes.io/projected/9b6fdd58-025f-4e30-a384-d4f2104e5cea-kube-api-access-ssznt\") on node \"crc\" DevicePath \"\"" Oct 03 08:58:08 crc kubenswrapper[4790]: I1003 08:58:08.000881 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-m92cz" Oct 03 08:58:08 crc kubenswrapper[4790]: I1003 08:58:08.176181 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lxrhn\" (UniqueName: \"kubernetes.io/projected/2859de83-046e-475b-bb37-fb8f4a3de695-kube-api-access-lxrhn\") pod \"2859de83-046e-475b-bb37-fb8f4a3de695\" (UID: \"2859de83-046e-475b-bb37-fb8f4a3de695\") " Oct 03 08:58:08 crc kubenswrapper[4790]: I1003 08:58:08.176284 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/2859de83-046e-475b-bb37-fb8f4a3de695-db-sync-config-data\") pod \"2859de83-046e-475b-bb37-fb8f4a3de695\" (UID: \"2859de83-046e-475b-bb37-fb8f4a3de695\") " Oct 03 08:58:08 crc kubenswrapper[4790]: I1003 08:58:08.176318 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2859de83-046e-475b-bb37-fb8f4a3de695-config-data\") pod \"2859de83-046e-475b-bb37-fb8f4a3de695\" (UID: \"2859de83-046e-475b-bb37-fb8f4a3de695\") " Oct 03 08:58:08 crc kubenswrapper[4790]: I1003 08:58:08.176377 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2859de83-046e-475b-bb37-fb8f4a3de695-combined-ca-bundle\") pod \"2859de83-046e-475b-bb37-fb8f4a3de695\" (UID: \"2859de83-046e-475b-bb37-fb8f4a3de695\") " Oct 03 08:58:08 crc kubenswrapper[4790]: I1003 08:58:08.197761 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2859de83-046e-475b-bb37-fb8f4a3de695-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "2859de83-046e-475b-bb37-fb8f4a3de695" (UID: "2859de83-046e-475b-bb37-fb8f4a3de695"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:58:08 crc kubenswrapper[4790]: I1003 08:58:08.210701 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2859de83-046e-475b-bb37-fb8f4a3de695-kube-api-access-lxrhn" (OuterVolumeSpecName: "kube-api-access-lxrhn") pod "2859de83-046e-475b-bb37-fb8f4a3de695" (UID: "2859de83-046e-475b-bb37-fb8f4a3de695"). InnerVolumeSpecName "kube-api-access-lxrhn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:58:08 crc kubenswrapper[4790]: I1003 08:58:08.226344 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2859de83-046e-475b-bb37-fb8f4a3de695-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2859de83-046e-475b-bb37-fb8f4a3de695" (UID: "2859de83-046e-475b-bb37-fb8f4a3de695"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:58:08 crc kubenswrapper[4790]: I1003 08:58:08.253497 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2859de83-046e-475b-bb37-fb8f4a3de695-config-data" (OuterVolumeSpecName: "config-data") pod "2859de83-046e-475b-bb37-fb8f4a3de695" (UID: "2859de83-046e-475b-bb37-fb8f4a3de695"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:58:08 crc kubenswrapper[4790]: I1003 08:58:08.278048 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lxrhn\" (UniqueName: \"kubernetes.io/projected/2859de83-046e-475b-bb37-fb8f4a3de695-kube-api-access-lxrhn\") on node \"crc\" DevicePath \"\"" Oct 03 08:58:08 crc kubenswrapper[4790]: I1003 08:58:08.278089 4790 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/2859de83-046e-475b-bb37-fb8f4a3de695-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 08:58:08 crc kubenswrapper[4790]: I1003 08:58:08.278099 4790 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2859de83-046e-475b-bb37-fb8f4a3de695-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 08:58:08 crc kubenswrapper[4790]: I1003 08:58:08.278109 4790 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2859de83-046e-475b-bb37-fb8f4a3de695-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 08:58:08 crc kubenswrapper[4790]: I1003 08:58:08.316957 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/prometheus-metric-storage-0" Oct 03 08:58:08 crc kubenswrapper[4790]: I1003 08:58:08.323974 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/prometheus-metric-storage-0" Oct 03 08:58:08 crc kubenswrapper[4790]: I1003 08:58:08.597781 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-f9rhk" event={"ID":"cde60e0e-020c-42ca-9677-05fc2d3d7a72","Type":"ContainerStarted","Data":"4ba6f46b86ce0149c3824563a53028342ad3c1387370b258b8c7ff65b4b88c86"} Oct 03 08:58:08 crc kubenswrapper[4790]: I1003 08:58:08.604543 4790 generic.go:334] "Generic (PLEG): container finished" podID="d4ab392a-37b8-46fa-b26c-5a1fb0a218e8" containerID="e7042232f0620a6f157d0c5dc7260a7cad3b98d14b3bf2b7101b435aea20714b" exitCode=0 Oct 03 08:58:08 crc kubenswrapper[4790]: I1003 08:58:08.604727 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-989b5cc47-594d9" event={"ID":"d4ab392a-37b8-46fa-b26c-5a1fb0a218e8","Type":"ContainerDied","Data":"e7042232f0620a6f157d0c5dc7260a7cad3b98d14b3bf2b7101b435aea20714b"} Oct 03 08:58:08 crc kubenswrapper[4790]: I1003 08:58:08.612524 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-sn67l" Oct 03 08:58:08 crc kubenswrapper[4790]: I1003 08:58:08.612629 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-m92cz" event={"ID":"2859de83-046e-475b-bb37-fb8f4a3de695","Type":"ContainerDied","Data":"e5c7a0b846a2faad1bc369aaebdd77745c853b7ea509357ef22f83588a2a6923"} Oct 03 08:58:08 crc kubenswrapper[4790]: I1003 08:58:08.612677 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e5c7a0b846a2faad1bc369aaebdd77745c853b7ea509357ef22f83588a2a6923" Oct 03 08:58:08 crc kubenswrapper[4790]: I1003 08:58:08.612796 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-djrdw" Oct 03 08:58:08 crc kubenswrapper[4790]: I1003 08:58:08.612821 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-m92cz" Oct 03 08:58:08 crc kubenswrapper[4790]: I1003 08:58:08.613627 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-c8zcf" Oct 03 08:58:08 crc kubenswrapper[4790]: I1003 08:58:08.625661 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/prometheus-metric-storage-0" Oct 03 08:58:08 crc kubenswrapper[4790]: I1003 08:58:08.632689 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-f9rhk" podStartSLOduration=3.20265033 podStartE2EDuration="8.6326602s" podCreationTimestamp="2025-10-03 08:58:00 +0000 UTC" firstStartedPulling="2025-10-03 08:58:02.062383762 +0000 UTC m=+1068.415318000" lastFinishedPulling="2025-10-03 08:58:07.492393632 +0000 UTC m=+1073.845327870" observedRunningTime="2025-10-03 08:58:08.629076312 +0000 UTC m=+1074.982010550" watchObservedRunningTime="2025-10-03 08:58:08.6326602 +0000 UTC m=+1074.985594448" Oct 03 08:58:09 crc kubenswrapper[4790]: I1003 08:58:09.050806 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-989b5cc47-594d9"] Oct 03 08:58:09 crc kubenswrapper[4790]: I1003 08:58:09.115170 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-56d75b6d47-8rct7"] Oct 03 08:58:09 crc kubenswrapper[4790]: E1003 08:58:09.128402 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2859de83-046e-475b-bb37-fb8f4a3de695" containerName="glance-db-sync" Oct 03 08:58:09 crc kubenswrapper[4790]: I1003 08:58:09.128438 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="2859de83-046e-475b-bb37-fb8f4a3de695" containerName="glance-db-sync" Oct 03 08:58:09 crc kubenswrapper[4790]: E1003 08:58:09.128460 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb9c1e83-318c-48b5-ae1c-ec2f41a6da7e" containerName="mariadb-database-create" Oct 03 08:58:09 crc kubenswrapper[4790]: I1003 08:58:09.128474 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb9c1e83-318c-48b5-ae1c-ec2f41a6da7e" containerName="mariadb-database-create" Oct 03 08:58:09 crc kubenswrapper[4790]: E1003 08:58:09.128488 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9b6fdd58-025f-4e30-a384-d4f2104e5cea" containerName="mariadb-database-create" Oct 03 08:58:09 crc kubenswrapper[4790]: I1003 08:58:09.128499 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b6fdd58-025f-4e30-a384-d4f2104e5cea" containerName="mariadb-database-create" Oct 03 08:58:09 crc kubenswrapper[4790]: E1003 08:58:09.128529 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a103c12c-c153-4074-ab87-d9ddb6536748" containerName="mariadb-database-create" Oct 03 08:58:09 crc kubenswrapper[4790]: I1003 08:58:09.128539 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="a103c12c-c153-4074-ab87-d9ddb6536748" containerName="mariadb-database-create" Oct 03 08:58:09 crc kubenswrapper[4790]: I1003 08:58:09.139854 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="2859de83-046e-475b-bb37-fb8f4a3de695" containerName="glance-db-sync" Oct 03 08:58:09 crc kubenswrapper[4790]: I1003 08:58:09.139904 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="a103c12c-c153-4074-ab87-d9ddb6536748" containerName="mariadb-database-create" Oct 03 08:58:09 crc kubenswrapper[4790]: I1003 08:58:09.139944 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="9b6fdd58-025f-4e30-a384-d4f2104e5cea" containerName="mariadb-database-create" Oct 03 08:58:09 crc kubenswrapper[4790]: I1003 08:58:09.139963 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="eb9c1e83-318c-48b5-ae1c-ec2f41a6da7e" containerName="mariadb-database-create" Oct 03 08:58:09 crc kubenswrapper[4790]: I1003 08:58:09.141689 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-56d75b6d47-8rct7" Oct 03 08:58:09 crc kubenswrapper[4790]: I1003 08:58:09.164137 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-56d75b6d47-8rct7"] Oct 03 08:58:09 crc kubenswrapper[4790]: I1003 08:58:09.339617 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/67fecd5f-4a82-413f-88ba-a07aa56c71a8-ovsdbserver-nb\") pod \"dnsmasq-dns-56d75b6d47-8rct7\" (UID: \"67fecd5f-4a82-413f-88ba-a07aa56c71a8\") " pod="openstack/dnsmasq-dns-56d75b6d47-8rct7" Oct 03 08:58:09 crc kubenswrapper[4790]: I1003 08:58:09.339692 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/67fecd5f-4a82-413f-88ba-a07aa56c71a8-dns-swift-storage-0\") pod \"dnsmasq-dns-56d75b6d47-8rct7\" (UID: \"67fecd5f-4a82-413f-88ba-a07aa56c71a8\") " pod="openstack/dnsmasq-dns-56d75b6d47-8rct7" Oct 03 08:58:09 crc kubenswrapper[4790]: I1003 08:58:09.339781 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/67fecd5f-4a82-413f-88ba-a07aa56c71a8-config\") pod \"dnsmasq-dns-56d75b6d47-8rct7\" (UID: \"67fecd5f-4a82-413f-88ba-a07aa56c71a8\") " pod="openstack/dnsmasq-dns-56d75b6d47-8rct7" Oct 03 08:58:09 crc kubenswrapper[4790]: I1003 08:58:09.339852 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/67fecd5f-4a82-413f-88ba-a07aa56c71a8-dns-svc\") pod \"dnsmasq-dns-56d75b6d47-8rct7\" (UID: \"67fecd5f-4a82-413f-88ba-a07aa56c71a8\") " pod="openstack/dnsmasq-dns-56d75b6d47-8rct7" Oct 03 08:58:09 crc kubenswrapper[4790]: I1003 08:58:09.339905 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/67fecd5f-4a82-413f-88ba-a07aa56c71a8-ovsdbserver-sb\") pod \"dnsmasq-dns-56d75b6d47-8rct7\" (UID: \"67fecd5f-4a82-413f-88ba-a07aa56c71a8\") " pod="openstack/dnsmasq-dns-56d75b6d47-8rct7" Oct 03 08:58:09 crc kubenswrapper[4790]: I1003 08:58:09.339951 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-np5v2\" (UniqueName: \"kubernetes.io/projected/67fecd5f-4a82-413f-88ba-a07aa56c71a8-kube-api-access-np5v2\") pod \"dnsmasq-dns-56d75b6d47-8rct7\" (UID: \"67fecd5f-4a82-413f-88ba-a07aa56c71a8\") " pod="openstack/dnsmasq-dns-56d75b6d47-8rct7" Oct 03 08:58:09 crc kubenswrapper[4790]: I1003 08:58:09.443569 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/67fecd5f-4a82-413f-88ba-a07aa56c71a8-ovsdbserver-sb\") pod \"dnsmasq-dns-56d75b6d47-8rct7\" (UID: \"67fecd5f-4a82-413f-88ba-a07aa56c71a8\") " pod="openstack/dnsmasq-dns-56d75b6d47-8rct7" Oct 03 08:58:09 crc kubenswrapper[4790]: I1003 08:58:09.443717 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-np5v2\" (UniqueName: \"kubernetes.io/projected/67fecd5f-4a82-413f-88ba-a07aa56c71a8-kube-api-access-np5v2\") pod \"dnsmasq-dns-56d75b6d47-8rct7\" (UID: \"67fecd5f-4a82-413f-88ba-a07aa56c71a8\") " pod="openstack/dnsmasq-dns-56d75b6d47-8rct7" Oct 03 08:58:09 crc kubenswrapper[4790]: I1003 08:58:09.443752 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/67fecd5f-4a82-413f-88ba-a07aa56c71a8-ovsdbserver-nb\") pod \"dnsmasq-dns-56d75b6d47-8rct7\" (UID: \"67fecd5f-4a82-413f-88ba-a07aa56c71a8\") " pod="openstack/dnsmasq-dns-56d75b6d47-8rct7" Oct 03 08:58:09 crc kubenswrapper[4790]: I1003 08:58:09.443788 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/67fecd5f-4a82-413f-88ba-a07aa56c71a8-dns-swift-storage-0\") pod \"dnsmasq-dns-56d75b6d47-8rct7\" (UID: \"67fecd5f-4a82-413f-88ba-a07aa56c71a8\") " pod="openstack/dnsmasq-dns-56d75b6d47-8rct7" Oct 03 08:58:09 crc kubenswrapper[4790]: I1003 08:58:09.443870 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/67fecd5f-4a82-413f-88ba-a07aa56c71a8-config\") pod \"dnsmasq-dns-56d75b6d47-8rct7\" (UID: \"67fecd5f-4a82-413f-88ba-a07aa56c71a8\") " pod="openstack/dnsmasq-dns-56d75b6d47-8rct7" Oct 03 08:58:09 crc kubenswrapper[4790]: I1003 08:58:09.443968 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/67fecd5f-4a82-413f-88ba-a07aa56c71a8-dns-svc\") pod \"dnsmasq-dns-56d75b6d47-8rct7\" (UID: \"67fecd5f-4a82-413f-88ba-a07aa56c71a8\") " pod="openstack/dnsmasq-dns-56d75b6d47-8rct7" Oct 03 08:58:09 crc kubenswrapper[4790]: I1003 08:58:09.445674 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/67fecd5f-4a82-413f-88ba-a07aa56c71a8-ovsdbserver-sb\") pod \"dnsmasq-dns-56d75b6d47-8rct7\" (UID: \"67fecd5f-4a82-413f-88ba-a07aa56c71a8\") " pod="openstack/dnsmasq-dns-56d75b6d47-8rct7" Oct 03 08:58:09 crc kubenswrapper[4790]: I1003 08:58:09.447900 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/67fecd5f-4a82-413f-88ba-a07aa56c71a8-ovsdbserver-nb\") pod \"dnsmasq-dns-56d75b6d47-8rct7\" (UID: \"67fecd5f-4a82-413f-88ba-a07aa56c71a8\") " pod="openstack/dnsmasq-dns-56d75b6d47-8rct7" Oct 03 08:58:09 crc kubenswrapper[4790]: I1003 08:58:09.448339 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/67fecd5f-4a82-413f-88ba-a07aa56c71a8-dns-swift-storage-0\") pod \"dnsmasq-dns-56d75b6d47-8rct7\" (UID: \"67fecd5f-4a82-413f-88ba-a07aa56c71a8\") " pod="openstack/dnsmasq-dns-56d75b6d47-8rct7" Oct 03 08:58:09 crc kubenswrapper[4790]: I1003 08:58:09.449085 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/67fecd5f-4a82-413f-88ba-a07aa56c71a8-dns-svc\") pod \"dnsmasq-dns-56d75b6d47-8rct7\" (UID: \"67fecd5f-4a82-413f-88ba-a07aa56c71a8\") " pod="openstack/dnsmasq-dns-56d75b6d47-8rct7" Oct 03 08:58:09 crc kubenswrapper[4790]: I1003 08:58:09.449208 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/67fecd5f-4a82-413f-88ba-a07aa56c71a8-config\") pod \"dnsmasq-dns-56d75b6d47-8rct7\" (UID: \"67fecd5f-4a82-413f-88ba-a07aa56c71a8\") " pod="openstack/dnsmasq-dns-56d75b6d47-8rct7" Oct 03 08:58:09 crc kubenswrapper[4790]: I1003 08:58:09.479327 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-np5v2\" (UniqueName: \"kubernetes.io/projected/67fecd5f-4a82-413f-88ba-a07aa56c71a8-kube-api-access-np5v2\") pod \"dnsmasq-dns-56d75b6d47-8rct7\" (UID: \"67fecd5f-4a82-413f-88ba-a07aa56c71a8\") " pod="openstack/dnsmasq-dns-56d75b6d47-8rct7" Oct 03 08:58:09 crc kubenswrapper[4790]: I1003 08:58:09.625448 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-989b5cc47-594d9" event={"ID":"d4ab392a-37b8-46fa-b26c-5a1fb0a218e8","Type":"ContainerStarted","Data":"e1776677ceae5c48e84684825585c13b684d5be60a15958d46fcd15036937407"} Oct 03 08:58:09 crc kubenswrapper[4790]: I1003 08:58:09.659686 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-989b5cc47-594d9" podStartSLOduration=7.659657737 podStartE2EDuration="7.659657737s" podCreationTimestamp="2025-10-03 08:58:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 08:58:09.654004602 +0000 UTC m=+1076.006938840" watchObservedRunningTime="2025-10-03 08:58:09.659657737 +0000 UTC m=+1076.012591975" Oct 03 08:58:09 crc kubenswrapper[4790]: I1003 08:58:09.777056 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-56d75b6d47-8rct7" Oct 03 08:58:10 crc kubenswrapper[4790]: I1003 08:58:10.186016 4790 patch_prober.go:28] interesting pod/machine-config-daemon-zqj4j container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 08:58:10 crc kubenswrapper[4790]: I1003 08:58:10.186280 4790 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" podUID="36eae06a-d8be-4128-8d68-acb32e1f011b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 08:58:10 crc kubenswrapper[4790]: I1003 08:58:10.237287 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-56d75b6d47-8rct7"] Oct 03 08:58:10 crc kubenswrapper[4790]: I1003 08:58:10.318160 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-notifications-server-0" Oct 03 08:58:10 crc kubenswrapper[4790]: I1003 08:58:10.552305 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-db-sync-8j2zm"] Oct 03 08:58:10 crc kubenswrapper[4790]: I1003 08:58:10.554701 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-db-sync-8j2zm" Oct 03 08:58:10 crc kubenswrapper[4790]: I1003 08:58:10.560477 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-watcher-dockercfg-5f8dh" Oct 03 08:58:10 crc kubenswrapper[4790]: I1003 08:58:10.560860 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-config-data" Oct 03 08:58:10 crc kubenswrapper[4790]: I1003 08:58:10.563203 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-db-sync-8j2zm"] Oct 03 08:58:10 crc kubenswrapper[4790]: I1003 08:58:10.634052 4790 generic.go:334] "Generic (PLEG): container finished" podID="67fecd5f-4a82-413f-88ba-a07aa56c71a8" containerID="c3fca4c476f7b449af2718dd47299c6d7cdaf0c30e27ccf14f5fcf0430a8173c" exitCode=0 Oct 03 08:58:10 crc kubenswrapper[4790]: I1003 08:58:10.634161 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56d75b6d47-8rct7" event={"ID":"67fecd5f-4a82-413f-88ba-a07aa56c71a8","Type":"ContainerDied","Data":"c3fca4c476f7b449af2718dd47299c6d7cdaf0c30e27ccf14f5fcf0430a8173c"} Oct 03 08:58:10 crc kubenswrapper[4790]: I1003 08:58:10.634235 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-989b5cc47-594d9" podUID="d4ab392a-37b8-46fa-b26c-5a1fb0a218e8" containerName="dnsmasq-dns" containerID="cri-o://e1776677ceae5c48e84684825585c13b684d5be60a15958d46fcd15036937407" gracePeriod=10 Oct 03 08:58:10 crc kubenswrapper[4790]: I1003 08:58:10.634250 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56d75b6d47-8rct7" event={"ID":"67fecd5f-4a82-413f-88ba-a07aa56c71a8","Type":"ContainerStarted","Data":"8cf55ead6f46f31919be112f8da85e30f23ed27a30d098eb4b49e904257b164e"} Oct 03 08:58:10 crc kubenswrapper[4790]: I1003 08:58:10.634415 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-989b5cc47-594d9" Oct 03 08:58:10 crc kubenswrapper[4790]: I1003 08:58:10.666246 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8d78d2bd-0369-4e94-9f44-4e57634ab8b6-config-data\") pod \"watcher-db-sync-8j2zm\" (UID: \"8d78d2bd-0369-4e94-9f44-4e57634ab8b6\") " pod="openstack/watcher-db-sync-8j2zm" Oct 03 08:58:10 crc kubenswrapper[4790]: I1003 08:58:10.666315 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/8d78d2bd-0369-4e94-9f44-4e57634ab8b6-db-sync-config-data\") pod \"watcher-db-sync-8j2zm\" (UID: \"8d78d2bd-0369-4e94-9f44-4e57634ab8b6\") " pod="openstack/watcher-db-sync-8j2zm" Oct 03 08:58:10 crc kubenswrapper[4790]: I1003 08:58:10.666367 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d78d2bd-0369-4e94-9f44-4e57634ab8b6-combined-ca-bundle\") pod \"watcher-db-sync-8j2zm\" (UID: \"8d78d2bd-0369-4e94-9f44-4e57634ab8b6\") " pod="openstack/watcher-db-sync-8j2zm" Oct 03 08:58:10 crc kubenswrapper[4790]: I1003 08:58:10.666458 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cj29t\" (UniqueName: \"kubernetes.io/projected/8d78d2bd-0369-4e94-9f44-4e57634ab8b6-kube-api-access-cj29t\") pod \"watcher-db-sync-8j2zm\" (UID: \"8d78d2bd-0369-4e94-9f44-4e57634ab8b6\") " pod="openstack/watcher-db-sync-8j2zm" Oct 03 08:58:10 crc kubenswrapper[4790]: I1003 08:58:10.769042 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8d78d2bd-0369-4e94-9f44-4e57634ab8b6-config-data\") pod \"watcher-db-sync-8j2zm\" (UID: \"8d78d2bd-0369-4e94-9f44-4e57634ab8b6\") " pod="openstack/watcher-db-sync-8j2zm" Oct 03 08:58:10 crc kubenswrapper[4790]: I1003 08:58:10.769106 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/8d78d2bd-0369-4e94-9f44-4e57634ab8b6-db-sync-config-data\") pod \"watcher-db-sync-8j2zm\" (UID: \"8d78d2bd-0369-4e94-9f44-4e57634ab8b6\") " pod="openstack/watcher-db-sync-8j2zm" Oct 03 08:58:10 crc kubenswrapper[4790]: I1003 08:58:10.769131 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d78d2bd-0369-4e94-9f44-4e57634ab8b6-combined-ca-bundle\") pod \"watcher-db-sync-8j2zm\" (UID: \"8d78d2bd-0369-4e94-9f44-4e57634ab8b6\") " pod="openstack/watcher-db-sync-8j2zm" Oct 03 08:58:10 crc kubenswrapper[4790]: I1003 08:58:10.769152 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cj29t\" (UniqueName: \"kubernetes.io/projected/8d78d2bd-0369-4e94-9f44-4e57634ab8b6-kube-api-access-cj29t\") pod \"watcher-db-sync-8j2zm\" (UID: \"8d78d2bd-0369-4e94-9f44-4e57634ab8b6\") " pod="openstack/watcher-db-sync-8j2zm" Oct 03 08:58:10 crc kubenswrapper[4790]: I1003 08:58:10.776664 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/8d78d2bd-0369-4e94-9f44-4e57634ab8b6-db-sync-config-data\") pod \"watcher-db-sync-8j2zm\" (UID: \"8d78d2bd-0369-4e94-9f44-4e57634ab8b6\") " pod="openstack/watcher-db-sync-8j2zm" Oct 03 08:58:10 crc kubenswrapper[4790]: I1003 08:58:10.779906 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8d78d2bd-0369-4e94-9f44-4e57634ab8b6-config-data\") pod \"watcher-db-sync-8j2zm\" (UID: \"8d78d2bd-0369-4e94-9f44-4e57634ab8b6\") " pod="openstack/watcher-db-sync-8j2zm" Oct 03 08:58:10 crc kubenswrapper[4790]: I1003 08:58:10.781298 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d78d2bd-0369-4e94-9f44-4e57634ab8b6-combined-ca-bundle\") pod \"watcher-db-sync-8j2zm\" (UID: \"8d78d2bd-0369-4e94-9f44-4e57634ab8b6\") " pod="openstack/watcher-db-sync-8j2zm" Oct 03 08:58:10 crc kubenswrapper[4790]: I1003 08:58:10.800970 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cj29t\" (UniqueName: \"kubernetes.io/projected/8d78d2bd-0369-4e94-9f44-4e57634ab8b6-kube-api-access-cj29t\") pod \"watcher-db-sync-8j2zm\" (UID: \"8d78d2bd-0369-4e94-9f44-4e57634ab8b6\") " pod="openstack/watcher-db-sync-8j2zm" Oct 03 08:58:10 crc kubenswrapper[4790]: I1003 08:58:10.903352 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-db-sync-8j2zm" Oct 03 08:58:11 crc kubenswrapper[4790]: I1003 08:58:11.146019 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-989b5cc47-594d9" Oct 03 08:58:11 crc kubenswrapper[4790]: I1003 08:58:11.282103 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d4ab392a-37b8-46fa-b26c-5a1fb0a218e8-ovsdbserver-nb\") pod \"d4ab392a-37b8-46fa-b26c-5a1fb0a218e8\" (UID: \"d4ab392a-37b8-46fa-b26c-5a1fb0a218e8\") " Oct 03 08:58:11 crc kubenswrapper[4790]: I1003 08:58:11.282379 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d4ab392a-37b8-46fa-b26c-5a1fb0a218e8-dns-svc\") pod \"d4ab392a-37b8-46fa-b26c-5a1fb0a218e8\" (UID: \"d4ab392a-37b8-46fa-b26c-5a1fb0a218e8\") " Oct 03 08:58:11 crc kubenswrapper[4790]: I1003 08:58:11.282424 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c9ksm\" (UniqueName: \"kubernetes.io/projected/d4ab392a-37b8-46fa-b26c-5a1fb0a218e8-kube-api-access-c9ksm\") pod \"d4ab392a-37b8-46fa-b26c-5a1fb0a218e8\" (UID: \"d4ab392a-37b8-46fa-b26c-5a1fb0a218e8\") " Oct 03 08:58:11 crc kubenswrapper[4790]: I1003 08:58:11.282476 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d4ab392a-37b8-46fa-b26c-5a1fb0a218e8-config\") pod \"d4ab392a-37b8-46fa-b26c-5a1fb0a218e8\" (UID: \"d4ab392a-37b8-46fa-b26c-5a1fb0a218e8\") " Oct 03 08:58:11 crc kubenswrapper[4790]: I1003 08:58:11.282551 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d4ab392a-37b8-46fa-b26c-5a1fb0a218e8-ovsdbserver-sb\") pod \"d4ab392a-37b8-46fa-b26c-5a1fb0a218e8\" (UID: \"d4ab392a-37b8-46fa-b26c-5a1fb0a218e8\") " Oct 03 08:58:11 crc kubenswrapper[4790]: I1003 08:58:11.282678 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d4ab392a-37b8-46fa-b26c-5a1fb0a218e8-dns-swift-storage-0\") pod \"d4ab392a-37b8-46fa-b26c-5a1fb0a218e8\" (UID: \"d4ab392a-37b8-46fa-b26c-5a1fb0a218e8\") " Oct 03 08:58:11 crc kubenswrapper[4790]: I1003 08:58:11.288525 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d4ab392a-37b8-46fa-b26c-5a1fb0a218e8-kube-api-access-c9ksm" (OuterVolumeSpecName: "kube-api-access-c9ksm") pod "d4ab392a-37b8-46fa-b26c-5a1fb0a218e8" (UID: "d4ab392a-37b8-46fa-b26c-5a1fb0a218e8"). InnerVolumeSpecName "kube-api-access-c9ksm". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:58:11 crc kubenswrapper[4790]: I1003 08:58:11.339105 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d4ab392a-37b8-46fa-b26c-5a1fb0a218e8-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "d4ab392a-37b8-46fa-b26c-5a1fb0a218e8" (UID: "d4ab392a-37b8-46fa-b26c-5a1fb0a218e8"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:58:11 crc kubenswrapper[4790]: I1003 08:58:11.352127 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d4ab392a-37b8-46fa-b26c-5a1fb0a218e8-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "d4ab392a-37b8-46fa-b26c-5a1fb0a218e8" (UID: "d4ab392a-37b8-46fa-b26c-5a1fb0a218e8"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:58:11 crc kubenswrapper[4790]: I1003 08:58:11.353894 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d4ab392a-37b8-46fa-b26c-5a1fb0a218e8-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "d4ab392a-37b8-46fa-b26c-5a1fb0a218e8" (UID: "d4ab392a-37b8-46fa-b26c-5a1fb0a218e8"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:58:11 crc kubenswrapper[4790]: I1003 08:58:11.358915 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d4ab392a-37b8-46fa-b26c-5a1fb0a218e8-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "d4ab392a-37b8-46fa-b26c-5a1fb0a218e8" (UID: "d4ab392a-37b8-46fa-b26c-5a1fb0a218e8"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:58:11 crc kubenswrapper[4790]: I1003 08:58:11.368507 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d4ab392a-37b8-46fa-b26c-5a1fb0a218e8-config" (OuterVolumeSpecName: "config") pod "d4ab392a-37b8-46fa-b26c-5a1fb0a218e8" (UID: "d4ab392a-37b8-46fa-b26c-5a1fb0a218e8"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:58:11 crc kubenswrapper[4790]: I1003 08:58:11.385985 4790 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d4ab392a-37b8-46fa-b26c-5a1fb0a218e8-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 03 08:58:11 crc kubenswrapper[4790]: I1003 08:58:11.386340 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c9ksm\" (UniqueName: \"kubernetes.io/projected/d4ab392a-37b8-46fa-b26c-5a1fb0a218e8-kube-api-access-c9ksm\") on node \"crc\" DevicePath \"\"" Oct 03 08:58:11 crc kubenswrapper[4790]: I1003 08:58:11.386424 4790 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d4ab392a-37b8-46fa-b26c-5a1fb0a218e8-config\") on node \"crc\" DevicePath \"\"" Oct 03 08:58:11 crc kubenswrapper[4790]: I1003 08:58:11.386553 4790 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d4ab392a-37b8-46fa-b26c-5a1fb0a218e8-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 03 08:58:11 crc kubenswrapper[4790]: I1003 08:58:11.386655 4790 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d4ab392a-37b8-46fa-b26c-5a1fb0a218e8-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 03 08:58:11 crc kubenswrapper[4790]: I1003 08:58:11.386725 4790 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d4ab392a-37b8-46fa-b26c-5a1fb0a218e8-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 03 08:58:11 crc kubenswrapper[4790]: I1003 08:58:11.492474 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-db-sync-8j2zm"] Oct 03 08:58:11 crc kubenswrapper[4790]: I1003 08:58:11.652428 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-db-sync-8j2zm" event={"ID":"8d78d2bd-0369-4e94-9f44-4e57634ab8b6","Type":"ContainerStarted","Data":"a7e378512a498b341651339187ed17bd8ccc34053fdfdfc76665dc157cbdd7b3"} Oct 03 08:58:11 crc kubenswrapper[4790]: I1003 08:58:11.661638 4790 generic.go:334] "Generic (PLEG): container finished" podID="d4ab392a-37b8-46fa-b26c-5a1fb0a218e8" containerID="e1776677ceae5c48e84684825585c13b684d5be60a15958d46fcd15036937407" exitCode=0 Oct 03 08:58:11 crc kubenswrapper[4790]: I1003 08:58:11.661772 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-989b5cc47-594d9" event={"ID":"d4ab392a-37b8-46fa-b26c-5a1fb0a218e8","Type":"ContainerDied","Data":"e1776677ceae5c48e84684825585c13b684d5be60a15958d46fcd15036937407"} Oct 03 08:58:11 crc kubenswrapper[4790]: I1003 08:58:11.661817 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-989b5cc47-594d9" event={"ID":"d4ab392a-37b8-46fa-b26c-5a1fb0a218e8","Type":"ContainerDied","Data":"4c0690eb69d02f552c314e0b0c2dfe2a621f0731ba4b3c4dfaa4b9a243c8a317"} Oct 03 08:58:11 crc kubenswrapper[4790]: I1003 08:58:11.661845 4790 scope.go:117] "RemoveContainer" containerID="e1776677ceae5c48e84684825585c13b684d5be60a15958d46fcd15036937407" Oct 03 08:58:11 crc kubenswrapper[4790]: I1003 08:58:11.662101 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-989b5cc47-594d9" Oct 03 08:58:11 crc kubenswrapper[4790]: I1003 08:58:11.671006 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56d75b6d47-8rct7" event={"ID":"67fecd5f-4a82-413f-88ba-a07aa56c71a8","Type":"ContainerStarted","Data":"eb8c3dc71c8ca8b48950da0bcc2bd797b69afd28744ac4b7dd0cd9dd51529407"} Oct 03 08:58:11 crc kubenswrapper[4790]: I1003 08:58:11.671205 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-56d75b6d47-8rct7" Oct 03 08:58:11 crc kubenswrapper[4790]: I1003 08:58:11.696176 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-56d75b6d47-8rct7" podStartSLOduration=2.6961555710000003 podStartE2EDuration="2.696155571s" podCreationTimestamp="2025-10-03 08:58:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 08:58:11.692037328 +0000 UTC m=+1078.044971576" watchObservedRunningTime="2025-10-03 08:58:11.696155571 +0000 UTC m=+1078.049089809" Oct 03 08:58:11 crc kubenswrapper[4790]: I1003 08:58:11.712164 4790 scope.go:117] "RemoveContainer" containerID="e7042232f0620a6f157d0c5dc7260a7cad3b98d14b3bf2b7101b435aea20714b" Oct 03 08:58:11 crc kubenswrapper[4790]: I1003 08:58:11.719518 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-989b5cc47-594d9"] Oct 03 08:58:11 crc kubenswrapper[4790]: I1003 08:58:11.736000 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-989b5cc47-594d9"] Oct 03 08:58:11 crc kubenswrapper[4790]: I1003 08:58:11.748066 4790 scope.go:117] "RemoveContainer" containerID="e1776677ceae5c48e84684825585c13b684d5be60a15958d46fcd15036937407" Oct 03 08:58:11 crc kubenswrapper[4790]: E1003 08:58:11.751544 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e1776677ceae5c48e84684825585c13b684d5be60a15958d46fcd15036937407\": container with ID starting with e1776677ceae5c48e84684825585c13b684d5be60a15958d46fcd15036937407 not found: ID does not exist" containerID="e1776677ceae5c48e84684825585c13b684d5be60a15958d46fcd15036937407" Oct 03 08:58:11 crc kubenswrapper[4790]: I1003 08:58:11.751710 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e1776677ceae5c48e84684825585c13b684d5be60a15958d46fcd15036937407"} err="failed to get container status \"e1776677ceae5c48e84684825585c13b684d5be60a15958d46fcd15036937407\": rpc error: code = NotFound desc = could not find container \"e1776677ceae5c48e84684825585c13b684d5be60a15958d46fcd15036937407\": container with ID starting with e1776677ceae5c48e84684825585c13b684d5be60a15958d46fcd15036937407 not found: ID does not exist" Oct 03 08:58:11 crc kubenswrapper[4790]: I1003 08:58:11.751743 4790 scope.go:117] "RemoveContainer" containerID="e7042232f0620a6f157d0c5dc7260a7cad3b98d14b3bf2b7101b435aea20714b" Oct 03 08:58:11 crc kubenswrapper[4790]: E1003 08:58:11.752175 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e7042232f0620a6f157d0c5dc7260a7cad3b98d14b3bf2b7101b435aea20714b\": container with ID starting with e7042232f0620a6f157d0c5dc7260a7cad3b98d14b3bf2b7101b435aea20714b not found: ID does not exist" containerID="e7042232f0620a6f157d0c5dc7260a7cad3b98d14b3bf2b7101b435aea20714b" Oct 03 08:58:11 crc kubenswrapper[4790]: I1003 08:58:11.752205 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e7042232f0620a6f157d0c5dc7260a7cad3b98d14b3bf2b7101b435aea20714b"} err="failed to get container status \"e7042232f0620a6f157d0c5dc7260a7cad3b98d14b3bf2b7101b435aea20714b\": rpc error: code = NotFound desc = could not find container \"e7042232f0620a6f157d0c5dc7260a7cad3b98d14b3bf2b7101b435aea20714b\": container with ID starting with e7042232f0620a6f157d0c5dc7260a7cad3b98d14b3bf2b7101b435aea20714b not found: ID does not exist" Oct 03 08:58:12 crc kubenswrapper[4790]: I1003 08:58:12.342350 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d4ab392a-37b8-46fa-b26c-5a1fb0a218e8" path="/var/lib/kubelet/pods/d4ab392a-37b8-46fa-b26c-5a1fb0a218e8/volumes" Oct 03 08:58:12 crc kubenswrapper[4790]: I1003 08:58:12.694929 4790 generic.go:334] "Generic (PLEG): container finished" podID="cde60e0e-020c-42ca-9677-05fc2d3d7a72" containerID="4ba6f46b86ce0149c3824563a53028342ad3c1387370b258b8c7ff65b4b88c86" exitCode=0 Oct 03 08:58:12 crc kubenswrapper[4790]: I1003 08:58:12.695009 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-f9rhk" event={"ID":"cde60e0e-020c-42ca-9677-05fc2d3d7a72","Type":"ContainerDied","Data":"4ba6f46b86ce0149c3824563a53028342ad3c1387370b258b8c7ff65b4b88c86"} Oct 03 08:58:18 crc kubenswrapper[4790]: I1003 08:58:18.437411 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-f9rhk" Oct 03 08:58:18 crc kubenswrapper[4790]: I1003 08:58:18.516430 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cde60e0e-020c-42ca-9677-05fc2d3d7a72-combined-ca-bundle\") pod \"cde60e0e-020c-42ca-9677-05fc2d3d7a72\" (UID: \"cde60e0e-020c-42ca-9677-05fc2d3d7a72\") " Oct 03 08:58:18 crc kubenswrapper[4790]: I1003 08:58:18.516910 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cde60e0e-020c-42ca-9677-05fc2d3d7a72-config-data\") pod \"cde60e0e-020c-42ca-9677-05fc2d3d7a72\" (UID: \"cde60e0e-020c-42ca-9677-05fc2d3d7a72\") " Oct 03 08:58:18 crc kubenswrapper[4790]: I1003 08:58:18.517001 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4l82z\" (UniqueName: \"kubernetes.io/projected/cde60e0e-020c-42ca-9677-05fc2d3d7a72-kube-api-access-4l82z\") pod \"cde60e0e-020c-42ca-9677-05fc2d3d7a72\" (UID: \"cde60e0e-020c-42ca-9677-05fc2d3d7a72\") " Oct 03 08:58:18 crc kubenswrapper[4790]: I1003 08:58:18.524840 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cde60e0e-020c-42ca-9677-05fc2d3d7a72-kube-api-access-4l82z" (OuterVolumeSpecName: "kube-api-access-4l82z") pod "cde60e0e-020c-42ca-9677-05fc2d3d7a72" (UID: "cde60e0e-020c-42ca-9677-05fc2d3d7a72"). InnerVolumeSpecName "kube-api-access-4l82z". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:58:18 crc kubenswrapper[4790]: I1003 08:58:18.542738 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cde60e0e-020c-42ca-9677-05fc2d3d7a72-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cde60e0e-020c-42ca-9677-05fc2d3d7a72" (UID: "cde60e0e-020c-42ca-9677-05fc2d3d7a72"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:58:18 crc kubenswrapper[4790]: I1003 08:58:18.560193 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cde60e0e-020c-42ca-9677-05fc2d3d7a72-config-data" (OuterVolumeSpecName: "config-data") pod "cde60e0e-020c-42ca-9677-05fc2d3d7a72" (UID: "cde60e0e-020c-42ca-9677-05fc2d3d7a72"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:58:18 crc kubenswrapper[4790]: I1003 08:58:18.618709 4790 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cde60e0e-020c-42ca-9677-05fc2d3d7a72-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 08:58:18 crc kubenswrapper[4790]: I1003 08:58:18.618754 4790 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cde60e0e-020c-42ca-9677-05fc2d3d7a72-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 08:58:18 crc kubenswrapper[4790]: I1003 08:58:18.618764 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4l82z\" (UniqueName: \"kubernetes.io/projected/cde60e0e-020c-42ca-9677-05fc2d3d7a72-kube-api-access-4l82z\") on node \"crc\" DevicePath \"\"" Oct 03 08:58:18 crc kubenswrapper[4790]: I1003 08:58:18.750227 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-f9rhk" event={"ID":"cde60e0e-020c-42ca-9677-05fc2d3d7a72","Type":"ContainerDied","Data":"62baf587c17d293fcd380e1df14e4b6e1636b489b510ec0441332d79325fddb5"} Oct 03 08:58:18 crc kubenswrapper[4790]: I1003 08:58:18.750278 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="62baf587c17d293fcd380e1df14e4b6e1636b489b510ec0441332d79325fddb5" Oct 03 08:58:18 crc kubenswrapper[4790]: I1003 08:58:18.750299 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-f9rhk" Oct 03 08:58:19 crc kubenswrapper[4790]: I1003 08:58:19.704967 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-56d75b6d47-8rct7"] Oct 03 08:58:19 crc kubenswrapper[4790]: I1003 08:58:19.705857 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-56d75b6d47-8rct7" podUID="67fecd5f-4a82-413f-88ba-a07aa56c71a8" containerName="dnsmasq-dns" containerID="cri-o://eb8c3dc71c8ca8b48950da0bcc2bd797b69afd28744ac4b7dd0cd9dd51529407" gracePeriod=10 Oct 03 08:58:19 crc kubenswrapper[4790]: I1003 08:58:19.707524 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-56d75b6d47-8rct7" Oct 03 08:58:19 crc kubenswrapper[4790]: I1003 08:58:19.741168 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-f6df867cc-pshbz"] Oct 03 08:58:19 crc kubenswrapper[4790]: E1003 08:58:19.741819 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cde60e0e-020c-42ca-9677-05fc2d3d7a72" containerName="keystone-db-sync" Oct 03 08:58:19 crc kubenswrapper[4790]: I1003 08:58:19.741854 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="cde60e0e-020c-42ca-9677-05fc2d3d7a72" containerName="keystone-db-sync" Oct 03 08:58:19 crc kubenswrapper[4790]: E1003 08:58:19.741889 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d4ab392a-37b8-46fa-b26c-5a1fb0a218e8" containerName="dnsmasq-dns" Oct 03 08:58:19 crc kubenswrapper[4790]: I1003 08:58:19.741898 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="d4ab392a-37b8-46fa-b26c-5a1fb0a218e8" containerName="dnsmasq-dns" Oct 03 08:58:19 crc kubenswrapper[4790]: E1003 08:58:19.741933 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d4ab392a-37b8-46fa-b26c-5a1fb0a218e8" containerName="init" Oct 03 08:58:19 crc kubenswrapper[4790]: I1003 08:58:19.741941 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="d4ab392a-37b8-46fa-b26c-5a1fb0a218e8" containerName="init" Oct 03 08:58:19 crc kubenswrapper[4790]: I1003 08:58:19.742303 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="cde60e0e-020c-42ca-9677-05fc2d3d7a72" containerName="keystone-db-sync" Oct 03 08:58:19 crc kubenswrapper[4790]: I1003 08:58:19.742358 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="d4ab392a-37b8-46fa-b26c-5a1fb0a218e8" containerName="dnsmasq-dns" Oct 03 08:58:19 crc kubenswrapper[4790]: I1003 08:58:19.743964 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-f6df867cc-pshbz" Oct 03 08:58:19 crc kubenswrapper[4790]: I1003 08:58:19.756831 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-f6df867cc-pshbz"] Oct 03 08:58:19 crc kubenswrapper[4790]: I1003 08:58:19.777947 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-db-sync-8j2zm" event={"ID":"8d78d2bd-0369-4e94-9f44-4e57634ab8b6","Type":"ContainerStarted","Data":"b903ffd9c9737c9e3f4c122cd3010c618d42c0b548b05612976955055812f322"} Oct 03 08:58:19 crc kubenswrapper[4790]: I1003 08:58:19.778983 4790 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-56d75b6d47-8rct7" podUID="67fecd5f-4a82-413f-88ba-a07aa56c71a8" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.143:5353: connect: connection refused" Oct 03 08:58:19 crc kubenswrapper[4790]: I1003 08:58:19.799671 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-ztmgb"] Oct 03 08:58:19 crc kubenswrapper[4790]: I1003 08:58:19.800936 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-ztmgb" Oct 03 08:58:19 crc kubenswrapper[4790]: I1003 08:58:19.807810 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-fnq8f" Oct 03 08:58:19 crc kubenswrapper[4790]: I1003 08:58:19.807990 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Oct 03 08:58:19 crc kubenswrapper[4790]: I1003 08:58:19.808134 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Oct 03 08:58:19 crc kubenswrapper[4790]: I1003 08:58:19.808250 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Oct 03 08:58:19 crc kubenswrapper[4790]: I1003 08:58:19.824739 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-ztmgb"] Oct 03 08:58:19 crc kubenswrapper[4790]: I1003 08:58:19.838954 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/19b25afb-eabe-4e5f-9fbd-39e86c2d8690-dns-svc\") pod \"dnsmasq-dns-f6df867cc-pshbz\" (UID: \"19b25afb-eabe-4e5f-9fbd-39e86c2d8690\") " pod="openstack/dnsmasq-dns-f6df867cc-pshbz" Oct 03 08:58:19 crc kubenswrapper[4790]: I1003 08:58:19.839060 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/43741e1d-af3f-4800-a5a7-cca0f6620cbc-scripts\") pod \"keystone-bootstrap-ztmgb\" (UID: \"43741e1d-af3f-4800-a5a7-cca0f6620cbc\") " pod="openstack/keystone-bootstrap-ztmgb" Oct 03 08:58:19 crc kubenswrapper[4790]: I1003 08:58:19.839085 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/19b25afb-eabe-4e5f-9fbd-39e86c2d8690-dns-swift-storage-0\") pod \"dnsmasq-dns-f6df867cc-pshbz\" (UID: \"19b25afb-eabe-4e5f-9fbd-39e86c2d8690\") " pod="openstack/dnsmasq-dns-f6df867cc-pshbz" Oct 03 08:58:19 crc kubenswrapper[4790]: I1003 08:58:19.839101 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/43741e1d-af3f-4800-a5a7-cca0f6620cbc-combined-ca-bundle\") pod \"keystone-bootstrap-ztmgb\" (UID: \"43741e1d-af3f-4800-a5a7-cca0f6620cbc\") " pod="openstack/keystone-bootstrap-ztmgb" Oct 03 08:58:19 crc kubenswrapper[4790]: I1003 08:58:19.839141 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/43741e1d-af3f-4800-a5a7-cca0f6620cbc-credential-keys\") pod \"keystone-bootstrap-ztmgb\" (UID: \"43741e1d-af3f-4800-a5a7-cca0f6620cbc\") " pod="openstack/keystone-bootstrap-ztmgb" Oct 03 08:58:19 crc kubenswrapper[4790]: I1003 08:58:19.839176 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/43741e1d-af3f-4800-a5a7-cca0f6620cbc-config-data\") pod \"keystone-bootstrap-ztmgb\" (UID: \"43741e1d-af3f-4800-a5a7-cca0f6620cbc\") " pod="openstack/keystone-bootstrap-ztmgb" Oct 03 08:58:19 crc kubenswrapper[4790]: I1003 08:58:19.839220 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/19b25afb-eabe-4e5f-9fbd-39e86c2d8690-ovsdbserver-nb\") pod \"dnsmasq-dns-f6df867cc-pshbz\" (UID: \"19b25afb-eabe-4e5f-9fbd-39e86c2d8690\") " pod="openstack/dnsmasq-dns-f6df867cc-pshbz" Oct 03 08:58:19 crc kubenswrapper[4790]: I1003 08:58:19.839265 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/43741e1d-af3f-4800-a5a7-cca0f6620cbc-fernet-keys\") pod \"keystone-bootstrap-ztmgb\" (UID: \"43741e1d-af3f-4800-a5a7-cca0f6620cbc\") " pod="openstack/keystone-bootstrap-ztmgb" Oct 03 08:58:19 crc kubenswrapper[4790]: I1003 08:58:19.839311 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-82vsb\" (UniqueName: \"kubernetes.io/projected/43741e1d-af3f-4800-a5a7-cca0f6620cbc-kube-api-access-82vsb\") pod \"keystone-bootstrap-ztmgb\" (UID: \"43741e1d-af3f-4800-a5a7-cca0f6620cbc\") " pod="openstack/keystone-bootstrap-ztmgb" Oct 03 08:58:19 crc kubenswrapper[4790]: I1003 08:58:19.839392 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/19b25afb-eabe-4e5f-9fbd-39e86c2d8690-config\") pod \"dnsmasq-dns-f6df867cc-pshbz\" (UID: \"19b25afb-eabe-4e5f-9fbd-39e86c2d8690\") " pod="openstack/dnsmasq-dns-f6df867cc-pshbz" Oct 03 08:58:19 crc kubenswrapper[4790]: I1003 08:58:19.839409 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/19b25afb-eabe-4e5f-9fbd-39e86c2d8690-ovsdbserver-sb\") pod \"dnsmasq-dns-f6df867cc-pshbz\" (UID: \"19b25afb-eabe-4e5f-9fbd-39e86c2d8690\") " pod="openstack/dnsmasq-dns-f6df867cc-pshbz" Oct 03 08:58:19 crc kubenswrapper[4790]: I1003 08:58:19.839429 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2r984\" (UniqueName: \"kubernetes.io/projected/19b25afb-eabe-4e5f-9fbd-39e86c2d8690-kube-api-access-2r984\") pod \"dnsmasq-dns-f6df867cc-pshbz\" (UID: \"19b25afb-eabe-4e5f-9fbd-39e86c2d8690\") " pod="openstack/dnsmasq-dns-f6df867cc-pshbz" Oct 03 08:58:19 crc kubenswrapper[4790]: I1003 08:58:19.851471 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/watcher-db-sync-8j2zm" podStartSLOduration=2.405758938 podStartE2EDuration="9.851449264s" podCreationTimestamp="2025-10-03 08:58:10 +0000 UTC" firstStartedPulling="2025-10-03 08:58:11.473391422 +0000 UTC m=+1077.826325660" lastFinishedPulling="2025-10-03 08:58:18.919081758 +0000 UTC m=+1085.272015986" observedRunningTime="2025-10-03 08:58:19.808364604 +0000 UTC m=+1086.161298872" watchObservedRunningTime="2025-10-03 08:58:19.851449264 +0000 UTC m=+1086.204383502" Oct 03 08:58:19 crc kubenswrapper[4790]: I1003 08:58:19.942984 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/43741e1d-af3f-4800-a5a7-cca0f6620cbc-config-data\") pod \"keystone-bootstrap-ztmgb\" (UID: \"43741e1d-af3f-4800-a5a7-cca0f6620cbc\") " pod="openstack/keystone-bootstrap-ztmgb" Oct 03 08:58:19 crc kubenswrapper[4790]: I1003 08:58:19.943061 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/19b25afb-eabe-4e5f-9fbd-39e86c2d8690-ovsdbserver-nb\") pod \"dnsmasq-dns-f6df867cc-pshbz\" (UID: \"19b25afb-eabe-4e5f-9fbd-39e86c2d8690\") " pod="openstack/dnsmasq-dns-f6df867cc-pshbz" Oct 03 08:58:19 crc kubenswrapper[4790]: I1003 08:58:19.943100 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/43741e1d-af3f-4800-a5a7-cca0f6620cbc-fernet-keys\") pod \"keystone-bootstrap-ztmgb\" (UID: \"43741e1d-af3f-4800-a5a7-cca0f6620cbc\") " pod="openstack/keystone-bootstrap-ztmgb" Oct 03 08:58:19 crc kubenswrapper[4790]: I1003 08:58:19.943127 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-82vsb\" (UniqueName: \"kubernetes.io/projected/43741e1d-af3f-4800-a5a7-cca0f6620cbc-kube-api-access-82vsb\") pod \"keystone-bootstrap-ztmgb\" (UID: \"43741e1d-af3f-4800-a5a7-cca0f6620cbc\") " pod="openstack/keystone-bootstrap-ztmgb" Oct 03 08:58:19 crc kubenswrapper[4790]: I1003 08:58:19.943178 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/19b25afb-eabe-4e5f-9fbd-39e86c2d8690-config\") pod \"dnsmasq-dns-f6df867cc-pshbz\" (UID: \"19b25afb-eabe-4e5f-9fbd-39e86c2d8690\") " pod="openstack/dnsmasq-dns-f6df867cc-pshbz" Oct 03 08:58:19 crc kubenswrapper[4790]: I1003 08:58:19.943200 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/19b25afb-eabe-4e5f-9fbd-39e86c2d8690-ovsdbserver-sb\") pod \"dnsmasq-dns-f6df867cc-pshbz\" (UID: \"19b25afb-eabe-4e5f-9fbd-39e86c2d8690\") " pod="openstack/dnsmasq-dns-f6df867cc-pshbz" Oct 03 08:58:19 crc kubenswrapper[4790]: I1003 08:58:19.943228 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2r984\" (UniqueName: \"kubernetes.io/projected/19b25afb-eabe-4e5f-9fbd-39e86c2d8690-kube-api-access-2r984\") pod \"dnsmasq-dns-f6df867cc-pshbz\" (UID: \"19b25afb-eabe-4e5f-9fbd-39e86c2d8690\") " pod="openstack/dnsmasq-dns-f6df867cc-pshbz" Oct 03 08:58:19 crc kubenswrapper[4790]: I1003 08:58:19.943269 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/19b25afb-eabe-4e5f-9fbd-39e86c2d8690-dns-svc\") pod \"dnsmasq-dns-f6df867cc-pshbz\" (UID: \"19b25afb-eabe-4e5f-9fbd-39e86c2d8690\") " pod="openstack/dnsmasq-dns-f6df867cc-pshbz" Oct 03 08:58:19 crc kubenswrapper[4790]: I1003 08:58:19.943313 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/43741e1d-af3f-4800-a5a7-cca0f6620cbc-scripts\") pod \"keystone-bootstrap-ztmgb\" (UID: \"43741e1d-af3f-4800-a5a7-cca0f6620cbc\") " pod="openstack/keystone-bootstrap-ztmgb" Oct 03 08:58:19 crc kubenswrapper[4790]: I1003 08:58:19.943333 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/19b25afb-eabe-4e5f-9fbd-39e86c2d8690-dns-swift-storage-0\") pod \"dnsmasq-dns-f6df867cc-pshbz\" (UID: \"19b25afb-eabe-4e5f-9fbd-39e86c2d8690\") " pod="openstack/dnsmasq-dns-f6df867cc-pshbz" Oct 03 08:58:19 crc kubenswrapper[4790]: I1003 08:58:19.943354 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/43741e1d-af3f-4800-a5a7-cca0f6620cbc-combined-ca-bundle\") pod \"keystone-bootstrap-ztmgb\" (UID: \"43741e1d-af3f-4800-a5a7-cca0f6620cbc\") " pod="openstack/keystone-bootstrap-ztmgb" Oct 03 08:58:19 crc kubenswrapper[4790]: I1003 08:58:19.943389 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/43741e1d-af3f-4800-a5a7-cca0f6620cbc-credential-keys\") pod \"keystone-bootstrap-ztmgb\" (UID: \"43741e1d-af3f-4800-a5a7-cca0f6620cbc\") " pod="openstack/keystone-bootstrap-ztmgb" Oct 03 08:58:19 crc kubenswrapper[4790]: I1003 08:58:19.945462 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/19b25afb-eabe-4e5f-9fbd-39e86c2d8690-ovsdbserver-sb\") pod \"dnsmasq-dns-f6df867cc-pshbz\" (UID: \"19b25afb-eabe-4e5f-9fbd-39e86c2d8690\") " pod="openstack/dnsmasq-dns-f6df867cc-pshbz" Oct 03 08:58:19 crc kubenswrapper[4790]: I1003 08:58:19.945919 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-6d887b5cbf-5qjjt"] Oct 03 08:58:19 crc kubenswrapper[4790]: I1003 08:58:19.949612 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6d887b5cbf-5qjjt" Oct 03 08:58:19 crc kubenswrapper[4790]: I1003 08:58:19.955147 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-config-data" Oct 03 08:58:19 crc kubenswrapper[4790]: I1003 08:58:19.955330 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-scripts" Oct 03 08:58:19 crc kubenswrapper[4790]: I1003 08:58:19.955453 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon-horizon-dockercfg-6w7tq" Oct 03 08:58:19 crc kubenswrapper[4790]: I1003 08:58:19.956214 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/19b25afb-eabe-4e5f-9fbd-39e86c2d8690-ovsdbserver-nb\") pod \"dnsmasq-dns-f6df867cc-pshbz\" (UID: \"19b25afb-eabe-4e5f-9fbd-39e86c2d8690\") " pod="openstack/dnsmasq-dns-f6df867cc-pshbz" Oct 03 08:58:19 crc kubenswrapper[4790]: I1003 08:58:19.959268 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/43741e1d-af3f-4800-a5a7-cca0f6620cbc-fernet-keys\") pod \"keystone-bootstrap-ztmgb\" (UID: \"43741e1d-af3f-4800-a5a7-cca0f6620cbc\") " pod="openstack/keystone-bootstrap-ztmgb" Oct 03 08:58:19 crc kubenswrapper[4790]: I1003 08:58:19.959449 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/43741e1d-af3f-4800-a5a7-cca0f6620cbc-scripts\") pod \"keystone-bootstrap-ztmgb\" (UID: \"43741e1d-af3f-4800-a5a7-cca0f6620cbc\") " pod="openstack/keystone-bootstrap-ztmgb" Oct 03 08:58:19 crc kubenswrapper[4790]: I1003 08:58:19.960162 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/19b25afb-eabe-4e5f-9fbd-39e86c2d8690-dns-svc\") pod \"dnsmasq-dns-f6df867cc-pshbz\" (UID: \"19b25afb-eabe-4e5f-9fbd-39e86c2d8690\") " pod="openstack/dnsmasq-dns-f6df867cc-pshbz" Oct 03 08:58:19 crc kubenswrapper[4790]: I1003 08:58:19.960243 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon" Oct 03 08:58:19 crc kubenswrapper[4790]: I1003 08:58:19.960876 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/19b25afb-eabe-4e5f-9fbd-39e86c2d8690-dns-swift-storage-0\") pod \"dnsmasq-dns-f6df867cc-pshbz\" (UID: \"19b25afb-eabe-4e5f-9fbd-39e86c2d8690\") " pod="openstack/dnsmasq-dns-f6df867cc-pshbz" Oct 03 08:58:19 crc kubenswrapper[4790]: I1003 08:58:19.960992 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/19b25afb-eabe-4e5f-9fbd-39e86c2d8690-config\") pod \"dnsmasq-dns-f6df867cc-pshbz\" (UID: \"19b25afb-eabe-4e5f-9fbd-39e86c2d8690\") " pod="openstack/dnsmasq-dns-f6df867cc-pshbz" Oct 03 08:58:19 crc kubenswrapper[4790]: I1003 08:58:19.967324 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/43741e1d-af3f-4800-a5a7-cca0f6620cbc-combined-ca-bundle\") pod \"keystone-bootstrap-ztmgb\" (UID: \"43741e1d-af3f-4800-a5a7-cca0f6620cbc\") " pod="openstack/keystone-bootstrap-ztmgb" Oct 03 08:58:19 crc kubenswrapper[4790]: I1003 08:58:19.987384 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/43741e1d-af3f-4800-a5a7-cca0f6620cbc-credential-keys\") pod \"keystone-bootstrap-ztmgb\" (UID: \"43741e1d-af3f-4800-a5a7-cca0f6620cbc\") " pod="openstack/keystone-bootstrap-ztmgb" Oct 03 08:58:19 crc kubenswrapper[4790]: I1003 08:58:19.992329 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2r984\" (UniqueName: \"kubernetes.io/projected/19b25afb-eabe-4e5f-9fbd-39e86c2d8690-kube-api-access-2r984\") pod \"dnsmasq-dns-f6df867cc-pshbz\" (UID: \"19b25afb-eabe-4e5f-9fbd-39e86c2d8690\") " pod="openstack/dnsmasq-dns-f6df867cc-pshbz" Oct 03 08:58:19 crc kubenswrapper[4790]: I1003 08:58:19.994507 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-82vsb\" (UniqueName: \"kubernetes.io/projected/43741e1d-af3f-4800-a5a7-cca0f6620cbc-kube-api-access-82vsb\") pod \"keystone-bootstrap-ztmgb\" (UID: \"43741e1d-af3f-4800-a5a7-cca0f6620cbc\") " pod="openstack/keystone-bootstrap-ztmgb" Oct 03 08:58:19 crc kubenswrapper[4790]: I1003 08:58:19.996612 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/43741e1d-af3f-4800-a5a7-cca0f6620cbc-config-data\") pod \"keystone-bootstrap-ztmgb\" (UID: \"43741e1d-af3f-4800-a5a7-cca0f6620cbc\") " pod="openstack/keystone-bootstrap-ztmgb" Oct 03 08:58:20 crc kubenswrapper[4790]: I1003 08:58:20.012261 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-6d887b5cbf-5qjjt"] Oct 03 08:58:20 crc kubenswrapper[4790]: I1003 08:58:20.054752 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/8a368664-2338-4579-ae15-981ee5ca3194-horizon-secret-key\") pod \"horizon-6d887b5cbf-5qjjt\" (UID: \"8a368664-2338-4579-ae15-981ee5ca3194\") " pod="openstack/horizon-6d887b5cbf-5qjjt" Oct 03 08:58:20 crc kubenswrapper[4790]: I1003 08:58:20.058698 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8a368664-2338-4579-ae15-981ee5ca3194-config-data\") pod \"horizon-6d887b5cbf-5qjjt\" (UID: \"8a368664-2338-4579-ae15-981ee5ca3194\") " pod="openstack/horizon-6d887b5cbf-5qjjt" Oct 03 08:58:20 crc kubenswrapper[4790]: I1003 08:58:20.058755 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8a368664-2338-4579-ae15-981ee5ca3194-logs\") pod \"horizon-6d887b5cbf-5qjjt\" (UID: \"8a368664-2338-4579-ae15-981ee5ca3194\") " pod="openstack/horizon-6d887b5cbf-5qjjt" Oct 03 08:58:20 crc kubenswrapper[4790]: I1003 08:58:20.058876 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8a368664-2338-4579-ae15-981ee5ca3194-scripts\") pod \"horizon-6d887b5cbf-5qjjt\" (UID: \"8a368664-2338-4579-ae15-981ee5ca3194\") " pod="openstack/horizon-6d887b5cbf-5qjjt" Oct 03 08:58:20 crc kubenswrapper[4790]: I1003 08:58:20.058981 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5xvwx\" (UniqueName: \"kubernetes.io/projected/8a368664-2338-4579-ae15-981ee5ca3194-kube-api-access-5xvwx\") pod \"horizon-6d887b5cbf-5qjjt\" (UID: \"8a368664-2338-4579-ae15-981ee5ca3194\") " pod="openstack/horizon-6d887b5cbf-5qjjt" Oct 03 08:58:20 crc kubenswrapper[4790]: I1003 08:58:20.128223 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-f6df867cc-pshbz" Oct 03 08:58:20 crc kubenswrapper[4790]: I1003 08:58:20.144012 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 03 08:58:20 crc kubenswrapper[4790]: I1003 08:58:20.169334 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8a368664-2338-4579-ae15-981ee5ca3194-logs\") pod \"horizon-6d887b5cbf-5qjjt\" (UID: \"8a368664-2338-4579-ae15-981ee5ca3194\") " pod="openstack/horizon-6d887b5cbf-5qjjt" Oct 03 08:58:20 crc kubenswrapper[4790]: I1003 08:58:20.169525 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8a368664-2338-4579-ae15-981ee5ca3194-scripts\") pod \"horizon-6d887b5cbf-5qjjt\" (UID: \"8a368664-2338-4579-ae15-981ee5ca3194\") " pod="openstack/horizon-6d887b5cbf-5qjjt" Oct 03 08:58:20 crc kubenswrapper[4790]: I1003 08:58:20.170657 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5xvwx\" (UniqueName: \"kubernetes.io/projected/8a368664-2338-4579-ae15-981ee5ca3194-kube-api-access-5xvwx\") pod \"horizon-6d887b5cbf-5qjjt\" (UID: \"8a368664-2338-4579-ae15-981ee5ca3194\") " pod="openstack/horizon-6d887b5cbf-5qjjt" Oct 03 08:58:20 crc kubenswrapper[4790]: I1003 08:58:20.170767 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/8a368664-2338-4579-ae15-981ee5ca3194-horizon-secret-key\") pod \"horizon-6d887b5cbf-5qjjt\" (UID: \"8a368664-2338-4579-ae15-981ee5ca3194\") " pod="openstack/horizon-6d887b5cbf-5qjjt" Oct 03 08:58:20 crc kubenswrapper[4790]: I1003 08:58:20.170803 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8a368664-2338-4579-ae15-981ee5ca3194-config-data\") pod \"horizon-6d887b5cbf-5qjjt\" (UID: \"8a368664-2338-4579-ae15-981ee5ca3194\") " pod="openstack/horizon-6d887b5cbf-5qjjt" Oct 03 08:58:20 crc kubenswrapper[4790]: I1003 08:58:20.174021 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8a368664-2338-4579-ae15-981ee5ca3194-scripts\") pod \"horizon-6d887b5cbf-5qjjt\" (UID: \"8a368664-2338-4579-ae15-981ee5ca3194\") " pod="openstack/horizon-6d887b5cbf-5qjjt" Oct 03 08:58:20 crc kubenswrapper[4790]: I1003 08:58:20.177052 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8a368664-2338-4579-ae15-981ee5ca3194-config-data\") pod \"horizon-6d887b5cbf-5qjjt\" (UID: \"8a368664-2338-4579-ae15-981ee5ca3194\") " pod="openstack/horizon-6d887b5cbf-5qjjt" Oct 03 08:58:20 crc kubenswrapper[4790]: I1003 08:58:20.177353 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8a368664-2338-4579-ae15-981ee5ca3194-logs\") pod \"horizon-6d887b5cbf-5qjjt\" (UID: \"8a368664-2338-4579-ae15-981ee5ca3194\") " pod="openstack/horizon-6d887b5cbf-5qjjt" Oct 03 08:58:20 crc kubenswrapper[4790]: I1003 08:58:20.186346 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 03 08:58:20 crc kubenswrapper[4790]: I1003 08:58:20.193908 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/8a368664-2338-4579-ae15-981ee5ca3194-horizon-secret-key\") pod \"horizon-6d887b5cbf-5qjjt\" (UID: \"8a368664-2338-4579-ae15-981ee5ca3194\") " pod="openstack/horizon-6d887b5cbf-5qjjt" Oct 03 08:58:20 crc kubenswrapper[4790]: I1003 08:58:20.197545 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 03 08:58:20 crc kubenswrapper[4790]: I1003 08:58:20.200154 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-ztmgb" Oct 03 08:58:20 crc kubenswrapper[4790]: I1003 08:58:20.212664 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 03 08:58:20 crc kubenswrapper[4790]: I1003 08:58:20.218115 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 03 08:58:20 crc kubenswrapper[4790]: I1003 08:58:20.284032 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xmsgj\" (UniqueName: \"kubernetes.io/projected/8a0817c5-b72b-4433-977d-6185317b0885-kube-api-access-xmsgj\") pod \"ceilometer-0\" (UID: \"8a0817c5-b72b-4433-977d-6185317b0885\") " pod="openstack/ceilometer-0" Oct 03 08:58:20 crc kubenswrapper[4790]: I1003 08:58:20.284300 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a0817c5-b72b-4433-977d-6185317b0885-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"8a0817c5-b72b-4433-977d-6185317b0885\") " pod="openstack/ceilometer-0" Oct 03 08:58:20 crc kubenswrapper[4790]: I1003 08:58:20.284356 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8a0817c5-b72b-4433-977d-6185317b0885-scripts\") pod \"ceilometer-0\" (UID: \"8a0817c5-b72b-4433-977d-6185317b0885\") " pod="openstack/ceilometer-0" Oct 03 08:58:20 crc kubenswrapper[4790]: I1003 08:58:20.284377 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8a0817c5-b72b-4433-977d-6185317b0885-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"8a0817c5-b72b-4433-977d-6185317b0885\") " pod="openstack/ceilometer-0" Oct 03 08:58:20 crc kubenswrapper[4790]: I1003 08:58:20.284445 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8a0817c5-b72b-4433-977d-6185317b0885-log-httpd\") pod \"ceilometer-0\" (UID: \"8a0817c5-b72b-4433-977d-6185317b0885\") " pod="openstack/ceilometer-0" Oct 03 08:58:20 crc kubenswrapper[4790]: I1003 08:58:20.284481 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a0817c5-b72b-4433-977d-6185317b0885-config-data\") pod \"ceilometer-0\" (UID: \"8a0817c5-b72b-4433-977d-6185317b0885\") " pod="openstack/ceilometer-0" Oct 03 08:58:20 crc kubenswrapper[4790]: I1003 08:58:20.284597 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8a0817c5-b72b-4433-977d-6185317b0885-run-httpd\") pod \"ceilometer-0\" (UID: \"8a0817c5-b72b-4433-977d-6185317b0885\") " pod="openstack/ceilometer-0" Oct 03 08:58:20 crc kubenswrapper[4790]: I1003 08:58:20.300490 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5xvwx\" (UniqueName: \"kubernetes.io/projected/8a368664-2338-4579-ae15-981ee5ca3194-kube-api-access-5xvwx\") pod \"horizon-6d887b5cbf-5qjjt\" (UID: \"8a368664-2338-4579-ae15-981ee5ca3194\") " pod="openstack/horizon-6d887b5cbf-5qjjt" Oct 03 08:58:20 crc kubenswrapper[4790]: I1003 08:58:20.376806 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-vzdz9"] Oct 03 08:58:20 crc kubenswrapper[4790]: I1003 08:58:20.377855 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6d887b5cbf-5qjjt" Oct 03 08:58:20 crc kubenswrapper[4790]: I1003 08:58:20.377921 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-vzdz9" Oct 03 08:58:20 crc kubenswrapper[4790]: I1003 08:58:20.381940 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 03 08:58:20 crc kubenswrapper[4790]: I1003 08:58:20.384254 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 03 08:58:20 crc kubenswrapper[4790]: I1003 08:58:20.387803 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Oct 03 08:58:20 crc kubenswrapper[4790]: I1003 08:58:20.387913 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Oct 03 08:58:20 crc kubenswrapper[4790]: I1003 08:58:20.388770 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-lvfmv" Oct 03 08:58:20 crc kubenswrapper[4790]: I1003 08:58:20.388946 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Oct 03 08:58:20 crc kubenswrapper[4790]: I1003 08:58:20.389137 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Oct 03 08:58:20 crc kubenswrapper[4790]: I1003 08:58:20.389643 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8a0817c5-b72b-4433-977d-6185317b0885-run-httpd\") pod \"ceilometer-0\" (UID: \"8a0817c5-b72b-4433-977d-6185317b0885\") " pod="openstack/ceilometer-0" Oct 03 08:58:20 crc kubenswrapper[4790]: I1003 08:58:20.389729 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xmsgj\" (UniqueName: \"kubernetes.io/projected/8a0817c5-b72b-4433-977d-6185317b0885-kube-api-access-xmsgj\") pod \"ceilometer-0\" (UID: \"8a0817c5-b72b-4433-977d-6185317b0885\") " pod="openstack/ceilometer-0" Oct 03 08:58:20 crc kubenswrapper[4790]: I1003 08:58:20.389784 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a0817c5-b72b-4433-977d-6185317b0885-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"8a0817c5-b72b-4433-977d-6185317b0885\") " pod="openstack/ceilometer-0" Oct 03 08:58:20 crc kubenswrapper[4790]: I1003 08:58:20.389824 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8a0817c5-b72b-4433-977d-6185317b0885-scripts\") pod \"ceilometer-0\" (UID: \"8a0817c5-b72b-4433-977d-6185317b0885\") " pod="openstack/ceilometer-0" Oct 03 08:58:20 crc kubenswrapper[4790]: I1003 08:58:20.389842 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8a0817c5-b72b-4433-977d-6185317b0885-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"8a0817c5-b72b-4433-977d-6185317b0885\") " pod="openstack/ceilometer-0" Oct 03 08:58:20 crc kubenswrapper[4790]: I1003 08:58:20.389937 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8a0817c5-b72b-4433-977d-6185317b0885-log-httpd\") pod \"ceilometer-0\" (UID: \"8a0817c5-b72b-4433-977d-6185317b0885\") " pod="openstack/ceilometer-0" Oct 03 08:58:20 crc kubenswrapper[4790]: I1003 08:58:20.389959 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a0817c5-b72b-4433-977d-6185317b0885-config-data\") pod \"ceilometer-0\" (UID: \"8a0817c5-b72b-4433-977d-6185317b0885\") " pod="openstack/ceilometer-0" Oct 03 08:58:20 crc kubenswrapper[4790]: I1003 08:58:20.394182 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8a0817c5-b72b-4433-977d-6185317b0885-log-httpd\") pod \"ceilometer-0\" (UID: \"8a0817c5-b72b-4433-977d-6185317b0885\") " pod="openstack/ceilometer-0" Oct 03 08:58:20 crc kubenswrapper[4790]: I1003 08:58:20.394230 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8a0817c5-b72b-4433-977d-6185317b0885-run-httpd\") pod \"ceilometer-0\" (UID: \"8a0817c5-b72b-4433-977d-6185317b0885\") " pod="openstack/ceilometer-0" Oct 03 08:58:20 crc kubenswrapper[4790]: I1003 08:58:20.410428 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8a0817c5-b72b-4433-977d-6185317b0885-scripts\") pod \"ceilometer-0\" (UID: \"8a0817c5-b72b-4433-977d-6185317b0885\") " pod="openstack/ceilometer-0" Oct 03 08:58:20 crc kubenswrapper[4790]: I1003 08:58:20.411131 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Oct 03 08:58:20 crc kubenswrapper[4790]: I1003 08:58:20.412191 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-czqcc" Oct 03 08:58:20 crc kubenswrapper[4790]: I1003 08:58:20.416316 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-f6df867cc-pshbz"] Oct 03 08:58:20 crc kubenswrapper[4790]: I1003 08:58:20.424810 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8a0817c5-b72b-4433-977d-6185317b0885-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"8a0817c5-b72b-4433-977d-6185317b0885\") " pod="openstack/ceilometer-0" Oct 03 08:58:20 crc kubenswrapper[4790]: I1003 08:58:20.424937 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-14c0-account-create-82p7s"] Oct 03 08:58:20 crc kubenswrapper[4790]: I1003 08:58:20.425891 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a0817c5-b72b-4433-977d-6185317b0885-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"8a0817c5-b72b-4433-977d-6185317b0885\") " pod="openstack/ceilometer-0" Oct 03 08:58:20 crc kubenswrapper[4790]: I1003 08:58:20.426959 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-14c0-account-create-82p7s" Oct 03 08:58:20 crc kubenswrapper[4790]: I1003 08:58:20.432689 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-867694955c-fpnzb"] Oct 03 08:58:20 crc kubenswrapper[4790]: I1003 08:58:20.440414 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a0817c5-b72b-4433-977d-6185317b0885-config-data\") pod \"ceilometer-0\" (UID: \"8a0817c5-b72b-4433-977d-6185317b0885\") " pod="openstack/ceilometer-0" Oct 03 08:58:20 crc kubenswrapper[4790]: I1003 08:58:20.441754 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-867694955c-fpnzb" Oct 03 08:58:20 crc kubenswrapper[4790]: I1003 08:58:20.444420 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Oct 03 08:58:20 crc kubenswrapper[4790]: I1003 08:58:20.448120 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xmsgj\" (UniqueName: \"kubernetes.io/projected/8a0817c5-b72b-4433-977d-6185317b0885-kube-api-access-xmsgj\") pod \"ceilometer-0\" (UID: \"8a0817c5-b72b-4433-977d-6185317b0885\") " pod="openstack/ceilometer-0" Oct 03 08:58:20 crc kubenswrapper[4790]: I1003 08:58:20.458429 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-vzdz9"] Oct 03 08:58:20 crc kubenswrapper[4790]: I1003 08:58:20.485216 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-14c0-account-create-82p7s"] Oct 03 08:58:20 crc kubenswrapper[4790]: I1003 08:58:20.496481 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a839597d-be75-4b9f-9d18-a2e8045a674b-scripts\") pod \"placement-db-sync-vzdz9\" (UID: \"a839597d-be75-4b9f-9d18-a2e8045a674b\") " pod="openstack/placement-db-sync-vzdz9" Oct 03 08:58:20 crc kubenswrapper[4790]: I1003 08:58:20.496544 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a839597d-be75-4b9f-9d18-a2e8045a674b-config-data\") pod \"placement-db-sync-vzdz9\" (UID: \"a839597d-be75-4b9f-9d18-a2e8045a674b\") " pod="openstack/placement-db-sync-vzdz9" Oct 03 08:58:20 crc kubenswrapper[4790]: I1003 08:58:20.496606 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rx8pr\" (UniqueName: \"kubernetes.io/projected/a839597d-be75-4b9f-9d18-a2e8045a674b-kube-api-access-rx8pr\") pod \"placement-db-sync-vzdz9\" (UID: \"a839597d-be75-4b9f-9d18-a2e8045a674b\") " pod="openstack/placement-db-sync-vzdz9" Oct 03 08:58:20 crc kubenswrapper[4790]: I1003 08:58:20.496634 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/64754289-91b5-4f84-9809-6271b6a13031-scripts\") pod \"horizon-867694955c-fpnzb\" (UID: \"64754289-91b5-4f84-9809-6271b6a13031\") " pod="openstack/horizon-867694955c-fpnzb" Oct 03 08:58:20 crc kubenswrapper[4790]: I1003 08:58:20.496838 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/64754289-91b5-4f84-9809-6271b6a13031-logs\") pod \"horizon-867694955c-fpnzb\" (UID: \"64754289-91b5-4f84-9809-6271b6a13031\") " pod="openstack/horizon-867694955c-fpnzb" Oct 03 08:58:20 crc kubenswrapper[4790]: I1003 08:58:20.496869 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/64754289-91b5-4f84-9809-6271b6a13031-horizon-secret-key\") pod \"horizon-867694955c-fpnzb\" (UID: \"64754289-91b5-4f84-9809-6271b6a13031\") " pod="openstack/horizon-867694955c-fpnzb" Oct 03 08:58:20 crc kubenswrapper[4790]: I1003 08:58:20.496889 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jlfrj\" (UniqueName: \"kubernetes.io/projected/64754289-91b5-4f84-9809-6271b6a13031-kube-api-access-jlfrj\") pod \"horizon-867694955c-fpnzb\" (UID: \"64754289-91b5-4f84-9809-6271b6a13031\") " pod="openstack/horizon-867694955c-fpnzb" Oct 03 08:58:20 crc kubenswrapper[4790]: I1003 08:58:20.496905 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a839597d-be75-4b9f-9d18-a2e8045a674b-combined-ca-bundle\") pod \"placement-db-sync-vzdz9\" (UID: \"a839597d-be75-4b9f-9d18-a2e8045a674b\") " pod="openstack/placement-db-sync-vzdz9" Oct 03 08:58:20 crc kubenswrapper[4790]: I1003 08:58:20.497147 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/64754289-91b5-4f84-9809-6271b6a13031-config-data\") pod \"horizon-867694955c-fpnzb\" (UID: \"64754289-91b5-4f84-9809-6271b6a13031\") " pod="openstack/horizon-867694955c-fpnzb" Oct 03 08:58:20 crc kubenswrapper[4790]: I1003 08:58:20.497200 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a839597d-be75-4b9f-9d18-a2e8045a674b-logs\") pod \"placement-db-sync-vzdz9\" (UID: \"a839597d-be75-4b9f-9d18-a2e8045a674b\") " pod="openstack/placement-db-sync-vzdz9" Oct 03 08:58:20 crc kubenswrapper[4790]: I1003 08:58:20.501622 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 03 08:58:20 crc kubenswrapper[4790]: I1003 08:58:20.531964 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 03 08:58:20 crc kubenswrapper[4790]: I1003 08:58:20.568085 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6454d9f559-66tmq"] Oct 03 08:58:20 crc kubenswrapper[4790]: I1003 08:58:20.569674 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6454d9f559-66tmq" Oct 03 08:58:20 crc kubenswrapper[4790]: I1003 08:58:20.596889 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6454d9f559-66tmq"] Oct 03 08:58:20 crc kubenswrapper[4790]: I1003 08:58:20.600552 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/64754289-91b5-4f84-9809-6271b6a13031-scripts\") pod \"horizon-867694955c-fpnzb\" (UID: \"64754289-91b5-4f84-9809-6271b6a13031\") " pod="openstack/horizon-867694955c-fpnzb" Oct 03 08:58:20 crc kubenswrapper[4790]: I1003 08:58:20.600612 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/64754289-91b5-4f84-9809-6271b6a13031-logs\") pod \"horizon-867694955c-fpnzb\" (UID: \"64754289-91b5-4f84-9809-6271b6a13031\") " pod="openstack/horizon-867694955c-fpnzb" Oct 03 08:58:20 crc kubenswrapper[4790]: I1003 08:58:20.600669 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e08f91ab-f702-49a8-be10-3114d8a10e48-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"e08f91ab-f702-49a8-be10-3114d8a10e48\") " pod="openstack/glance-default-internal-api-0" Oct 03 08:58:20 crc kubenswrapper[4790]: I1003 08:58:20.600688 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e08f91ab-f702-49a8-be10-3114d8a10e48-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"e08f91ab-f702-49a8-be10-3114d8a10e48\") " pod="openstack/glance-default-internal-api-0" Oct 03 08:58:20 crc kubenswrapper[4790]: I1003 08:58:20.600707 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/64754289-91b5-4f84-9809-6271b6a13031-horizon-secret-key\") pod \"horizon-867694955c-fpnzb\" (UID: \"64754289-91b5-4f84-9809-6271b6a13031\") " pod="openstack/horizon-867694955c-fpnzb" Oct 03 08:58:20 crc kubenswrapper[4790]: I1003 08:58:20.600723 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jlfrj\" (UniqueName: \"kubernetes.io/projected/64754289-91b5-4f84-9809-6271b6a13031-kube-api-access-jlfrj\") pod \"horizon-867694955c-fpnzb\" (UID: \"64754289-91b5-4f84-9809-6271b6a13031\") " pod="openstack/horizon-867694955c-fpnzb" Oct 03 08:58:20 crc kubenswrapper[4790]: I1003 08:58:20.600740 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a839597d-be75-4b9f-9d18-a2e8045a674b-combined-ca-bundle\") pod \"placement-db-sync-vzdz9\" (UID: \"a839597d-be75-4b9f-9d18-a2e8045a674b\") " pod="openstack/placement-db-sync-vzdz9" Oct 03 08:58:20 crc kubenswrapper[4790]: I1003 08:58:20.600759 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wl86j\" (UniqueName: \"kubernetes.io/projected/f11270f2-ead7-4001-94f9-54e66cfbe017-kube-api-access-wl86j\") pod \"barbican-14c0-account-create-82p7s\" (UID: \"f11270f2-ead7-4001-94f9-54e66cfbe017\") " pod="openstack/barbican-14c0-account-create-82p7s" Oct 03 08:58:20 crc kubenswrapper[4790]: I1003 08:58:20.600777 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"e08f91ab-f702-49a8-be10-3114d8a10e48\") " pod="openstack/glance-default-internal-api-0" Oct 03 08:58:20 crc kubenswrapper[4790]: I1003 08:58:20.600813 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e08f91ab-f702-49a8-be10-3114d8a10e48-config-data\") pod \"glance-default-internal-api-0\" (UID: \"e08f91ab-f702-49a8-be10-3114d8a10e48\") " pod="openstack/glance-default-internal-api-0" Oct 03 08:58:20 crc kubenswrapper[4790]: I1003 08:58:20.600873 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/64754289-91b5-4f84-9809-6271b6a13031-config-data\") pod \"horizon-867694955c-fpnzb\" (UID: \"64754289-91b5-4f84-9809-6271b6a13031\") " pod="openstack/horizon-867694955c-fpnzb" Oct 03 08:58:20 crc kubenswrapper[4790]: I1003 08:58:20.600892 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a839597d-be75-4b9f-9d18-a2e8045a674b-logs\") pod \"placement-db-sync-vzdz9\" (UID: \"a839597d-be75-4b9f-9d18-a2e8045a674b\") " pod="openstack/placement-db-sync-vzdz9" Oct 03 08:58:20 crc kubenswrapper[4790]: I1003 08:58:20.600929 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e08f91ab-f702-49a8-be10-3114d8a10e48-scripts\") pod \"glance-default-internal-api-0\" (UID: \"e08f91ab-f702-49a8-be10-3114d8a10e48\") " pod="openstack/glance-default-internal-api-0" Oct 03 08:58:20 crc kubenswrapper[4790]: I1003 08:58:20.600948 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a839597d-be75-4b9f-9d18-a2e8045a674b-scripts\") pod \"placement-db-sync-vzdz9\" (UID: \"a839597d-be75-4b9f-9d18-a2e8045a674b\") " pod="openstack/placement-db-sync-vzdz9" Oct 03 08:58:20 crc kubenswrapper[4790]: I1003 08:58:20.600976 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e08f91ab-f702-49a8-be10-3114d8a10e48-logs\") pod \"glance-default-internal-api-0\" (UID: \"e08f91ab-f702-49a8-be10-3114d8a10e48\") " pod="openstack/glance-default-internal-api-0" Oct 03 08:58:20 crc kubenswrapper[4790]: I1003 08:58:20.600992 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e08f91ab-f702-49a8-be10-3114d8a10e48-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"e08f91ab-f702-49a8-be10-3114d8a10e48\") " pod="openstack/glance-default-internal-api-0" Oct 03 08:58:20 crc kubenswrapper[4790]: I1003 08:58:20.601010 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a839597d-be75-4b9f-9d18-a2e8045a674b-config-data\") pod \"placement-db-sync-vzdz9\" (UID: \"a839597d-be75-4b9f-9d18-a2e8045a674b\") " pod="openstack/placement-db-sync-vzdz9" Oct 03 08:58:20 crc kubenswrapper[4790]: I1003 08:58:20.601047 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mzq4n\" (UniqueName: \"kubernetes.io/projected/e08f91ab-f702-49a8-be10-3114d8a10e48-kube-api-access-mzq4n\") pod \"glance-default-internal-api-0\" (UID: \"e08f91ab-f702-49a8-be10-3114d8a10e48\") " pod="openstack/glance-default-internal-api-0" Oct 03 08:58:20 crc kubenswrapper[4790]: I1003 08:58:20.601120 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rx8pr\" (UniqueName: \"kubernetes.io/projected/a839597d-be75-4b9f-9d18-a2e8045a674b-kube-api-access-rx8pr\") pod \"placement-db-sync-vzdz9\" (UID: \"a839597d-be75-4b9f-9d18-a2e8045a674b\") " pod="openstack/placement-db-sync-vzdz9" Oct 03 08:58:20 crc kubenswrapper[4790]: I1003 08:58:20.601972 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a839597d-be75-4b9f-9d18-a2e8045a674b-logs\") pod \"placement-db-sync-vzdz9\" (UID: \"a839597d-be75-4b9f-9d18-a2e8045a674b\") " pod="openstack/placement-db-sync-vzdz9" Oct 03 08:58:20 crc kubenswrapper[4790]: I1003 08:58:20.604879 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/64754289-91b5-4f84-9809-6271b6a13031-config-data\") pod \"horizon-867694955c-fpnzb\" (UID: \"64754289-91b5-4f84-9809-6271b6a13031\") " pod="openstack/horizon-867694955c-fpnzb" Oct 03 08:58:20 crc kubenswrapper[4790]: I1003 08:58:20.606972 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/64754289-91b5-4f84-9809-6271b6a13031-logs\") pod \"horizon-867694955c-fpnzb\" (UID: \"64754289-91b5-4f84-9809-6271b6a13031\") " pod="openstack/horizon-867694955c-fpnzb" Oct 03 08:58:20 crc kubenswrapper[4790]: I1003 08:58:20.608217 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/64754289-91b5-4f84-9809-6271b6a13031-scripts\") pod \"horizon-867694955c-fpnzb\" (UID: \"64754289-91b5-4f84-9809-6271b6a13031\") " pod="openstack/horizon-867694955c-fpnzb" Oct 03 08:58:20 crc kubenswrapper[4790]: I1003 08:58:20.614907 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a839597d-be75-4b9f-9d18-a2e8045a674b-combined-ca-bundle\") pod \"placement-db-sync-vzdz9\" (UID: \"a839597d-be75-4b9f-9d18-a2e8045a674b\") " pod="openstack/placement-db-sync-vzdz9" Oct 03 08:58:20 crc kubenswrapper[4790]: I1003 08:58:20.614959 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-867694955c-fpnzb"] Oct 03 08:58:20 crc kubenswrapper[4790]: I1003 08:58:20.625750 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a839597d-be75-4b9f-9d18-a2e8045a674b-config-data\") pod \"placement-db-sync-vzdz9\" (UID: \"a839597d-be75-4b9f-9d18-a2e8045a674b\") " pod="openstack/placement-db-sync-vzdz9" Oct 03 08:58:20 crc kubenswrapper[4790]: I1003 08:58:20.630111 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a839597d-be75-4b9f-9d18-a2e8045a674b-scripts\") pod \"placement-db-sync-vzdz9\" (UID: \"a839597d-be75-4b9f-9d18-a2e8045a674b\") " pod="openstack/placement-db-sync-vzdz9" Oct 03 08:58:20 crc kubenswrapper[4790]: I1003 08:58:20.630511 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/64754289-91b5-4f84-9809-6271b6a13031-horizon-secret-key\") pod \"horizon-867694955c-fpnzb\" (UID: \"64754289-91b5-4f84-9809-6271b6a13031\") " pod="openstack/horizon-867694955c-fpnzb" Oct 03 08:58:20 crc kubenswrapper[4790]: I1003 08:58:20.634029 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rx8pr\" (UniqueName: \"kubernetes.io/projected/a839597d-be75-4b9f-9d18-a2e8045a674b-kube-api-access-rx8pr\") pod \"placement-db-sync-vzdz9\" (UID: \"a839597d-be75-4b9f-9d18-a2e8045a674b\") " pod="openstack/placement-db-sync-vzdz9" Oct 03 08:58:20 crc kubenswrapper[4790]: I1003 08:58:20.640314 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jlfrj\" (UniqueName: \"kubernetes.io/projected/64754289-91b5-4f84-9809-6271b6a13031-kube-api-access-jlfrj\") pod \"horizon-867694955c-fpnzb\" (UID: \"64754289-91b5-4f84-9809-6271b6a13031\") " pod="openstack/horizon-867694955c-fpnzb" Oct 03 08:58:20 crc kubenswrapper[4790]: I1003 08:58:20.695892 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-5d29-account-create-jqwpk"] Oct 03 08:58:20 crc kubenswrapper[4790]: I1003 08:58:20.697410 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-5d29-account-create-jqwpk" Oct 03 08:58:20 crc kubenswrapper[4790]: I1003 08:58:20.703403 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Oct 03 08:58:20 crc kubenswrapper[4790]: I1003 08:58:20.703882 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7drk6\" (UniqueName: \"kubernetes.io/projected/1404dbad-1c09-4b76-a6a1-d563ed8d8bde-kube-api-access-7drk6\") pod \"dnsmasq-dns-6454d9f559-66tmq\" (UID: \"1404dbad-1c09-4b76-a6a1-d563ed8d8bde\") " pod="openstack/dnsmasq-dns-6454d9f559-66tmq" Oct 03 08:58:20 crc kubenswrapper[4790]: I1003 08:58:20.704003 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e08f91ab-f702-49a8-be10-3114d8a10e48-scripts\") pod \"glance-default-internal-api-0\" (UID: \"e08f91ab-f702-49a8-be10-3114d8a10e48\") " pod="openstack/glance-default-internal-api-0" Oct 03 08:58:20 crc kubenswrapper[4790]: I1003 08:58:20.704069 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e08f91ab-f702-49a8-be10-3114d8a10e48-logs\") pod \"glance-default-internal-api-0\" (UID: \"e08f91ab-f702-49a8-be10-3114d8a10e48\") " pod="openstack/glance-default-internal-api-0" Oct 03 08:58:20 crc kubenswrapper[4790]: I1003 08:58:20.704099 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e08f91ab-f702-49a8-be10-3114d8a10e48-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"e08f91ab-f702-49a8-be10-3114d8a10e48\") " pod="openstack/glance-default-internal-api-0" Oct 03 08:58:20 crc kubenswrapper[4790]: I1003 08:58:20.705015 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mzq4n\" (UniqueName: \"kubernetes.io/projected/e08f91ab-f702-49a8-be10-3114d8a10e48-kube-api-access-mzq4n\") pod \"glance-default-internal-api-0\" (UID: \"e08f91ab-f702-49a8-be10-3114d8a10e48\") " pod="openstack/glance-default-internal-api-0" Oct 03 08:58:20 crc kubenswrapper[4790]: I1003 08:58:20.705064 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1404dbad-1c09-4b76-a6a1-d563ed8d8bde-ovsdbserver-sb\") pod \"dnsmasq-dns-6454d9f559-66tmq\" (UID: \"1404dbad-1c09-4b76-a6a1-d563ed8d8bde\") " pod="openstack/dnsmasq-dns-6454d9f559-66tmq" Oct 03 08:58:20 crc kubenswrapper[4790]: I1003 08:58:20.705119 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1404dbad-1c09-4b76-a6a1-d563ed8d8bde-dns-swift-storage-0\") pod \"dnsmasq-dns-6454d9f559-66tmq\" (UID: \"1404dbad-1c09-4b76-a6a1-d563ed8d8bde\") " pod="openstack/dnsmasq-dns-6454d9f559-66tmq" Oct 03 08:58:20 crc kubenswrapper[4790]: I1003 08:58:20.705193 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1404dbad-1c09-4b76-a6a1-d563ed8d8bde-ovsdbserver-nb\") pod \"dnsmasq-dns-6454d9f559-66tmq\" (UID: \"1404dbad-1c09-4b76-a6a1-d563ed8d8bde\") " pod="openstack/dnsmasq-dns-6454d9f559-66tmq" Oct 03 08:58:20 crc kubenswrapper[4790]: I1003 08:58:20.705217 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e08f91ab-f702-49a8-be10-3114d8a10e48-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"e08f91ab-f702-49a8-be10-3114d8a10e48\") " pod="openstack/glance-default-internal-api-0" Oct 03 08:58:20 crc kubenswrapper[4790]: I1003 08:58:20.705235 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e08f91ab-f702-49a8-be10-3114d8a10e48-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"e08f91ab-f702-49a8-be10-3114d8a10e48\") " pod="openstack/glance-default-internal-api-0" Oct 03 08:58:20 crc kubenswrapper[4790]: I1003 08:58:20.705265 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wl86j\" (UniqueName: \"kubernetes.io/projected/f11270f2-ead7-4001-94f9-54e66cfbe017-kube-api-access-wl86j\") pod \"barbican-14c0-account-create-82p7s\" (UID: \"f11270f2-ead7-4001-94f9-54e66cfbe017\") " pod="openstack/barbican-14c0-account-create-82p7s" Oct 03 08:58:20 crc kubenswrapper[4790]: I1003 08:58:20.705285 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"e08f91ab-f702-49a8-be10-3114d8a10e48\") " pod="openstack/glance-default-internal-api-0" Oct 03 08:58:20 crc kubenswrapper[4790]: I1003 08:58:20.705302 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e08f91ab-f702-49a8-be10-3114d8a10e48-config-data\") pod \"glance-default-internal-api-0\" (UID: \"e08f91ab-f702-49a8-be10-3114d8a10e48\") " pod="openstack/glance-default-internal-api-0" Oct 03 08:58:20 crc kubenswrapper[4790]: I1003 08:58:20.705339 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1404dbad-1c09-4b76-a6a1-d563ed8d8bde-config\") pod \"dnsmasq-dns-6454d9f559-66tmq\" (UID: \"1404dbad-1c09-4b76-a6a1-d563ed8d8bde\") " pod="openstack/dnsmasq-dns-6454d9f559-66tmq" Oct 03 08:58:20 crc kubenswrapper[4790]: I1003 08:58:20.705358 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1404dbad-1c09-4b76-a6a1-d563ed8d8bde-dns-svc\") pod \"dnsmasq-dns-6454d9f559-66tmq\" (UID: \"1404dbad-1c09-4b76-a6a1-d563ed8d8bde\") " pod="openstack/dnsmasq-dns-6454d9f559-66tmq" Oct 03 08:58:20 crc kubenswrapper[4790]: I1003 08:58:20.704668 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e08f91ab-f702-49a8-be10-3114d8a10e48-logs\") pod \"glance-default-internal-api-0\" (UID: \"e08f91ab-f702-49a8-be10-3114d8a10e48\") " pod="openstack/glance-default-internal-api-0" Oct 03 08:58:20 crc kubenswrapper[4790]: I1003 08:58:20.706188 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e08f91ab-f702-49a8-be10-3114d8a10e48-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"e08f91ab-f702-49a8-be10-3114d8a10e48\") " pod="openstack/glance-default-internal-api-0" Oct 03 08:58:20 crc kubenswrapper[4790]: I1003 08:58:20.707945 4790 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"e08f91ab-f702-49a8-be10-3114d8a10e48\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/glance-default-internal-api-0" Oct 03 08:58:20 crc kubenswrapper[4790]: I1003 08:58:20.727293 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e08f91ab-f702-49a8-be10-3114d8a10e48-config-data\") pod \"glance-default-internal-api-0\" (UID: \"e08f91ab-f702-49a8-be10-3114d8a10e48\") " pod="openstack/glance-default-internal-api-0" Oct 03 08:58:20 crc kubenswrapper[4790]: I1003 08:58:20.729424 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e08f91ab-f702-49a8-be10-3114d8a10e48-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"e08f91ab-f702-49a8-be10-3114d8a10e48\") " pod="openstack/glance-default-internal-api-0" Oct 03 08:58:20 crc kubenswrapper[4790]: I1003 08:58:20.747202 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e08f91ab-f702-49a8-be10-3114d8a10e48-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"e08f91ab-f702-49a8-be10-3114d8a10e48\") " pod="openstack/glance-default-internal-api-0" Oct 03 08:58:20 crc kubenswrapper[4790]: I1003 08:58:20.747984 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e08f91ab-f702-49a8-be10-3114d8a10e48-scripts\") pod \"glance-default-internal-api-0\" (UID: \"e08f91ab-f702-49a8-be10-3114d8a10e48\") " pod="openstack/glance-default-internal-api-0" Oct 03 08:58:20 crc kubenswrapper[4790]: I1003 08:58:20.756505 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mzq4n\" (UniqueName: \"kubernetes.io/projected/e08f91ab-f702-49a8-be10-3114d8a10e48-kube-api-access-mzq4n\") pod \"glance-default-internal-api-0\" (UID: \"e08f91ab-f702-49a8-be10-3114d8a10e48\") " pod="openstack/glance-default-internal-api-0" Oct 03 08:58:20 crc kubenswrapper[4790]: I1003 08:58:20.763074 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wl86j\" (UniqueName: \"kubernetes.io/projected/f11270f2-ead7-4001-94f9-54e66cfbe017-kube-api-access-wl86j\") pod \"barbican-14c0-account-create-82p7s\" (UID: \"f11270f2-ead7-4001-94f9-54e66cfbe017\") " pod="openstack/barbican-14c0-account-create-82p7s" Oct 03 08:58:20 crc kubenswrapper[4790]: I1003 08:58:20.764218 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Oct 03 08:58:20 crc kubenswrapper[4790]: I1003 08:58:20.765666 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 03 08:58:20 crc kubenswrapper[4790]: I1003 08:58:20.768671 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Oct 03 08:58:20 crc kubenswrapper[4790]: I1003 08:58:20.768741 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Oct 03 08:58:20 crc kubenswrapper[4790]: I1003 08:58:20.790665 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-5d29-account-create-jqwpk"] Oct 03 08:58:20 crc kubenswrapper[4790]: I1003 08:58:20.800376 4790 generic.go:334] "Generic (PLEG): container finished" podID="67fecd5f-4a82-413f-88ba-a07aa56c71a8" containerID="eb8c3dc71c8ca8b48950da0bcc2bd797b69afd28744ac4b7dd0cd9dd51529407" exitCode=0 Oct 03 08:58:20 crc kubenswrapper[4790]: I1003 08:58:20.801272 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56d75b6d47-8rct7" event={"ID":"67fecd5f-4a82-413f-88ba-a07aa56c71a8","Type":"ContainerDied","Data":"eb8c3dc71c8ca8b48950da0bcc2bd797b69afd28744ac4b7dd0cd9dd51529407"} Oct 03 08:58:20 crc kubenswrapper[4790]: I1003 08:58:20.801394 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56d75b6d47-8rct7" event={"ID":"67fecd5f-4a82-413f-88ba-a07aa56c71a8","Type":"ContainerDied","Data":"8cf55ead6f46f31919be112f8da85e30f23ed27a30d098eb4b49e904257b164e"} Oct 03 08:58:20 crc kubenswrapper[4790]: I1003 08:58:20.801482 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8cf55ead6f46f31919be112f8da85e30f23ed27a30d098eb4b49e904257b164e" Oct 03 08:58:20 crc kubenswrapper[4790]: I1003 08:58:20.802168 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"e08f91ab-f702-49a8-be10-3114d8a10e48\") " pod="openstack/glance-default-internal-api-0" Oct 03 08:58:20 crc kubenswrapper[4790]: I1003 08:58:20.807147 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7ngs9\" (UniqueName: \"kubernetes.io/projected/355f59b0-e1a4-47c9-b24b-fffda1be9952-kube-api-access-7ngs9\") pod \"glance-default-external-api-0\" (UID: \"355f59b0-e1a4-47c9-b24b-fffda1be9952\") " pod="openstack/glance-default-external-api-0" Oct 03 08:58:20 crc kubenswrapper[4790]: I1003 08:58:20.807297 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/355f59b0-e1a4-47c9-b24b-fffda1be9952-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"355f59b0-e1a4-47c9-b24b-fffda1be9952\") " pod="openstack/glance-default-external-api-0" Oct 03 08:58:20 crc kubenswrapper[4790]: I1003 08:58:20.807394 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1404dbad-1c09-4b76-a6a1-d563ed8d8bde-ovsdbserver-nb\") pod \"dnsmasq-dns-6454d9f559-66tmq\" (UID: \"1404dbad-1c09-4b76-a6a1-d563ed8d8bde\") " pod="openstack/dnsmasq-dns-6454d9f559-66tmq" Oct 03 08:58:20 crc kubenswrapper[4790]: I1003 08:58:20.807479 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/355f59b0-e1a4-47c9-b24b-fffda1be9952-config-data\") pod \"glance-default-external-api-0\" (UID: \"355f59b0-e1a4-47c9-b24b-fffda1be9952\") " pod="openstack/glance-default-external-api-0" Oct 03 08:58:20 crc kubenswrapper[4790]: I1003 08:58:20.807587 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"355f59b0-e1a4-47c9-b24b-fffda1be9952\") " pod="openstack/glance-default-external-api-0" Oct 03 08:58:20 crc kubenswrapper[4790]: I1003 08:58:20.807662 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/355f59b0-e1a4-47c9-b24b-fffda1be9952-logs\") pod \"glance-default-external-api-0\" (UID: \"355f59b0-e1a4-47c9-b24b-fffda1be9952\") " pod="openstack/glance-default-external-api-0" Oct 03 08:58:20 crc kubenswrapper[4790]: I1003 08:58:20.807757 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1404dbad-1c09-4b76-a6a1-d563ed8d8bde-config\") pod \"dnsmasq-dns-6454d9f559-66tmq\" (UID: \"1404dbad-1c09-4b76-a6a1-d563ed8d8bde\") " pod="openstack/dnsmasq-dns-6454d9f559-66tmq" Oct 03 08:58:20 crc kubenswrapper[4790]: I1003 08:58:20.807835 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1404dbad-1c09-4b76-a6a1-d563ed8d8bde-dns-svc\") pod \"dnsmasq-dns-6454d9f559-66tmq\" (UID: \"1404dbad-1c09-4b76-a6a1-d563ed8d8bde\") " pod="openstack/dnsmasq-dns-6454d9f559-66tmq" Oct 03 08:58:20 crc kubenswrapper[4790]: I1003 08:58:20.807970 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7drk6\" (UniqueName: \"kubernetes.io/projected/1404dbad-1c09-4b76-a6a1-d563ed8d8bde-kube-api-access-7drk6\") pod \"dnsmasq-dns-6454d9f559-66tmq\" (UID: \"1404dbad-1c09-4b76-a6a1-d563ed8d8bde\") " pod="openstack/dnsmasq-dns-6454d9f559-66tmq" Oct 03 08:58:20 crc kubenswrapper[4790]: I1003 08:58:20.808094 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/355f59b0-e1a4-47c9-b24b-fffda1be9952-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"355f59b0-e1a4-47c9-b24b-fffda1be9952\") " pod="openstack/glance-default-external-api-0" Oct 03 08:58:20 crc kubenswrapper[4790]: I1003 08:58:20.808162 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/355f59b0-e1a4-47c9-b24b-fffda1be9952-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"355f59b0-e1a4-47c9-b24b-fffda1be9952\") " pod="openstack/glance-default-external-api-0" Oct 03 08:58:20 crc kubenswrapper[4790]: I1003 08:58:20.808251 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/355f59b0-e1a4-47c9-b24b-fffda1be9952-scripts\") pod \"glance-default-external-api-0\" (UID: \"355f59b0-e1a4-47c9-b24b-fffda1be9952\") " pod="openstack/glance-default-external-api-0" Oct 03 08:58:20 crc kubenswrapper[4790]: I1003 08:58:20.808356 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1404dbad-1c09-4b76-a6a1-d563ed8d8bde-ovsdbserver-nb\") pod \"dnsmasq-dns-6454d9f559-66tmq\" (UID: \"1404dbad-1c09-4b76-a6a1-d563ed8d8bde\") " pod="openstack/dnsmasq-dns-6454d9f559-66tmq" Oct 03 08:58:20 crc kubenswrapper[4790]: I1003 08:58:20.811038 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nb7ql\" (UniqueName: \"kubernetes.io/projected/feee3da2-b9c2-4091-b0cf-cacd8f7f0fdc-kube-api-access-nb7ql\") pod \"cinder-5d29-account-create-jqwpk\" (UID: \"feee3da2-b9c2-4091-b0cf-cacd8f7f0fdc\") " pod="openstack/cinder-5d29-account-create-jqwpk" Oct 03 08:58:20 crc kubenswrapper[4790]: I1003 08:58:20.811187 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1404dbad-1c09-4b76-a6a1-d563ed8d8bde-ovsdbserver-sb\") pod \"dnsmasq-dns-6454d9f559-66tmq\" (UID: \"1404dbad-1c09-4b76-a6a1-d563ed8d8bde\") " pod="openstack/dnsmasq-dns-6454d9f559-66tmq" Oct 03 08:58:20 crc kubenswrapper[4790]: I1003 08:58:20.811265 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1404dbad-1c09-4b76-a6a1-d563ed8d8bde-dns-swift-storage-0\") pod \"dnsmasq-dns-6454d9f559-66tmq\" (UID: \"1404dbad-1c09-4b76-a6a1-d563ed8d8bde\") " pod="openstack/dnsmasq-dns-6454d9f559-66tmq" Oct 03 08:58:20 crc kubenswrapper[4790]: I1003 08:58:20.811300 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1404dbad-1c09-4b76-a6a1-d563ed8d8bde-config\") pod \"dnsmasq-dns-6454d9f559-66tmq\" (UID: \"1404dbad-1c09-4b76-a6a1-d563ed8d8bde\") " pod="openstack/dnsmasq-dns-6454d9f559-66tmq" Oct 03 08:58:20 crc kubenswrapper[4790]: I1003 08:58:20.811928 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1404dbad-1c09-4b76-a6a1-d563ed8d8bde-ovsdbserver-sb\") pod \"dnsmasq-dns-6454d9f559-66tmq\" (UID: \"1404dbad-1c09-4b76-a6a1-d563ed8d8bde\") " pod="openstack/dnsmasq-dns-6454d9f559-66tmq" Oct 03 08:58:20 crc kubenswrapper[4790]: I1003 08:58:20.812221 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1404dbad-1c09-4b76-a6a1-d563ed8d8bde-dns-svc\") pod \"dnsmasq-dns-6454d9f559-66tmq\" (UID: \"1404dbad-1c09-4b76-a6a1-d563ed8d8bde\") " pod="openstack/dnsmasq-dns-6454d9f559-66tmq" Oct 03 08:58:20 crc kubenswrapper[4790]: I1003 08:58:20.814954 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 03 08:58:20 crc kubenswrapper[4790]: I1003 08:58:20.824904 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1404dbad-1c09-4b76-a6a1-d563ed8d8bde-dns-swift-storage-0\") pod \"dnsmasq-dns-6454d9f559-66tmq\" (UID: \"1404dbad-1c09-4b76-a6a1-d563ed8d8bde\") " pod="openstack/dnsmasq-dns-6454d9f559-66tmq" Oct 03 08:58:20 crc kubenswrapper[4790]: I1003 08:58:20.835837 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7drk6\" (UniqueName: \"kubernetes.io/projected/1404dbad-1c09-4b76-a6a1-d563ed8d8bde-kube-api-access-7drk6\") pod \"dnsmasq-dns-6454d9f559-66tmq\" (UID: \"1404dbad-1c09-4b76-a6a1-d563ed8d8bde\") " pod="openstack/dnsmasq-dns-6454d9f559-66tmq" Oct 03 08:58:20 crc kubenswrapper[4790]: I1003 08:58:20.847305 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-2100-account-create-q4khm"] Oct 03 08:58:20 crc kubenswrapper[4790]: I1003 08:58:20.848427 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-2100-account-create-q4khm" Oct 03 08:58:20 crc kubenswrapper[4790]: I1003 08:58:20.853467 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-2100-account-create-q4khm"] Oct 03 08:58:20 crc kubenswrapper[4790]: I1003 08:58:20.853977 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Oct 03 08:58:20 crc kubenswrapper[4790]: I1003 08:58:20.872031 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-14c0-account-create-82p7s" Oct 03 08:58:20 crc kubenswrapper[4790]: I1003 08:58:20.872101 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-vzdz9" Oct 03 08:58:20 crc kubenswrapper[4790]: I1003 08:58:20.897862 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-867694955c-fpnzb" Oct 03 08:58:20 crc kubenswrapper[4790]: I1003 08:58:20.918023 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/355f59b0-e1a4-47c9-b24b-fffda1be9952-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"355f59b0-e1a4-47c9-b24b-fffda1be9952\") " pod="openstack/glance-default-external-api-0" Oct 03 08:58:20 crc kubenswrapper[4790]: I1003 08:58:20.918054 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7ngs9\" (UniqueName: \"kubernetes.io/projected/355f59b0-e1a4-47c9-b24b-fffda1be9952-kube-api-access-7ngs9\") pod \"glance-default-external-api-0\" (UID: \"355f59b0-e1a4-47c9-b24b-fffda1be9952\") " pod="openstack/glance-default-external-api-0" Oct 03 08:58:20 crc kubenswrapper[4790]: I1003 08:58:20.918105 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/355f59b0-e1a4-47c9-b24b-fffda1be9952-config-data\") pod \"glance-default-external-api-0\" (UID: \"355f59b0-e1a4-47c9-b24b-fffda1be9952\") " pod="openstack/glance-default-external-api-0" Oct 03 08:58:20 crc kubenswrapper[4790]: I1003 08:58:20.918141 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"355f59b0-e1a4-47c9-b24b-fffda1be9952\") " pod="openstack/glance-default-external-api-0" Oct 03 08:58:20 crc kubenswrapper[4790]: I1003 08:58:20.918161 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/355f59b0-e1a4-47c9-b24b-fffda1be9952-logs\") pod \"glance-default-external-api-0\" (UID: \"355f59b0-e1a4-47c9-b24b-fffda1be9952\") " pod="openstack/glance-default-external-api-0" Oct 03 08:58:20 crc kubenswrapper[4790]: I1003 08:58:20.918237 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/355f59b0-e1a4-47c9-b24b-fffda1be9952-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"355f59b0-e1a4-47c9-b24b-fffda1be9952\") " pod="openstack/glance-default-external-api-0" Oct 03 08:58:20 crc kubenswrapper[4790]: I1003 08:58:20.918255 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/355f59b0-e1a4-47c9-b24b-fffda1be9952-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"355f59b0-e1a4-47c9-b24b-fffda1be9952\") " pod="openstack/glance-default-external-api-0" Oct 03 08:58:20 crc kubenswrapper[4790]: I1003 08:58:20.918275 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2tg7g\" (UniqueName: \"kubernetes.io/projected/4146a9d4-bf8b-4bce-aca7-531fb9e272ea-kube-api-access-2tg7g\") pod \"neutron-2100-account-create-q4khm\" (UID: \"4146a9d4-bf8b-4bce-aca7-531fb9e272ea\") " pod="openstack/neutron-2100-account-create-q4khm" Oct 03 08:58:20 crc kubenswrapper[4790]: I1003 08:58:20.918300 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/355f59b0-e1a4-47c9-b24b-fffda1be9952-scripts\") pod \"glance-default-external-api-0\" (UID: \"355f59b0-e1a4-47c9-b24b-fffda1be9952\") " pod="openstack/glance-default-external-api-0" Oct 03 08:58:20 crc kubenswrapper[4790]: I1003 08:58:20.918348 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nb7ql\" (UniqueName: \"kubernetes.io/projected/feee3da2-b9c2-4091-b0cf-cacd8f7f0fdc-kube-api-access-nb7ql\") pod \"cinder-5d29-account-create-jqwpk\" (UID: \"feee3da2-b9c2-4091-b0cf-cacd8f7f0fdc\") " pod="openstack/cinder-5d29-account-create-jqwpk" Oct 03 08:58:20 crc kubenswrapper[4790]: I1003 08:58:20.918965 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6454d9f559-66tmq" Oct 03 08:58:20 crc kubenswrapper[4790]: I1003 08:58:20.921355 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 03 08:58:20 crc kubenswrapper[4790]: I1003 08:58:20.922699 4790 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"355f59b0-e1a4-47c9-b24b-fffda1be9952\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/glance-default-external-api-0" Oct 03 08:58:20 crc kubenswrapper[4790]: I1003 08:58:20.923154 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/355f59b0-e1a4-47c9-b24b-fffda1be9952-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"355f59b0-e1a4-47c9-b24b-fffda1be9952\") " pod="openstack/glance-default-external-api-0" Oct 03 08:58:20 crc kubenswrapper[4790]: I1003 08:58:20.923230 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/355f59b0-e1a4-47c9-b24b-fffda1be9952-logs\") pod \"glance-default-external-api-0\" (UID: \"355f59b0-e1a4-47c9-b24b-fffda1be9952\") " pod="openstack/glance-default-external-api-0" Oct 03 08:58:20 crc kubenswrapper[4790]: I1003 08:58:20.925373 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/355f59b0-e1a4-47c9-b24b-fffda1be9952-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"355f59b0-e1a4-47c9-b24b-fffda1be9952\") " pod="openstack/glance-default-external-api-0" Oct 03 08:58:20 crc kubenswrapper[4790]: I1003 08:58:20.926038 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/355f59b0-e1a4-47c9-b24b-fffda1be9952-config-data\") pod \"glance-default-external-api-0\" (UID: \"355f59b0-e1a4-47c9-b24b-fffda1be9952\") " pod="openstack/glance-default-external-api-0" Oct 03 08:58:20 crc kubenswrapper[4790]: I1003 08:58:20.937449 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/355f59b0-e1a4-47c9-b24b-fffda1be9952-scripts\") pod \"glance-default-external-api-0\" (UID: \"355f59b0-e1a4-47c9-b24b-fffda1be9952\") " pod="openstack/glance-default-external-api-0" Oct 03 08:58:20 crc kubenswrapper[4790]: I1003 08:58:20.947385 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/355f59b0-e1a4-47c9-b24b-fffda1be9952-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"355f59b0-e1a4-47c9-b24b-fffda1be9952\") " pod="openstack/glance-default-external-api-0" Oct 03 08:58:20 crc kubenswrapper[4790]: I1003 08:58:20.977189 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7ngs9\" (UniqueName: \"kubernetes.io/projected/355f59b0-e1a4-47c9-b24b-fffda1be9952-kube-api-access-7ngs9\") pod \"glance-default-external-api-0\" (UID: \"355f59b0-e1a4-47c9-b24b-fffda1be9952\") " pod="openstack/glance-default-external-api-0" Oct 03 08:58:20 crc kubenswrapper[4790]: I1003 08:58:20.980196 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nb7ql\" (UniqueName: \"kubernetes.io/projected/feee3da2-b9c2-4091-b0cf-cacd8f7f0fdc-kube-api-access-nb7ql\") pod \"cinder-5d29-account-create-jqwpk\" (UID: \"feee3da2-b9c2-4091-b0cf-cacd8f7f0fdc\") " pod="openstack/cinder-5d29-account-create-jqwpk" Oct 03 08:58:21 crc kubenswrapper[4790]: I1003 08:58:21.022205 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-56d75b6d47-8rct7" Oct 03 08:58:21 crc kubenswrapper[4790]: I1003 08:58:21.026254 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2tg7g\" (UniqueName: \"kubernetes.io/projected/4146a9d4-bf8b-4bce-aca7-531fb9e272ea-kube-api-access-2tg7g\") pod \"neutron-2100-account-create-q4khm\" (UID: \"4146a9d4-bf8b-4bce-aca7-531fb9e272ea\") " pod="openstack/neutron-2100-account-create-q4khm" Oct 03 08:58:21 crc kubenswrapper[4790]: I1003 08:58:21.058278 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"355f59b0-e1a4-47c9-b24b-fffda1be9952\") " pod="openstack/glance-default-external-api-0" Oct 03 08:58:21 crc kubenswrapper[4790]: I1003 08:58:21.059727 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2tg7g\" (UniqueName: \"kubernetes.io/projected/4146a9d4-bf8b-4bce-aca7-531fb9e272ea-kube-api-access-2tg7g\") pod \"neutron-2100-account-create-q4khm\" (UID: \"4146a9d4-bf8b-4bce-aca7-531fb9e272ea\") " pod="openstack/neutron-2100-account-create-q4khm" Oct 03 08:58:21 crc kubenswrapper[4790]: I1003 08:58:21.130061 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/67fecd5f-4a82-413f-88ba-a07aa56c71a8-ovsdbserver-nb\") pod \"67fecd5f-4a82-413f-88ba-a07aa56c71a8\" (UID: \"67fecd5f-4a82-413f-88ba-a07aa56c71a8\") " Oct 03 08:58:21 crc kubenswrapper[4790]: I1003 08:58:21.130128 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/67fecd5f-4a82-413f-88ba-a07aa56c71a8-dns-swift-storage-0\") pod \"67fecd5f-4a82-413f-88ba-a07aa56c71a8\" (UID: \"67fecd5f-4a82-413f-88ba-a07aa56c71a8\") " Oct 03 08:58:21 crc kubenswrapper[4790]: I1003 08:58:21.130188 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/67fecd5f-4a82-413f-88ba-a07aa56c71a8-dns-svc\") pod \"67fecd5f-4a82-413f-88ba-a07aa56c71a8\" (UID: \"67fecd5f-4a82-413f-88ba-a07aa56c71a8\") " Oct 03 08:58:21 crc kubenswrapper[4790]: I1003 08:58:21.130357 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/67fecd5f-4a82-413f-88ba-a07aa56c71a8-config\") pod \"67fecd5f-4a82-413f-88ba-a07aa56c71a8\" (UID: \"67fecd5f-4a82-413f-88ba-a07aa56c71a8\") " Oct 03 08:58:21 crc kubenswrapper[4790]: I1003 08:58:21.130379 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/67fecd5f-4a82-413f-88ba-a07aa56c71a8-ovsdbserver-sb\") pod \"67fecd5f-4a82-413f-88ba-a07aa56c71a8\" (UID: \"67fecd5f-4a82-413f-88ba-a07aa56c71a8\") " Oct 03 08:58:21 crc kubenswrapper[4790]: I1003 08:58:21.130436 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-np5v2\" (UniqueName: \"kubernetes.io/projected/67fecd5f-4a82-413f-88ba-a07aa56c71a8-kube-api-access-np5v2\") pod \"67fecd5f-4a82-413f-88ba-a07aa56c71a8\" (UID: \"67fecd5f-4a82-413f-88ba-a07aa56c71a8\") " Oct 03 08:58:21 crc kubenswrapper[4790]: I1003 08:58:21.154168 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/67fecd5f-4a82-413f-88ba-a07aa56c71a8-kube-api-access-np5v2" (OuterVolumeSpecName: "kube-api-access-np5v2") pod "67fecd5f-4a82-413f-88ba-a07aa56c71a8" (UID: "67fecd5f-4a82-413f-88ba-a07aa56c71a8"). InnerVolumeSpecName "kube-api-access-np5v2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:58:21 crc kubenswrapper[4790]: I1003 08:58:21.236449 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/67fecd5f-4a82-413f-88ba-a07aa56c71a8-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "67fecd5f-4a82-413f-88ba-a07aa56c71a8" (UID: "67fecd5f-4a82-413f-88ba-a07aa56c71a8"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:58:21 crc kubenswrapper[4790]: I1003 08:58:21.238199 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-np5v2\" (UniqueName: \"kubernetes.io/projected/67fecd5f-4a82-413f-88ba-a07aa56c71a8-kube-api-access-np5v2\") on node \"crc\" DevicePath \"\"" Oct 03 08:58:21 crc kubenswrapper[4790]: I1003 08:58:21.238233 4790 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/67fecd5f-4a82-413f-88ba-a07aa56c71a8-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 03 08:58:21 crc kubenswrapper[4790]: I1003 08:58:21.255043 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/67fecd5f-4a82-413f-88ba-a07aa56c71a8-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "67fecd5f-4a82-413f-88ba-a07aa56c71a8" (UID: "67fecd5f-4a82-413f-88ba-a07aa56c71a8"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:58:21 crc kubenswrapper[4790]: I1003 08:58:21.274833 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/67fecd5f-4a82-413f-88ba-a07aa56c71a8-config" (OuterVolumeSpecName: "config") pod "67fecd5f-4a82-413f-88ba-a07aa56c71a8" (UID: "67fecd5f-4a82-413f-88ba-a07aa56c71a8"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:58:21 crc kubenswrapper[4790]: I1003 08:58:21.274967 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/67fecd5f-4a82-413f-88ba-a07aa56c71a8-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "67fecd5f-4a82-413f-88ba-a07aa56c71a8" (UID: "67fecd5f-4a82-413f-88ba-a07aa56c71a8"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:58:21 crc kubenswrapper[4790]: I1003 08:58:21.275202 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/67fecd5f-4a82-413f-88ba-a07aa56c71a8-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "67fecd5f-4a82-413f-88ba-a07aa56c71a8" (UID: "67fecd5f-4a82-413f-88ba-a07aa56c71a8"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:58:21 crc kubenswrapper[4790]: I1003 08:58:21.280055 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-5d29-account-create-jqwpk" Oct 03 08:58:21 crc kubenswrapper[4790]: I1003 08:58:21.322826 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 03 08:58:21 crc kubenswrapper[4790]: I1003 08:58:21.334926 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-2100-account-create-q4khm" Oct 03 08:58:21 crc kubenswrapper[4790]: I1003 08:58:21.350902 4790 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/67fecd5f-4a82-413f-88ba-a07aa56c71a8-config\") on node \"crc\" DevicePath \"\"" Oct 03 08:58:21 crc kubenswrapper[4790]: I1003 08:58:21.350933 4790 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/67fecd5f-4a82-413f-88ba-a07aa56c71a8-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 03 08:58:21 crc kubenswrapper[4790]: I1003 08:58:21.350950 4790 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/67fecd5f-4a82-413f-88ba-a07aa56c71a8-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 03 08:58:21 crc kubenswrapper[4790]: I1003 08:58:21.350958 4790 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/67fecd5f-4a82-413f-88ba-a07aa56c71a8-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 03 08:58:21 crc kubenswrapper[4790]: I1003 08:58:21.469461 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-ztmgb"] Oct 03 08:58:21 crc kubenswrapper[4790]: W1003 08:58:21.483643 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod43741e1d_af3f_4800_a5a7_cca0f6620cbc.slice/crio-62f713261b38fce8904840d58b61fb3003e76924aec92ae9ead3903b8bf7a343 WatchSource:0}: Error finding container 62f713261b38fce8904840d58b61fb3003e76924aec92ae9ead3903b8bf7a343: Status 404 returned error can't find the container with id 62f713261b38fce8904840d58b61fb3003e76924aec92ae9ead3903b8bf7a343 Oct 03 08:58:21 crc kubenswrapper[4790]: I1003 08:58:21.485714 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-f6df867cc-pshbz"] Oct 03 08:58:21 crc kubenswrapper[4790]: I1003 08:58:21.710612 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-6d887b5cbf-5qjjt"] Oct 03 08:58:21 crc kubenswrapper[4790]: I1003 08:58:21.864368 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-ztmgb" event={"ID":"43741e1d-af3f-4800-a5a7-cca0f6620cbc","Type":"ContainerStarted","Data":"62f713261b38fce8904840d58b61fb3003e76924aec92ae9ead3903b8bf7a343"} Oct 03 08:58:21 crc kubenswrapper[4790]: I1003 08:58:21.874069 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-f6df867cc-pshbz" event={"ID":"19b25afb-eabe-4e5f-9fbd-39e86c2d8690","Type":"ContainerStarted","Data":"5327c59c1585dfe10b37376951bd7dfbb20267c5493783b52351f5ec4da2b7fa"} Oct 03 08:58:21 crc kubenswrapper[4790]: I1003 08:58:21.903461 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-56d75b6d47-8rct7" Oct 03 08:58:21 crc kubenswrapper[4790]: I1003 08:58:21.906676 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6d887b5cbf-5qjjt" event={"ID":"8a368664-2338-4579-ae15-981ee5ca3194","Type":"ContainerStarted","Data":"bcba4addbf1ac988637f56e72ee0f4bba335de286226d15fb3303d9af941a7b0"} Oct 03 08:58:21 crc kubenswrapper[4790]: I1003 08:58:21.996190 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-14c0-account-create-82p7s"] Oct 03 08:58:22 crc kubenswrapper[4790]: I1003 08:58:22.013335 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-56d75b6d47-8rct7"] Oct 03 08:58:22 crc kubenswrapper[4790]: I1003 08:58:22.020088 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-56d75b6d47-8rct7"] Oct 03 08:58:22 crc kubenswrapper[4790]: I1003 08:58:22.052455 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-vzdz9"] Oct 03 08:58:22 crc kubenswrapper[4790]: I1003 08:58:22.115360 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 03 08:58:22 crc kubenswrapper[4790]: I1003 08:58:22.233133 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-867694955c-fpnzb"] Oct 03 08:58:22 crc kubenswrapper[4790]: W1003 08:58:22.271862 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod64754289_91b5_4f84_9809_6271b6a13031.slice/crio-b0ff64c50c118d67a4cf15daa2b8ee6c4f295601ca118a67059cebe19412eeaf WatchSource:0}: Error finding container b0ff64c50c118d67a4cf15daa2b8ee6c4f295601ca118a67059cebe19412eeaf: Status 404 returned error can't find the container with id b0ff64c50c118d67a4cf15daa2b8ee6c4f295601ca118a67059cebe19412eeaf Oct 03 08:58:22 crc kubenswrapper[4790]: I1003 08:58:22.326831 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6454d9f559-66tmq"] Oct 03 08:58:22 crc kubenswrapper[4790]: I1003 08:58:22.352177 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="67fecd5f-4a82-413f-88ba-a07aa56c71a8" path="/var/lib/kubelet/pods/67fecd5f-4a82-413f-88ba-a07aa56c71a8/volumes" Oct 03 08:58:22 crc kubenswrapper[4790]: I1003 08:58:22.389459 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 03 08:58:22 crc kubenswrapper[4790]: I1003 08:58:22.415949 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-2100-account-create-q4khm"] Oct 03 08:58:22 crc kubenswrapper[4790]: I1003 08:58:22.424361 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-5d29-account-create-jqwpk"] Oct 03 08:58:22 crc kubenswrapper[4790]: I1003 08:58:22.740672 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 03 08:58:22 crc kubenswrapper[4790]: I1003 08:58:22.827215 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 03 08:58:22 crc kubenswrapper[4790]: I1003 08:58:22.978214 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-6d887b5cbf-5qjjt"] Oct 03 08:58:23 crc kubenswrapper[4790]: I1003 08:58:23.004481 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 03 08:58:23 crc kubenswrapper[4790]: I1003 08:58:23.005050 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-ztmgb" event={"ID":"43741e1d-af3f-4800-a5a7-cca0f6620cbc","Type":"ContainerStarted","Data":"30e0fe45b4ecd503d294bff8ce70c90a46cb82b7bf568d3f1c53ed5fb95d4976"} Oct 03 08:58:23 crc kubenswrapper[4790]: I1003 08:58:23.020032 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8a0817c5-b72b-4433-977d-6185317b0885","Type":"ContainerStarted","Data":"6f68cbe5675af8b8176635455fcabbd6a69f76ebd35d37d256275064dd3d9117"} Oct 03 08:58:23 crc kubenswrapper[4790]: I1003 08:58:23.021633 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-6448764df-qdvpl"] Oct 03 08:58:23 crc kubenswrapper[4790]: E1003 08:58:23.022036 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="67fecd5f-4a82-413f-88ba-a07aa56c71a8" containerName="dnsmasq-dns" Oct 03 08:58:23 crc kubenswrapper[4790]: I1003 08:58:23.022053 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="67fecd5f-4a82-413f-88ba-a07aa56c71a8" containerName="dnsmasq-dns" Oct 03 08:58:23 crc kubenswrapper[4790]: E1003 08:58:23.022063 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="67fecd5f-4a82-413f-88ba-a07aa56c71a8" containerName="init" Oct 03 08:58:23 crc kubenswrapper[4790]: I1003 08:58:23.022070 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="67fecd5f-4a82-413f-88ba-a07aa56c71a8" containerName="init" Oct 03 08:58:23 crc kubenswrapper[4790]: I1003 08:58:23.022276 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="67fecd5f-4a82-413f-88ba-a07aa56c71a8" containerName="dnsmasq-dns" Oct 03 08:58:23 crc kubenswrapper[4790]: I1003 08:58:23.032249 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6448764df-qdvpl" Oct 03 08:58:23 crc kubenswrapper[4790]: I1003 08:58:23.035483 4790 generic.go:334] "Generic (PLEG): container finished" podID="19b25afb-eabe-4e5f-9fbd-39e86c2d8690" containerID="419d2f8cd743e3a550ea077cc0c27de3068c7ab27c756dc6252eccc2d195eac3" exitCode=0 Oct 03 08:58:23 crc kubenswrapper[4790]: I1003 08:58:23.035567 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-f6df867cc-pshbz" event={"ID":"19b25afb-eabe-4e5f-9fbd-39e86c2d8690","Type":"ContainerDied","Data":"419d2f8cd743e3a550ea077cc0c27de3068c7ab27c756dc6252eccc2d195eac3"} Oct 03 08:58:23 crc kubenswrapper[4790]: I1003 08:58:23.040872 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6454d9f559-66tmq" event={"ID":"1404dbad-1c09-4b76-a6a1-d563ed8d8bde","Type":"ContainerStarted","Data":"1415f324dd7837e61466dd7387c73fb163ce3c0af88885c4d0363d2fcf35bcb5"} Oct 03 08:58:23 crc kubenswrapper[4790]: I1003 08:58:23.053427 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-vzdz9" event={"ID":"a839597d-be75-4b9f-9d18-a2e8045a674b","Type":"ContainerStarted","Data":"43cf0741e59860177d867a729e291d29d497b1d449c8b71eb006fc2f9b0344a8"} Oct 03 08:58:23 crc kubenswrapper[4790]: I1003 08:58:23.061555 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-6448764df-qdvpl"] Oct 03 08:58:23 crc kubenswrapper[4790]: I1003 08:58:23.066676 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"355f59b0-e1a4-47c9-b24b-fffda1be9952","Type":"ContainerStarted","Data":"fa87d5770606723279f6b17ced44e0a3ae528328c0f2db42338fdc106015cd38"} Oct 03 08:58:23 crc kubenswrapper[4790]: I1003 08:58:23.078239 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-2100-account-create-q4khm" event={"ID":"4146a9d4-bf8b-4bce-aca7-531fb9e272ea","Type":"ContainerStarted","Data":"9f99636ae488741b4d9552f97007ede1d39ca1c9fa52d44505ae794822e14ac0"} Oct 03 08:58:23 crc kubenswrapper[4790]: I1003 08:58:23.094746 4790 generic.go:334] "Generic (PLEG): container finished" podID="f11270f2-ead7-4001-94f9-54e66cfbe017" containerID="5bcbce90e5f552eaaadc6b92cb86678891a8513ca0aa2b956936185c44add82d" exitCode=0 Oct 03 08:58:23 crc kubenswrapper[4790]: I1003 08:58:23.094854 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-14c0-account-create-82p7s" event={"ID":"f11270f2-ead7-4001-94f9-54e66cfbe017","Type":"ContainerDied","Data":"5bcbce90e5f552eaaadc6b92cb86678891a8513ca0aa2b956936185c44add82d"} Oct 03 08:58:23 crc kubenswrapper[4790]: I1003 08:58:23.094882 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-14c0-account-create-82p7s" event={"ID":"f11270f2-ead7-4001-94f9-54e66cfbe017","Type":"ContainerStarted","Data":"da91ca42ca561e3625089bdb21460b54f0a60d5569037fb952ed43e8275a7a77"} Oct 03 08:58:23 crc kubenswrapper[4790]: I1003 08:58:23.095341 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 03 08:58:23 crc kubenswrapper[4790]: I1003 08:58:23.096692 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-ztmgb" podStartSLOduration=4.096680011 podStartE2EDuration="4.096680011s" podCreationTimestamp="2025-10-03 08:58:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 08:58:23.030986012 +0000 UTC m=+1089.383920250" watchObservedRunningTime="2025-10-03 08:58:23.096680011 +0000 UTC m=+1089.449614249" Oct 03 08:58:23 crc kubenswrapper[4790]: I1003 08:58:23.122507 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-5d29-account-create-jqwpk" event={"ID":"feee3da2-b9c2-4091-b0cf-cacd8f7f0fdc","Type":"ContainerStarted","Data":"c7edc1b05d2086b322f0dfdb304a4f23877a25bd966951077a815ad91c0491f2"} Oct 03 08:58:23 crc kubenswrapper[4790]: I1003 08:58:23.123816 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-867694955c-fpnzb" event={"ID":"64754289-91b5-4f84-9809-6271b6a13031","Type":"ContainerStarted","Data":"b0ff64c50c118d67a4cf15daa2b8ee6c4f295601ca118a67059cebe19412eeaf"} Oct 03 08:58:23 crc kubenswrapper[4790]: I1003 08:58:23.125691 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"e08f91ab-f702-49a8-be10-3114d8a10e48","Type":"ContainerStarted","Data":"e47eef746fe26100f7053a206f35675a1034a7cc37a5c5ac37e36abb7ccc53f6"} Oct 03 08:58:23 crc kubenswrapper[4790]: I1003 08:58:23.140206 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e5a8c4b6-bbe9-4dda-b8b9-4b84fe0864ef-config-data\") pod \"horizon-6448764df-qdvpl\" (UID: \"e5a8c4b6-bbe9-4dda-b8b9-4b84fe0864ef\") " pod="openstack/horizon-6448764df-qdvpl" Oct 03 08:58:23 crc kubenswrapper[4790]: I1003 08:58:23.140284 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q4blv\" (UniqueName: \"kubernetes.io/projected/e5a8c4b6-bbe9-4dda-b8b9-4b84fe0864ef-kube-api-access-q4blv\") pod \"horizon-6448764df-qdvpl\" (UID: \"e5a8c4b6-bbe9-4dda-b8b9-4b84fe0864ef\") " pod="openstack/horizon-6448764df-qdvpl" Oct 03 08:58:23 crc kubenswrapper[4790]: I1003 08:58:23.140356 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e5a8c4b6-bbe9-4dda-b8b9-4b84fe0864ef-scripts\") pod \"horizon-6448764df-qdvpl\" (UID: \"e5a8c4b6-bbe9-4dda-b8b9-4b84fe0864ef\") " pod="openstack/horizon-6448764df-qdvpl" Oct 03 08:58:23 crc kubenswrapper[4790]: I1003 08:58:23.140380 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e5a8c4b6-bbe9-4dda-b8b9-4b84fe0864ef-logs\") pod \"horizon-6448764df-qdvpl\" (UID: \"e5a8c4b6-bbe9-4dda-b8b9-4b84fe0864ef\") " pod="openstack/horizon-6448764df-qdvpl" Oct 03 08:58:23 crc kubenswrapper[4790]: I1003 08:58:23.140431 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/e5a8c4b6-bbe9-4dda-b8b9-4b84fe0864ef-horizon-secret-key\") pod \"horizon-6448764df-qdvpl\" (UID: \"e5a8c4b6-bbe9-4dda-b8b9-4b84fe0864ef\") " pod="openstack/horizon-6448764df-qdvpl" Oct 03 08:58:23 crc kubenswrapper[4790]: I1003 08:58:23.242065 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e5a8c4b6-bbe9-4dda-b8b9-4b84fe0864ef-scripts\") pod \"horizon-6448764df-qdvpl\" (UID: \"e5a8c4b6-bbe9-4dda-b8b9-4b84fe0864ef\") " pod="openstack/horizon-6448764df-qdvpl" Oct 03 08:58:23 crc kubenswrapper[4790]: I1003 08:58:23.242112 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e5a8c4b6-bbe9-4dda-b8b9-4b84fe0864ef-logs\") pod \"horizon-6448764df-qdvpl\" (UID: \"e5a8c4b6-bbe9-4dda-b8b9-4b84fe0864ef\") " pod="openstack/horizon-6448764df-qdvpl" Oct 03 08:58:23 crc kubenswrapper[4790]: I1003 08:58:23.242162 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/e5a8c4b6-bbe9-4dda-b8b9-4b84fe0864ef-horizon-secret-key\") pod \"horizon-6448764df-qdvpl\" (UID: \"e5a8c4b6-bbe9-4dda-b8b9-4b84fe0864ef\") " pod="openstack/horizon-6448764df-qdvpl" Oct 03 08:58:23 crc kubenswrapper[4790]: I1003 08:58:23.242210 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e5a8c4b6-bbe9-4dda-b8b9-4b84fe0864ef-config-data\") pod \"horizon-6448764df-qdvpl\" (UID: \"e5a8c4b6-bbe9-4dda-b8b9-4b84fe0864ef\") " pod="openstack/horizon-6448764df-qdvpl" Oct 03 08:58:23 crc kubenswrapper[4790]: I1003 08:58:23.242262 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q4blv\" (UniqueName: \"kubernetes.io/projected/e5a8c4b6-bbe9-4dda-b8b9-4b84fe0864ef-kube-api-access-q4blv\") pod \"horizon-6448764df-qdvpl\" (UID: \"e5a8c4b6-bbe9-4dda-b8b9-4b84fe0864ef\") " pod="openstack/horizon-6448764df-qdvpl" Oct 03 08:58:23 crc kubenswrapper[4790]: I1003 08:58:23.242735 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e5a8c4b6-bbe9-4dda-b8b9-4b84fe0864ef-logs\") pod \"horizon-6448764df-qdvpl\" (UID: \"e5a8c4b6-bbe9-4dda-b8b9-4b84fe0864ef\") " pod="openstack/horizon-6448764df-qdvpl" Oct 03 08:58:23 crc kubenswrapper[4790]: I1003 08:58:23.246499 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e5a8c4b6-bbe9-4dda-b8b9-4b84fe0864ef-scripts\") pod \"horizon-6448764df-qdvpl\" (UID: \"e5a8c4b6-bbe9-4dda-b8b9-4b84fe0864ef\") " pod="openstack/horizon-6448764df-qdvpl" Oct 03 08:58:23 crc kubenswrapper[4790]: I1003 08:58:23.248946 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e5a8c4b6-bbe9-4dda-b8b9-4b84fe0864ef-config-data\") pod \"horizon-6448764df-qdvpl\" (UID: \"e5a8c4b6-bbe9-4dda-b8b9-4b84fe0864ef\") " pod="openstack/horizon-6448764df-qdvpl" Oct 03 08:58:23 crc kubenswrapper[4790]: I1003 08:58:23.268049 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q4blv\" (UniqueName: \"kubernetes.io/projected/e5a8c4b6-bbe9-4dda-b8b9-4b84fe0864ef-kube-api-access-q4blv\") pod \"horizon-6448764df-qdvpl\" (UID: \"e5a8c4b6-bbe9-4dda-b8b9-4b84fe0864ef\") " pod="openstack/horizon-6448764df-qdvpl" Oct 03 08:58:23 crc kubenswrapper[4790]: I1003 08:58:23.268316 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/e5a8c4b6-bbe9-4dda-b8b9-4b84fe0864ef-horizon-secret-key\") pod \"horizon-6448764df-qdvpl\" (UID: \"e5a8c4b6-bbe9-4dda-b8b9-4b84fe0864ef\") " pod="openstack/horizon-6448764df-qdvpl" Oct 03 08:58:23 crc kubenswrapper[4790]: I1003 08:58:23.374512 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6448764df-qdvpl" Oct 03 08:58:23 crc kubenswrapper[4790]: I1003 08:58:23.741609 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-f6df867cc-pshbz" Oct 03 08:58:23 crc kubenswrapper[4790]: I1003 08:58:23.759501 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/19b25afb-eabe-4e5f-9fbd-39e86c2d8690-dns-svc\") pod \"19b25afb-eabe-4e5f-9fbd-39e86c2d8690\" (UID: \"19b25afb-eabe-4e5f-9fbd-39e86c2d8690\") " Oct 03 08:58:23 crc kubenswrapper[4790]: I1003 08:58:23.824595 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/19b25afb-eabe-4e5f-9fbd-39e86c2d8690-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "19b25afb-eabe-4e5f-9fbd-39e86c2d8690" (UID: "19b25afb-eabe-4e5f-9fbd-39e86c2d8690"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:58:23 crc kubenswrapper[4790]: I1003 08:58:23.863003 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2r984\" (UniqueName: \"kubernetes.io/projected/19b25afb-eabe-4e5f-9fbd-39e86c2d8690-kube-api-access-2r984\") pod \"19b25afb-eabe-4e5f-9fbd-39e86c2d8690\" (UID: \"19b25afb-eabe-4e5f-9fbd-39e86c2d8690\") " Oct 03 08:58:23 crc kubenswrapper[4790]: I1003 08:58:23.863045 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/19b25afb-eabe-4e5f-9fbd-39e86c2d8690-dns-swift-storage-0\") pod \"19b25afb-eabe-4e5f-9fbd-39e86c2d8690\" (UID: \"19b25afb-eabe-4e5f-9fbd-39e86c2d8690\") " Oct 03 08:58:23 crc kubenswrapper[4790]: I1003 08:58:23.863080 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/19b25afb-eabe-4e5f-9fbd-39e86c2d8690-ovsdbserver-sb\") pod \"19b25afb-eabe-4e5f-9fbd-39e86c2d8690\" (UID: \"19b25afb-eabe-4e5f-9fbd-39e86c2d8690\") " Oct 03 08:58:23 crc kubenswrapper[4790]: I1003 08:58:23.863156 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/19b25afb-eabe-4e5f-9fbd-39e86c2d8690-ovsdbserver-nb\") pod \"19b25afb-eabe-4e5f-9fbd-39e86c2d8690\" (UID: \"19b25afb-eabe-4e5f-9fbd-39e86c2d8690\") " Oct 03 08:58:23 crc kubenswrapper[4790]: I1003 08:58:23.863206 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/19b25afb-eabe-4e5f-9fbd-39e86c2d8690-config\") pod \"19b25afb-eabe-4e5f-9fbd-39e86c2d8690\" (UID: \"19b25afb-eabe-4e5f-9fbd-39e86c2d8690\") " Oct 03 08:58:23 crc kubenswrapper[4790]: I1003 08:58:23.863704 4790 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/19b25afb-eabe-4e5f-9fbd-39e86c2d8690-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 03 08:58:23 crc kubenswrapper[4790]: I1003 08:58:23.881190 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/19b25afb-eabe-4e5f-9fbd-39e86c2d8690-kube-api-access-2r984" (OuterVolumeSpecName: "kube-api-access-2r984") pod "19b25afb-eabe-4e5f-9fbd-39e86c2d8690" (UID: "19b25afb-eabe-4e5f-9fbd-39e86c2d8690"). InnerVolumeSpecName "kube-api-access-2r984". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:58:23 crc kubenswrapper[4790]: I1003 08:58:23.898608 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/19b25afb-eabe-4e5f-9fbd-39e86c2d8690-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "19b25afb-eabe-4e5f-9fbd-39e86c2d8690" (UID: "19b25afb-eabe-4e5f-9fbd-39e86c2d8690"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:58:23 crc kubenswrapper[4790]: I1003 08:58:23.898832 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/19b25afb-eabe-4e5f-9fbd-39e86c2d8690-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "19b25afb-eabe-4e5f-9fbd-39e86c2d8690" (UID: "19b25afb-eabe-4e5f-9fbd-39e86c2d8690"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:58:23 crc kubenswrapper[4790]: I1003 08:58:23.899167 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/19b25afb-eabe-4e5f-9fbd-39e86c2d8690-config" (OuterVolumeSpecName: "config") pod "19b25afb-eabe-4e5f-9fbd-39e86c2d8690" (UID: "19b25afb-eabe-4e5f-9fbd-39e86c2d8690"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:58:23 crc kubenswrapper[4790]: I1003 08:58:23.916402 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/19b25afb-eabe-4e5f-9fbd-39e86c2d8690-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "19b25afb-eabe-4e5f-9fbd-39e86c2d8690" (UID: "19b25afb-eabe-4e5f-9fbd-39e86c2d8690"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:58:23 crc kubenswrapper[4790]: I1003 08:58:23.970298 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2r984\" (UniqueName: \"kubernetes.io/projected/19b25afb-eabe-4e5f-9fbd-39e86c2d8690-kube-api-access-2r984\") on node \"crc\" DevicePath \"\"" Oct 03 08:58:23 crc kubenswrapper[4790]: I1003 08:58:23.970357 4790 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/19b25afb-eabe-4e5f-9fbd-39e86c2d8690-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 03 08:58:23 crc kubenswrapper[4790]: I1003 08:58:23.971412 4790 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/19b25afb-eabe-4e5f-9fbd-39e86c2d8690-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 03 08:58:23 crc kubenswrapper[4790]: I1003 08:58:23.971434 4790 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/19b25afb-eabe-4e5f-9fbd-39e86c2d8690-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 03 08:58:23 crc kubenswrapper[4790]: I1003 08:58:23.971466 4790 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/19b25afb-eabe-4e5f-9fbd-39e86c2d8690-config\") on node \"crc\" DevicePath \"\"" Oct 03 08:58:24 crc kubenswrapper[4790]: I1003 08:58:24.124025 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-6448764df-qdvpl"] Oct 03 08:58:24 crc kubenswrapper[4790]: W1003 08:58:24.132609 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode5a8c4b6_bbe9_4dda_b8b9_4b84fe0864ef.slice/crio-01256a351eb3e28957318809b2108790993abc2462c0abc393d83c0d76eced39 WatchSource:0}: Error finding container 01256a351eb3e28957318809b2108790993abc2462c0abc393d83c0d76eced39: Status 404 returned error can't find the container with id 01256a351eb3e28957318809b2108790993abc2462c0abc393d83c0d76eced39 Oct 03 08:58:24 crc kubenswrapper[4790]: I1003 08:58:24.146991 4790 generic.go:334] "Generic (PLEG): container finished" podID="feee3da2-b9c2-4091-b0cf-cacd8f7f0fdc" containerID="3810111fe0dbe40d183e931305a5130d4a7c2a94606d047505cd9ff507de82e7" exitCode=0 Oct 03 08:58:24 crc kubenswrapper[4790]: I1003 08:58:24.147081 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-5d29-account-create-jqwpk" event={"ID":"feee3da2-b9c2-4091-b0cf-cacd8f7f0fdc","Type":"ContainerDied","Data":"3810111fe0dbe40d183e931305a5130d4a7c2a94606d047505cd9ff507de82e7"} Oct 03 08:58:24 crc kubenswrapper[4790]: I1003 08:58:24.149736 4790 generic.go:334] "Generic (PLEG): container finished" podID="4146a9d4-bf8b-4bce-aca7-531fb9e272ea" containerID="52f70ec8d15ca2b8722d40d109dd87907796a7a8eb04ba50e4a0d89100f00997" exitCode=0 Oct 03 08:58:24 crc kubenswrapper[4790]: I1003 08:58:24.149793 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-2100-account-create-q4khm" event={"ID":"4146a9d4-bf8b-4bce-aca7-531fb9e272ea","Type":"ContainerDied","Data":"52f70ec8d15ca2b8722d40d109dd87907796a7a8eb04ba50e4a0d89100f00997"} Oct 03 08:58:24 crc kubenswrapper[4790]: I1003 08:58:24.166657 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-f6df867cc-pshbz" Oct 03 08:58:24 crc kubenswrapper[4790]: I1003 08:58:24.169490 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-f6df867cc-pshbz" event={"ID":"19b25afb-eabe-4e5f-9fbd-39e86c2d8690","Type":"ContainerDied","Data":"5327c59c1585dfe10b37376951bd7dfbb20267c5493783b52351f5ec4da2b7fa"} Oct 03 08:58:24 crc kubenswrapper[4790]: I1003 08:58:24.169630 4790 scope.go:117] "RemoveContainer" containerID="419d2f8cd743e3a550ea077cc0c27de3068c7ab27c756dc6252eccc2d195eac3" Oct 03 08:58:24 crc kubenswrapper[4790]: I1003 08:58:24.191292 4790 generic.go:334] "Generic (PLEG): container finished" podID="1404dbad-1c09-4b76-a6a1-d563ed8d8bde" containerID="f0a60fb4840f95ea41ae2623394aa25ab6e9fae2e4fd798f9addf6a51bf1c24e" exitCode=0 Oct 03 08:58:24 crc kubenswrapper[4790]: I1003 08:58:24.193406 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6454d9f559-66tmq" event={"ID":"1404dbad-1c09-4b76-a6a1-d563ed8d8bde","Type":"ContainerDied","Data":"f0a60fb4840f95ea41ae2623394aa25ab6e9fae2e4fd798f9addf6a51bf1c24e"} Oct 03 08:58:24 crc kubenswrapper[4790]: I1003 08:58:24.281568 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-f6df867cc-pshbz"] Oct 03 08:58:24 crc kubenswrapper[4790]: I1003 08:58:24.294980 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-f6df867cc-pshbz"] Oct 03 08:58:24 crc kubenswrapper[4790]: I1003 08:58:24.408557 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="19b25afb-eabe-4e5f-9fbd-39e86c2d8690" path="/var/lib/kubelet/pods/19b25afb-eabe-4e5f-9fbd-39e86c2d8690/volumes" Oct 03 08:58:24 crc kubenswrapper[4790]: I1003 08:58:24.839488 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-14c0-account-create-82p7s" Oct 03 08:58:25 crc kubenswrapper[4790]: I1003 08:58:25.024385 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wl86j\" (UniqueName: \"kubernetes.io/projected/f11270f2-ead7-4001-94f9-54e66cfbe017-kube-api-access-wl86j\") pod \"f11270f2-ead7-4001-94f9-54e66cfbe017\" (UID: \"f11270f2-ead7-4001-94f9-54e66cfbe017\") " Oct 03 08:58:25 crc kubenswrapper[4790]: I1003 08:58:25.062902 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f11270f2-ead7-4001-94f9-54e66cfbe017-kube-api-access-wl86j" (OuterVolumeSpecName: "kube-api-access-wl86j") pod "f11270f2-ead7-4001-94f9-54e66cfbe017" (UID: "f11270f2-ead7-4001-94f9-54e66cfbe017"). InnerVolumeSpecName "kube-api-access-wl86j". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:58:25 crc kubenswrapper[4790]: I1003 08:58:25.146941 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wl86j\" (UniqueName: \"kubernetes.io/projected/f11270f2-ead7-4001-94f9-54e66cfbe017-kube-api-access-wl86j\") on node \"crc\" DevicePath \"\"" Oct 03 08:58:25 crc kubenswrapper[4790]: I1003 08:58:25.306438 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6448764df-qdvpl" event={"ID":"e5a8c4b6-bbe9-4dda-b8b9-4b84fe0864ef","Type":"ContainerStarted","Data":"01256a351eb3e28957318809b2108790993abc2462c0abc393d83c0d76eced39"} Oct 03 08:58:25 crc kubenswrapper[4790]: I1003 08:58:25.311841 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"e08f91ab-f702-49a8-be10-3114d8a10e48","Type":"ContainerStarted","Data":"aaeb72aaa7d01efc6a63c2cdd3df361bb83ca8e4eedad0117b3a034a12e0326c"} Oct 03 08:58:25 crc kubenswrapper[4790]: I1003 08:58:25.337486 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6454d9f559-66tmq" event={"ID":"1404dbad-1c09-4b76-a6a1-d563ed8d8bde","Type":"ContainerStarted","Data":"16961df2c39ced84062e96d16f8afcb5b72c5e8edb0c1bb2c8ec402830f0fbc6"} Oct 03 08:58:25 crc kubenswrapper[4790]: I1003 08:58:25.337857 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6454d9f559-66tmq" Oct 03 08:58:25 crc kubenswrapper[4790]: I1003 08:58:25.355753 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-14c0-account-create-82p7s" event={"ID":"f11270f2-ead7-4001-94f9-54e66cfbe017","Type":"ContainerDied","Data":"da91ca42ca561e3625089bdb21460b54f0a60d5569037fb952ed43e8275a7a77"} Oct 03 08:58:25 crc kubenswrapper[4790]: I1003 08:58:25.355823 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="da91ca42ca561e3625089bdb21460b54f0a60d5569037fb952ed43e8275a7a77" Oct 03 08:58:25 crc kubenswrapper[4790]: I1003 08:58:25.355907 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-14c0-account-create-82p7s" Oct 03 08:58:25 crc kubenswrapper[4790]: I1003 08:58:25.379399 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"355f59b0-e1a4-47c9-b24b-fffda1be9952","Type":"ContainerStarted","Data":"37ae584484cbe528ca7cd11dd582dcd5d81ec5af833e3400cef9e800763bd0e2"} Oct 03 08:58:25 crc kubenswrapper[4790]: I1003 08:58:25.400305 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6454d9f559-66tmq" podStartSLOduration=5.400275448 podStartE2EDuration="5.400275448s" podCreationTimestamp="2025-10-03 08:58:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 08:58:25.371177691 +0000 UTC m=+1091.724111929" watchObservedRunningTime="2025-10-03 08:58:25.400275448 +0000 UTC m=+1091.753209706" Oct 03 08:58:26 crc kubenswrapper[4790]: I1003 08:58:26.048948 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-5d29-account-create-jqwpk" Oct 03 08:58:26 crc kubenswrapper[4790]: I1003 08:58:26.096444 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nb7ql\" (UniqueName: \"kubernetes.io/projected/feee3da2-b9c2-4091-b0cf-cacd8f7f0fdc-kube-api-access-nb7ql\") pod \"feee3da2-b9c2-4091-b0cf-cacd8f7f0fdc\" (UID: \"feee3da2-b9c2-4091-b0cf-cacd8f7f0fdc\") " Oct 03 08:58:26 crc kubenswrapper[4790]: I1003 08:58:26.114341 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/feee3da2-b9c2-4091-b0cf-cacd8f7f0fdc-kube-api-access-nb7ql" (OuterVolumeSpecName: "kube-api-access-nb7ql") pod "feee3da2-b9c2-4091-b0cf-cacd8f7f0fdc" (UID: "feee3da2-b9c2-4091-b0cf-cacd8f7f0fdc"). InnerVolumeSpecName "kube-api-access-nb7ql". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:58:26 crc kubenswrapper[4790]: I1003 08:58:26.166622 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-2100-account-create-q4khm" Oct 03 08:58:26 crc kubenswrapper[4790]: I1003 08:58:26.200292 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2tg7g\" (UniqueName: \"kubernetes.io/projected/4146a9d4-bf8b-4bce-aca7-531fb9e272ea-kube-api-access-2tg7g\") pod \"4146a9d4-bf8b-4bce-aca7-531fb9e272ea\" (UID: \"4146a9d4-bf8b-4bce-aca7-531fb9e272ea\") " Oct 03 08:58:26 crc kubenswrapper[4790]: I1003 08:58:26.200858 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nb7ql\" (UniqueName: \"kubernetes.io/projected/feee3da2-b9c2-4091-b0cf-cacd8f7f0fdc-kube-api-access-nb7ql\") on node \"crc\" DevicePath \"\"" Oct 03 08:58:26 crc kubenswrapper[4790]: I1003 08:58:26.207767 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4146a9d4-bf8b-4bce-aca7-531fb9e272ea-kube-api-access-2tg7g" (OuterVolumeSpecName: "kube-api-access-2tg7g") pod "4146a9d4-bf8b-4bce-aca7-531fb9e272ea" (UID: "4146a9d4-bf8b-4bce-aca7-531fb9e272ea"). InnerVolumeSpecName "kube-api-access-2tg7g". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:58:26 crc kubenswrapper[4790]: I1003 08:58:26.304000 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2tg7g\" (UniqueName: \"kubernetes.io/projected/4146a9d4-bf8b-4bce-aca7-531fb9e272ea-kube-api-access-2tg7g\") on node \"crc\" DevicePath \"\"" Oct 03 08:58:26 crc kubenswrapper[4790]: I1003 08:58:26.410387 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"e08f91ab-f702-49a8-be10-3114d8a10e48","Type":"ContainerStarted","Data":"d85291541bd6889ad8fc25a5d8a2d6cd0ba75458c6a4a080f7698e45d3a23786"} Oct 03 08:58:26 crc kubenswrapper[4790]: I1003 08:58:26.410960 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="e08f91ab-f702-49a8-be10-3114d8a10e48" containerName="glance-log" containerID="cri-o://aaeb72aaa7d01efc6a63c2cdd3df361bb83ca8e4eedad0117b3a034a12e0326c" gracePeriod=30 Oct 03 08:58:26 crc kubenswrapper[4790]: I1003 08:58:26.411453 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="e08f91ab-f702-49a8-be10-3114d8a10e48" containerName="glance-httpd" containerID="cri-o://d85291541bd6889ad8fc25a5d8a2d6cd0ba75458c6a4a080f7698e45d3a23786" gracePeriod=30 Oct 03 08:58:26 crc kubenswrapper[4790]: I1003 08:58:26.430554 4790 generic.go:334] "Generic (PLEG): container finished" podID="8d78d2bd-0369-4e94-9f44-4e57634ab8b6" containerID="b903ffd9c9737c9e3f4c122cd3010c618d42c0b548b05612976955055812f322" exitCode=0 Oct 03 08:58:26 crc kubenswrapper[4790]: I1003 08:58:26.430666 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-db-sync-8j2zm" event={"ID":"8d78d2bd-0369-4e94-9f44-4e57634ab8b6","Type":"ContainerDied","Data":"b903ffd9c9737c9e3f4c122cd3010c618d42c0b548b05612976955055812f322"} Oct 03 08:58:26 crc kubenswrapper[4790]: I1003 08:58:26.439636 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-5d29-account-create-jqwpk" event={"ID":"feee3da2-b9c2-4091-b0cf-cacd8f7f0fdc","Type":"ContainerDied","Data":"c7edc1b05d2086b322f0dfdb304a4f23877a25bd966951077a815ad91c0491f2"} Oct 03 08:58:26 crc kubenswrapper[4790]: I1003 08:58:26.439679 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c7edc1b05d2086b322f0dfdb304a4f23877a25bd966951077a815ad91c0491f2" Oct 03 08:58:26 crc kubenswrapper[4790]: I1003 08:58:26.439747 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-5d29-account-create-jqwpk" Oct 03 08:58:26 crc kubenswrapper[4790]: I1003 08:58:26.457915 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"355f59b0-e1a4-47c9-b24b-fffda1be9952","Type":"ContainerStarted","Data":"8857067574a18e2028d2b32588327ef27d82cb8e97bd14e25cf888321d5d19af"} Oct 03 08:58:26 crc kubenswrapper[4790]: I1003 08:58:26.458138 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="355f59b0-e1a4-47c9-b24b-fffda1be9952" containerName="glance-log" containerID="cri-o://37ae584484cbe528ca7cd11dd582dcd5d81ec5af833e3400cef9e800763bd0e2" gracePeriod=30 Oct 03 08:58:26 crc kubenswrapper[4790]: I1003 08:58:26.458516 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="355f59b0-e1a4-47c9-b24b-fffda1be9952" containerName="glance-httpd" containerID="cri-o://8857067574a18e2028d2b32588327ef27d82cb8e97bd14e25cf888321d5d19af" gracePeriod=30 Oct 03 08:58:26 crc kubenswrapper[4790]: I1003 08:58:26.462457 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=6.462424207 podStartE2EDuration="6.462424207s" podCreationTimestamp="2025-10-03 08:58:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 08:58:26.443793907 +0000 UTC m=+1092.796728165" watchObservedRunningTime="2025-10-03 08:58:26.462424207 +0000 UTC m=+1092.815358445" Oct 03 08:58:26 crc kubenswrapper[4790]: I1003 08:58:26.469883 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-2100-account-create-q4khm" event={"ID":"4146a9d4-bf8b-4bce-aca7-531fb9e272ea","Type":"ContainerDied","Data":"9f99636ae488741b4d9552f97007ede1d39ca1c9fa52d44505ae794822e14ac0"} Oct 03 08:58:26 crc kubenswrapper[4790]: I1003 08:58:26.469930 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9f99636ae488741b4d9552f97007ede1d39ca1c9fa52d44505ae794822e14ac0" Oct 03 08:58:26 crc kubenswrapper[4790]: I1003 08:58:26.470000 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-2100-account-create-q4khm" Oct 03 08:58:26 crc kubenswrapper[4790]: I1003 08:58:26.531689 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=6.531669252 podStartE2EDuration="6.531669252s" podCreationTimestamp="2025-10-03 08:58:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 08:58:26.523723596 +0000 UTC m=+1092.876657844" watchObservedRunningTime="2025-10-03 08:58:26.531669252 +0000 UTC m=+1092.884603500" Oct 03 08:58:27 crc kubenswrapper[4790]: I1003 08:58:27.487588 4790 generic.go:334] "Generic (PLEG): container finished" podID="e08f91ab-f702-49a8-be10-3114d8a10e48" containerID="d85291541bd6889ad8fc25a5d8a2d6cd0ba75458c6a4a080f7698e45d3a23786" exitCode=0 Oct 03 08:58:27 crc kubenswrapper[4790]: I1003 08:58:27.487930 4790 generic.go:334] "Generic (PLEG): container finished" podID="e08f91ab-f702-49a8-be10-3114d8a10e48" containerID="aaeb72aaa7d01efc6a63c2cdd3df361bb83ca8e4eedad0117b3a034a12e0326c" exitCode=143 Oct 03 08:58:27 crc kubenswrapper[4790]: I1003 08:58:27.487647 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"e08f91ab-f702-49a8-be10-3114d8a10e48","Type":"ContainerDied","Data":"d85291541bd6889ad8fc25a5d8a2d6cd0ba75458c6a4a080f7698e45d3a23786"} Oct 03 08:58:27 crc kubenswrapper[4790]: I1003 08:58:27.488034 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"e08f91ab-f702-49a8-be10-3114d8a10e48","Type":"ContainerDied","Data":"aaeb72aaa7d01efc6a63c2cdd3df361bb83ca8e4eedad0117b3a034a12e0326c"} Oct 03 08:58:27 crc kubenswrapper[4790]: I1003 08:58:27.493381 4790 generic.go:334] "Generic (PLEG): container finished" podID="355f59b0-e1a4-47c9-b24b-fffda1be9952" containerID="37ae584484cbe528ca7cd11dd582dcd5d81ec5af833e3400cef9e800763bd0e2" exitCode=143 Oct 03 08:58:27 crc kubenswrapper[4790]: I1003 08:58:27.494059 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"355f59b0-e1a4-47c9-b24b-fffda1be9952","Type":"ContainerDied","Data":"37ae584484cbe528ca7cd11dd582dcd5d81ec5af833e3400cef9e800763bd0e2"} Oct 03 08:58:28 crc kubenswrapper[4790]: I1003 08:58:28.520551 4790 generic.go:334] "Generic (PLEG): container finished" podID="355f59b0-e1a4-47c9-b24b-fffda1be9952" containerID="8857067574a18e2028d2b32588327ef27d82cb8e97bd14e25cf888321d5d19af" exitCode=0 Oct 03 08:58:28 crc kubenswrapper[4790]: I1003 08:58:28.520734 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"355f59b0-e1a4-47c9-b24b-fffda1be9952","Type":"ContainerDied","Data":"8857067574a18e2028d2b32588327ef27d82cb8e97bd14e25cf888321d5d19af"} Oct 03 08:58:29 crc kubenswrapper[4790]: I1003 08:58:29.086054 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-867694955c-fpnzb"] Oct 03 08:58:29 crc kubenswrapper[4790]: I1003 08:58:29.129895 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-7d76b4b9d6-n5tqf"] Oct 03 08:58:29 crc kubenswrapper[4790]: E1003 08:58:29.130306 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="19b25afb-eabe-4e5f-9fbd-39e86c2d8690" containerName="init" Oct 03 08:58:29 crc kubenswrapper[4790]: I1003 08:58:29.130325 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="19b25afb-eabe-4e5f-9fbd-39e86c2d8690" containerName="init" Oct 03 08:58:29 crc kubenswrapper[4790]: E1003 08:58:29.130343 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f11270f2-ead7-4001-94f9-54e66cfbe017" containerName="mariadb-account-create" Oct 03 08:58:29 crc kubenswrapper[4790]: I1003 08:58:29.130350 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="f11270f2-ead7-4001-94f9-54e66cfbe017" containerName="mariadb-account-create" Oct 03 08:58:29 crc kubenswrapper[4790]: E1003 08:58:29.130380 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="feee3da2-b9c2-4091-b0cf-cacd8f7f0fdc" containerName="mariadb-account-create" Oct 03 08:58:29 crc kubenswrapper[4790]: I1003 08:58:29.130388 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="feee3da2-b9c2-4091-b0cf-cacd8f7f0fdc" containerName="mariadb-account-create" Oct 03 08:58:29 crc kubenswrapper[4790]: E1003 08:58:29.130404 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4146a9d4-bf8b-4bce-aca7-531fb9e272ea" containerName="mariadb-account-create" Oct 03 08:58:29 crc kubenswrapper[4790]: I1003 08:58:29.130412 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="4146a9d4-bf8b-4bce-aca7-531fb9e272ea" containerName="mariadb-account-create" Oct 03 08:58:29 crc kubenswrapper[4790]: I1003 08:58:29.130773 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="4146a9d4-bf8b-4bce-aca7-531fb9e272ea" containerName="mariadb-account-create" Oct 03 08:58:29 crc kubenswrapper[4790]: I1003 08:58:29.130790 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="19b25afb-eabe-4e5f-9fbd-39e86c2d8690" containerName="init" Oct 03 08:58:29 crc kubenswrapper[4790]: I1003 08:58:29.130804 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="feee3da2-b9c2-4091-b0cf-cacd8f7f0fdc" containerName="mariadb-account-create" Oct 03 08:58:29 crc kubenswrapper[4790]: I1003 08:58:29.130813 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="f11270f2-ead7-4001-94f9-54e66cfbe017" containerName="mariadb-account-create" Oct 03 08:58:29 crc kubenswrapper[4790]: I1003 08:58:29.131716 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-7d76b4b9d6-n5tqf"] Oct 03 08:58:29 crc kubenswrapper[4790]: I1003 08:58:29.131798 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7d76b4b9d6-n5tqf" Oct 03 08:58:29 crc kubenswrapper[4790]: I1003 08:58:29.153165 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-horizon-svc" Oct 03 08:58:29 crc kubenswrapper[4790]: I1003 08:58:29.241051 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-6448764df-qdvpl"] Oct 03 08:58:29 crc kubenswrapper[4790]: I1003 08:58:29.256415 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f57970a5-79c5-4a4f-bdf6-89d25bd8d8c5-config-data\") pod \"horizon-7d76b4b9d6-n5tqf\" (UID: \"f57970a5-79c5-4a4f-bdf6-89d25bd8d8c5\") " pod="openstack/horizon-7d76b4b9d6-n5tqf" Oct 03 08:58:29 crc kubenswrapper[4790]: I1003 08:58:29.256464 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f57970a5-79c5-4a4f-bdf6-89d25bd8d8c5-scripts\") pod \"horizon-7d76b4b9d6-n5tqf\" (UID: \"f57970a5-79c5-4a4f-bdf6-89d25bd8d8c5\") " pod="openstack/horizon-7d76b4b9d6-n5tqf" Oct 03 08:58:29 crc kubenswrapper[4790]: I1003 08:58:29.256520 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/f57970a5-79c5-4a4f-bdf6-89d25bd8d8c5-horizon-secret-key\") pod \"horizon-7d76b4b9d6-n5tqf\" (UID: \"f57970a5-79c5-4a4f-bdf6-89d25bd8d8c5\") " pod="openstack/horizon-7d76b4b9d6-n5tqf" Oct 03 08:58:29 crc kubenswrapper[4790]: I1003 08:58:29.256539 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f57970a5-79c5-4a4f-bdf6-89d25bd8d8c5-combined-ca-bundle\") pod \"horizon-7d76b4b9d6-n5tqf\" (UID: \"f57970a5-79c5-4a4f-bdf6-89d25bd8d8c5\") " pod="openstack/horizon-7d76b4b9d6-n5tqf" Oct 03 08:58:29 crc kubenswrapper[4790]: I1003 08:58:29.256555 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/f57970a5-79c5-4a4f-bdf6-89d25bd8d8c5-horizon-tls-certs\") pod \"horizon-7d76b4b9d6-n5tqf\" (UID: \"f57970a5-79c5-4a4f-bdf6-89d25bd8d8c5\") " pod="openstack/horizon-7d76b4b9d6-n5tqf" Oct 03 08:58:29 crc kubenswrapper[4790]: I1003 08:58:29.256599 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zx4h7\" (UniqueName: \"kubernetes.io/projected/f57970a5-79c5-4a4f-bdf6-89d25bd8d8c5-kube-api-access-zx4h7\") pod \"horizon-7d76b4b9d6-n5tqf\" (UID: \"f57970a5-79c5-4a4f-bdf6-89d25bd8d8c5\") " pod="openstack/horizon-7d76b4b9d6-n5tqf" Oct 03 08:58:29 crc kubenswrapper[4790]: I1003 08:58:29.256638 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f57970a5-79c5-4a4f-bdf6-89d25bd8d8c5-logs\") pod \"horizon-7d76b4b9d6-n5tqf\" (UID: \"f57970a5-79c5-4a4f-bdf6-89d25bd8d8c5\") " pod="openstack/horizon-7d76b4b9d6-n5tqf" Oct 03 08:58:29 crc kubenswrapper[4790]: I1003 08:58:29.260066 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-86f486799b-gwc4l"] Oct 03 08:58:29 crc kubenswrapper[4790]: I1003 08:58:29.263213 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-86f486799b-gwc4l" Oct 03 08:58:29 crc kubenswrapper[4790]: I1003 08:58:29.283912 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-86f486799b-gwc4l"] Oct 03 08:58:29 crc kubenswrapper[4790]: I1003 08:58:29.358886 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f57970a5-79c5-4a4f-bdf6-89d25bd8d8c5-config-data\") pod \"horizon-7d76b4b9d6-n5tqf\" (UID: \"f57970a5-79c5-4a4f-bdf6-89d25bd8d8c5\") " pod="openstack/horizon-7d76b4b9d6-n5tqf" Oct 03 08:58:29 crc kubenswrapper[4790]: I1003 08:58:29.358974 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f57970a5-79c5-4a4f-bdf6-89d25bd8d8c5-scripts\") pod \"horizon-7d76b4b9d6-n5tqf\" (UID: \"f57970a5-79c5-4a4f-bdf6-89d25bd8d8c5\") " pod="openstack/horizon-7d76b4b9d6-n5tqf" Oct 03 08:58:29 crc kubenswrapper[4790]: I1003 08:58:29.359085 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/f57970a5-79c5-4a4f-bdf6-89d25bd8d8c5-horizon-secret-key\") pod \"horizon-7d76b4b9d6-n5tqf\" (UID: \"f57970a5-79c5-4a4f-bdf6-89d25bd8d8c5\") " pod="openstack/horizon-7d76b4b9d6-n5tqf" Oct 03 08:58:29 crc kubenswrapper[4790]: I1003 08:58:29.359104 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f57970a5-79c5-4a4f-bdf6-89d25bd8d8c5-combined-ca-bundle\") pod \"horizon-7d76b4b9d6-n5tqf\" (UID: \"f57970a5-79c5-4a4f-bdf6-89d25bd8d8c5\") " pod="openstack/horizon-7d76b4b9d6-n5tqf" Oct 03 08:58:29 crc kubenswrapper[4790]: I1003 08:58:29.359123 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/f57970a5-79c5-4a4f-bdf6-89d25bd8d8c5-horizon-tls-certs\") pod \"horizon-7d76b4b9d6-n5tqf\" (UID: \"f57970a5-79c5-4a4f-bdf6-89d25bd8d8c5\") " pod="openstack/horizon-7d76b4b9d6-n5tqf" Oct 03 08:58:29 crc kubenswrapper[4790]: I1003 08:58:29.359298 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zx4h7\" (UniqueName: \"kubernetes.io/projected/f57970a5-79c5-4a4f-bdf6-89d25bd8d8c5-kube-api-access-zx4h7\") pod \"horizon-7d76b4b9d6-n5tqf\" (UID: \"f57970a5-79c5-4a4f-bdf6-89d25bd8d8c5\") " pod="openstack/horizon-7d76b4b9d6-n5tqf" Oct 03 08:58:29 crc kubenswrapper[4790]: I1003 08:58:29.360723 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f57970a5-79c5-4a4f-bdf6-89d25bd8d8c5-config-data\") pod \"horizon-7d76b4b9d6-n5tqf\" (UID: \"f57970a5-79c5-4a4f-bdf6-89d25bd8d8c5\") " pod="openstack/horizon-7d76b4b9d6-n5tqf" Oct 03 08:58:29 crc kubenswrapper[4790]: I1003 08:58:29.361035 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f57970a5-79c5-4a4f-bdf6-89d25bd8d8c5-logs\") pod \"horizon-7d76b4b9d6-n5tqf\" (UID: \"f57970a5-79c5-4a4f-bdf6-89d25bd8d8c5\") " pod="openstack/horizon-7d76b4b9d6-n5tqf" Oct 03 08:58:29 crc kubenswrapper[4790]: I1003 08:58:29.361308 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f57970a5-79c5-4a4f-bdf6-89d25bd8d8c5-logs\") pod \"horizon-7d76b4b9d6-n5tqf\" (UID: \"f57970a5-79c5-4a4f-bdf6-89d25bd8d8c5\") " pod="openstack/horizon-7d76b4b9d6-n5tqf" Oct 03 08:58:29 crc kubenswrapper[4790]: I1003 08:58:29.362119 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f57970a5-79c5-4a4f-bdf6-89d25bd8d8c5-scripts\") pod \"horizon-7d76b4b9d6-n5tqf\" (UID: \"f57970a5-79c5-4a4f-bdf6-89d25bd8d8c5\") " pod="openstack/horizon-7d76b4b9d6-n5tqf" Oct 03 08:58:29 crc kubenswrapper[4790]: I1003 08:58:29.372276 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/f57970a5-79c5-4a4f-bdf6-89d25bd8d8c5-horizon-secret-key\") pod \"horizon-7d76b4b9d6-n5tqf\" (UID: \"f57970a5-79c5-4a4f-bdf6-89d25bd8d8c5\") " pod="openstack/horizon-7d76b4b9d6-n5tqf" Oct 03 08:58:29 crc kubenswrapper[4790]: I1003 08:58:29.373169 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/f57970a5-79c5-4a4f-bdf6-89d25bd8d8c5-horizon-tls-certs\") pod \"horizon-7d76b4b9d6-n5tqf\" (UID: \"f57970a5-79c5-4a4f-bdf6-89d25bd8d8c5\") " pod="openstack/horizon-7d76b4b9d6-n5tqf" Oct 03 08:58:29 crc kubenswrapper[4790]: I1003 08:58:29.373708 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f57970a5-79c5-4a4f-bdf6-89d25bd8d8c5-combined-ca-bundle\") pod \"horizon-7d76b4b9d6-n5tqf\" (UID: \"f57970a5-79c5-4a4f-bdf6-89d25bd8d8c5\") " pod="openstack/horizon-7d76b4b9d6-n5tqf" Oct 03 08:58:29 crc kubenswrapper[4790]: I1003 08:58:29.378386 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zx4h7\" (UniqueName: \"kubernetes.io/projected/f57970a5-79c5-4a4f-bdf6-89d25bd8d8c5-kube-api-access-zx4h7\") pod \"horizon-7d76b4b9d6-n5tqf\" (UID: \"f57970a5-79c5-4a4f-bdf6-89d25bd8d8c5\") " pod="openstack/horizon-7d76b4b9d6-n5tqf" Oct 03 08:58:29 crc kubenswrapper[4790]: I1003 08:58:29.462630 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/6b9b4763-1706-4c7e-b39f-0d8311a9637d-horizon-secret-key\") pod \"horizon-86f486799b-gwc4l\" (UID: \"6b9b4763-1706-4c7e-b39f-0d8311a9637d\") " pod="openstack/horizon-86f486799b-gwc4l" Oct 03 08:58:29 crc kubenswrapper[4790]: I1003 08:58:29.462740 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6b9b4763-1706-4c7e-b39f-0d8311a9637d-scripts\") pod \"horizon-86f486799b-gwc4l\" (UID: \"6b9b4763-1706-4c7e-b39f-0d8311a9637d\") " pod="openstack/horizon-86f486799b-gwc4l" Oct 03 08:58:29 crc kubenswrapper[4790]: I1003 08:58:29.462772 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ns97q\" (UniqueName: \"kubernetes.io/projected/6b9b4763-1706-4c7e-b39f-0d8311a9637d-kube-api-access-ns97q\") pod \"horizon-86f486799b-gwc4l\" (UID: \"6b9b4763-1706-4c7e-b39f-0d8311a9637d\") " pod="openstack/horizon-86f486799b-gwc4l" Oct 03 08:58:29 crc kubenswrapper[4790]: I1003 08:58:29.462811 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6b9b4763-1706-4c7e-b39f-0d8311a9637d-config-data\") pod \"horizon-86f486799b-gwc4l\" (UID: \"6b9b4763-1706-4c7e-b39f-0d8311a9637d\") " pod="openstack/horizon-86f486799b-gwc4l" Oct 03 08:58:29 crc kubenswrapper[4790]: I1003 08:58:29.462829 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/6b9b4763-1706-4c7e-b39f-0d8311a9637d-horizon-tls-certs\") pod \"horizon-86f486799b-gwc4l\" (UID: \"6b9b4763-1706-4c7e-b39f-0d8311a9637d\") " pod="openstack/horizon-86f486799b-gwc4l" Oct 03 08:58:29 crc kubenswrapper[4790]: I1003 08:58:29.463033 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b9b4763-1706-4c7e-b39f-0d8311a9637d-combined-ca-bundle\") pod \"horizon-86f486799b-gwc4l\" (UID: \"6b9b4763-1706-4c7e-b39f-0d8311a9637d\") " pod="openstack/horizon-86f486799b-gwc4l" Oct 03 08:58:29 crc kubenswrapper[4790]: I1003 08:58:29.463079 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6b9b4763-1706-4c7e-b39f-0d8311a9637d-logs\") pod \"horizon-86f486799b-gwc4l\" (UID: \"6b9b4763-1706-4c7e-b39f-0d8311a9637d\") " pod="openstack/horizon-86f486799b-gwc4l" Oct 03 08:58:29 crc kubenswrapper[4790]: I1003 08:58:29.475128 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7d76b4b9d6-n5tqf" Oct 03 08:58:29 crc kubenswrapper[4790]: I1003 08:58:29.553230 4790 generic.go:334] "Generic (PLEG): container finished" podID="43741e1d-af3f-4800-a5a7-cca0f6620cbc" containerID="30e0fe45b4ecd503d294bff8ce70c90a46cb82b7bf568d3f1c53ed5fb95d4976" exitCode=0 Oct 03 08:58:29 crc kubenswrapper[4790]: I1003 08:58:29.553410 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-ztmgb" event={"ID":"43741e1d-af3f-4800-a5a7-cca0f6620cbc","Type":"ContainerDied","Data":"30e0fe45b4ecd503d294bff8ce70c90a46cb82b7bf568d3f1c53ed5fb95d4976"} Oct 03 08:58:29 crc kubenswrapper[4790]: I1003 08:58:29.564702 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6b9b4763-1706-4c7e-b39f-0d8311a9637d-scripts\") pod \"horizon-86f486799b-gwc4l\" (UID: \"6b9b4763-1706-4c7e-b39f-0d8311a9637d\") " pod="openstack/horizon-86f486799b-gwc4l" Oct 03 08:58:29 crc kubenswrapper[4790]: I1003 08:58:29.564776 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ns97q\" (UniqueName: \"kubernetes.io/projected/6b9b4763-1706-4c7e-b39f-0d8311a9637d-kube-api-access-ns97q\") pod \"horizon-86f486799b-gwc4l\" (UID: \"6b9b4763-1706-4c7e-b39f-0d8311a9637d\") " pod="openstack/horizon-86f486799b-gwc4l" Oct 03 08:58:29 crc kubenswrapper[4790]: I1003 08:58:29.564825 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6b9b4763-1706-4c7e-b39f-0d8311a9637d-config-data\") pod \"horizon-86f486799b-gwc4l\" (UID: \"6b9b4763-1706-4c7e-b39f-0d8311a9637d\") " pod="openstack/horizon-86f486799b-gwc4l" Oct 03 08:58:29 crc kubenswrapper[4790]: I1003 08:58:29.564842 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/6b9b4763-1706-4c7e-b39f-0d8311a9637d-horizon-tls-certs\") pod \"horizon-86f486799b-gwc4l\" (UID: \"6b9b4763-1706-4c7e-b39f-0d8311a9637d\") " pod="openstack/horizon-86f486799b-gwc4l" Oct 03 08:58:29 crc kubenswrapper[4790]: I1003 08:58:29.564912 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b9b4763-1706-4c7e-b39f-0d8311a9637d-combined-ca-bundle\") pod \"horizon-86f486799b-gwc4l\" (UID: \"6b9b4763-1706-4c7e-b39f-0d8311a9637d\") " pod="openstack/horizon-86f486799b-gwc4l" Oct 03 08:58:29 crc kubenswrapper[4790]: I1003 08:58:29.564936 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6b9b4763-1706-4c7e-b39f-0d8311a9637d-logs\") pod \"horizon-86f486799b-gwc4l\" (UID: \"6b9b4763-1706-4c7e-b39f-0d8311a9637d\") " pod="openstack/horizon-86f486799b-gwc4l" Oct 03 08:58:29 crc kubenswrapper[4790]: I1003 08:58:29.564987 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/6b9b4763-1706-4c7e-b39f-0d8311a9637d-horizon-secret-key\") pod \"horizon-86f486799b-gwc4l\" (UID: \"6b9b4763-1706-4c7e-b39f-0d8311a9637d\") " pod="openstack/horizon-86f486799b-gwc4l" Oct 03 08:58:29 crc kubenswrapper[4790]: I1003 08:58:29.567158 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6b9b4763-1706-4c7e-b39f-0d8311a9637d-config-data\") pod \"horizon-86f486799b-gwc4l\" (UID: \"6b9b4763-1706-4c7e-b39f-0d8311a9637d\") " pod="openstack/horizon-86f486799b-gwc4l" Oct 03 08:58:29 crc kubenswrapper[4790]: I1003 08:58:29.567589 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6b9b4763-1706-4c7e-b39f-0d8311a9637d-scripts\") pod \"horizon-86f486799b-gwc4l\" (UID: \"6b9b4763-1706-4c7e-b39f-0d8311a9637d\") " pod="openstack/horizon-86f486799b-gwc4l" Oct 03 08:58:29 crc kubenswrapper[4790]: I1003 08:58:29.569330 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6b9b4763-1706-4c7e-b39f-0d8311a9637d-logs\") pod \"horizon-86f486799b-gwc4l\" (UID: \"6b9b4763-1706-4c7e-b39f-0d8311a9637d\") " pod="openstack/horizon-86f486799b-gwc4l" Oct 03 08:58:29 crc kubenswrapper[4790]: I1003 08:58:29.577238 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b9b4763-1706-4c7e-b39f-0d8311a9637d-combined-ca-bundle\") pod \"horizon-86f486799b-gwc4l\" (UID: \"6b9b4763-1706-4c7e-b39f-0d8311a9637d\") " pod="openstack/horizon-86f486799b-gwc4l" Oct 03 08:58:29 crc kubenswrapper[4790]: I1003 08:58:29.577237 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/6b9b4763-1706-4c7e-b39f-0d8311a9637d-horizon-secret-key\") pod \"horizon-86f486799b-gwc4l\" (UID: \"6b9b4763-1706-4c7e-b39f-0d8311a9637d\") " pod="openstack/horizon-86f486799b-gwc4l" Oct 03 08:58:29 crc kubenswrapper[4790]: I1003 08:58:29.587451 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ns97q\" (UniqueName: \"kubernetes.io/projected/6b9b4763-1706-4c7e-b39f-0d8311a9637d-kube-api-access-ns97q\") pod \"horizon-86f486799b-gwc4l\" (UID: \"6b9b4763-1706-4c7e-b39f-0d8311a9637d\") " pod="openstack/horizon-86f486799b-gwc4l" Oct 03 08:58:29 crc kubenswrapper[4790]: I1003 08:58:29.588233 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/6b9b4763-1706-4c7e-b39f-0d8311a9637d-horizon-tls-certs\") pod \"horizon-86f486799b-gwc4l\" (UID: \"6b9b4763-1706-4c7e-b39f-0d8311a9637d\") " pod="openstack/horizon-86f486799b-gwc4l" Oct 03 08:58:29 crc kubenswrapper[4790]: I1003 08:58:29.600196 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-86f486799b-gwc4l" Oct 03 08:58:29 crc kubenswrapper[4790]: I1003 08:58:29.679125 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-db-sync-8j2zm" Oct 03 08:58:29 crc kubenswrapper[4790]: I1003 08:58:29.868716 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cj29t\" (UniqueName: \"kubernetes.io/projected/8d78d2bd-0369-4e94-9f44-4e57634ab8b6-kube-api-access-cj29t\") pod \"8d78d2bd-0369-4e94-9f44-4e57634ab8b6\" (UID: \"8d78d2bd-0369-4e94-9f44-4e57634ab8b6\") " Oct 03 08:58:29 crc kubenswrapper[4790]: I1003 08:58:29.868780 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/8d78d2bd-0369-4e94-9f44-4e57634ab8b6-db-sync-config-data\") pod \"8d78d2bd-0369-4e94-9f44-4e57634ab8b6\" (UID: \"8d78d2bd-0369-4e94-9f44-4e57634ab8b6\") " Oct 03 08:58:29 crc kubenswrapper[4790]: I1003 08:58:29.868954 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8d78d2bd-0369-4e94-9f44-4e57634ab8b6-config-data\") pod \"8d78d2bd-0369-4e94-9f44-4e57634ab8b6\" (UID: \"8d78d2bd-0369-4e94-9f44-4e57634ab8b6\") " Oct 03 08:58:29 crc kubenswrapper[4790]: I1003 08:58:29.869015 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d78d2bd-0369-4e94-9f44-4e57634ab8b6-combined-ca-bundle\") pod \"8d78d2bd-0369-4e94-9f44-4e57634ab8b6\" (UID: \"8d78d2bd-0369-4e94-9f44-4e57634ab8b6\") " Oct 03 08:58:29 crc kubenswrapper[4790]: I1003 08:58:29.874044 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8d78d2bd-0369-4e94-9f44-4e57634ab8b6-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "8d78d2bd-0369-4e94-9f44-4e57634ab8b6" (UID: "8d78d2bd-0369-4e94-9f44-4e57634ab8b6"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:58:29 crc kubenswrapper[4790]: I1003 08:58:29.874346 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8d78d2bd-0369-4e94-9f44-4e57634ab8b6-kube-api-access-cj29t" (OuterVolumeSpecName: "kube-api-access-cj29t") pod "8d78d2bd-0369-4e94-9f44-4e57634ab8b6" (UID: "8d78d2bd-0369-4e94-9f44-4e57634ab8b6"). InnerVolumeSpecName "kube-api-access-cj29t". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:58:29 crc kubenswrapper[4790]: I1003 08:58:29.894140 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8d78d2bd-0369-4e94-9f44-4e57634ab8b6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8d78d2bd-0369-4e94-9f44-4e57634ab8b6" (UID: "8d78d2bd-0369-4e94-9f44-4e57634ab8b6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:58:29 crc kubenswrapper[4790]: I1003 08:58:29.943789 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8d78d2bd-0369-4e94-9f44-4e57634ab8b6-config-data" (OuterVolumeSpecName: "config-data") pod "8d78d2bd-0369-4e94-9f44-4e57634ab8b6" (UID: "8d78d2bd-0369-4e94-9f44-4e57634ab8b6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:58:29 crc kubenswrapper[4790]: I1003 08:58:29.971621 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cj29t\" (UniqueName: \"kubernetes.io/projected/8d78d2bd-0369-4e94-9f44-4e57634ab8b6-kube-api-access-cj29t\") on node \"crc\" DevicePath \"\"" Oct 03 08:58:29 crc kubenswrapper[4790]: I1003 08:58:29.971670 4790 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/8d78d2bd-0369-4e94-9f44-4e57634ab8b6-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 08:58:29 crc kubenswrapper[4790]: I1003 08:58:29.971684 4790 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8d78d2bd-0369-4e94-9f44-4e57634ab8b6-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 08:58:29 crc kubenswrapper[4790]: I1003 08:58:29.971700 4790 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d78d2bd-0369-4e94-9f44-4e57634ab8b6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 08:58:30 crc kubenswrapper[4790]: I1003 08:58:30.560856 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-brfpm"] Oct 03 08:58:30 crc kubenswrapper[4790]: E1003 08:58:30.561641 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8d78d2bd-0369-4e94-9f44-4e57634ab8b6" containerName="watcher-db-sync" Oct 03 08:58:30 crc kubenswrapper[4790]: I1003 08:58:30.561660 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d78d2bd-0369-4e94-9f44-4e57634ab8b6" containerName="watcher-db-sync" Oct 03 08:58:30 crc kubenswrapper[4790]: I1003 08:58:30.561878 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="8d78d2bd-0369-4e94-9f44-4e57634ab8b6" containerName="watcher-db-sync" Oct 03 08:58:30 crc kubenswrapper[4790]: I1003 08:58:30.564872 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-brfpm" Oct 03 08:58:30 crc kubenswrapper[4790]: I1003 08:58:30.571772 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Oct 03 08:58:30 crc kubenswrapper[4790]: I1003 08:58:30.572119 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-kp26b" Oct 03 08:58:30 crc kubenswrapper[4790]: I1003 08:58:30.574603 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-brfpm"] Oct 03 08:58:30 crc kubenswrapper[4790]: I1003 08:58:30.577305 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-db-sync-8j2zm" Oct 03 08:58:30 crc kubenswrapper[4790]: I1003 08:58:30.577355 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-db-sync-8j2zm" event={"ID":"8d78d2bd-0369-4e94-9f44-4e57634ab8b6","Type":"ContainerDied","Data":"a7e378512a498b341651339187ed17bd8ccc34053fdfdfc76665dc157cbdd7b3"} Oct 03 08:58:30 crc kubenswrapper[4790]: I1003 08:58:30.577386 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a7e378512a498b341651339187ed17bd8ccc34053fdfdfc76665dc157cbdd7b3" Oct 03 08:58:30 crc kubenswrapper[4790]: I1003 08:58:30.589666 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zk7df\" (UniqueName: \"kubernetes.io/projected/20340665-5e73-4bfd-8a1a-6d21e0d16432-kube-api-access-zk7df\") pod \"barbican-db-sync-brfpm\" (UID: \"20340665-5e73-4bfd-8a1a-6d21e0d16432\") " pod="openstack/barbican-db-sync-brfpm" Oct 03 08:58:30 crc kubenswrapper[4790]: I1003 08:58:30.602929 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/20340665-5e73-4bfd-8a1a-6d21e0d16432-combined-ca-bundle\") pod \"barbican-db-sync-brfpm\" (UID: \"20340665-5e73-4bfd-8a1a-6d21e0d16432\") " pod="openstack/barbican-db-sync-brfpm" Oct 03 08:58:30 crc kubenswrapper[4790]: I1003 08:58:30.603240 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/20340665-5e73-4bfd-8a1a-6d21e0d16432-db-sync-config-data\") pod \"barbican-db-sync-brfpm\" (UID: \"20340665-5e73-4bfd-8a1a-6d21e0d16432\") " pod="openstack/barbican-db-sync-brfpm" Oct 03 08:58:30 crc kubenswrapper[4790]: I1003 08:58:30.721812 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/20340665-5e73-4bfd-8a1a-6d21e0d16432-combined-ca-bundle\") pod \"barbican-db-sync-brfpm\" (UID: \"20340665-5e73-4bfd-8a1a-6d21e0d16432\") " pod="openstack/barbican-db-sync-brfpm" Oct 03 08:58:30 crc kubenswrapper[4790]: I1003 08:58:30.721914 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/20340665-5e73-4bfd-8a1a-6d21e0d16432-db-sync-config-data\") pod \"barbican-db-sync-brfpm\" (UID: \"20340665-5e73-4bfd-8a1a-6d21e0d16432\") " pod="openstack/barbican-db-sync-brfpm" Oct 03 08:58:30 crc kubenswrapper[4790]: I1003 08:58:30.721968 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zk7df\" (UniqueName: \"kubernetes.io/projected/20340665-5e73-4bfd-8a1a-6d21e0d16432-kube-api-access-zk7df\") pod \"barbican-db-sync-brfpm\" (UID: \"20340665-5e73-4bfd-8a1a-6d21e0d16432\") " pod="openstack/barbican-db-sync-brfpm" Oct 03 08:58:30 crc kubenswrapper[4790]: I1003 08:58:30.727829 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/20340665-5e73-4bfd-8a1a-6d21e0d16432-combined-ca-bundle\") pod \"barbican-db-sync-brfpm\" (UID: \"20340665-5e73-4bfd-8a1a-6d21e0d16432\") " pod="openstack/barbican-db-sync-brfpm" Oct 03 08:58:30 crc kubenswrapper[4790]: I1003 08:58:30.737193 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zk7df\" (UniqueName: \"kubernetes.io/projected/20340665-5e73-4bfd-8a1a-6d21e0d16432-kube-api-access-zk7df\") pod \"barbican-db-sync-brfpm\" (UID: \"20340665-5e73-4bfd-8a1a-6d21e0d16432\") " pod="openstack/barbican-db-sync-brfpm" Oct 03 08:58:30 crc kubenswrapper[4790]: I1003 08:58:30.752350 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/20340665-5e73-4bfd-8a1a-6d21e0d16432-db-sync-config-data\") pod \"barbican-db-sync-brfpm\" (UID: \"20340665-5e73-4bfd-8a1a-6d21e0d16432\") " pod="openstack/barbican-db-sync-brfpm" Oct 03 08:58:30 crc kubenswrapper[4790]: I1003 08:58:30.891339 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-brfpm" Oct 03 08:58:30 crc kubenswrapper[4790]: I1003 08:58:30.923291 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6454d9f559-66tmq" Oct 03 08:58:30 crc kubenswrapper[4790]: I1003 08:58:30.929429 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-gklmz"] Oct 03 08:58:30 crc kubenswrapper[4790]: I1003 08:58:30.945382 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-gklmz" Oct 03 08:58:30 crc kubenswrapper[4790]: I1003 08:58:30.949232 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Oct 03 08:58:30 crc kubenswrapper[4790]: I1003 08:58:30.949487 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-5pgsx" Oct 03 08:58:30 crc kubenswrapper[4790]: I1003 08:58:30.949633 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Oct 03 08:58:30 crc kubenswrapper[4790]: I1003 08:58:30.955954 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-gklmz"] Oct 03 08:58:31 crc kubenswrapper[4790]: I1003 08:58:31.014645 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-api-0"] Oct 03 08:58:31 crc kubenswrapper[4790]: I1003 08:58:31.016330 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-api-0" Oct 03 08:58:31 crc kubenswrapper[4790]: I1003 08:58:31.019818 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-api-config-data" Oct 03 08:58:31 crc kubenswrapper[4790]: I1003 08:58:31.031199 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-watcher-dockercfg-5f8dh" Oct 03 08:58:31 crc kubenswrapper[4790]: I1003 08:58:31.033701 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-api-0"] Oct 03 08:58:31 crc kubenswrapper[4790]: I1003 08:58:31.040260 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/96250103-59af-4ec6-a036-f6e8222b02e0-etc-machine-id\") pod \"cinder-db-sync-gklmz\" (UID: \"96250103-59af-4ec6-a036-f6e8222b02e0\") " pod="openstack/cinder-db-sync-gklmz" Oct 03 08:58:31 crc kubenswrapper[4790]: I1003 08:58:31.040302 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/35a981dd-1fb1-4a29-a36c-592ea3925ff7-config-data\") pod \"watcher-api-0\" (UID: \"35a981dd-1fb1-4a29-a36c-592ea3925ff7\") " pod="openstack/watcher-api-0" Oct 03 08:58:31 crc kubenswrapper[4790]: I1003 08:58:31.040334 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hv7rx\" (UniqueName: \"kubernetes.io/projected/96250103-59af-4ec6-a036-f6e8222b02e0-kube-api-access-hv7rx\") pod \"cinder-db-sync-gklmz\" (UID: \"96250103-59af-4ec6-a036-f6e8222b02e0\") " pod="openstack/cinder-db-sync-gklmz" Oct 03 08:58:31 crc kubenswrapper[4790]: I1003 08:58:31.040359 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/96250103-59af-4ec6-a036-f6e8222b02e0-config-data\") pod \"cinder-db-sync-gklmz\" (UID: \"96250103-59af-4ec6-a036-f6e8222b02e0\") " pod="openstack/cinder-db-sync-gklmz" Oct 03 08:58:31 crc kubenswrapper[4790]: I1003 08:58:31.040377 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/35a981dd-1fb1-4a29-a36c-592ea3925ff7-combined-ca-bundle\") pod \"watcher-api-0\" (UID: \"35a981dd-1fb1-4a29-a36c-592ea3925ff7\") " pod="openstack/watcher-api-0" Oct 03 08:58:31 crc kubenswrapper[4790]: I1003 08:58:31.040416 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/35a981dd-1fb1-4a29-a36c-592ea3925ff7-logs\") pod \"watcher-api-0\" (UID: \"35a981dd-1fb1-4a29-a36c-592ea3925ff7\") " pod="openstack/watcher-api-0" Oct 03 08:58:31 crc kubenswrapper[4790]: I1003 08:58:31.040477 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/35a981dd-1fb1-4a29-a36c-592ea3925ff7-custom-prometheus-ca\") pod \"watcher-api-0\" (UID: \"35a981dd-1fb1-4a29-a36c-592ea3925ff7\") " pod="openstack/watcher-api-0" Oct 03 08:58:31 crc kubenswrapper[4790]: I1003 08:58:31.040501 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/96250103-59af-4ec6-a036-f6e8222b02e0-combined-ca-bundle\") pod \"cinder-db-sync-gklmz\" (UID: \"96250103-59af-4ec6-a036-f6e8222b02e0\") " pod="openstack/cinder-db-sync-gklmz" Oct 03 08:58:31 crc kubenswrapper[4790]: I1003 08:58:31.040547 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/96250103-59af-4ec6-a036-f6e8222b02e0-db-sync-config-data\") pod \"cinder-db-sync-gklmz\" (UID: \"96250103-59af-4ec6-a036-f6e8222b02e0\") " pod="openstack/cinder-db-sync-gklmz" Oct 03 08:58:31 crc kubenswrapper[4790]: I1003 08:58:31.040569 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xwdjj\" (UniqueName: \"kubernetes.io/projected/35a981dd-1fb1-4a29-a36c-592ea3925ff7-kube-api-access-xwdjj\") pod \"watcher-api-0\" (UID: \"35a981dd-1fb1-4a29-a36c-592ea3925ff7\") " pod="openstack/watcher-api-0" Oct 03 08:58:31 crc kubenswrapper[4790]: I1003 08:58:31.040634 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/96250103-59af-4ec6-a036-f6e8222b02e0-scripts\") pod \"cinder-db-sync-gklmz\" (UID: \"96250103-59af-4ec6-a036-f6e8222b02e0\") " pod="openstack/cinder-db-sync-gklmz" Oct 03 08:58:31 crc kubenswrapper[4790]: I1003 08:58:31.049342 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675946d6fc-std9f"] Oct 03 08:58:31 crc kubenswrapper[4790]: I1003 08:58:31.049618 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-675946d6fc-std9f" podUID="dd4153b0-ab10-423f-bb3a-be4bac79c118" containerName="dnsmasq-dns" containerID="cri-o://dc3f67af7517d1bed8849c40f9891ea9a8e11f9f2be2fb0354d64d3b6488e4ca" gracePeriod=10 Oct 03 08:58:31 crc kubenswrapper[4790]: I1003 08:58:31.143204 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/96250103-59af-4ec6-a036-f6e8222b02e0-etc-machine-id\") pod \"cinder-db-sync-gklmz\" (UID: \"96250103-59af-4ec6-a036-f6e8222b02e0\") " pod="openstack/cinder-db-sync-gklmz" Oct 03 08:58:31 crc kubenswrapper[4790]: I1003 08:58:31.143286 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/35a981dd-1fb1-4a29-a36c-592ea3925ff7-config-data\") pod \"watcher-api-0\" (UID: \"35a981dd-1fb1-4a29-a36c-592ea3925ff7\") " pod="openstack/watcher-api-0" Oct 03 08:58:31 crc kubenswrapper[4790]: I1003 08:58:31.143359 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hv7rx\" (UniqueName: \"kubernetes.io/projected/96250103-59af-4ec6-a036-f6e8222b02e0-kube-api-access-hv7rx\") pod \"cinder-db-sync-gklmz\" (UID: \"96250103-59af-4ec6-a036-f6e8222b02e0\") " pod="openstack/cinder-db-sync-gklmz" Oct 03 08:58:31 crc kubenswrapper[4790]: I1003 08:58:31.143382 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/96250103-59af-4ec6-a036-f6e8222b02e0-config-data\") pod \"cinder-db-sync-gklmz\" (UID: \"96250103-59af-4ec6-a036-f6e8222b02e0\") " pod="openstack/cinder-db-sync-gklmz" Oct 03 08:58:31 crc kubenswrapper[4790]: I1003 08:58:31.143407 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/35a981dd-1fb1-4a29-a36c-592ea3925ff7-combined-ca-bundle\") pod \"watcher-api-0\" (UID: \"35a981dd-1fb1-4a29-a36c-592ea3925ff7\") " pod="openstack/watcher-api-0" Oct 03 08:58:31 crc kubenswrapper[4790]: I1003 08:58:31.143446 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/35a981dd-1fb1-4a29-a36c-592ea3925ff7-logs\") pod \"watcher-api-0\" (UID: \"35a981dd-1fb1-4a29-a36c-592ea3925ff7\") " pod="openstack/watcher-api-0" Oct 03 08:58:31 crc kubenswrapper[4790]: I1003 08:58:31.143506 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/35a981dd-1fb1-4a29-a36c-592ea3925ff7-custom-prometheus-ca\") pod \"watcher-api-0\" (UID: \"35a981dd-1fb1-4a29-a36c-592ea3925ff7\") " pod="openstack/watcher-api-0" Oct 03 08:58:31 crc kubenswrapper[4790]: I1003 08:58:31.143551 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/96250103-59af-4ec6-a036-f6e8222b02e0-combined-ca-bundle\") pod \"cinder-db-sync-gklmz\" (UID: \"96250103-59af-4ec6-a036-f6e8222b02e0\") " pod="openstack/cinder-db-sync-gklmz" Oct 03 08:58:31 crc kubenswrapper[4790]: I1003 08:58:31.143640 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/96250103-59af-4ec6-a036-f6e8222b02e0-db-sync-config-data\") pod \"cinder-db-sync-gklmz\" (UID: \"96250103-59af-4ec6-a036-f6e8222b02e0\") " pod="openstack/cinder-db-sync-gklmz" Oct 03 08:58:31 crc kubenswrapper[4790]: I1003 08:58:31.143672 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xwdjj\" (UniqueName: \"kubernetes.io/projected/35a981dd-1fb1-4a29-a36c-592ea3925ff7-kube-api-access-xwdjj\") pod \"watcher-api-0\" (UID: \"35a981dd-1fb1-4a29-a36c-592ea3925ff7\") " pod="openstack/watcher-api-0" Oct 03 08:58:31 crc kubenswrapper[4790]: I1003 08:58:31.143704 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/96250103-59af-4ec6-a036-f6e8222b02e0-scripts\") pod \"cinder-db-sync-gklmz\" (UID: \"96250103-59af-4ec6-a036-f6e8222b02e0\") " pod="openstack/cinder-db-sync-gklmz" Oct 03 08:58:31 crc kubenswrapper[4790]: I1003 08:58:31.146806 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/35a981dd-1fb1-4a29-a36c-592ea3925ff7-logs\") pod \"watcher-api-0\" (UID: \"35a981dd-1fb1-4a29-a36c-592ea3925ff7\") " pod="openstack/watcher-api-0" Oct 03 08:58:31 crc kubenswrapper[4790]: I1003 08:58:31.148727 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/96250103-59af-4ec6-a036-f6e8222b02e0-etc-machine-id\") pod \"cinder-db-sync-gklmz\" (UID: \"96250103-59af-4ec6-a036-f6e8222b02e0\") " pod="openstack/cinder-db-sync-gklmz" Oct 03 08:58:31 crc kubenswrapper[4790]: I1003 08:58:31.158687 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/35a981dd-1fb1-4a29-a36c-592ea3925ff7-config-data\") pod \"watcher-api-0\" (UID: \"35a981dd-1fb1-4a29-a36c-592ea3925ff7\") " pod="openstack/watcher-api-0" Oct 03 08:58:31 crc kubenswrapper[4790]: I1003 08:58:31.161212 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/35a981dd-1fb1-4a29-a36c-592ea3925ff7-custom-prometheus-ca\") pod \"watcher-api-0\" (UID: \"35a981dd-1fb1-4a29-a36c-592ea3925ff7\") " pod="openstack/watcher-api-0" Oct 03 08:58:31 crc kubenswrapper[4790]: I1003 08:58:31.208460 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/96250103-59af-4ec6-a036-f6e8222b02e0-combined-ca-bundle\") pod \"cinder-db-sync-gklmz\" (UID: \"96250103-59af-4ec6-a036-f6e8222b02e0\") " pod="openstack/cinder-db-sync-gklmz" Oct 03 08:58:31 crc kubenswrapper[4790]: I1003 08:58:31.208911 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/96250103-59af-4ec6-a036-f6e8222b02e0-db-sync-config-data\") pod \"cinder-db-sync-gklmz\" (UID: \"96250103-59af-4ec6-a036-f6e8222b02e0\") " pod="openstack/cinder-db-sync-gklmz" Oct 03 08:58:31 crc kubenswrapper[4790]: I1003 08:58:31.232910 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/96250103-59af-4ec6-a036-f6e8222b02e0-config-data\") pod \"cinder-db-sync-gklmz\" (UID: \"96250103-59af-4ec6-a036-f6e8222b02e0\") " pod="openstack/cinder-db-sync-gklmz" Oct 03 08:58:31 crc kubenswrapper[4790]: I1003 08:58:31.240597 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/35a981dd-1fb1-4a29-a36c-592ea3925ff7-combined-ca-bundle\") pod \"watcher-api-0\" (UID: \"35a981dd-1fb1-4a29-a36c-592ea3925ff7\") " pod="openstack/watcher-api-0" Oct 03 08:58:31 crc kubenswrapper[4790]: I1003 08:58:31.241021 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/96250103-59af-4ec6-a036-f6e8222b02e0-scripts\") pod \"cinder-db-sync-gklmz\" (UID: \"96250103-59af-4ec6-a036-f6e8222b02e0\") " pod="openstack/cinder-db-sync-gklmz" Oct 03 08:58:31 crc kubenswrapper[4790]: I1003 08:58:31.275179 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-mfdjk"] Oct 03 08:58:31 crc kubenswrapper[4790]: I1003 08:58:31.278303 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-mfdjk" Oct 03 08:58:31 crc kubenswrapper[4790]: I1003 08:58:31.292079 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Oct 03 08:58:31 crc kubenswrapper[4790]: I1003 08:58:31.293312 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-wkzbz" Oct 03 08:58:31 crc kubenswrapper[4790]: I1003 08:58:31.294877 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Oct 03 08:58:31 crc kubenswrapper[4790]: I1003 08:58:31.337643 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-mfdjk"] Oct 03 08:58:31 crc kubenswrapper[4790]: I1003 08:58:31.340206 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hv7rx\" (UniqueName: \"kubernetes.io/projected/96250103-59af-4ec6-a036-f6e8222b02e0-kube-api-access-hv7rx\") pod \"cinder-db-sync-gklmz\" (UID: \"96250103-59af-4ec6-a036-f6e8222b02e0\") " pod="openstack/cinder-db-sync-gklmz" Oct 03 08:58:31 crc kubenswrapper[4790]: I1003 08:58:31.369847 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/df20e14b-d7d8-4cab-8b7c-61122da75fbf-config\") pod \"neutron-db-sync-mfdjk\" (UID: \"df20e14b-d7d8-4cab-8b7c-61122da75fbf\") " pod="openstack/neutron-db-sync-mfdjk" Oct 03 08:58:31 crc kubenswrapper[4790]: I1003 08:58:31.369986 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ts7hg\" (UniqueName: \"kubernetes.io/projected/df20e14b-d7d8-4cab-8b7c-61122da75fbf-kube-api-access-ts7hg\") pod \"neutron-db-sync-mfdjk\" (UID: \"df20e14b-d7d8-4cab-8b7c-61122da75fbf\") " pod="openstack/neutron-db-sync-mfdjk" Oct 03 08:58:31 crc kubenswrapper[4790]: I1003 08:58:31.370065 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df20e14b-d7d8-4cab-8b7c-61122da75fbf-combined-ca-bundle\") pod \"neutron-db-sync-mfdjk\" (UID: \"df20e14b-d7d8-4cab-8b7c-61122da75fbf\") " pod="openstack/neutron-db-sync-mfdjk" Oct 03 08:58:31 crc kubenswrapper[4790]: I1003 08:58:31.459642 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xwdjj\" (UniqueName: \"kubernetes.io/projected/35a981dd-1fb1-4a29-a36c-592ea3925ff7-kube-api-access-xwdjj\") pod \"watcher-api-0\" (UID: \"35a981dd-1fb1-4a29-a36c-592ea3925ff7\") " pod="openstack/watcher-api-0" Oct 03 08:58:31 crc kubenswrapper[4790]: I1003 08:58:31.460638 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-applier-0"] Oct 03 08:58:31 crc kubenswrapper[4790]: I1003 08:58:31.461968 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-applier-0" Oct 03 08:58:31 crc kubenswrapper[4790]: I1003 08:58:31.465374 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-applier-config-data" Oct 03 08:58:31 crc kubenswrapper[4790]: I1003 08:58:31.469939 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-decision-engine-0"] Oct 03 08:58:31 crc kubenswrapper[4790]: I1003 08:58:31.471481 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-decision-engine-0" Oct 03 08:58:31 crc kubenswrapper[4790]: I1003 08:58:31.473050 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/df20e14b-d7d8-4cab-8b7c-61122da75fbf-config\") pod \"neutron-db-sync-mfdjk\" (UID: \"df20e14b-d7d8-4cab-8b7c-61122da75fbf\") " pod="openstack/neutron-db-sync-mfdjk" Oct 03 08:58:31 crc kubenswrapper[4790]: I1003 08:58:31.473138 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ts7hg\" (UniqueName: \"kubernetes.io/projected/df20e14b-d7d8-4cab-8b7c-61122da75fbf-kube-api-access-ts7hg\") pod \"neutron-db-sync-mfdjk\" (UID: \"df20e14b-d7d8-4cab-8b7c-61122da75fbf\") " pod="openstack/neutron-db-sync-mfdjk" Oct 03 08:58:31 crc kubenswrapper[4790]: I1003 08:58:31.473193 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df20e14b-d7d8-4cab-8b7c-61122da75fbf-combined-ca-bundle\") pod \"neutron-db-sync-mfdjk\" (UID: \"df20e14b-d7d8-4cab-8b7c-61122da75fbf\") " pod="openstack/neutron-db-sync-mfdjk" Oct 03 08:58:31 crc kubenswrapper[4790]: I1003 08:58:31.474554 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-decision-engine-config-data" Oct 03 08:58:31 crc kubenswrapper[4790]: I1003 08:58:31.486164 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df20e14b-d7d8-4cab-8b7c-61122da75fbf-combined-ca-bundle\") pod \"neutron-db-sync-mfdjk\" (UID: \"df20e14b-d7d8-4cab-8b7c-61122da75fbf\") " pod="openstack/neutron-db-sync-mfdjk" Oct 03 08:58:31 crc kubenswrapper[4790]: I1003 08:58:31.503057 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/df20e14b-d7d8-4cab-8b7c-61122da75fbf-config\") pod \"neutron-db-sync-mfdjk\" (UID: \"df20e14b-d7d8-4cab-8b7c-61122da75fbf\") " pod="openstack/neutron-db-sync-mfdjk" Oct 03 08:58:31 crc kubenswrapper[4790]: I1003 08:58:31.503669 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-applier-0"] Oct 03 08:58:31 crc kubenswrapper[4790]: I1003 08:58:31.514474 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ts7hg\" (UniqueName: \"kubernetes.io/projected/df20e14b-d7d8-4cab-8b7c-61122da75fbf-kube-api-access-ts7hg\") pod \"neutron-db-sync-mfdjk\" (UID: \"df20e14b-d7d8-4cab-8b7c-61122da75fbf\") " pod="openstack/neutron-db-sync-mfdjk" Oct 03 08:58:31 crc kubenswrapper[4790]: I1003 08:58:31.528985 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-decision-engine-0"] Oct 03 08:58:31 crc kubenswrapper[4790]: I1003 08:58:31.572846 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-gklmz" Oct 03 08:58:31 crc kubenswrapper[4790]: I1003 08:58:31.575625 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/f0afbb0f-f4dc-4814-8c7f-791351972c0b-custom-prometheus-ca\") pod \"watcher-decision-engine-0\" (UID: \"f0afbb0f-f4dc-4814-8c7f-791351972c0b\") " pod="openstack/watcher-decision-engine-0" Oct 03 08:58:31 crc kubenswrapper[4790]: I1003 08:58:31.575677 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d5jw5\" (UniqueName: \"kubernetes.io/projected/67485d96-5ca0-4195-bb7a-d09d03d3dc49-kube-api-access-d5jw5\") pod \"watcher-applier-0\" (UID: \"67485d96-5ca0-4195-bb7a-d09d03d3dc49\") " pod="openstack/watcher-applier-0" Oct 03 08:58:31 crc kubenswrapper[4790]: I1003 08:58:31.575708 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/67485d96-5ca0-4195-bb7a-d09d03d3dc49-combined-ca-bundle\") pod \"watcher-applier-0\" (UID: \"67485d96-5ca0-4195-bb7a-d09d03d3dc49\") " pod="openstack/watcher-applier-0" Oct 03 08:58:31 crc kubenswrapper[4790]: I1003 08:58:31.575737 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f0afbb0f-f4dc-4814-8c7f-791351972c0b-config-data\") pod \"watcher-decision-engine-0\" (UID: \"f0afbb0f-f4dc-4814-8c7f-791351972c0b\") " pod="openstack/watcher-decision-engine-0" Oct 03 08:58:31 crc kubenswrapper[4790]: I1003 08:58:31.575755 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f0afbb0f-f4dc-4814-8c7f-791351972c0b-logs\") pod \"watcher-decision-engine-0\" (UID: \"f0afbb0f-f4dc-4814-8c7f-791351972c0b\") " pod="openstack/watcher-decision-engine-0" Oct 03 08:58:31 crc kubenswrapper[4790]: I1003 08:58:31.575770 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/67485d96-5ca0-4195-bb7a-d09d03d3dc49-config-data\") pod \"watcher-applier-0\" (UID: \"67485d96-5ca0-4195-bb7a-d09d03d3dc49\") " pod="openstack/watcher-applier-0" Oct 03 08:58:31 crc kubenswrapper[4790]: I1003 08:58:31.575802 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/67485d96-5ca0-4195-bb7a-d09d03d3dc49-logs\") pod \"watcher-applier-0\" (UID: \"67485d96-5ca0-4195-bb7a-d09d03d3dc49\") " pod="openstack/watcher-applier-0" Oct 03 08:58:31 crc kubenswrapper[4790]: I1003 08:58:31.575840 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f0afbb0f-f4dc-4814-8c7f-791351972c0b-combined-ca-bundle\") pod \"watcher-decision-engine-0\" (UID: \"f0afbb0f-f4dc-4814-8c7f-791351972c0b\") " pod="openstack/watcher-decision-engine-0" Oct 03 08:58:31 crc kubenswrapper[4790]: I1003 08:58:31.575860 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ncf84\" (UniqueName: \"kubernetes.io/projected/f0afbb0f-f4dc-4814-8c7f-791351972c0b-kube-api-access-ncf84\") pod \"watcher-decision-engine-0\" (UID: \"f0afbb0f-f4dc-4814-8c7f-791351972c0b\") " pod="openstack/watcher-decision-engine-0" Oct 03 08:58:31 crc kubenswrapper[4790]: I1003 08:58:31.599720 4790 generic.go:334] "Generic (PLEG): container finished" podID="dd4153b0-ab10-423f-bb3a-be4bac79c118" containerID="dc3f67af7517d1bed8849c40f9891ea9a8e11f9f2be2fb0354d64d3b6488e4ca" exitCode=0 Oct 03 08:58:31 crc kubenswrapper[4790]: I1003 08:58:31.599777 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675946d6fc-std9f" event={"ID":"dd4153b0-ab10-423f-bb3a-be4bac79c118","Type":"ContainerDied","Data":"dc3f67af7517d1bed8849c40f9891ea9a8e11f9f2be2fb0354d64d3b6488e4ca"} Oct 03 08:58:31 crc kubenswrapper[4790]: I1003 08:58:31.602232 4790 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-675946d6fc-std9f" podUID="dd4153b0-ab10-423f-bb3a-be4bac79c118" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.123:5353: connect: connection refused" Oct 03 08:58:31 crc kubenswrapper[4790]: I1003 08:58:31.602255 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-mfdjk" Oct 03 08:58:31 crc kubenswrapper[4790]: I1003 08:58:31.639533 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-api-0" Oct 03 08:58:31 crc kubenswrapper[4790]: I1003 08:58:31.677982 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/f0afbb0f-f4dc-4814-8c7f-791351972c0b-custom-prometheus-ca\") pod \"watcher-decision-engine-0\" (UID: \"f0afbb0f-f4dc-4814-8c7f-791351972c0b\") " pod="openstack/watcher-decision-engine-0" Oct 03 08:58:31 crc kubenswrapper[4790]: I1003 08:58:31.678063 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d5jw5\" (UniqueName: \"kubernetes.io/projected/67485d96-5ca0-4195-bb7a-d09d03d3dc49-kube-api-access-d5jw5\") pod \"watcher-applier-0\" (UID: \"67485d96-5ca0-4195-bb7a-d09d03d3dc49\") " pod="openstack/watcher-applier-0" Oct 03 08:58:31 crc kubenswrapper[4790]: I1003 08:58:31.678098 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/67485d96-5ca0-4195-bb7a-d09d03d3dc49-combined-ca-bundle\") pod \"watcher-applier-0\" (UID: \"67485d96-5ca0-4195-bb7a-d09d03d3dc49\") " pod="openstack/watcher-applier-0" Oct 03 08:58:31 crc kubenswrapper[4790]: I1003 08:58:31.678124 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f0afbb0f-f4dc-4814-8c7f-791351972c0b-config-data\") pod \"watcher-decision-engine-0\" (UID: \"f0afbb0f-f4dc-4814-8c7f-791351972c0b\") " pod="openstack/watcher-decision-engine-0" Oct 03 08:58:31 crc kubenswrapper[4790]: I1003 08:58:31.678139 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f0afbb0f-f4dc-4814-8c7f-791351972c0b-logs\") pod \"watcher-decision-engine-0\" (UID: \"f0afbb0f-f4dc-4814-8c7f-791351972c0b\") " pod="openstack/watcher-decision-engine-0" Oct 03 08:58:31 crc kubenswrapper[4790]: I1003 08:58:31.678156 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/67485d96-5ca0-4195-bb7a-d09d03d3dc49-config-data\") pod \"watcher-applier-0\" (UID: \"67485d96-5ca0-4195-bb7a-d09d03d3dc49\") " pod="openstack/watcher-applier-0" Oct 03 08:58:31 crc kubenswrapper[4790]: I1003 08:58:31.678200 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/67485d96-5ca0-4195-bb7a-d09d03d3dc49-logs\") pod \"watcher-applier-0\" (UID: \"67485d96-5ca0-4195-bb7a-d09d03d3dc49\") " pod="openstack/watcher-applier-0" Oct 03 08:58:31 crc kubenswrapper[4790]: I1003 08:58:31.678252 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f0afbb0f-f4dc-4814-8c7f-791351972c0b-combined-ca-bundle\") pod \"watcher-decision-engine-0\" (UID: \"f0afbb0f-f4dc-4814-8c7f-791351972c0b\") " pod="openstack/watcher-decision-engine-0" Oct 03 08:58:31 crc kubenswrapper[4790]: I1003 08:58:31.678271 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ncf84\" (UniqueName: \"kubernetes.io/projected/f0afbb0f-f4dc-4814-8c7f-791351972c0b-kube-api-access-ncf84\") pod \"watcher-decision-engine-0\" (UID: \"f0afbb0f-f4dc-4814-8c7f-791351972c0b\") " pod="openstack/watcher-decision-engine-0" Oct 03 08:58:31 crc kubenswrapper[4790]: I1003 08:58:31.679651 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/67485d96-5ca0-4195-bb7a-d09d03d3dc49-logs\") pod \"watcher-applier-0\" (UID: \"67485d96-5ca0-4195-bb7a-d09d03d3dc49\") " pod="openstack/watcher-applier-0" Oct 03 08:58:31 crc kubenswrapper[4790]: I1003 08:58:31.679764 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f0afbb0f-f4dc-4814-8c7f-791351972c0b-logs\") pod \"watcher-decision-engine-0\" (UID: \"f0afbb0f-f4dc-4814-8c7f-791351972c0b\") " pod="openstack/watcher-decision-engine-0" Oct 03 08:58:31 crc kubenswrapper[4790]: I1003 08:58:31.682512 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/f0afbb0f-f4dc-4814-8c7f-791351972c0b-custom-prometheus-ca\") pod \"watcher-decision-engine-0\" (UID: \"f0afbb0f-f4dc-4814-8c7f-791351972c0b\") " pod="openstack/watcher-decision-engine-0" Oct 03 08:58:31 crc kubenswrapper[4790]: I1003 08:58:31.682519 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/67485d96-5ca0-4195-bb7a-d09d03d3dc49-config-data\") pod \"watcher-applier-0\" (UID: \"67485d96-5ca0-4195-bb7a-d09d03d3dc49\") " pod="openstack/watcher-applier-0" Oct 03 08:58:31 crc kubenswrapper[4790]: I1003 08:58:31.689762 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f0afbb0f-f4dc-4814-8c7f-791351972c0b-combined-ca-bundle\") pod \"watcher-decision-engine-0\" (UID: \"f0afbb0f-f4dc-4814-8c7f-791351972c0b\") " pod="openstack/watcher-decision-engine-0" Oct 03 08:58:31 crc kubenswrapper[4790]: I1003 08:58:31.690425 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/67485d96-5ca0-4195-bb7a-d09d03d3dc49-combined-ca-bundle\") pod \"watcher-applier-0\" (UID: \"67485d96-5ca0-4195-bb7a-d09d03d3dc49\") " pod="openstack/watcher-applier-0" Oct 03 08:58:31 crc kubenswrapper[4790]: I1003 08:58:31.693335 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f0afbb0f-f4dc-4814-8c7f-791351972c0b-config-data\") pod \"watcher-decision-engine-0\" (UID: \"f0afbb0f-f4dc-4814-8c7f-791351972c0b\") " pod="openstack/watcher-decision-engine-0" Oct 03 08:58:31 crc kubenswrapper[4790]: I1003 08:58:31.696120 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d5jw5\" (UniqueName: \"kubernetes.io/projected/67485d96-5ca0-4195-bb7a-d09d03d3dc49-kube-api-access-d5jw5\") pod \"watcher-applier-0\" (UID: \"67485d96-5ca0-4195-bb7a-d09d03d3dc49\") " pod="openstack/watcher-applier-0" Oct 03 08:58:31 crc kubenswrapper[4790]: I1003 08:58:31.701983 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ncf84\" (UniqueName: \"kubernetes.io/projected/f0afbb0f-f4dc-4814-8c7f-791351972c0b-kube-api-access-ncf84\") pod \"watcher-decision-engine-0\" (UID: \"f0afbb0f-f4dc-4814-8c7f-791351972c0b\") " pod="openstack/watcher-decision-engine-0" Oct 03 08:58:31 crc kubenswrapper[4790]: I1003 08:58:31.918556 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-applier-0" Oct 03 08:58:31 crc kubenswrapper[4790]: I1003 08:58:31.924220 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-decision-engine-0" Oct 03 08:58:36 crc kubenswrapper[4790]: I1003 08:58:36.602040 4790 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-675946d6fc-std9f" podUID="dd4153b0-ab10-423f-bb3a-be4bac79c118" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.123:5353: connect: connection refused" Oct 03 08:58:38 crc kubenswrapper[4790]: E1003 08:58:38.110517 4790 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.58:5001/podified-master-centos10/openstack-ceilometer-central:watcher_latest" Oct 03 08:58:38 crc kubenswrapper[4790]: E1003 08:58:38.111116 4790 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.58:5001/podified-master-centos10/openstack-ceilometer-central:watcher_latest" Oct 03 08:58:38 crc kubenswrapper[4790]: E1003 08:58:38.111323 4790 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:ceilometer-central-agent,Image:38.102.83.58:5001/podified-master-centos10/openstack-ceilometer-central:watcher_latest,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n58bh679h9ch656h64bhffhch75h67fh75h5ddh5fbh68dh568hf7h5b9h98h568h5d4h84h64dh5d7hd8h66fh65bh644h698hcdh57h9chbfh67bq,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/var/lib/openstack/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/openstack/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:ceilometer-central-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-xmsgj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/python3 /var/lib/openstack/bin/centralhealth.py],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:300,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ceilometer-0_openstack(8a0817c5-b72b-4433-977d-6185317b0885): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 03 08:58:38 crc kubenswrapper[4790]: I1003 08:58:38.336679 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-ztmgb" Oct 03 08:58:38 crc kubenswrapper[4790]: I1003 08:58:38.432457 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/43741e1d-af3f-4800-a5a7-cca0f6620cbc-scripts\") pod \"43741e1d-af3f-4800-a5a7-cca0f6620cbc\" (UID: \"43741e1d-af3f-4800-a5a7-cca0f6620cbc\") " Oct 03 08:58:38 crc kubenswrapper[4790]: I1003 08:58:38.432541 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/43741e1d-af3f-4800-a5a7-cca0f6620cbc-credential-keys\") pod \"43741e1d-af3f-4800-a5a7-cca0f6620cbc\" (UID: \"43741e1d-af3f-4800-a5a7-cca0f6620cbc\") " Oct 03 08:58:38 crc kubenswrapper[4790]: I1003 08:58:38.432643 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/43741e1d-af3f-4800-a5a7-cca0f6620cbc-config-data\") pod \"43741e1d-af3f-4800-a5a7-cca0f6620cbc\" (UID: \"43741e1d-af3f-4800-a5a7-cca0f6620cbc\") " Oct 03 08:58:38 crc kubenswrapper[4790]: I1003 08:58:38.432892 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-82vsb\" (UniqueName: \"kubernetes.io/projected/43741e1d-af3f-4800-a5a7-cca0f6620cbc-kube-api-access-82vsb\") pod \"43741e1d-af3f-4800-a5a7-cca0f6620cbc\" (UID: \"43741e1d-af3f-4800-a5a7-cca0f6620cbc\") " Oct 03 08:58:38 crc kubenswrapper[4790]: I1003 08:58:38.433018 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/43741e1d-af3f-4800-a5a7-cca0f6620cbc-combined-ca-bundle\") pod \"43741e1d-af3f-4800-a5a7-cca0f6620cbc\" (UID: \"43741e1d-af3f-4800-a5a7-cca0f6620cbc\") " Oct 03 08:58:38 crc kubenswrapper[4790]: I1003 08:58:38.433126 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/43741e1d-af3f-4800-a5a7-cca0f6620cbc-fernet-keys\") pod \"43741e1d-af3f-4800-a5a7-cca0f6620cbc\" (UID: \"43741e1d-af3f-4800-a5a7-cca0f6620cbc\") " Oct 03 08:58:38 crc kubenswrapper[4790]: I1003 08:58:38.437117 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43741e1d-af3f-4800-a5a7-cca0f6620cbc-scripts" (OuterVolumeSpecName: "scripts") pod "43741e1d-af3f-4800-a5a7-cca0f6620cbc" (UID: "43741e1d-af3f-4800-a5a7-cca0f6620cbc"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:58:38 crc kubenswrapper[4790]: I1003 08:58:38.439208 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43741e1d-af3f-4800-a5a7-cca0f6620cbc-kube-api-access-82vsb" (OuterVolumeSpecName: "kube-api-access-82vsb") pod "43741e1d-af3f-4800-a5a7-cca0f6620cbc" (UID: "43741e1d-af3f-4800-a5a7-cca0f6620cbc"). InnerVolumeSpecName "kube-api-access-82vsb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:58:38 crc kubenswrapper[4790]: I1003 08:58:38.441686 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43741e1d-af3f-4800-a5a7-cca0f6620cbc-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "43741e1d-af3f-4800-a5a7-cca0f6620cbc" (UID: "43741e1d-af3f-4800-a5a7-cca0f6620cbc"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:58:38 crc kubenswrapper[4790]: I1003 08:58:38.444956 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43741e1d-af3f-4800-a5a7-cca0f6620cbc-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "43741e1d-af3f-4800-a5a7-cca0f6620cbc" (UID: "43741e1d-af3f-4800-a5a7-cca0f6620cbc"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:58:38 crc kubenswrapper[4790]: I1003 08:58:38.473730 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43741e1d-af3f-4800-a5a7-cca0f6620cbc-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "43741e1d-af3f-4800-a5a7-cca0f6620cbc" (UID: "43741e1d-af3f-4800-a5a7-cca0f6620cbc"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:58:38 crc kubenswrapper[4790]: I1003 08:58:38.478786 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43741e1d-af3f-4800-a5a7-cca0f6620cbc-config-data" (OuterVolumeSpecName: "config-data") pod "43741e1d-af3f-4800-a5a7-cca0f6620cbc" (UID: "43741e1d-af3f-4800-a5a7-cca0f6620cbc"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:58:38 crc kubenswrapper[4790]: I1003 08:58:38.535580 4790 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/43741e1d-af3f-4800-a5a7-cca0f6620cbc-credential-keys\") on node \"crc\" DevicePath \"\"" Oct 03 08:58:38 crc kubenswrapper[4790]: I1003 08:58:38.535613 4790 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/43741e1d-af3f-4800-a5a7-cca0f6620cbc-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 08:58:38 crc kubenswrapper[4790]: I1003 08:58:38.535624 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-82vsb\" (UniqueName: \"kubernetes.io/projected/43741e1d-af3f-4800-a5a7-cca0f6620cbc-kube-api-access-82vsb\") on node \"crc\" DevicePath \"\"" Oct 03 08:58:38 crc kubenswrapper[4790]: I1003 08:58:38.535632 4790 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/43741e1d-af3f-4800-a5a7-cca0f6620cbc-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 08:58:38 crc kubenswrapper[4790]: I1003 08:58:38.535643 4790 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/43741e1d-af3f-4800-a5a7-cca0f6620cbc-fernet-keys\") on node \"crc\" DevicePath \"\"" Oct 03 08:58:38 crc kubenswrapper[4790]: I1003 08:58:38.535650 4790 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/43741e1d-af3f-4800-a5a7-cca0f6620cbc-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 08:58:38 crc kubenswrapper[4790]: I1003 08:58:38.695938 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-ztmgb" event={"ID":"43741e1d-af3f-4800-a5a7-cca0f6620cbc","Type":"ContainerDied","Data":"62f713261b38fce8904840d58b61fb3003e76924aec92ae9ead3903b8bf7a343"} Oct 03 08:58:38 crc kubenswrapper[4790]: I1003 08:58:38.696345 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="62f713261b38fce8904840d58b61fb3003e76924aec92ae9ead3903b8bf7a343" Oct 03 08:58:38 crc kubenswrapper[4790]: I1003 08:58:38.696108 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-ztmgb" Oct 03 08:58:38 crc kubenswrapper[4790]: I1003 08:58:38.786585 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675946d6fc-std9f" Oct 03 08:58:38 crc kubenswrapper[4790]: I1003 08:58:38.842148 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-txmgh\" (UniqueName: \"kubernetes.io/projected/dd4153b0-ab10-423f-bb3a-be4bac79c118-kube-api-access-txmgh\") pod \"dd4153b0-ab10-423f-bb3a-be4bac79c118\" (UID: \"dd4153b0-ab10-423f-bb3a-be4bac79c118\") " Oct 03 08:58:38 crc kubenswrapper[4790]: I1003 08:58:38.842721 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/dd4153b0-ab10-423f-bb3a-be4bac79c118-ovsdbserver-sb\") pod \"dd4153b0-ab10-423f-bb3a-be4bac79c118\" (UID: \"dd4153b0-ab10-423f-bb3a-be4bac79c118\") " Oct 03 08:58:38 crc kubenswrapper[4790]: I1003 08:58:38.842805 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dd4153b0-ab10-423f-bb3a-be4bac79c118-config\") pod \"dd4153b0-ab10-423f-bb3a-be4bac79c118\" (UID: \"dd4153b0-ab10-423f-bb3a-be4bac79c118\") " Oct 03 08:58:38 crc kubenswrapper[4790]: I1003 08:58:38.842900 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/dd4153b0-ab10-423f-bb3a-be4bac79c118-dns-svc\") pod \"dd4153b0-ab10-423f-bb3a-be4bac79c118\" (UID: \"dd4153b0-ab10-423f-bb3a-be4bac79c118\") " Oct 03 08:58:38 crc kubenswrapper[4790]: I1003 08:58:38.842986 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/dd4153b0-ab10-423f-bb3a-be4bac79c118-ovsdbserver-nb\") pod \"dd4153b0-ab10-423f-bb3a-be4bac79c118\" (UID: \"dd4153b0-ab10-423f-bb3a-be4bac79c118\") " Oct 03 08:58:38 crc kubenswrapper[4790]: I1003 08:58:38.873035 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dd4153b0-ab10-423f-bb3a-be4bac79c118-kube-api-access-txmgh" (OuterVolumeSpecName: "kube-api-access-txmgh") pod "dd4153b0-ab10-423f-bb3a-be4bac79c118" (UID: "dd4153b0-ab10-423f-bb3a-be4bac79c118"). InnerVolumeSpecName "kube-api-access-txmgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:58:38 crc kubenswrapper[4790]: I1003 08:58:38.949589 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-txmgh\" (UniqueName: \"kubernetes.io/projected/dd4153b0-ab10-423f-bb3a-be4bac79c118-kube-api-access-txmgh\") on node \"crc\" DevicePath \"\"" Oct 03 08:58:38 crc kubenswrapper[4790]: I1003 08:58:38.962186 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dd4153b0-ab10-423f-bb3a-be4bac79c118-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "dd4153b0-ab10-423f-bb3a-be4bac79c118" (UID: "dd4153b0-ab10-423f-bb3a-be4bac79c118"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:58:38 crc kubenswrapper[4790]: I1003 08:58:38.966339 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dd4153b0-ab10-423f-bb3a-be4bac79c118-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "dd4153b0-ab10-423f-bb3a-be4bac79c118" (UID: "dd4153b0-ab10-423f-bb3a-be4bac79c118"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:58:38 crc kubenswrapper[4790]: I1003 08:58:38.972791 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dd4153b0-ab10-423f-bb3a-be4bac79c118-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "dd4153b0-ab10-423f-bb3a-be4bac79c118" (UID: "dd4153b0-ab10-423f-bb3a-be4bac79c118"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:58:38 crc kubenswrapper[4790]: I1003 08:58:38.985166 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dd4153b0-ab10-423f-bb3a-be4bac79c118-config" (OuterVolumeSpecName: "config") pod "dd4153b0-ab10-423f-bb3a-be4bac79c118" (UID: "dd4153b0-ab10-423f-bb3a-be4bac79c118"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:58:39 crc kubenswrapper[4790]: I1003 08:58:39.051450 4790 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/dd4153b0-ab10-423f-bb3a-be4bac79c118-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 03 08:58:39 crc kubenswrapper[4790]: I1003 08:58:39.051493 4790 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dd4153b0-ab10-423f-bb3a-be4bac79c118-config\") on node \"crc\" DevicePath \"\"" Oct 03 08:58:39 crc kubenswrapper[4790]: I1003 08:58:39.051506 4790 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/dd4153b0-ab10-423f-bb3a-be4bac79c118-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 03 08:58:39 crc kubenswrapper[4790]: I1003 08:58:39.051516 4790 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/dd4153b0-ab10-423f-bb3a-be4bac79c118-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 03 08:58:39 crc kubenswrapper[4790]: I1003 08:58:39.454819 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-ztmgb"] Oct 03 08:58:39 crc kubenswrapper[4790]: I1003 08:58:39.466370 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-ztmgb"] Oct 03 08:58:39 crc kubenswrapper[4790]: I1003 08:58:39.520192 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-bbwh5"] Oct 03 08:58:39 crc kubenswrapper[4790]: E1003 08:58:39.520582 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="43741e1d-af3f-4800-a5a7-cca0f6620cbc" containerName="keystone-bootstrap" Oct 03 08:58:39 crc kubenswrapper[4790]: I1003 08:58:39.520600 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="43741e1d-af3f-4800-a5a7-cca0f6620cbc" containerName="keystone-bootstrap" Oct 03 08:58:39 crc kubenswrapper[4790]: E1003 08:58:39.520616 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd4153b0-ab10-423f-bb3a-be4bac79c118" containerName="dnsmasq-dns" Oct 03 08:58:39 crc kubenswrapper[4790]: I1003 08:58:39.520622 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd4153b0-ab10-423f-bb3a-be4bac79c118" containerName="dnsmasq-dns" Oct 03 08:58:39 crc kubenswrapper[4790]: E1003 08:58:39.520640 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd4153b0-ab10-423f-bb3a-be4bac79c118" containerName="init" Oct 03 08:58:39 crc kubenswrapper[4790]: I1003 08:58:39.520646 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd4153b0-ab10-423f-bb3a-be4bac79c118" containerName="init" Oct 03 08:58:39 crc kubenswrapper[4790]: I1003 08:58:39.520823 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="dd4153b0-ab10-423f-bb3a-be4bac79c118" containerName="dnsmasq-dns" Oct 03 08:58:39 crc kubenswrapper[4790]: I1003 08:58:39.520839 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="43741e1d-af3f-4800-a5a7-cca0f6620cbc" containerName="keystone-bootstrap" Oct 03 08:58:39 crc kubenswrapper[4790]: I1003 08:58:39.521479 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-bbwh5" Oct 03 08:58:39 crc kubenswrapper[4790]: I1003 08:58:39.525727 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Oct 03 08:58:39 crc kubenswrapper[4790]: I1003 08:58:39.526886 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-fnq8f" Oct 03 08:58:39 crc kubenswrapper[4790]: I1003 08:58:39.527204 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Oct 03 08:58:39 crc kubenswrapper[4790]: I1003 08:58:39.527316 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Oct 03 08:58:39 crc kubenswrapper[4790]: I1003 08:58:39.554155 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-bbwh5"] Oct 03 08:58:39 crc kubenswrapper[4790]: I1003 08:58:39.622610 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-gklmz"] Oct 03 08:58:39 crc kubenswrapper[4790]: I1003 08:58:39.624284 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 03 08:58:39 crc kubenswrapper[4790]: I1003 08:58:39.635358 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-brfpm"] Oct 03 08:58:39 crc kubenswrapper[4790]: I1003 08:58:39.648755 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 03 08:58:39 crc kubenswrapper[4790]: I1003 08:58:39.648843 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-mfdjk"] Oct 03 08:58:39 crc kubenswrapper[4790]: I1003 08:58:39.667101 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z8xlz\" (UniqueName: \"kubernetes.io/projected/7197a2e5-b159-4ea2-bcb6-a58faf21d4ad-kube-api-access-z8xlz\") pod \"keystone-bootstrap-bbwh5\" (UID: \"7197a2e5-b159-4ea2-bcb6-a58faf21d4ad\") " pod="openstack/keystone-bootstrap-bbwh5" Oct 03 08:58:39 crc kubenswrapper[4790]: I1003 08:58:39.667186 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/7197a2e5-b159-4ea2-bcb6-a58faf21d4ad-credential-keys\") pod \"keystone-bootstrap-bbwh5\" (UID: \"7197a2e5-b159-4ea2-bcb6-a58faf21d4ad\") " pod="openstack/keystone-bootstrap-bbwh5" Oct 03 08:58:39 crc kubenswrapper[4790]: I1003 08:58:39.667271 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7197a2e5-b159-4ea2-bcb6-a58faf21d4ad-combined-ca-bundle\") pod \"keystone-bootstrap-bbwh5\" (UID: \"7197a2e5-b159-4ea2-bcb6-a58faf21d4ad\") " pod="openstack/keystone-bootstrap-bbwh5" Oct 03 08:58:39 crc kubenswrapper[4790]: I1003 08:58:39.667295 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/7197a2e5-b159-4ea2-bcb6-a58faf21d4ad-fernet-keys\") pod \"keystone-bootstrap-bbwh5\" (UID: \"7197a2e5-b159-4ea2-bcb6-a58faf21d4ad\") " pod="openstack/keystone-bootstrap-bbwh5" Oct 03 08:58:39 crc kubenswrapper[4790]: I1003 08:58:39.667331 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7197a2e5-b159-4ea2-bcb6-a58faf21d4ad-config-data\") pod \"keystone-bootstrap-bbwh5\" (UID: \"7197a2e5-b159-4ea2-bcb6-a58faf21d4ad\") " pod="openstack/keystone-bootstrap-bbwh5" Oct 03 08:58:39 crc kubenswrapper[4790]: I1003 08:58:39.667357 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7197a2e5-b159-4ea2-bcb6-a58faf21d4ad-scripts\") pod \"keystone-bootstrap-bbwh5\" (UID: \"7197a2e5-b159-4ea2-bcb6-a58faf21d4ad\") " pod="openstack/keystone-bootstrap-bbwh5" Oct 03 08:58:39 crc kubenswrapper[4790]: I1003 08:58:39.667461 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-7d76b4b9d6-n5tqf"] Oct 03 08:58:39 crc kubenswrapper[4790]: I1003 08:58:39.678789 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-applier-0"] Oct 03 08:58:39 crc kubenswrapper[4790]: I1003 08:58:39.685724 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-api-0"] Oct 03 08:58:39 crc kubenswrapper[4790]: I1003 08:58:39.712299 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-decision-engine-0"] Oct 03 08:58:39 crc kubenswrapper[4790]: W1003 08:58:39.729202 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod67485d96_5ca0_4195_bb7a_d09d03d3dc49.slice/crio-065a93cf76deee6299a5a5a7d3e1dbafa3562a7c693ca01f268c1f7666811b5c WatchSource:0}: Error finding container 065a93cf76deee6299a5a5a7d3e1dbafa3562a7c693ca01f268c1f7666811b5c: Status 404 returned error can't find the container with id 065a93cf76deee6299a5a5a7d3e1dbafa3562a7c693ca01f268c1f7666811b5c Oct 03 08:58:39 crc kubenswrapper[4790]: I1003 08:58:39.732749 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6448764df-qdvpl" event={"ID":"e5a8c4b6-bbe9-4dda-b8b9-4b84fe0864ef","Type":"ContainerStarted","Data":"fc91064cded5b32a5632b39284688d49158263bf22c6a86f5afd9e7d3936de2f"} Oct 03 08:58:39 crc kubenswrapper[4790]: I1003 08:58:39.737844 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 03 08:58:39 crc kubenswrapper[4790]: I1003 08:58:39.738293 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"e08f91ab-f702-49a8-be10-3114d8a10e48","Type":"ContainerDied","Data":"e47eef746fe26100f7053a206f35675a1034a7cc37a5c5ac37e36abb7ccc53f6"} Oct 03 08:58:39 crc kubenswrapper[4790]: I1003 08:58:39.738351 4790 scope.go:117] "RemoveContainer" containerID="d85291541bd6889ad8fc25a5d8a2d6cd0ba75458c6a4a080f7698e45d3a23786" Oct 03 08:58:39 crc kubenswrapper[4790]: W1003 08:58:39.739853 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf0afbb0f_f4dc_4814_8c7f_791351972c0b.slice/crio-a908578b6502ff3984c4ef14391723bc62abedc7411bba3d07eac6cc51be543c WatchSource:0}: Error finding container a908578b6502ff3984c4ef14391723bc62abedc7411bba3d07eac6cc51be543c: Status 404 returned error can't find the container with id a908578b6502ff3984c4ef14391723bc62abedc7411bba3d07eac6cc51be543c Oct 03 08:58:39 crc kubenswrapper[4790]: I1003 08:58:39.740871 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6d887b5cbf-5qjjt" event={"ID":"8a368664-2338-4579-ae15-981ee5ca3194","Type":"ContainerStarted","Data":"c9c519b5c6dd5924b44d4d2bc8af4c14d93fd4cea6763325b628d674eb8b39b4"} Oct 03 08:58:39 crc kubenswrapper[4790]: I1003 08:58:39.754373 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-vzdz9" event={"ID":"a839597d-be75-4b9f-9d18-a2e8045a674b","Type":"ContainerStarted","Data":"436f9b4ae537c4ee8eadbceb94dcc53da18a9ada7d1217d7ec775e3e975c1a8d"} Oct 03 08:58:39 crc kubenswrapper[4790]: I1003 08:58:39.759266 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675946d6fc-std9f" event={"ID":"dd4153b0-ab10-423f-bb3a-be4bac79c118","Type":"ContainerDied","Data":"f54334b10bed2f7e8edd859681bff19c659e0ce02efa3af5d54188dac7ee56ca"} Oct 03 08:58:39 crc kubenswrapper[4790]: I1003 08:58:39.759387 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675946d6fc-std9f" Oct 03 08:58:39 crc kubenswrapper[4790]: I1003 08:58:39.763750 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 03 08:58:39 crc kubenswrapper[4790]: I1003 08:58:39.764551 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"355f59b0-e1a4-47c9-b24b-fffda1be9952","Type":"ContainerDied","Data":"fa87d5770606723279f6b17ced44e0a3ae528328c0f2db42338fdc106015cd38"} Oct 03 08:58:39 crc kubenswrapper[4790]: I1003 08:58:39.769155 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"355f59b0-e1a4-47c9-b24b-fffda1be9952\" (UID: \"355f59b0-e1a4-47c9-b24b-fffda1be9952\") " Oct 03 08:58:39 crc kubenswrapper[4790]: I1003 08:58:39.769343 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e08f91ab-f702-49a8-be10-3114d8a10e48-internal-tls-certs\") pod \"e08f91ab-f702-49a8-be10-3114d8a10e48\" (UID: \"e08f91ab-f702-49a8-be10-3114d8a10e48\") " Oct 03 08:58:39 crc kubenswrapper[4790]: I1003 08:58:39.769384 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/355f59b0-e1a4-47c9-b24b-fffda1be9952-scripts\") pod \"355f59b0-e1a4-47c9-b24b-fffda1be9952\" (UID: \"355f59b0-e1a4-47c9-b24b-fffda1be9952\") " Oct 03 08:58:39 crc kubenswrapper[4790]: I1003 08:58:39.769408 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/355f59b0-e1a4-47c9-b24b-fffda1be9952-httpd-run\") pod \"355f59b0-e1a4-47c9-b24b-fffda1be9952\" (UID: \"355f59b0-e1a4-47c9-b24b-fffda1be9952\") " Oct 03 08:58:39 crc kubenswrapper[4790]: I1003 08:58:39.769500 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e08f91ab-f702-49a8-be10-3114d8a10e48-scripts\") pod \"e08f91ab-f702-49a8-be10-3114d8a10e48\" (UID: \"e08f91ab-f702-49a8-be10-3114d8a10e48\") " Oct 03 08:58:39 crc kubenswrapper[4790]: I1003 08:58:39.769519 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/355f59b0-e1a4-47c9-b24b-fffda1be9952-public-tls-certs\") pod \"355f59b0-e1a4-47c9-b24b-fffda1be9952\" (UID: \"355f59b0-e1a4-47c9-b24b-fffda1be9952\") " Oct 03 08:58:39 crc kubenswrapper[4790]: I1003 08:58:39.769564 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e08f91ab-f702-49a8-be10-3114d8a10e48-httpd-run\") pod \"e08f91ab-f702-49a8-be10-3114d8a10e48\" (UID: \"e08f91ab-f702-49a8-be10-3114d8a10e48\") " Oct 03 08:58:39 crc kubenswrapper[4790]: I1003 08:58:39.769664 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"e08f91ab-f702-49a8-be10-3114d8a10e48\" (UID: \"e08f91ab-f702-49a8-be10-3114d8a10e48\") " Oct 03 08:58:39 crc kubenswrapper[4790]: I1003 08:58:39.769701 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e08f91ab-f702-49a8-be10-3114d8a10e48-config-data\") pod \"e08f91ab-f702-49a8-be10-3114d8a10e48\" (UID: \"e08f91ab-f702-49a8-be10-3114d8a10e48\") " Oct 03 08:58:39 crc kubenswrapper[4790]: I1003 08:58:39.769766 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7ngs9\" (UniqueName: \"kubernetes.io/projected/355f59b0-e1a4-47c9-b24b-fffda1be9952-kube-api-access-7ngs9\") pod \"355f59b0-e1a4-47c9-b24b-fffda1be9952\" (UID: \"355f59b0-e1a4-47c9-b24b-fffda1be9952\") " Oct 03 08:58:39 crc kubenswrapper[4790]: I1003 08:58:39.769806 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/355f59b0-e1a4-47c9-b24b-fffda1be9952-logs\") pod \"355f59b0-e1a4-47c9-b24b-fffda1be9952\" (UID: \"355f59b0-e1a4-47c9-b24b-fffda1be9952\") " Oct 03 08:58:39 crc kubenswrapper[4790]: I1003 08:58:39.769852 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mzq4n\" (UniqueName: \"kubernetes.io/projected/e08f91ab-f702-49a8-be10-3114d8a10e48-kube-api-access-mzq4n\") pod \"e08f91ab-f702-49a8-be10-3114d8a10e48\" (UID: \"e08f91ab-f702-49a8-be10-3114d8a10e48\") " Oct 03 08:58:39 crc kubenswrapper[4790]: I1003 08:58:39.769915 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e08f91ab-f702-49a8-be10-3114d8a10e48-logs\") pod \"e08f91ab-f702-49a8-be10-3114d8a10e48\" (UID: \"e08f91ab-f702-49a8-be10-3114d8a10e48\") " Oct 03 08:58:39 crc kubenswrapper[4790]: I1003 08:58:39.769950 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/355f59b0-e1a4-47c9-b24b-fffda1be9952-config-data\") pod \"355f59b0-e1a4-47c9-b24b-fffda1be9952\" (UID: \"355f59b0-e1a4-47c9-b24b-fffda1be9952\") " Oct 03 08:58:39 crc kubenswrapper[4790]: I1003 08:58:39.769982 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/355f59b0-e1a4-47c9-b24b-fffda1be9952-combined-ca-bundle\") pod \"355f59b0-e1a4-47c9-b24b-fffda1be9952\" (UID: \"355f59b0-e1a4-47c9-b24b-fffda1be9952\") " Oct 03 08:58:39 crc kubenswrapper[4790]: I1003 08:58:39.770011 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e08f91ab-f702-49a8-be10-3114d8a10e48-combined-ca-bundle\") pod \"e08f91ab-f702-49a8-be10-3114d8a10e48\" (UID: \"e08f91ab-f702-49a8-be10-3114d8a10e48\") " Oct 03 08:58:39 crc kubenswrapper[4790]: I1003 08:58:39.770305 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7197a2e5-b159-4ea2-bcb6-a58faf21d4ad-scripts\") pod \"keystone-bootstrap-bbwh5\" (UID: \"7197a2e5-b159-4ea2-bcb6-a58faf21d4ad\") " pod="openstack/keystone-bootstrap-bbwh5" Oct 03 08:58:39 crc kubenswrapper[4790]: I1003 08:58:39.770389 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z8xlz\" (UniqueName: \"kubernetes.io/projected/7197a2e5-b159-4ea2-bcb6-a58faf21d4ad-kube-api-access-z8xlz\") pod \"keystone-bootstrap-bbwh5\" (UID: \"7197a2e5-b159-4ea2-bcb6-a58faf21d4ad\") " pod="openstack/keystone-bootstrap-bbwh5" Oct 03 08:58:39 crc kubenswrapper[4790]: I1003 08:58:39.770464 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/7197a2e5-b159-4ea2-bcb6-a58faf21d4ad-credential-keys\") pod \"keystone-bootstrap-bbwh5\" (UID: \"7197a2e5-b159-4ea2-bcb6-a58faf21d4ad\") " pod="openstack/keystone-bootstrap-bbwh5" Oct 03 08:58:39 crc kubenswrapper[4790]: I1003 08:58:39.770557 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7197a2e5-b159-4ea2-bcb6-a58faf21d4ad-combined-ca-bundle\") pod \"keystone-bootstrap-bbwh5\" (UID: \"7197a2e5-b159-4ea2-bcb6-a58faf21d4ad\") " pod="openstack/keystone-bootstrap-bbwh5" Oct 03 08:58:39 crc kubenswrapper[4790]: I1003 08:58:39.770601 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/7197a2e5-b159-4ea2-bcb6-a58faf21d4ad-fernet-keys\") pod \"keystone-bootstrap-bbwh5\" (UID: \"7197a2e5-b159-4ea2-bcb6-a58faf21d4ad\") " pod="openstack/keystone-bootstrap-bbwh5" Oct 03 08:58:39 crc kubenswrapper[4790]: I1003 08:58:39.770634 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7197a2e5-b159-4ea2-bcb6-a58faf21d4ad-config-data\") pod \"keystone-bootstrap-bbwh5\" (UID: \"7197a2e5-b159-4ea2-bcb6-a58faf21d4ad\") " pod="openstack/keystone-bootstrap-bbwh5" Oct 03 08:58:39 crc kubenswrapper[4790]: I1003 08:58:39.772980 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/355f59b0-e1a4-47c9-b24b-fffda1be9952-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "355f59b0-e1a4-47c9-b24b-fffda1be9952" (UID: "355f59b0-e1a4-47c9-b24b-fffda1be9952"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 08:58:39 crc kubenswrapper[4790]: I1003 08:58:39.773685 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e08f91ab-f702-49a8-be10-3114d8a10e48-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "e08f91ab-f702-49a8-be10-3114d8a10e48" (UID: "e08f91ab-f702-49a8-be10-3114d8a10e48"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 08:58:39 crc kubenswrapper[4790]: I1003 08:58:39.775019 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-vzdz9" podStartSLOduration=3.454554778 podStartE2EDuration="19.775000667s" podCreationTimestamp="2025-10-03 08:58:20 +0000 UTC" firstStartedPulling="2025-10-03 08:58:22.080983204 +0000 UTC m=+1088.433917442" lastFinishedPulling="2025-10-03 08:58:38.401429093 +0000 UTC m=+1104.754363331" observedRunningTime="2025-10-03 08:58:39.770516053 +0000 UTC m=+1106.123450291" watchObservedRunningTime="2025-10-03 08:58:39.775000667 +0000 UTC m=+1106.127934905" Oct 03 08:58:39 crc kubenswrapper[4790]: I1003 08:58:39.776054 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/355f59b0-e1a4-47c9-b24b-fffda1be9952-logs" (OuterVolumeSpecName: "logs") pod "355f59b0-e1a4-47c9-b24b-fffda1be9952" (UID: "355f59b0-e1a4-47c9-b24b-fffda1be9952"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 08:58:39 crc kubenswrapper[4790]: I1003 08:58:39.779996 4790 scope.go:117] "RemoveContainer" containerID="aaeb72aaa7d01efc6a63c2cdd3df361bb83ca8e4eedad0117b3a034a12e0326c" Oct 03 08:58:39 crc kubenswrapper[4790]: I1003 08:58:39.780457 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e08f91ab-f702-49a8-be10-3114d8a10e48-logs" (OuterVolumeSpecName: "logs") pod "e08f91ab-f702-49a8-be10-3114d8a10e48" (UID: "e08f91ab-f702-49a8-be10-3114d8a10e48"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 08:58:39 crc kubenswrapper[4790]: I1003 08:58:39.782335 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-867694955c-fpnzb" event={"ID":"64754289-91b5-4f84-9809-6271b6a13031","Type":"ContainerStarted","Data":"19ff0d78e06cc94459f21d1f5f64963a5b790c52733b5c522801481fb8aaedea"} Oct 03 08:58:39 crc kubenswrapper[4790]: I1003 08:58:39.799822 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e08f91ab-f702-49a8-be10-3114d8a10e48-scripts" (OuterVolumeSpecName: "scripts") pod "e08f91ab-f702-49a8-be10-3114d8a10e48" (UID: "e08f91ab-f702-49a8-be10-3114d8a10e48"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:58:39 crc kubenswrapper[4790]: I1003 08:58:39.799832 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage10-crc" (OuterVolumeSpecName: "glance") pod "355f59b0-e1a4-47c9-b24b-fffda1be9952" (UID: "355f59b0-e1a4-47c9-b24b-fffda1be9952"). InnerVolumeSpecName "local-storage10-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 03 08:58:39 crc kubenswrapper[4790]: I1003 08:58:39.801627 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-gklmz" event={"ID":"96250103-59af-4ec6-a036-f6e8222b02e0","Type":"ContainerStarted","Data":"667fdf6cae749ef69d70ba0c8033b599c355a044033c89f0595179dd66ca3613"} Oct 03 08:58:39 crc kubenswrapper[4790]: I1003 08:58:39.805185 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/7197a2e5-b159-4ea2-bcb6-a58faf21d4ad-credential-keys\") pod \"keystone-bootstrap-bbwh5\" (UID: \"7197a2e5-b159-4ea2-bcb6-a58faf21d4ad\") " pod="openstack/keystone-bootstrap-bbwh5" Oct 03 08:58:39 crc kubenswrapper[4790]: I1003 08:58:39.812449 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7197a2e5-b159-4ea2-bcb6-a58faf21d4ad-scripts\") pod \"keystone-bootstrap-bbwh5\" (UID: \"7197a2e5-b159-4ea2-bcb6-a58faf21d4ad\") " pod="openstack/keystone-bootstrap-bbwh5" Oct 03 08:58:39 crc kubenswrapper[4790]: I1003 08:58:39.812870 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e08f91ab-f702-49a8-be10-3114d8a10e48-kube-api-access-mzq4n" (OuterVolumeSpecName: "kube-api-access-mzq4n") pod "e08f91ab-f702-49a8-be10-3114d8a10e48" (UID: "e08f91ab-f702-49a8-be10-3114d8a10e48"). InnerVolumeSpecName "kube-api-access-mzq4n". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:58:39 crc kubenswrapper[4790]: I1003 08:58:39.814431 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z8xlz\" (UniqueName: \"kubernetes.io/projected/7197a2e5-b159-4ea2-bcb6-a58faf21d4ad-kube-api-access-z8xlz\") pod \"keystone-bootstrap-bbwh5\" (UID: \"7197a2e5-b159-4ea2-bcb6-a58faf21d4ad\") " pod="openstack/keystone-bootstrap-bbwh5" Oct 03 08:58:39 crc kubenswrapper[4790]: I1003 08:58:39.818763 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage09-crc" (OuterVolumeSpecName: "glance") pod "e08f91ab-f702-49a8-be10-3114d8a10e48" (UID: "e08f91ab-f702-49a8-be10-3114d8a10e48"). InnerVolumeSpecName "local-storage09-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 03 08:58:39 crc kubenswrapper[4790]: I1003 08:58:39.825032 4790 scope.go:117] "RemoveContainer" containerID="dc3f67af7517d1bed8849c40f9891ea9a8e11f9f2be2fb0354d64d3b6488e4ca" Oct 03 08:58:39 crc kubenswrapper[4790]: I1003 08:58:39.831194 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-86f486799b-gwc4l"] Oct 03 08:58:39 crc kubenswrapper[4790]: I1003 08:58:39.837596 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7197a2e5-b159-4ea2-bcb6-a58faf21d4ad-combined-ca-bundle\") pod \"keystone-bootstrap-bbwh5\" (UID: \"7197a2e5-b159-4ea2-bcb6-a58faf21d4ad\") " pod="openstack/keystone-bootstrap-bbwh5" Oct 03 08:58:39 crc kubenswrapper[4790]: I1003 08:58:39.842991 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675946d6fc-std9f"] Oct 03 08:58:39 crc kubenswrapper[4790]: I1003 08:58:39.854529 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/7197a2e5-b159-4ea2-bcb6-a58faf21d4ad-fernet-keys\") pod \"keystone-bootstrap-bbwh5\" (UID: \"7197a2e5-b159-4ea2-bcb6-a58faf21d4ad\") " pod="openstack/keystone-bootstrap-bbwh5" Oct 03 08:58:39 crc kubenswrapper[4790]: I1003 08:58:39.869967 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-675946d6fc-std9f"] Oct 03 08:58:39 crc kubenswrapper[4790]: I1003 08:58:39.872242 4790 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e08f91ab-f702-49a8-be10-3114d8a10e48-httpd-run\") on node \"crc\" DevicePath \"\"" Oct 03 08:58:39 crc kubenswrapper[4790]: I1003 08:58:39.872300 4790 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" " Oct 03 08:58:39 crc kubenswrapper[4790]: I1003 08:58:39.872314 4790 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/355f59b0-e1a4-47c9-b24b-fffda1be9952-logs\") on node \"crc\" DevicePath \"\"" Oct 03 08:58:39 crc kubenswrapper[4790]: I1003 08:58:39.872327 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mzq4n\" (UniqueName: \"kubernetes.io/projected/e08f91ab-f702-49a8-be10-3114d8a10e48-kube-api-access-mzq4n\") on node \"crc\" DevicePath \"\"" Oct 03 08:58:39 crc kubenswrapper[4790]: I1003 08:58:39.872339 4790 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e08f91ab-f702-49a8-be10-3114d8a10e48-logs\") on node \"crc\" DevicePath \"\"" Oct 03 08:58:39 crc kubenswrapper[4790]: I1003 08:58:39.872358 4790 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" " Oct 03 08:58:39 crc kubenswrapper[4790]: I1003 08:58:39.872370 4790 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/355f59b0-e1a4-47c9-b24b-fffda1be9952-httpd-run\") on node \"crc\" DevicePath \"\"" Oct 03 08:58:39 crc kubenswrapper[4790]: I1003 08:58:39.872381 4790 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e08f91ab-f702-49a8-be10-3114d8a10e48-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 08:58:39 crc kubenswrapper[4790]: I1003 08:58:39.874474 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7197a2e5-b159-4ea2-bcb6-a58faf21d4ad-config-data\") pod \"keystone-bootstrap-bbwh5\" (UID: \"7197a2e5-b159-4ea2-bcb6-a58faf21d4ad\") " pod="openstack/keystone-bootstrap-bbwh5" Oct 03 08:58:39 crc kubenswrapper[4790]: I1003 08:58:39.884801 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/355f59b0-e1a4-47c9-b24b-fffda1be9952-kube-api-access-7ngs9" (OuterVolumeSpecName: "kube-api-access-7ngs9") pod "355f59b0-e1a4-47c9-b24b-fffda1be9952" (UID: "355f59b0-e1a4-47c9-b24b-fffda1be9952"). InnerVolumeSpecName "kube-api-access-7ngs9". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:58:39 crc kubenswrapper[4790]: I1003 08:58:39.886628 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/355f59b0-e1a4-47c9-b24b-fffda1be9952-scripts" (OuterVolumeSpecName: "scripts") pod "355f59b0-e1a4-47c9-b24b-fffda1be9952" (UID: "355f59b0-e1a4-47c9-b24b-fffda1be9952"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:58:39 crc kubenswrapper[4790]: I1003 08:58:39.918614 4790 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage10-crc" (UniqueName: "kubernetes.io/local-volume/local-storage10-crc") on node "crc" Oct 03 08:58:39 crc kubenswrapper[4790]: I1003 08:58:39.940700 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e08f91ab-f702-49a8-be10-3114d8a10e48-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e08f91ab-f702-49a8-be10-3114d8a10e48" (UID: "e08f91ab-f702-49a8-be10-3114d8a10e48"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:58:39 crc kubenswrapper[4790]: I1003 08:58:39.947539 4790 scope.go:117] "RemoveContainer" containerID="7fcda0e7cf3b307ca9c13a7755139b6213989832222956f1fe141e835ff7c023" Oct 03 08:58:39 crc kubenswrapper[4790]: I1003 08:58:39.948305 4790 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage09-crc" (UniqueName: "kubernetes.io/local-volume/local-storage09-crc") on node "crc" Oct 03 08:58:39 crc kubenswrapper[4790]: I1003 08:58:39.964089 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/355f59b0-e1a4-47c9-b24b-fffda1be9952-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "355f59b0-e1a4-47c9-b24b-fffda1be9952" (UID: "355f59b0-e1a4-47c9-b24b-fffda1be9952"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:58:39 crc kubenswrapper[4790]: I1003 08:58:39.974394 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7ngs9\" (UniqueName: \"kubernetes.io/projected/355f59b0-e1a4-47c9-b24b-fffda1be9952-kube-api-access-7ngs9\") on node \"crc\" DevicePath \"\"" Oct 03 08:58:39 crc kubenswrapper[4790]: I1003 08:58:39.974426 4790 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/355f59b0-e1a4-47c9-b24b-fffda1be9952-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 08:58:39 crc kubenswrapper[4790]: I1003 08:58:39.974437 4790 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e08f91ab-f702-49a8-be10-3114d8a10e48-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 08:58:39 crc kubenswrapper[4790]: I1003 08:58:39.974446 4790 reconciler_common.go:293] "Volume detached for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" DevicePath \"\"" Oct 03 08:58:39 crc kubenswrapper[4790]: I1003 08:58:39.974457 4790 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/355f59b0-e1a4-47c9-b24b-fffda1be9952-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 08:58:39 crc kubenswrapper[4790]: I1003 08:58:39.974467 4790 reconciler_common.go:293] "Volume detached for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" DevicePath \"\"" Oct 03 08:58:39 crc kubenswrapper[4790]: I1003 08:58:39.978766 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e08f91ab-f702-49a8-be10-3114d8a10e48-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "e08f91ab-f702-49a8-be10-3114d8a10e48" (UID: "e08f91ab-f702-49a8-be10-3114d8a10e48"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:58:39 crc kubenswrapper[4790]: I1003 08:58:39.999585 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/355f59b0-e1a4-47c9-b24b-fffda1be9952-config-data" (OuterVolumeSpecName: "config-data") pod "355f59b0-e1a4-47c9-b24b-fffda1be9952" (UID: "355f59b0-e1a4-47c9-b24b-fffda1be9952"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:58:40 crc kubenswrapper[4790]: I1003 08:58:40.003365 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-bbwh5" Oct 03 08:58:40 crc kubenswrapper[4790]: I1003 08:58:40.008675 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e08f91ab-f702-49a8-be10-3114d8a10e48-config-data" (OuterVolumeSpecName: "config-data") pod "e08f91ab-f702-49a8-be10-3114d8a10e48" (UID: "e08f91ab-f702-49a8-be10-3114d8a10e48"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:58:40 crc kubenswrapper[4790]: I1003 08:58:40.012717 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/355f59b0-e1a4-47c9-b24b-fffda1be9952-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "355f59b0-e1a4-47c9-b24b-fffda1be9952" (UID: "355f59b0-e1a4-47c9-b24b-fffda1be9952"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:58:40 crc kubenswrapper[4790]: I1003 08:58:40.044239 4790 scope.go:117] "RemoveContainer" containerID="8857067574a18e2028d2b32588327ef27d82cb8e97bd14e25cf888321d5d19af" Oct 03 08:58:40 crc kubenswrapper[4790]: I1003 08:58:40.079592 4790 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/355f59b0-e1a4-47c9-b24b-fffda1be9952-public-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 03 08:58:40 crc kubenswrapper[4790]: I1003 08:58:40.079624 4790 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e08f91ab-f702-49a8-be10-3114d8a10e48-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 08:58:40 crc kubenswrapper[4790]: I1003 08:58:40.079809 4790 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/355f59b0-e1a4-47c9-b24b-fffda1be9952-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 08:58:40 crc kubenswrapper[4790]: I1003 08:58:40.079824 4790 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e08f91ab-f702-49a8-be10-3114d8a10e48-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 03 08:58:40 crc kubenswrapper[4790]: I1003 08:58:40.120361 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 03 08:58:40 crc kubenswrapper[4790]: I1003 08:58:40.150995 4790 scope.go:117] "RemoveContainer" containerID="37ae584484cbe528ca7cd11dd582dcd5d81ec5af833e3400cef9e800763bd0e2" Oct 03 08:58:40 crc kubenswrapper[4790]: I1003 08:58:40.161357 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 03 08:58:40 crc kubenswrapper[4790]: I1003 08:58:40.185620 4790 patch_prober.go:28] interesting pod/machine-config-daemon-zqj4j container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 08:58:40 crc kubenswrapper[4790]: I1003 08:58:40.185650 4790 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" podUID="36eae06a-d8be-4128-8d68-acb32e1f011b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 08:58:40 crc kubenswrapper[4790]: I1003 08:58:40.207085 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 03 08:58:40 crc kubenswrapper[4790]: E1003 08:58:40.207526 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e08f91ab-f702-49a8-be10-3114d8a10e48" containerName="glance-httpd" Oct 03 08:58:40 crc kubenswrapper[4790]: I1003 08:58:40.207538 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="e08f91ab-f702-49a8-be10-3114d8a10e48" containerName="glance-httpd" Oct 03 08:58:40 crc kubenswrapper[4790]: E1003 08:58:40.207556 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="355f59b0-e1a4-47c9-b24b-fffda1be9952" containerName="glance-httpd" Oct 03 08:58:40 crc kubenswrapper[4790]: I1003 08:58:40.207562 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="355f59b0-e1a4-47c9-b24b-fffda1be9952" containerName="glance-httpd" Oct 03 08:58:40 crc kubenswrapper[4790]: E1003 08:58:40.207591 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="355f59b0-e1a4-47c9-b24b-fffda1be9952" containerName="glance-log" Oct 03 08:58:40 crc kubenswrapper[4790]: I1003 08:58:40.207597 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="355f59b0-e1a4-47c9-b24b-fffda1be9952" containerName="glance-log" Oct 03 08:58:40 crc kubenswrapper[4790]: E1003 08:58:40.207626 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e08f91ab-f702-49a8-be10-3114d8a10e48" containerName="glance-log" Oct 03 08:58:40 crc kubenswrapper[4790]: I1003 08:58:40.207632 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="e08f91ab-f702-49a8-be10-3114d8a10e48" containerName="glance-log" Oct 03 08:58:40 crc kubenswrapper[4790]: I1003 08:58:40.207794 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="e08f91ab-f702-49a8-be10-3114d8a10e48" containerName="glance-httpd" Oct 03 08:58:40 crc kubenswrapper[4790]: I1003 08:58:40.207814 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="355f59b0-e1a4-47c9-b24b-fffda1be9952" containerName="glance-log" Oct 03 08:58:40 crc kubenswrapper[4790]: I1003 08:58:40.207829 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="355f59b0-e1a4-47c9-b24b-fffda1be9952" containerName="glance-httpd" Oct 03 08:58:40 crc kubenswrapper[4790]: I1003 08:58:40.207839 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="e08f91ab-f702-49a8-be10-3114d8a10e48" containerName="glance-log" Oct 03 08:58:40 crc kubenswrapper[4790]: I1003 08:58:40.208908 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 03 08:58:40 crc kubenswrapper[4790]: I1003 08:58:40.216175 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Oct 03 08:58:40 crc kubenswrapper[4790]: I1003 08:58:40.216372 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-lvfmv" Oct 03 08:58:40 crc kubenswrapper[4790]: I1003 08:58:40.216538 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Oct 03 08:58:40 crc kubenswrapper[4790]: I1003 08:58:40.216857 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Oct 03 08:58:40 crc kubenswrapper[4790]: I1003 08:58:40.244668 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 03 08:58:40 crc kubenswrapper[4790]: I1003 08:58:40.258236 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 03 08:58:40 crc kubenswrapper[4790]: I1003 08:58:40.266181 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 03 08:58:40 crc kubenswrapper[4790]: I1003 08:58:40.278876 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Oct 03 08:58:40 crc kubenswrapper[4790]: I1003 08:58:40.280613 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 03 08:58:40 crc kubenswrapper[4790]: I1003 08:58:40.283083 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Oct 03 08:58:40 crc kubenswrapper[4790]: I1003 08:58:40.285796 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Oct 03 08:58:40 crc kubenswrapper[4790]: I1003 08:58:40.286308 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 03 08:58:40 crc kubenswrapper[4790]: I1003 08:58:40.383896 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"ab258ad1-5952-4586-97ac-9da3ad97658d\") " pod="openstack/glance-default-internal-api-0" Oct 03 08:58:40 crc kubenswrapper[4790]: I1003 08:58:40.383946 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"0c264d15-baa6-494c-89e1-220296a37ce8\") " pod="openstack/glance-default-external-api-0" Oct 03 08:58:40 crc kubenswrapper[4790]: I1003 08:58:40.383980 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c264d15-baa6-494c-89e1-220296a37ce8-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"0c264d15-baa6-494c-89e1-220296a37ce8\") " pod="openstack/glance-default-external-api-0" Oct 03 08:58:40 crc kubenswrapper[4790]: I1003 08:58:40.383999 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jtd2n\" (UniqueName: \"kubernetes.io/projected/0c264d15-baa6-494c-89e1-220296a37ce8-kube-api-access-jtd2n\") pod \"glance-default-external-api-0\" (UID: \"0c264d15-baa6-494c-89e1-220296a37ce8\") " pod="openstack/glance-default-external-api-0" Oct 03 08:58:40 crc kubenswrapper[4790]: I1003 08:58:40.384019 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k4gfc\" (UniqueName: \"kubernetes.io/projected/ab258ad1-5952-4586-97ac-9da3ad97658d-kube-api-access-k4gfc\") pod \"glance-default-internal-api-0\" (UID: \"ab258ad1-5952-4586-97ac-9da3ad97658d\") " pod="openstack/glance-default-internal-api-0" Oct 03 08:58:40 crc kubenswrapper[4790]: I1003 08:58:40.384041 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab258ad1-5952-4586-97ac-9da3ad97658d-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"ab258ad1-5952-4586-97ac-9da3ad97658d\") " pod="openstack/glance-default-internal-api-0" Oct 03 08:58:40 crc kubenswrapper[4790]: I1003 08:58:40.384063 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ab258ad1-5952-4586-97ac-9da3ad97658d-logs\") pod \"glance-default-internal-api-0\" (UID: \"ab258ad1-5952-4586-97ac-9da3ad97658d\") " pod="openstack/glance-default-internal-api-0" Oct 03 08:58:40 crc kubenswrapper[4790]: I1003 08:58:40.384083 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ab258ad1-5952-4586-97ac-9da3ad97658d-scripts\") pod \"glance-default-internal-api-0\" (UID: \"ab258ad1-5952-4586-97ac-9da3ad97658d\") " pod="openstack/glance-default-internal-api-0" Oct 03 08:58:40 crc kubenswrapper[4790]: I1003 08:58:40.384104 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/0c264d15-baa6-494c-89e1-220296a37ce8-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"0c264d15-baa6-494c-89e1-220296a37ce8\") " pod="openstack/glance-default-external-api-0" Oct 03 08:58:40 crc kubenswrapper[4790]: I1003 08:58:40.384118 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0c264d15-baa6-494c-89e1-220296a37ce8-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"0c264d15-baa6-494c-89e1-220296a37ce8\") " pod="openstack/glance-default-external-api-0" Oct 03 08:58:40 crc kubenswrapper[4790]: I1003 08:58:40.384139 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0c264d15-baa6-494c-89e1-220296a37ce8-scripts\") pod \"glance-default-external-api-0\" (UID: \"0c264d15-baa6-494c-89e1-220296a37ce8\") " pod="openstack/glance-default-external-api-0" Oct 03 08:58:40 crc kubenswrapper[4790]: I1003 08:58:40.384170 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ab258ad1-5952-4586-97ac-9da3ad97658d-config-data\") pod \"glance-default-internal-api-0\" (UID: \"ab258ad1-5952-4586-97ac-9da3ad97658d\") " pod="openstack/glance-default-internal-api-0" Oct 03 08:58:40 crc kubenswrapper[4790]: I1003 08:58:40.384198 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ab258ad1-5952-4586-97ac-9da3ad97658d-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"ab258ad1-5952-4586-97ac-9da3ad97658d\") " pod="openstack/glance-default-internal-api-0" Oct 03 08:58:40 crc kubenswrapper[4790]: I1003 08:58:40.384214 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0c264d15-baa6-494c-89e1-220296a37ce8-config-data\") pod \"glance-default-external-api-0\" (UID: \"0c264d15-baa6-494c-89e1-220296a37ce8\") " pod="openstack/glance-default-external-api-0" Oct 03 08:58:40 crc kubenswrapper[4790]: I1003 08:58:40.384245 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0c264d15-baa6-494c-89e1-220296a37ce8-logs\") pod \"glance-default-external-api-0\" (UID: \"0c264d15-baa6-494c-89e1-220296a37ce8\") " pod="openstack/glance-default-external-api-0" Oct 03 08:58:40 crc kubenswrapper[4790]: I1003 08:58:40.384272 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ab258ad1-5952-4586-97ac-9da3ad97658d-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"ab258ad1-5952-4586-97ac-9da3ad97658d\") " pod="openstack/glance-default-internal-api-0" Oct 03 08:58:40 crc kubenswrapper[4790]: I1003 08:58:40.461182 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="355f59b0-e1a4-47c9-b24b-fffda1be9952" path="/var/lib/kubelet/pods/355f59b0-e1a4-47c9-b24b-fffda1be9952/volumes" Oct 03 08:58:40 crc kubenswrapper[4790]: I1003 08:58:40.464247 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43741e1d-af3f-4800-a5a7-cca0f6620cbc" path="/var/lib/kubelet/pods/43741e1d-af3f-4800-a5a7-cca0f6620cbc/volumes" Oct 03 08:58:40 crc kubenswrapper[4790]: I1003 08:58:40.467486 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dd4153b0-ab10-423f-bb3a-be4bac79c118" path="/var/lib/kubelet/pods/dd4153b0-ab10-423f-bb3a-be4bac79c118/volumes" Oct 03 08:58:40 crc kubenswrapper[4790]: I1003 08:58:40.472116 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e08f91ab-f702-49a8-be10-3114d8a10e48" path="/var/lib/kubelet/pods/e08f91ab-f702-49a8-be10-3114d8a10e48/volumes" Oct 03 08:58:40 crc kubenswrapper[4790]: I1003 08:58:40.485664 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"0c264d15-baa6-494c-89e1-220296a37ce8\") " pod="openstack/glance-default-external-api-0" Oct 03 08:58:40 crc kubenswrapper[4790]: I1003 08:58:40.485736 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c264d15-baa6-494c-89e1-220296a37ce8-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"0c264d15-baa6-494c-89e1-220296a37ce8\") " pod="openstack/glance-default-external-api-0" Oct 03 08:58:40 crc kubenswrapper[4790]: I1003 08:58:40.485766 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jtd2n\" (UniqueName: \"kubernetes.io/projected/0c264d15-baa6-494c-89e1-220296a37ce8-kube-api-access-jtd2n\") pod \"glance-default-external-api-0\" (UID: \"0c264d15-baa6-494c-89e1-220296a37ce8\") " pod="openstack/glance-default-external-api-0" Oct 03 08:58:40 crc kubenswrapper[4790]: I1003 08:58:40.485790 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k4gfc\" (UniqueName: \"kubernetes.io/projected/ab258ad1-5952-4586-97ac-9da3ad97658d-kube-api-access-k4gfc\") pod \"glance-default-internal-api-0\" (UID: \"ab258ad1-5952-4586-97ac-9da3ad97658d\") " pod="openstack/glance-default-internal-api-0" Oct 03 08:58:40 crc kubenswrapper[4790]: I1003 08:58:40.485824 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab258ad1-5952-4586-97ac-9da3ad97658d-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"ab258ad1-5952-4586-97ac-9da3ad97658d\") " pod="openstack/glance-default-internal-api-0" Oct 03 08:58:40 crc kubenswrapper[4790]: I1003 08:58:40.485843 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ab258ad1-5952-4586-97ac-9da3ad97658d-logs\") pod \"glance-default-internal-api-0\" (UID: \"ab258ad1-5952-4586-97ac-9da3ad97658d\") " pod="openstack/glance-default-internal-api-0" Oct 03 08:58:40 crc kubenswrapper[4790]: I1003 08:58:40.485864 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ab258ad1-5952-4586-97ac-9da3ad97658d-scripts\") pod \"glance-default-internal-api-0\" (UID: \"ab258ad1-5952-4586-97ac-9da3ad97658d\") " pod="openstack/glance-default-internal-api-0" Oct 03 08:58:40 crc kubenswrapper[4790]: I1003 08:58:40.487218 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/0c264d15-baa6-494c-89e1-220296a37ce8-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"0c264d15-baa6-494c-89e1-220296a37ce8\") " pod="openstack/glance-default-external-api-0" Oct 03 08:58:40 crc kubenswrapper[4790]: I1003 08:58:40.487288 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0c264d15-baa6-494c-89e1-220296a37ce8-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"0c264d15-baa6-494c-89e1-220296a37ce8\") " pod="openstack/glance-default-external-api-0" Oct 03 08:58:40 crc kubenswrapper[4790]: I1003 08:58:40.487354 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0c264d15-baa6-494c-89e1-220296a37ce8-scripts\") pod \"glance-default-external-api-0\" (UID: \"0c264d15-baa6-494c-89e1-220296a37ce8\") " pod="openstack/glance-default-external-api-0" Oct 03 08:58:40 crc kubenswrapper[4790]: I1003 08:58:40.487463 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ab258ad1-5952-4586-97ac-9da3ad97658d-config-data\") pod \"glance-default-internal-api-0\" (UID: \"ab258ad1-5952-4586-97ac-9da3ad97658d\") " pod="openstack/glance-default-internal-api-0" Oct 03 08:58:40 crc kubenswrapper[4790]: I1003 08:58:40.487557 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ab258ad1-5952-4586-97ac-9da3ad97658d-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"ab258ad1-5952-4586-97ac-9da3ad97658d\") " pod="openstack/glance-default-internal-api-0" Oct 03 08:58:40 crc kubenswrapper[4790]: I1003 08:58:40.487713 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0c264d15-baa6-494c-89e1-220296a37ce8-config-data\") pod \"glance-default-external-api-0\" (UID: \"0c264d15-baa6-494c-89e1-220296a37ce8\") " pod="openstack/glance-default-external-api-0" Oct 03 08:58:40 crc kubenswrapper[4790]: I1003 08:58:40.487941 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0c264d15-baa6-494c-89e1-220296a37ce8-logs\") pod \"glance-default-external-api-0\" (UID: \"0c264d15-baa6-494c-89e1-220296a37ce8\") " pod="openstack/glance-default-external-api-0" Oct 03 08:58:40 crc kubenswrapper[4790]: I1003 08:58:40.488049 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ab258ad1-5952-4586-97ac-9da3ad97658d-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"ab258ad1-5952-4586-97ac-9da3ad97658d\") " pod="openstack/glance-default-internal-api-0" Oct 03 08:58:40 crc kubenswrapper[4790]: I1003 08:58:40.488193 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"ab258ad1-5952-4586-97ac-9da3ad97658d\") " pod="openstack/glance-default-internal-api-0" Oct 03 08:58:40 crc kubenswrapper[4790]: I1003 08:58:40.488802 4790 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"ab258ad1-5952-4586-97ac-9da3ad97658d\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/glance-default-internal-api-0" Oct 03 08:58:40 crc kubenswrapper[4790]: I1003 08:58:40.509162 4790 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"0c264d15-baa6-494c-89e1-220296a37ce8\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/glance-default-external-api-0" Oct 03 08:58:40 crc kubenswrapper[4790]: I1003 08:58:40.520423 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ab258ad1-5952-4586-97ac-9da3ad97658d-scripts\") pod \"glance-default-internal-api-0\" (UID: \"ab258ad1-5952-4586-97ac-9da3ad97658d\") " pod="openstack/glance-default-internal-api-0" Oct 03 08:58:40 crc kubenswrapper[4790]: I1003 08:58:40.521254 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0c264d15-baa6-494c-89e1-220296a37ce8-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"0c264d15-baa6-494c-89e1-220296a37ce8\") " pod="openstack/glance-default-external-api-0" Oct 03 08:58:40 crc kubenswrapper[4790]: I1003 08:58:40.522388 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ab258ad1-5952-4586-97ac-9da3ad97658d-logs\") pod \"glance-default-internal-api-0\" (UID: \"ab258ad1-5952-4586-97ac-9da3ad97658d\") " pod="openstack/glance-default-internal-api-0" Oct 03 08:58:40 crc kubenswrapper[4790]: I1003 08:58:40.523648 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0c264d15-baa6-494c-89e1-220296a37ce8-logs\") pod \"glance-default-external-api-0\" (UID: \"0c264d15-baa6-494c-89e1-220296a37ce8\") " pod="openstack/glance-default-external-api-0" Oct 03 08:58:40 crc kubenswrapper[4790]: I1003 08:58:40.523884 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/0c264d15-baa6-494c-89e1-220296a37ce8-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"0c264d15-baa6-494c-89e1-220296a37ce8\") " pod="openstack/glance-default-external-api-0" Oct 03 08:58:40 crc kubenswrapper[4790]: I1003 08:58:40.524254 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ab258ad1-5952-4586-97ac-9da3ad97658d-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"ab258ad1-5952-4586-97ac-9da3ad97658d\") " pod="openstack/glance-default-internal-api-0" Oct 03 08:58:40 crc kubenswrapper[4790]: I1003 08:58:40.532557 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0c264d15-baa6-494c-89e1-220296a37ce8-config-data\") pod \"glance-default-external-api-0\" (UID: \"0c264d15-baa6-494c-89e1-220296a37ce8\") " pod="openstack/glance-default-external-api-0" Oct 03 08:58:40 crc kubenswrapper[4790]: I1003 08:58:40.534824 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ab258ad1-5952-4586-97ac-9da3ad97658d-config-data\") pod \"glance-default-internal-api-0\" (UID: \"ab258ad1-5952-4586-97ac-9da3ad97658d\") " pod="openstack/glance-default-internal-api-0" Oct 03 08:58:40 crc kubenswrapper[4790]: I1003 08:58:40.535526 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c264d15-baa6-494c-89e1-220296a37ce8-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"0c264d15-baa6-494c-89e1-220296a37ce8\") " pod="openstack/glance-default-external-api-0" Oct 03 08:58:40 crc kubenswrapper[4790]: I1003 08:58:40.607593 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ab258ad1-5952-4586-97ac-9da3ad97658d-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"ab258ad1-5952-4586-97ac-9da3ad97658d\") " pod="openstack/glance-default-internal-api-0" Oct 03 08:58:40 crc kubenswrapper[4790]: I1003 08:58:40.611248 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab258ad1-5952-4586-97ac-9da3ad97658d-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"ab258ad1-5952-4586-97ac-9da3ad97658d\") " pod="openstack/glance-default-internal-api-0" Oct 03 08:58:40 crc kubenswrapper[4790]: I1003 08:58:40.611269 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0c264d15-baa6-494c-89e1-220296a37ce8-scripts\") pod \"glance-default-external-api-0\" (UID: \"0c264d15-baa6-494c-89e1-220296a37ce8\") " pod="openstack/glance-default-external-api-0" Oct 03 08:58:40 crc kubenswrapper[4790]: I1003 08:58:40.611641 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k4gfc\" (UniqueName: \"kubernetes.io/projected/ab258ad1-5952-4586-97ac-9da3ad97658d-kube-api-access-k4gfc\") pod \"glance-default-internal-api-0\" (UID: \"ab258ad1-5952-4586-97ac-9da3ad97658d\") " pod="openstack/glance-default-internal-api-0" Oct 03 08:58:40 crc kubenswrapper[4790]: I1003 08:58:40.611850 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jtd2n\" (UniqueName: \"kubernetes.io/projected/0c264d15-baa6-494c-89e1-220296a37ce8-kube-api-access-jtd2n\") pod \"glance-default-external-api-0\" (UID: \"0c264d15-baa6-494c-89e1-220296a37ce8\") " pod="openstack/glance-default-external-api-0" Oct 03 08:58:40 crc kubenswrapper[4790]: E1003 08:58:40.616021 4790 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode08f91ab_f702_49a8_be10_3114d8a10e48.slice/crio-e47eef746fe26100f7053a206f35675a1034a7cc37a5c5ac37e36abb7ccc53f6\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod355f59b0_e1a4_47c9_b24b_fffda1be9952.slice/crio-fa87d5770606723279f6b17ced44e0a3ae528328c0f2db42338fdc106015cd38\": RecentStats: unable to find data in memory cache]" Oct 03 08:58:40 crc kubenswrapper[4790]: I1003 08:58:40.631761 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"ab258ad1-5952-4586-97ac-9da3ad97658d\") " pod="openstack/glance-default-internal-api-0" Oct 03 08:58:40 crc kubenswrapper[4790]: I1003 08:58:40.643659 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"0c264d15-baa6-494c-89e1-220296a37ce8\") " pod="openstack/glance-default-external-api-0" Oct 03 08:58:40 crc kubenswrapper[4790]: I1003 08:58:40.740838 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 03 08:58:40 crc kubenswrapper[4790]: I1003 08:58:40.752805 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 03 08:58:40 crc kubenswrapper[4790]: I1003 08:58:40.824398 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7d76b4b9d6-n5tqf" event={"ID":"f57970a5-79c5-4a4f-bdf6-89d25bd8d8c5","Type":"ContainerStarted","Data":"bc33e5db07b735662f20eb2b839ba0dbfaa1141b576d9dfed09994c044e63804"} Oct 03 08:58:40 crc kubenswrapper[4790]: I1003 08:58:40.824913 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7d76b4b9d6-n5tqf" event={"ID":"f57970a5-79c5-4a4f-bdf6-89d25bd8d8c5","Type":"ContainerStarted","Data":"80ff8f257cad22f768c65b4fbdbfe5c80ba083d64adccf248419774a119fc165"} Oct 03 08:58:40 crc kubenswrapper[4790]: I1003 08:58:40.826683 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6448764df-qdvpl" event={"ID":"e5a8c4b6-bbe9-4dda-b8b9-4b84fe0864ef","Type":"ContainerStarted","Data":"4d9ecde0aa8e78ebd504b850bd733c9f1f1640a4a1ec7d3f30d5c89b961cce0f"} Oct 03 08:58:40 crc kubenswrapper[4790]: I1003 08:58:40.826811 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-6448764df-qdvpl" podUID="e5a8c4b6-bbe9-4dda-b8b9-4b84fe0864ef" containerName="horizon-log" containerID="cri-o://fc91064cded5b32a5632b39284688d49158263bf22c6a86f5afd9e7d3936de2f" gracePeriod=30 Oct 03 08:58:40 crc kubenswrapper[4790]: I1003 08:58:40.827366 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-6448764df-qdvpl" podUID="e5a8c4b6-bbe9-4dda-b8b9-4b84fe0864ef" containerName="horizon" containerID="cri-o://4d9ecde0aa8e78ebd504b850bd733c9f1f1640a4a1ec7d3f30d5c89b961cce0f" gracePeriod=30 Oct 03 08:58:40 crc kubenswrapper[4790]: I1003 08:58:40.841796 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-applier-0" event={"ID":"67485d96-5ca0-4195-bb7a-d09d03d3dc49","Type":"ContainerStarted","Data":"065a93cf76deee6299a5a5a7d3e1dbafa3562a7c693ca01f268c1f7666811b5c"} Oct 03 08:58:40 crc kubenswrapper[4790]: I1003 08:58:40.847253 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8a0817c5-b72b-4433-977d-6185317b0885","Type":"ContainerStarted","Data":"fab70dfb9f385a94ccc6a11aad0924e34371da2c99c29c954d36abeba9e33c9e"} Oct 03 08:58:40 crc kubenswrapper[4790]: I1003 08:58:40.866509 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6d887b5cbf-5qjjt" event={"ID":"8a368664-2338-4579-ae15-981ee5ca3194","Type":"ContainerStarted","Data":"80d9e9f284c78efe1c4af25f8abb5f3875804eaeaa94d80bb165c97f01a16664"} Oct 03 08:58:40 crc kubenswrapper[4790]: I1003 08:58:40.866666 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-6d887b5cbf-5qjjt" podUID="8a368664-2338-4579-ae15-981ee5ca3194" containerName="horizon-log" containerID="cri-o://c9c519b5c6dd5924b44d4d2bc8af4c14d93fd4cea6763325b628d674eb8b39b4" gracePeriod=30 Oct 03 08:58:40 crc kubenswrapper[4790]: I1003 08:58:40.866754 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-6d887b5cbf-5qjjt" podUID="8a368664-2338-4579-ae15-981ee5ca3194" containerName="horizon" containerID="cri-o://80d9e9f284c78efe1c4af25f8abb5f3875804eaeaa94d80bb165c97f01a16664" gracePeriod=30 Oct 03 08:58:40 crc kubenswrapper[4790]: I1003 08:58:40.871880 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-6448764df-qdvpl" podStartSLOduration=4.509498852 podStartE2EDuration="18.871863837s" podCreationTimestamp="2025-10-03 08:58:22 +0000 UTC" firstStartedPulling="2025-10-03 08:58:24.140263642 +0000 UTC m=+1090.493197880" lastFinishedPulling="2025-10-03 08:58:38.502628627 +0000 UTC m=+1104.855562865" observedRunningTime="2025-10-03 08:58:40.85191797 +0000 UTC m=+1107.204852228" watchObservedRunningTime="2025-10-03 08:58:40.871863837 +0000 UTC m=+1107.224798085" Oct 03 08:58:40 crc kubenswrapper[4790]: I1003 08:58:40.880896 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-86f486799b-gwc4l" event={"ID":"6b9b4763-1706-4c7e-b39f-0d8311a9637d","Type":"ContainerStarted","Data":"5be36c0be71be565b58276cb0c461e9e22dce6052a782291f0b3d281cc59f964"} Oct 03 08:58:40 crc kubenswrapper[4790]: I1003 08:58:40.880945 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-86f486799b-gwc4l" event={"ID":"6b9b4763-1706-4c7e-b39f-0d8311a9637d","Type":"ContainerStarted","Data":"8c59753834ebab4aa2634b7add0c054fc379d2f9c4ea877414c2c1de89ac8e38"} Oct 03 08:58:40 crc kubenswrapper[4790]: I1003 08:58:40.887018 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-brfpm" event={"ID":"20340665-5e73-4bfd-8a1a-6d21e0d16432","Type":"ContainerStarted","Data":"393ec01869d710e54cb79af23763bbc6c201fa62633d76b7abd231c77d64ad76"} Oct 03 08:58:40 crc kubenswrapper[4790]: I1003 08:58:40.888748 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"35a981dd-1fb1-4a29-a36c-592ea3925ff7","Type":"ContainerStarted","Data":"0756731148ec4e5f1681cd64aa3f20411e6b35df4b805f488fb6e06f6474e65a"} Oct 03 08:58:40 crc kubenswrapper[4790]: I1003 08:58:40.888775 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"35a981dd-1fb1-4a29-a36c-592ea3925ff7","Type":"ContainerStarted","Data":"3186d5368edc29f905c35efc70437e28a583147d846e4a3da578be64dcc7d48a"} Oct 03 08:58:40 crc kubenswrapper[4790]: I1003 08:58:40.896908 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-bbwh5"] Oct 03 08:58:40 crc kubenswrapper[4790]: I1003 08:58:40.897262 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-867694955c-fpnzb" event={"ID":"64754289-91b5-4f84-9809-6271b6a13031","Type":"ContainerStarted","Data":"140fd24e39bc9f1ab43f72b08f0a295a95db3c2d185cb86fac96c194e10cadb5"} Oct 03 08:58:40 crc kubenswrapper[4790]: I1003 08:58:40.897457 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-867694955c-fpnzb" podUID="64754289-91b5-4f84-9809-6271b6a13031" containerName="horizon-log" containerID="cri-o://19ff0d78e06cc94459f21d1f5f64963a5b790c52733b5c522801481fb8aaedea" gracePeriod=30 Oct 03 08:58:40 crc kubenswrapper[4790]: I1003 08:58:40.898259 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-867694955c-fpnzb" podUID="64754289-91b5-4f84-9809-6271b6a13031" containerName="horizon" containerID="cri-o://140fd24e39bc9f1ab43f72b08f0a295a95db3c2d185cb86fac96c194e10cadb5" gracePeriod=30 Oct 03 08:58:40 crc kubenswrapper[4790]: I1003 08:58:40.903267 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-867694955c-fpnzb" Oct 03 08:58:40 crc kubenswrapper[4790]: I1003 08:58:40.898612 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-6d887b5cbf-5qjjt" podStartSLOduration=5.245189915 podStartE2EDuration="21.898587619s" podCreationTimestamp="2025-10-03 08:58:19 +0000 UTC" firstStartedPulling="2025-10-03 08:58:21.755800291 +0000 UTC m=+1088.108734529" lastFinishedPulling="2025-10-03 08:58:38.409197995 +0000 UTC m=+1104.762132233" observedRunningTime="2025-10-03 08:58:40.892134213 +0000 UTC m=+1107.245068471" watchObservedRunningTime="2025-10-03 08:58:40.898587619 +0000 UTC m=+1107.251521857" Oct 03 08:58:40 crc kubenswrapper[4790]: I1003 08:58:40.908540 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-mfdjk" event={"ID":"df20e14b-d7d8-4cab-8b7c-61122da75fbf","Type":"ContainerStarted","Data":"0d7452c96884917989d401aa60ff69a7463322b4b922a19b9ee7a87d035506bb"} Oct 03 08:58:40 crc kubenswrapper[4790]: I1003 08:58:40.908598 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-mfdjk" event={"ID":"df20e14b-d7d8-4cab-8b7c-61122da75fbf","Type":"ContainerStarted","Data":"166a3036570750727b0dd758229bd28aafecd40eb06b2140fd160ae44b5f7978"} Oct 03 08:58:40 crc kubenswrapper[4790]: I1003 08:58:40.935155 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"f0afbb0f-f4dc-4814-8c7f-791351972c0b","Type":"ContainerStarted","Data":"a908578b6502ff3984c4ef14391723bc62abedc7411bba3d07eac6cc51be543c"} Oct 03 08:58:40 crc kubenswrapper[4790]: I1003 08:58:40.967458 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-867694955c-fpnzb" podStartSLOduration=4.740864754 podStartE2EDuration="20.967439957s" podCreationTimestamp="2025-10-03 08:58:20 +0000 UTC" firstStartedPulling="2025-10-03 08:58:22.276033643 +0000 UTC m=+1088.628967881" lastFinishedPulling="2025-10-03 08:58:38.502608846 +0000 UTC m=+1104.855543084" observedRunningTime="2025-10-03 08:58:40.923764439 +0000 UTC m=+1107.276698677" watchObservedRunningTime="2025-10-03 08:58:40.967439957 +0000 UTC m=+1107.320374195" Oct 03 08:58:41 crc kubenswrapper[4790]: I1003 08:58:41.000773 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-mfdjk" podStartSLOduration=10.000754079 podStartE2EDuration="10.000754079s" podCreationTimestamp="2025-10-03 08:58:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 08:58:40.946007109 +0000 UTC m=+1107.298941347" watchObservedRunningTime="2025-10-03 08:58:41.000754079 +0000 UTC m=+1107.353688317" Oct 03 08:58:41 crc kubenswrapper[4790]: I1003 08:58:41.625929 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 03 08:58:41 crc kubenswrapper[4790]: I1003 08:58:41.788001 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 03 08:58:42 crc kubenswrapper[4790]: I1003 08:58:42.016416 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"0c264d15-baa6-494c-89e1-220296a37ce8","Type":"ContainerStarted","Data":"75e14685cbdc766dac9c5631ebf06f24403d08c618e0cee83389ef736be2b117"} Oct 03 08:58:42 crc kubenswrapper[4790]: I1003 08:58:42.027256 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-bbwh5" event={"ID":"7197a2e5-b159-4ea2-bcb6-a58faf21d4ad","Type":"ContainerStarted","Data":"db89e74913a6abb0a0855c991ea609b1ce7803291149958d8aa0f7fb86212ea7"} Oct 03 08:58:42 crc kubenswrapper[4790]: I1003 08:58:42.027309 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-bbwh5" event={"ID":"7197a2e5-b159-4ea2-bcb6-a58faf21d4ad","Type":"ContainerStarted","Data":"2c06d5bb919d322773f80f35ef9296f43725d6ab6d1350c7dd9b9a90e35cb5f8"} Oct 03 08:58:42 crc kubenswrapper[4790]: I1003 08:58:42.036004 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7d76b4b9d6-n5tqf" event={"ID":"f57970a5-79c5-4a4f-bdf6-89d25bd8d8c5","Type":"ContainerStarted","Data":"4a4cc8546043f16febf7615f741031a66032e2c6d6c4ccd1550b14aee53138a2"} Oct 03 08:58:42 crc kubenswrapper[4790]: I1003 08:58:42.049280 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"ab258ad1-5952-4586-97ac-9da3ad97658d","Type":"ContainerStarted","Data":"8fc3021598e228d28ab9357026f9f22e644b7b0d478b260227e6425b23115872"} Oct 03 08:58:42 crc kubenswrapper[4790]: I1003 08:58:42.051052 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-bbwh5" podStartSLOduration=3.051034303 podStartE2EDuration="3.051034303s" podCreationTimestamp="2025-10-03 08:58:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 08:58:42.046837618 +0000 UTC m=+1108.399771866" watchObservedRunningTime="2025-10-03 08:58:42.051034303 +0000 UTC m=+1108.403968541" Oct 03 08:58:42 crc kubenswrapper[4790]: I1003 08:58:42.052989 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-86f486799b-gwc4l" event={"ID":"6b9b4763-1706-4c7e-b39f-0d8311a9637d","Type":"ContainerStarted","Data":"a43c79207758e34c6eeebc098b910355829d4cf1c07e91fbd5f8d8ec7846ac40"} Oct 03 08:58:42 crc kubenswrapper[4790]: I1003 08:58:42.072258 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-7d76b4b9d6-n5tqf" podStartSLOduration=13.072231044 podStartE2EDuration="13.072231044s" podCreationTimestamp="2025-10-03 08:58:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 08:58:42.066642291 +0000 UTC m=+1108.419576529" watchObservedRunningTime="2025-10-03 08:58:42.072231044 +0000 UTC m=+1108.425165282" Oct 03 08:58:42 crc kubenswrapper[4790]: I1003 08:58:42.090469 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"35a981dd-1fb1-4a29-a36c-592ea3925ff7","Type":"ContainerStarted","Data":"5e70dc57effa61a0c354857a8899fc8ee8fea8bcf0b9ee5be0b152dd874e9e0f"} Oct 03 08:58:42 crc kubenswrapper[4790]: I1003 08:58:42.090519 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-api-0" Oct 03 08:58:42 crc kubenswrapper[4790]: I1003 08:58:42.094450 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-86f486799b-gwc4l" podStartSLOduration=13.094431482 podStartE2EDuration="13.094431482s" podCreationTimestamp="2025-10-03 08:58:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 08:58:42.088977093 +0000 UTC m=+1108.441911351" watchObservedRunningTime="2025-10-03 08:58:42.094431482 +0000 UTC m=+1108.447365720" Oct 03 08:58:42 crc kubenswrapper[4790]: I1003 08:58:42.110562 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/watcher-api-0" podStartSLOduration=12.110539634 podStartE2EDuration="12.110539634s" podCreationTimestamp="2025-10-03 08:58:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 08:58:42.107699636 +0000 UTC m=+1108.460633884" watchObservedRunningTime="2025-10-03 08:58:42.110539634 +0000 UTC m=+1108.463473872" Oct 03 08:58:43 crc kubenswrapper[4790]: I1003 08:58:43.106248 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"0c264d15-baa6-494c-89e1-220296a37ce8","Type":"ContainerStarted","Data":"e1f7fd3c3f6cd89b03346ac4f66120fb71f51641e1aa7429240630f77d743d72"} Oct 03 08:58:43 crc kubenswrapper[4790]: I1003 08:58:43.375988 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-6448764df-qdvpl" Oct 03 08:58:44 crc kubenswrapper[4790]: I1003 08:58:44.721424 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/watcher-api-0" Oct 03 08:58:45 crc kubenswrapper[4790]: I1003 08:58:45.168717 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"ab258ad1-5952-4586-97ac-9da3ad97658d","Type":"ContainerStarted","Data":"a00e77f9ee436c7cc8964d611a19e338bf93e927ab08a3cc85c2f19a770a75f4"} Oct 03 08:58:46 crc kubenswrapper[4790]: I1003 08:58:46.641824 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-api-0" Oct 03 08:58:47 crc kubenswrapper[4790]: I1003 08:58:47.210676 4790 generic.go:334] "Generic (PLEG): container finished" podID="a839597d-be75-4b9f-9d18-a2e8045a674b" containerID="436f9b4ae537c4ee8eadbceb94dcc53da18a9ada7d1217d7ec775e3e975c1a8d" exitCode=0 Oct 03 08:58:47 crc kubenswrapper[4790]: I1003 08:58:47.211195 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-vzdz9" event={"ID":"a839597d-be75-4b9f-9d18-a2e8045a674b","Type":"ContainerDied","Data":"436f9b4ae537c4ee8eadbceb94dcc53da18a9ada7d1217d7ec775e3e975c1a8d"} Oct 03 08:58:47 crc kubenswrapper[4790]: I1003 08:58:47.216771 4790 generic.go:334] "Generic (PLEG): container finished" podID="7197a2e5-b159-4ea2-bcb6-a58faf21d4ad" containerID="db89e74913a6abb0a0855c991ea609b1ce7803291149958d8aa0f7fb86212ea7" exitCode=0 Oct 03 08:58:47 crc kubenswrapper[4790]: I1003 08:58:47.216805 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-bbwh5" event={"ID":"7197a2e5-b159-4ea2-bcb6-a58faf21d4ad","Type":"ContainerDied","Data":"db89e74913a6abb0a0855c991ea609b1ce7803291149958d8aa0f7fb86212ea7"} Oct 03 08:58:49 crc kubenswrapper[4790]: I1003 08:58:49.476241 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-7d76b4b9d6-n5tqf" Oct 03 08:58:49 crc kubenswrapper[4790]: I1003 08:58:49.476638 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-7d76b4b9d6-n5tqf" Oct 03 08:58:49 crc kubenswrapper[4790]: I1003 08:58:49.600962 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-86f486799b-gwc4l" Oct 03 08:58:49 crc kubenswrapper[4790]: I1003 08:58:49.601025 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-86f486799b-gwc4l" Oct 03 08:58:50 crc kubenswrapper[4790]: I1003 08:58:50.379329 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-6d887b5cbf-5qjjt" Oct 03 08:58:51 crc kubenswrapper[4790]: I1003 08:58:51.642102 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-api-0" Oct 03 08:58:51 crc kubenswrapper[4790]: I1003 08:58:51.651469 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/watcher-api-0" Oct 03 08:58:52 crc kubenswrapper[4790]: I1003 08:58:52.301127 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-vzdz9" event={"ID":"a839597d-be75-4b9f-9d18-a2e8045a674b","Type":"ContainerDied","Data":"43cf0741e59860177d867a729e291d29d497b1d449c8b71eb006fc2f9b0344a8"} Oct 03 08:58:52 crc kubenswrapper[4790]: I1003 08:58:52.301506 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="43cf0741e59860177d867a729e291d29d497b1d449c8b71eb006fc2f9b0344a8" Oct 03 08:58:52 crc kubenswrapper[4790]: I1003 08:58:52.303134 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-bbwh5" event={"ID":"7197a2e5-b159-4ea2-bcb6-a58faf21d4ad","Type":"ContainerDied","Data":"2c06d5bb919d322773f80f35ef9296f43725d6ab6d1350c7dd9b9a90e35cb5f8"} Oct 03 08:58:52 crc kubenswrapper[4790]: I1003 08:58:52.303181 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2c06d5bb919d322773f80f35ef9296f43725d6ab6d1350c7dd9b9a90e35cb5f8" Oct 03 08:58:52 crc kubenswrapper[4790]: I1003 08:58:52.313055 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/watcher-api-0" Oct 03 08:58:52 crc kubenswrapper[4790]: I1003 08:58:52.406114 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-bbwh5" Oct 03 08:58:52 crc kubenswrapper[4790]: I1003 08:58:52.431170 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-vzdz9" Oct 03 08:58:52 crc kubenswrapper[4790]: I1003 08:58:52.508246 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7197a2e5-b159-4ea2-bcb6-a58faf21d4ad-scripts\") pod \"7197a2e5-b159-4ea2-bcb6-a58faf21d4ad\" (UID: \"7197a2e5-b159-4ea2-bcb6-a58faf21d4ad\") " Oct 03 08:58:52 crc kubenswrapper[4790]: I1003 08:58:52.508295 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/7197a2e5-b159-4ea2-bcb6-a58faf21d4ad-credential-keys\") pod \"7197a2e5-b159-4ea2-bcb6-a58faf21d4ad\" (UID: \"7197a2e5-b159-4ea2-bcb6-a58faf21d4ad\") " Oct 03 08:58:52 crc kubenswrapper[4790]: I1003 08:58:52.508406 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7197a2e5-b159-4ea2-bcb6-a58faf21d4ad-combined-ca-bundle\") pod \"7197a2e5-b159-4ea2-bcb6-a58faf21d4ad\" (UID: \"7197a2e5-b159-4ea2-bcb6-a58faf21d4ad\") " Oct 03 08:58:52 crc kubenswrapper[4790]: I1003 08:58:52.508462 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/7197a2e5-b159-4ea2-bcb6-a58faf21d4ad-fernet-keys\") pod \"7197a2e5-b159-4ea2-bcb6-a58faf21d4ad\" (UID: \"7197a2e5-b159-4ea2-bcb6-a58faf21d4ad\") " Oct 03 08:58:52 crc kubenswrapper[4790]: I1003 08:58:52.508495 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a839597d-be75-4b9f-9d18-a2e8045a674b-combined-ca-bundle\") pod \"a839597d-be75-4b9f-9d18-a2e8045a674b\" (UID: \"a839597d-be75-4b9f-9d18-a2e8045a674b\") " Oct 03 08:58:52 crc kubenswrapper[4790]: I1003 08:58:52.508525 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7197a2e5-b159-4ea2-bcb6-a58faf21d4ad-config-data\") pod \"7197a2e5-b159-4ea2-bcb6-a58faf21d4ad\" (UID: \"7197a2e5-b159-4ea2-bcb6-a58faf21d4ad\") " Oct 03 08:58:52 crc kubenswrapper[4790]: I1003 08:58:52.508595 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a839597d-be75-4b9f-9d18-a2e8045a674b-logs\") pod \"a839597d-be75-4b9f-9d18-a2e8045a674b\" (UID: \"a839597d-be75-4b9f-9d18-a2e8045a674b\") " Oct 03 08:58:52 crc kubenswrapper[4790]: I1003 08:58:52.508634 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a839597d-be75-4b9f-9d18-a2e8045a674b-scripts\") pod \"a839597d-be75-4b9f-9d18-a2e8045a674b\" (UID: \"a839597d-be75-4b9f-9d18-a2e8045a674b\") " Oct 03 08:58:52 crc kubenswrapper[4790]: I1003 08:58:52.508726 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a839597d-be75-4b9f-9d18-a2e8045a674b-config-data\") pod \"a839597d-be75-4b9f-9d18-a2e8045a674b\" (UID: \"a839597d-be75-4b9f-9d18-a2e8045a674b\") " Oct 03 08:58:52 crc kubenswrapper[4790]: I1003 08:58:52.508775 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rx8pr\" (UniqueName: \"kubernetes.io/projected/a839597d-be75-4b9f-9d18-a2e8045a674b-kube-api-access-rx8pr\") pod \"a839597d-be75-4b9f-9d18-a2e8045a674b\" (UID: \"a839597d-be75-4b9f-9d18-a2e8045a674b\") " Oct 03 08:58:52 crc kubenswrapper[4790]: I1003 08:58:52.508837 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z8xlz\" (UniqueName: \"kubernetes.io/projected/7197a2e5-b159-4ea2-bcb6-a58faf21d4ad-kube-api-access-z8xlz\") pod \"7197a2e5-b159-4ea2-bcb6-a58faf21d4ad\" (UID: \"7197a2e5-b159-4ea2-bcb6-a58faf21d4ad\") " Oct 03 08:58:52 crc kubenswrapper[4790]: I1003 08:58:52.511030 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a839597d-be75-4b9f-9d18-a2e8045a674b-logs" (OuterVolumeSpecName: "logs") pod "a839597d-be75-4b9f-9d18-a2e8045a674b" (UID: "a839597d-be75-4b9f-9d18-a2e8045a674b"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 08:58:52 crc kubenswrapper[4790]: I1003 08:58:52.525348 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a839597d-be75-4b9f-9d18-a2e8045a674b-scripts" (OuterVolumeSpecName: "scripts") pod "a839597d-be75-4b9f-9d18-a2e8045a674b" (UID: "a839597d-be75-4b9f-9d18-a2e8045a674b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:58:52 crc kubenswrapper[4790]: I1003 08:58:52.526787 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7197a2e5-b159-4ea2-bcb6-a58faf21d4ad-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "7197a2e5-b159-4ea2-bcb6-a58faf21d4ad" (UID: "7197a2e5-b159-4ea2-bcb6-a58faf21d4ad"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:58:52 crc kubenswrapper[4790]: I1003 08:58:52.527350 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a839597d-be75-4b9f-9d18-a2e8045a674b-kube-api-access-rx8pr" (OuterVolumeSpecName: "kube-api-access-rx8pr") pod "a839597d-be75-4b9f-9d18-a2e8045a674b" (UID: "a839597d-be75-4b9f-9d18-a2e8045a674b"). InnerVolumeSpecName "kube-api-access-rx8pr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:58:52 crc kubenswrapper[4790]: I1003 08:58:52.539976 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7197a2e5-b159-4ea2-bcb6-a58faf21d4ad-kube-api-access-z8xlz" (OuterVolumeSpecName: "kube-api-access-z8xlz") pod "7197a2e5-b159-4ea2-bcb6-a58faf21d4ad" (UID: "7197a2e5-b159-4ea2-bcb6-a58faf21d4ad"). InnerVolumeSpecName "kube-api-access-z8xlz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:58:52 crc kubenswrapper[4790]: I1003 08:58:52.541994 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7197a2e5-b159-4ea2-bcb6-a58faf21d4ad-scripts" (OuterVolumeSpecName: "scripts") pod "7197a2e5-b159-4ea2-bcb6-a58faf21d4ad" (UID: "7197a2e5-b159-4ea2-bcb6-a58faf21d4ad"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:58:52 crc kubenswrapper[4790]: I1003 08:58:52.543871 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7197a2e5-b159-4ea2-bcb6-a58faf21d4ad-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "7197a2e5-b159-4ea2-bcb6-a58faf21d4ad" (UID: "7197a2e5-b159-4ea2-bcb6-a58faf21d4ad"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:58:52 crc kubenswrapper[4790]: I1003 08:58:52.561865 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a839597d-be75-4b9f-9d18-a2e8045a674b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a839597d-be75-4b9f-9d18-a2e8045a674b" (UID: "a839597d-be75-4b9f-9d18-a2e8045a674b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:58:52 crc kubenswrapper[4790]: I1003 08:58:52.573975 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7197a2e5-b159-4ea2-bcb6-a58faf21d4ad-config-data" (OuterVolumeSpecName: "config-data") pod "7197a2e5-b159-4ea2-bcb6-a58faf21d4ad" (UID: "7197a2e5-b159-4ea2-bcb6-a58faf21d4ad"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:58:52 crc kubenswrapper[4790]: I1003 08:58:52.576114 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a839597d-be75-4b9f-9d18-a2e8045a674b-config-data" (OuterVolumeSpecName: "config-data") pod "a839597d-be75-4b9f-9d18-a2e8045a674b" (UID: "a839597d-be75-4b9f-9d18-a2e8045a674b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:58:52 crc kubenswrapper[4790]: I1003 08:58:52.593730 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7197a2e5-b159-4ea2-bcb6-a58faf21d4ad-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7197a2e5-b159-4ea2-bcb6-a58faf21d4ad" (UID: "7197a2e5-b159-4ea2-bcb6-a58faf21d4ad"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:58:52 crc kubenswrapper[4790]: I1003 08:58:52.610700 4790 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7197a2e5-b159-4ea2-bcb6-a58faf21d4ad-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 08:58:52 crc kubenswrapper[4790]: I1003 08:58:52.610751 4790 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/7197a2e5-b159-4ea2-bcb6-a58faf21d4ad-credential-keys\") on node \"crc\" DevicePath \"\"" Oct 03 08:58:52 crc kubenswrapper[4790]: I1003 08:58:52.610766 4790 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7197a2e5-b159-4ea2-bcb6-a58faf21d4ad-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 08:58:52 crc kubenswrapper[4790]: I1003 08:58:52.610779 4790 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/7197a2e5-b159-4ea2-bcb6-a58faf21d4ad-fernet-keys\") on node \"crc\" DevicePath \"\"" Oct 03 08:58:52 crc kubenswrapper[4790]: I1003 08:58:52.610792 4790 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a839597d-be75-4b9f-9d18-a2e8045a674b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 08:58:52 crc kubenswrapper[4790]: I1003 08:58:52.610803 4790 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7197a2e5-b159-4ea2-bcb6-a58faf21d4ad-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 08:58:52 crc kubenswrapper[4790]: I1003 08:58:52.610814 4790 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a839597d-be75-4b9f-9d18-a2e8045a674b-logs\") on node \"crc\" DevicePath \"\"" Oct 03 08:58:52 crc kubenswrapper[4790]: I1003 08:58:52.610824 4790 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a839597d-be75-4b9f-9d18-a2e8045a674b-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 08:58:52 crc kubenswrapper[4790]: I1003 08:58:52.610836 4790 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a839597d-be75-4b9f-9d18-a2e8045a674b-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 08:58:52 crc kubenswrapper[4790]: I1003 08:58:52.610848 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rx8pr\" (UniqueName: \"kubernetes.io/projected/a839597d-be75-4b9f-9d18-a2e8045a674b-kube-api-access-rx8pr\") on node \"crc\" DevicePath \"\"" Oct 03 08:58:52 crc kubenswrapper[4790]: I1003 08:58:52.610862 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z8xlz\" (UniqueName: \"kubernetes.io/projected/7197a2e5-b159-4ea2-bcb6-a58faf21d4ad-kube-api-access-z8xlz\") on node \"crc\" DevicePath \"\"" Oct 03 08:58:53 crc kubenswrapper[4790]: I1003 08:58:53.314535 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-bbwh5" Oct 03 08:58:53 crc kubenswrapper[4790]: I1003 08:58:53.314686 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-vzdz9" Oct 03 08:58:53 crc kubenswrapper[4790]: I1003 08:58:53.596278 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-7c4ff96776-8dljp"] Oct 03 08:58:53 crc kubenswrapper[4790]: E1003 08:58:53.596806 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7197a2e5-b159-4ea2-bcb6-a58faf21d4ad" containerName="keystone-bootstrap" Oct 03 08:58:53 crc kubenswrapper[4790]: I1003 08:58:53.596825 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="7197a2e5-b159-4ea2-bcb6-a58faf21d4ad" containerName="keystone-bootstrap" Oct 03 08:58:53 crc kubenswrapper[4790]: E1003 08:58:53.596845 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a839597d-be75-4b9f-9d18-a2e8045a674b" containerName="placement-db-sync" Oct 03 08:58:53 crc kubenswrapper[4790]: I1003 08:58:53.596851 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="a839597d-be75-4b9f-9d18-a2e8045a674b" containerName="placement-db-sync" Oct 03 08:58:53 crc kubenswrapper[4790]: I1003 08:58:53.597049 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="a839597d-be75-4b9f-9d18-a2e8045a674b" containerName="placement-db-sync" Oct 03 08:58:53 crc kubenswrapper[4790]: I1003 08:58:53.597080 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="7197a2e5-b159-4ea2-bcb6-a58faf21d4ad" containerName="keystone-bootstrap" Oct 03 08:58:53 crc kubenswrapper[4790]: I1003 08:58:53.598700 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-7c4ff96776-8dljp" Oct 03 08:58:53 crc kubenswrapper[4790]: I1003 08:58:53.602410 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Oct 03 08:58:53 crc kubenswrapper[4790]: I1003 08:58:53.602506 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Oct 03 08:58:53 crc kubenswrapper[4790]: I1003 08:58:53.602754 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-internal-svc" Oct 03 08:58:53 crc kubenswrapper[4790]: I1003 08:58:53.602940 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-fnq8f" Oct 03 08:58:53 crc kubenswrapper[4790]: I1003 08:58:53.603098 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Oct 03 08:58:53 crc kubenswrapper[4790]: I1003 08:58:53.603292 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-public-svc" Oct 03 08:58:53 crc kubenswrapper[4790]: I1003 08:58:53.604908 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-6d5469d486-dtt4h"] Oct 03 08:58:53 crc kubenswrapper[4790]: I1003 08:58:53.606562 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-6d5469d486-dtt4h" Oct 03 08:58:53 crc kubenswrapper[4790]: I1003 08:58:53.609316 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-public-svc" Oct 03 08:58:53 crc kubenswrapper[4790]: I1003 08:58:53.612897 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Oct 03 08:58:53 crc kubenswrapper[4790]: I1003 08:58:53.613078 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-czqcc" Oct 03 08:58:53 crc kubenswrapper[4790]: I1003 08:58:53.613360 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Oct 03 08:58:53 crc kubenswrapper[4790]: I1003 08:58:53.613962 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-internal-svc" Oct 03 08:58:53 crc kubenswrapper[4790]: I1003 08:58:53.619141 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-7c4ff96776-8dljp"] Oct 03 08:58:53 crc kubenswrapper[4790]: I1003 08:58:53.629964 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-6d5469d486-dtt4h"] Oct 03 08:58:53 crc kubenswrapper[4790]: I1003 08:58:53.739929 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/750b0c2b-c871-46f7-b318-78ace47356a9-combined-ca-bundle\") pod \"keystone-7c4ff96776-8dljp\" (UID: \"750b0c2b-c871-46f7-b318-78ace47356a9\") " pod="openstack/keystone-7c4ff96776-8dljp" Oct 03 08:58:53 crc kubenswrapper[4790]: I1003 08:58:53.739990 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/38463967-5519-4ea0-8081-b22ac7858270-public-tls-certs\") pod \"placement-6d5469d486-dtt4h\" (UID: \"38463967-5519-4ea0-8081-b22ac7858270\") " pod="openstack/placement-6d5469d486-dtt4h" Oct 03 08:58:53 crc kubenswrapper[4790]: I1003 08:58:53.740025 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/750b0c2b-c871-46f7-b318-78ace47356a9-credential-keys\") pod \"keystone-7c4ff96776-8dljp\" (UID: \"750b0c2b-c871-46f7-b318-78ace47356a9\") " pod="openstack/keystone-7c4ff96776-8dljp" Oct 03 08:58:53 crc kubenswrapper[4790]: I1003 08:58:53.740068 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2qmws\" (UniqueName: \"kubernetes.io/projected/750b0c2b-c871-46f7-b318-78ace47356a9-kube-api-access-2qmws\") pod \"keystone-7c4ff96776-8dljp\" (UID: \"750b0c2b-c871-46f7-b318-78ace47356a9\") " pod="openstack/keystone-7c4ff96776-8dljp" Oct 03 08:58:53 crc kubenswrapper[4790]: I1003 08:58:53.740115 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/38463967-5519-4ea0-8081-b22ac7858270-internal-tls-certs\") pod \"placement-6d5469d486-dtt4h\" (UID: \"38463967-5519-4ea0-8081-b22ac7858270\") " pod="openstack/placement-6d5469d486-dtt4h" Oct 03 08:58:53 crc kubenswrapper[4790]: I1003 08:58:53.740139 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/38463967-5519-4ea0-8081-b22ac7858270-config-data\") pod \"placement-6d5469d486-dtt4h\" (UID: \"38463967-5519-4ea0-8081-b22ac7858270\") " pod="openstack/placement-6d5469d486-dtt4h" Oct 03 08:58:53 crc kubenswrapper[4790]: I1003 08:58:53.740166 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/38463967-5519-4ea0-8081-b22ac7858270-logs\") pod \"placement-6d5469d486-dtt4h\" (UID: \"38463967-5519-4ea0-8081-b22ac7858270\") " pod="openstack/placement-6d5469d486-dtt4h" Oct 03 08:58:53 crc kubenswrapper[4790]: I1003 08:58:53.740196 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/750b0c2b-c871-46f7-b318-78ace47356a9-scripts\") pod \"keystone-7c4ff96776-8dljp\" (UID: \"750b0c2b-c871-46f7-b318-78ace47356a9\") " pod="openstack/keystone-7c4ff96776-8dljp" Oct 03 08:58:53 crc kubenswrapper[4790]: I1003 08:58:53.740220 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/750b0c2b-c871-46f7-b318-78ace47356a9-public-tls-certs\") pod \"keystone-7c4ff96776-8dljp\" (UID: \"750b0c2b-c871-46f7-b318-78ace47356a9\") " pod="openstack/keystone-7c4ff96776-8dljp" Oct 03 08:58:53 crc kubenswrapper[4790]: I1003 08:58:53.740245 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7jkkt\" (UniqueName: \"kubernetes.io/projected/38463967-5519-4ea0-8081-b22ac7858270-kube-api-access-7jkkt\") pod \"placement-6d5469d486-dtt4h\" (UID: \"38463967-5519-4ea0-8081-b22ac7858270\") " pod="openstack/placement-6d5469d486-dtt4h" Oct 03 08:58:53 crc kubenswrapper[4790]: I1003 08:58:53.740488 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/38463967-5519-4ea0-8081-b22ac7858270-scripts\") pod \"placement-6d5469d486-dtt4h\" (UID: \"38463967-5519-4ea0-8081-b22ac7858270\") " pod="openstack/placement-6d5469d486-dtt4h" Oct 03 08:58:53 crc kubenswrapper[4790]: I1003 08:58:53.740638 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/38463967-5519-4ea0-8081-b22ac7858270-combined-ca-bundle\") pod \"placement-6d5469d486-dtt4h\" (UID: \"38463967-5519-4ea0-8081-b22ac7858270\") " pod="openstack/placement-6d5469d486-dtt4h" Oct 03 08:58:53 crc kubenswrapper[4790]: I1003 08:58:53.740681 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/750b0c2b-c871-46f7-b318-78ace47356a9-config-data\") pod \"keystone-7c4ff96776-8dljp\" (UID: \"750b0c2b-c871-46f7-b318-78ace47356a9\") " pod="openstack/keystone-7c4ff96776-8dljp" Oct 03 08:58:53 crc kubenswrapper[4790]: I1003 08:58:53.740709 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/750b0c2b-c871-46f7-b318-78ace47356a9-internal-tls-certs\") pod \"keystone-7c4ff96776-8dljp\" (UID: \"750b0c2b-c871-46f7-b318-78ace47356a9\") " pod="openstack/keystone-7c4ff96776-8dljp" Oct 03 08:58:53 crc kubenswrapper[4790]: I1003 08:58:53.740765 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/750b0c2b-c871-46f7-b318-78ace47356a9-fernet-keys\") pod \"keystone-7c4ff96776-8dljp\" (UID: \"750b0c2b-c871-46f7-b318-78ace47356a9\") " pod="openstack/keystone-7c4ff96776-8dljp" Oct 03 08:58:53 crc kubenswrapper[4790]: I1003 08:58:53.842819 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/38463967-5519-4ea0-8081-b22ac7858270-internal-tls-certs\") pod \"placement-6d5469d486-dtt4h\" (UID: \"38463967-5519-4ea0-8081-b22ac7858270\") " pod="openstack/placement-6d5469d486-dtt4h" Oct 03 08:58:53 crc kubenswrapper[4790]: I1003 08:58:53.842879 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/38463967-5519-4ea0-8081-b22ac7858270-config-data\") pod \"placement-6d5469d486-dtt4h\" (UID: \"38463967-5519-4ea0-8081-b22ac7858270\") " pod="openstack/placement-6d5469d486-dtt4h" Oct 03 08:58:53 crc kubenswrapper[4790]: I1003 08:58:53.842918 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/38463967-5519-4ea0-8081-b22ac7858270-logs\") pod \"placement-6d5469d486-dtt4h\" (UID: \"38463967-5519-4ea0-8081-b22ac7858270\") " pod="openstack/placement-6d5469d486-dtt4h" Oct 03 08:58:53 crc kubenswrapper[4790]: I1003 08:58:53.842948 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/750b0c2b-c871-46f7-b318-78ace47356a9-scripts\") pod \"keystone-7c4ff96776-8dljp\" (UID: \"750b0c2b-c871-46f7-b318-78ace47356a9\") " pod="openstack/keystone-7c4ff96776-8dljp" Oct 03 08:58:53 crc kubenswrapper[4790]: I1003 08:58:53.842978 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/750b0c2b-c871-46f7-b318-78ace47356a9-public-tls-certs\") pod \"keystone-7c4ff96776-8dljp\" (UID: \"750b0c2b-c871-46f7-b318-78ace47356a9\") " pod="openstack/keystone-7c4ff96776-8dljp" Oct 03 08:58:53 crc kubenswrapper[4790]: I1003 08:58:53.843000 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7jkkt\" (UniqueName: \"kubernetes.io/projected/38463967-5519-4ea0-8081-b22ac7858270-kube-api-access-7jkkt\") pod \"placement-6d5469d486-dtt4h\" (UID: \"38463967-5519-4ea0-8081-b22ac7858270\") " pod="openstack/placement-6d5469d486-dtt4h" Oct 03 08:58:53 crc kubenswrapper[4790]: I1003 08:58:53.843027 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/38463967-5519-4ea0-8081-b22ac7858270-scripts\") pod \"placement-6d5469d486-dtt4h\" (UID: \"38463967-5519-4ea0-8081-b22ac7858270\") " pod="openstack/placement-6d5469d486-dtt4h" Oct 03 08:58:53 crc kubenswrapper[4790]: I1003 08:58:53.843067 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/38463967-5519-4ea0-8081-b22ac7858270-combined-ca-bundle\") pod \"placement-6d5469d486-dtt4h\" (UID: \"38463967-5519-4ea0-8081-b22ac7858270\") " pod="openstack/placement-6d5469d486-dtt4h" Oct 03 08:58:53 crc kubenswrapper[4790]: I1003 08:58:53.843093 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/750b0c2b-c871-46f7-b318-78ace47356a9-config-data\") pod \"keystone-7c4ff96776-8dljp\" (UID: \"750b0c2b-c871-46f7-b318-78ace47356a9\") " pod="openstack/keystone-7c4ff96776-8dljp" Oct 03 08:58:53 crc kubenswrapper[4790]: I1003 08:58:53.843122 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/750b0c2b-c871-46f7-b318-78ace47356a9-internal-tls-certs\") pod \"keystone-7c4ff96776-8dljp\" (UID: \"750b0c2b-c871-46f7-b318-78ace47356a9\") " pod="openstack/keystone-7c4ff96776-8dljp" Oct 03 08:58:53 crc kubenswrapper[4790]: I1003 08:58:53.843154 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/750b0c2b-c871-46f7-b318-78ace47356a9-fernet-keys\") pod \"keystone-7c4ff96776-8dljp\" (UID: \"750b0c2b-c871-46f7-b318-78ace47356a9\") " pod="openstack/keystone-7c4ff96776-8dljp" Oct 03 08:58:53 crc kubenswrapper[4790]: I1003 08:58:53.843250 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/750b0c2b-c871-46f7-b318-78ace47356a9-combined-ca-bundle\") pod \"keystone-7c4ff96776-8dljp\" (UID: \"750b0c2b-c871-46f7-b318-78ace47356a9\") " pod="openstack/keystone-7c4ff96776-8dljp" Oct 03 08:58:53 crc kubenswrapper[4790]: I1003 08:58:53.843278 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/38463967-5519-4ea0-8081-b22ac7858270-public-tls-certs\") pod \"placement-6d5469d486-dtt4h\" (UID: \"38463967-5519-4ea0-8081-b22ac7858270\") " pod="openstack/placement-6d5469d486-dtt4h" Oct 03 08:58:53 crc kubenswrapper[4790]: I1003 08:58:53.843308 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/750b0c2b-c871-46f7-b318-78ace47356a9-credential-keys\") pod \"keystone-7c4ff96776-8dljp\" (UID: \"750b0c2b-c871-46f7-b318-78ace47356a9\") " pod="openstack/keystone-7c4ff96776-8dljp" Oct 03 08:58:53 crc kubenswrapper[4790]: I1003 08:58:53.843351 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2qmws\" (UniqueName: \"kubernetes.io/projected/750b0c2b-c871-46f7-b318-78ace47356a9-kube-api-access-2qmws\") pod \"keystone-7c4ff96776-8dljp\" (UID: \"750b0c2b-c871-46f7-b318-78ace47356a9\") " pod="openstack/keystone-7c4ff96776-8dljp" Oct 03 08:58:53 crc kubenswrapper[4790]: I1003 08:58:53.860156 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/38463967-5519-4ea0-8081-b22ac7858270-logs\") pod \"placement-6d5469d486-dtt4h\" (UID: \"38463967-5519-4ea0-8081-b22ac7858270\") " pod="openstack/placement-6d5469d486-dtt4h" Oct 03 08:58:53 crc kubenswrapper[4790]: I1003 08:58:53.863731 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/750b0c2b-c871-46f7-b318-78ace47356a9-config-data\") pod \"keystone-7c4ff96776-8dljp\" (UID: \"750b0c2b-c871-46f7-b318-78ace47356a9\") " pod="openstack/keystone-7c4ff96776-8dljp" Oct 03 08:58:53 crc kubenswrapper[4790]: I1003 08:58:53.869001 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/750b0c2b-c871-46f7-b318-78ace47356a9-scripts\") pod \"keystone-7c4ff96776-8dljp\" (UID: \"750b0c2b-c871-46f7-b318-78ace47356a9\") " pod="openstack/keystone-7c4ff96776-8dljp" Oct 03 08:58:53 crc kubenswrapper[4790]: I1003 08:58:53.871186 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/38463967-5519-4ea0-8081-b22ac7858270-combined-ca-bundle\") pod \"placement-6d5469d486-dtt4h\" (UID: \"38463967-5519-4ea0-8081-b22ac7858270\") " pod="openstack/placement-6d5469d486-dtt4h" Oct 03 08:58:53 crc kubenswrapper[4790]: I1003 08:58:53.872802 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/38463967-5519-4ea0-8081-b22ac7858270-scripts\") pod \"placement-6d5469d486-dtt4h\" (UID: \"38463967-5519-4ea0-8081-b22ac7858270\") " pod="openstack/placement-6d5469d486-dtt4h" Oct 03 08:58:53 crc kubenswrapper[4790]: I1003 08:58:53.873059 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/38463967-5519-4ea0-8081-b22ac7858270-config-data\") pod \"placement-6d5469d486-dtt4h\" (UID: \"38463967-5519-4ea0-8081-b22ac7858270\") " pod="openstack/placement-6d5469d486-dtt4h" Oct 03 08:58:53 crc kubenswrapper[4790]: I1003 08:58:53.873774 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/38463967-5519-4ea0-8081-b22ac7858270-internal-tls-certs\") pod \"placement-6d5469d486-dtt4h\" (UID: \"38463967-5519-4ea0-8081-b22ac7858270\") " pod="openstack/placement-6d5469d486-dtt4h" Oct 03 08:58:53 crc kubenswrapper[4790]: I1003 08:58:53.878270 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/750b0c2b-c871-46f7-b318-78ace47356a9-fernet-keys\") pod \"keystone-7c4ff96776-8dljp\" (UID: \"750b0c2b-c871-46f7-b318-78ace47356a9\") " pod="openstack/keystone-7c4ff96776-8dljp" Oct 03 08:58:53 crc kubenswrapper[4790]: I1003 08:58:53.878966 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/750b0c2b-c871-46f7-b318-78ace47356a9-combined-ca-bundle\") pod \"keystone-7c4ff96776-8dljp\" (UID: \"750b0c2b-c871-46f7-b318-78ace47356a9\") " pod="openstack/keystone-7c4ff96776-8dljp" Oct 03 08:58:53 crc kubenswrapper[4790]: I1003 08:58:53.879465 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/750b0c2b-c871-46f7-b318-78ace47356a9-public-tls-certs\") pod \"keystone-7c4ff96776-8dljp\" (UID: \"750b0c2b-c871-46f7-b318-78ace47356a9\") " pod="openstack/keystone-7c4ff96776-8dljp" Oct 03 08:58:53 crc kubenswrapper[4790]: I1003 08:58:53.879683 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2qmws\" (UniqueName: \"kubernetes.io/projected/750b0c2b-c871-46f7-b318-78ace47356a9-kube-api-access-2qmws\") pod \"keystone-7c4ff96776-8dljp\" (UID: \"750b0c2b-c871-46f7-b318-78ace47356a9\") " pod="openstack/keystone-7c4ff96776-8dljp" Oct 03 08:58:53 crc kubenswrapper[4790]: I1003 08:58:53.879882 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/750b0c2b-c871-46f7-b318-78ace47356a9-credential-keys\") pod \"keystone-7c4ff96776-8dljp\" (UID: \"750b0c2b-c871-46f7-b318-78ace47356a9\") " pod="openstack/keystone-7c4ff96776-8dljp" Oct 03 08:58:53 crc kubenswrapper[4790]: I1003 08:58:53.881132 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/750b0c2b-c871-46f7-b318-78ace47356a9-internal-tls-certs\") pod \"keystone-7c4ff96776-8dljp\" (UID: \"750b0c2b-c871-46f7-b318-78ace47356a9\") " pod="openstack/keystone-7c4ff96776-8dljp" Oct 03 08:58:53 crc kubenswrapper[4790]: I1003 08:58:53.886014 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7jkkt\" (UniqueName: \"kubernetes.io/projected/38463967-5519-4ea0-8081-b22ac7858270-kube-api-access-7jkkt\") pod \"placement-6d5469d486-dtt4h\" (UID: \"38463967-5519-4ea0-8081-b22ac7858270\") " pod="openstack/placement-6d5469d486-dtt4h" Oct 03 08:58:53 crc kubenswrapper[4790]: I1003 08:58:53.890379 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/38463967-5519-4ea0-8081-b22ac7858270-public-tls-certs\") pod \"placement-6d5469d486-dtt4h\" (UID: \"38463967-5519-4ea0-8081-b22ac7858270\") " pod="openstack/placement-6d5469d486-dtt4h" Oct 03 08:58:53 crc kubenswrapper[4790]: I1003 08:58:53.939876 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-7c4ff96776-8dljp" Oct 03 08:58:53 crc kubenswrapper[4790]: I1003 08:58:53.962089 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-6d5469d486-dtt4h" Oct 03 08:58:55 crc kubenswrapper[4790]: I1003 08:58:55.185235 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-api-0"] Oct 03 08:58:55 crc kubenswrapper[4790]: I1003 08:58:55.185510 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/watcher-api-0" podUID="35a981dd-1fb1-4a29-a36c-592ea3925ff7" containerName="watcher-api-log" containerID="cri-o://0756731148ec4e5f1681cd64aa3f20411e6b35df4b805f488fb6e06f6474e65a" gracePeriod=30 Oct 03 08:58:55 crc kubenswrapper[4790]: I1003 08:58:55.185600 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/watcher-api-0" podUID="35a981dd-1fb1-4a29-a36c-592ea3925ff7" containerName="watcher-api" containerID="cri-o://5e70dc57effa61a0c354857a8899fc8ee8fea8bcf0b9ee5be0b152dd874e9e0f" gracePeriod=30 Oct 03 08:58:55 crc kubenswrapper[4790]: I1003 08:58:55.341544 4790 generic.go:334] "Generic (PLEG): container finished" podID="35a981dd-1fb1-4a29-a36c-592ea3925ff7" containerID="0756731148ec4e5f1681cd64aa3f20411e6b35df4b805f488fb6e06f6474e65a" exitCode=143 Oct 03 08:58:55 crc kubenswrapper[4790]: I1003 08:58:55.341778 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"35a981dd-1fb1-4a29-a36c-592ea3925ff7","Type":"ContainerDied","Data":"0756731148ec4e5f1681cd64aa3f20411e6b35df4b805f488fb6e06f6474e65a"} Oct 03 08:58:56 crc kubenswrapper[4790]: I1003 08:58:56.640766 4790 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/watcher-api-0" podUID="35a981dd-1fb1-4a29-a36c-592ea3925ff7" containerName="watcher-api" probeResult="failure" output="Get \"http://10.217.0.162:9322/\": dial tcp 10.217.0.162:9322: connect: connection refused" Oct 03 08:58:56 crc kubenswrapper[4790]: I1003 08:58:56.641191 4790 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/watcher-api-0" podUID="35a981dd-1fb1-4a29-a36c-592ea3925ff7" containerName="watcher-api-log" probeResult="failure" output="Get \"http://10.217.0.162:9322/\": dial tcp 10.217.0.162:9322: connect: connection refused" Oct 03 08:58:57 crc kubenswrapper[4790]: I1003 08:58:57.368545 4790 generic.go:334] "Generic (PLEG): container finished" podID="35a981dd-1fb1-4a29-a36c-592ea3925ff7" containerID="5e70dc57effa61a0c354857a8899fc8ee8fea8bcf0b9ee5be0b152dd874e9e0f" exitCode=0 Oct 03 08:58:57 crc kubenswrapper[4790]: I1003 08:58:57.368611 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"35a981dd-1fb1-4a29-a36c-592ea3925ff7","Type":"ContainerDied","Data":"5e70dc57effa61a0c354857a8899fc8ee8fea8bcf0b9ee5be0b152dd874e9e0f"} Oct 03 08:59:01 crc kubenswrapper[4790]: I1003 08:59:01.642486 4790 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/watcher-api-0" podUID="35a981dd-1fb1-4a29-a36c-592ea3925ff7" containerName="watcher-api-log" probeResult="failure" output="Get \"http://10.217.0.162:9322/\": dial tcp 10.217.0.162:9322: connect: connection refused" Oct 03 08:59:01 crc kubenswrapper[4790]: I1003 08:59:01.642484 4790 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/watcher-api-0" podUID="35a981dd-1fb1-4a29-a36c-592ea3925ff7" containerName="watcher-api" probeResult="failure" output="Get \"http://10.217.0.162:9322/\": dial tcp 10.217.0.162:9322: connect: connection refused" Oct 03 08:59:01 crc kubenswrapper[4790]: I1003 08:59:01.842747 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-86f486799b-gwc4l" Oct 03 08:59:01 crc kubenswrapper[4790]: I1003 08:59:01.874096 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-7d76b4b9d6-n5tqf" Oct 03 08:59:03 crc kubenswrapper[4790]: E1003 08:59:03.228480 4790 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.58:5001/podified-master-centos10/openstack-barbican-api:watcher_latest" Oct 03 08:59:03 crc kubenswrapper[4790]: E1003 08:59:03.228680 4790 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.58:5001/podified-master-centos10/openstack-barbican-api:watcher_latest" Oct 03 08:59:03 crc kubenswrapper[4790]: E1003 08:59:03.231097 4790 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:barbican-db-sync,Image:38.102.83.58:5001/podified-master-centos10/openstack-barbican-api:watcher_latest,Command:[/bin/bash],Args:[-c barbican-manage db upgrade],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/barbican/barbican.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-zk7df,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42403,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42403,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod barbican-db-sync-brfpm_openstack(20340665-5e73-4bfd-8a1a-6d21e0d16432): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 03 08:59:03 crc kubenswrapper[4790]: E1003 08:59:03.232189 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/barbican-db-sync-brfpm" podUID="20340665-5e73-4bfd-8a1a-6d21e0d16432" Oct 03 08:59:03 crc kubenswrapper[4790]: E1003 08:59:03.478768 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.58:5001/podified-master-centos10/openstack-barbican-api:watcher_latest\\\"\"" pod="openstack/barbican-db-sync-brfpm" podUID="20340665-5e73-4bfd-8a1a-6d21e0d16432" Oct 03 08:59:03 crc kubenswrapper[4790]: I1003 08:59:03.610894 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-7d76b4b9d6-n5tqf" Oct 03 08:59:03 crc kubenswrapper[4790]: I1003 08:59:03.698826 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-86f486799b-gwc4l" Oct 03 08:59:03 crc kubenswrapper[4790]: I1003 08:59:03.756740 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-7d76b4b9d6-n5tqf"] Oct 03 08:59:04 crc kubenswrapper[4790]: I1003 08:59:04.486700 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-7d76b4b9d6-n5tqf" podUID="f57970a5-79c5-4a4f-bdf6-89d25bd8d8c5" containerName="horizon-log" containerID="cri-o://bc33e5db07b735662f20eb2b839ba0dbfaa1141b576d9dfed09994c044e63804" gracePeriod=30 Oct 03 08:59:04 crc kubenswrapper[4790]: I1003 08:59:04.486814 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-7d76b4b9d6-n5tqf" podUID="f57970a5-79c5-4a4f-bdf6-89d25bd8d8c5" containerName="horizon" containerID="cri-o://4a4cc8546043f16febf7615f741031a66032e2c6d6c4ccd1550b14aee53138a2" gracePeriod=30 Oct 03 08:59:04 crc kubenswrapper[4790]: E1003 08:59:04.916467 4790 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.58:5001/podified-master-centos10/openstack-cinder-api:watcher_latest" Oct 03 08:59:04 crc kubenswrapper[4790]: E1003 08:59:04.916834 4790 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.58:5001/podified-master-centos10/openstack-cinder-api:watcher_latest" Oct 03 08:59:04 crc kubenswrapper[4790]: E1003 08:59:04.917066 4790 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cinder-db-sync,Image:38.102.83.58:5001/podified-master-centos10/openstack-cinder-api:watcher_latest,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-machine-id,ReadOnly:true,MountPath:/etc/machine-id,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/config-data/merged,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/cinder/cinder.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-hv7rx,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-db-sync-gklmz_openstack(96250103-59af-4ec6-a036-f6e8222b02e0): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 03 08:59:04 crc kubenswrapper[4790]: E1003 08:59:04.918882 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/cinder-db-sync-gklmz" podUID="96250103-59af-4ec6-a036-f6e8222b02e0" Oct 03 08:59:05 crc kubenswrapper[4790]: I1003 08:59:05.242349 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-api-0" Oct 03 08:59:05 crc kubenswrapper[4790]: I1003 08:59:05.280511 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/35a981dd-1fb1-4a29-a36c-592ea3925ff7-combined-ca-bundle\") pod \"35a981dd-1fb1-4a29-a36c-592ea3925ff7\" (UID: \"35a981dd-1fb1-4a29-a36c-592ea3925ff7\") " Oct 03 08:59:05 crc kubenswrapper[4790]: I1003 08:59:05.280620 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/35a981dd-1fb1-4a29-a36c-592ea3925ff7-logs\") pod \"35a981dd-1fb1-4a29-a36c-592ea3925ff7\" (UID: \"35a981dd-1fb1-4a29-a36c-592ea3925ff7\") " Oct 03 08:59:05 crc kubenswrapper[4790]: I1003 08:59:05.280677 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/35a981dd-1fb1-4a29-a36c-592ea3925ff7-custom-prometheus-ca\") pod \"35a981dd-1fb1-4a29-a36c-592ea3925ff7\" (UID: \"35a981dd-1fb1-4a29-a36c-592ea3925ff7\") " Oct 03 08:59:05 crc kubenswrapper[4790]: I1003 08:59:05.280707 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/35a981dd-1fb1-4a29-a36c-592ea3925ff7-config-data\") pod \"35a981dd-1fb1-4a29-a36c-592ea3925ff7\" (UID: \"35a981dd-1fb1-4a29-a36c-592ea3925ff7\") " Oct 03 08:59:05 crc kubenswrapper[4790]: I1003 08:59:05.281674 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/35a981dd-1fb1-4a29-a36c-592ea3925ff7-logs" (OuterVolumeSpecName: "logs") pod "35a981dd-1fb1-4a29-a36c-592ea3925ff7" (UID: "35a981dd-1fb1-4a29-a36c-592ea3925ff7"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 08:59:05 crc kubenswrapper[4790]: I1003 08:59:05.338350 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/35a981dd-1fb1-4a29-a36c-592ea3925ff7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "35a981dd-1fb1-4a29-a36c-592ea3925ff7" (UID: "35a981dd-1fb1-4a29-a36c-592ea3925ff7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:59:05 crc kubenswrapper[4790]: I1003 08:59:05.382373 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xwdjj\" (UniqueName: \"kubernetes.io/projected/35a981dd-1fb1-4a29-a36c-592ea3925ff7-kube-api-access-xwdjj\") pod \"35a981dd-1fb1-4a29-a36c-592ea3925ff7\" (UID: \"35a981dd-1fb1-4a29-a36c-592ea3925ff7\") " Oct 03 08:59:05 crc kubenswrapper[4790]: I1003 08:59:05.382826 4790 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/35a981dd-1fb1-4a29-a36c-592ea3925ff7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 08:59:05 crc kubenswrapper[4790]: I1003 08:59:05.382845 4790 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/35a981dd-1fb1-4a29-a36c-592ea3925ff7-logs\") on node \"crc\" DevicePath \"\"" Oct 03 08:59:05 crc kubenswrapper[4790]: I1003 08:59:05.386858 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/35a981dd-1fb1-4a29-a36c-592ea3925ff7-kube-api-access-xwdjj" (OuterVolumeSpecName: "kube-api-access-xwdjj") pod "35a981dd-1fb1-4a29-a36c-592ea3925ff7" (UID: "35a981dd-1fb1-4a29-a36c-592ea3925ff7"). InnerVolumeSpecName "kube-api-access-xwdjj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:59:05 crc kubenswrapper[4790]: I1003 08:59:05.412293 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/35a981dd-1fb1-4a29-a36c-592ea3925ff7-custom-prometheus-ca" (OuterVolumeSpecName: "custom-prometheus-ca") pod "35a981dd-1fb1-4a29-a36c-592ea3925ff7" (UID: "35a981dd-1fb1-4a29-a36c-592ea3925ff7"). InnerVolumeSpecName "custom-prometheus-ca". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:59:05 crc kubenswrapper[4790]: I1003 08:59:05.447861 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/35a981dd-1fb1-4a29-a36c-592ea3925ff7-config-data" (OuterVolumeSpecName: "config-data") pod "35a981dd-1fb1-4a29-a36c-592ea3925ff7" (UID: "35a981dd-1fb1-4a29-a36c-592ea3925ff7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:59:05 crc kubenswrapper[4790]: I1003 08:59:05.487193 4790 reconciler_common.go:293] "Volume detached for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/35a981dd-1fb1-4a29-a36c-592ea3925ff7-custom-prometheus-ca\") on node \"crc\" DevicePath \"\"" Oct 03 08:59:05 crc kubenswrapper[4790]: I1003 08:59:05.487225 4790 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/35a981dd-1fb1-4a29-a36c-592ea3925ff7-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 08:59:05 crc kubenswrapper[4790]: I1003 08:59:05.487237 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xwdjj\" (UniqueName: \"kubernetes.io/projected/35a981dd-1fb1-4a29-a36c-592ea3925ff7-kube-api-access-xwdjj\") on node \"crc\" DevicePath \"\"" Oct 03 08:59:05 crc kubenswrapper[4790]: I1003 08:59:05.505567 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8a0817c5-b72b-4433-977d-6185317b0885","Type":"ContainerStarted","Data":"a7695d2d6a51762860632c605a70ca0fddde0aa8e14b150fee62a009907aca95"} Oct 03 08:59:05 crc kubenswrapper[4790]: I1003 08:59:05.508989 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-applier-0" event={"ID":"67485d96-5ca0-4195-bb7a-d09d03d3dc49","Type":"ContainerStarted","Data":"1cd3ca3f8cab2c9e46c0c3aabd5d50b766648f23d471d9f872e26525f9cd4590"} Oct 03 08:59:05 crc kubenswrapper[4790]: I1003 08:59:05.520171 4790 generic.go:334] "Generic (PLEG): container finished" podID="f57970a5-79c5-4a4f-bdf6-89d25bd8d8c5" containerID="4a4cc8546043f16febf7615f741031a66032e2c6d6c4ccd1550b14aee53138a2" exitCode=0 Oct 03 08:59:05 crc kubenswrapper[4790]: I1003 08:59:05.520276 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7d76b4b9d6-n5tqf" event={"ID":"f57970a5-79c5-4a4f-bdf6-89d25bd8d8c5","Type":"ContainerDied","Data":"4a4cc8546043f16febf7615f741031a66032e2c6d6c4ccd1550b14aee53138a2"} Oct 03 08:59:05 crc kubenswrapper[4790]: I1003 08:59:05.529648 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-api-0" Oct 03 08:59:05 crc kubenswrapper[4790]: I1003 08:59:05.532171 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"35a981dd-1fb1-4a29-a36c-592ea3925ff7","Type":"ContainerDied","Data":"3186d5368edc29f905c35efc70437e28a583147d846e4a3da578be64dcc7d48a"} Oct 03 08:59:05 crc kubenswrapper[4790]: I1003 08:59:05.532217 4790 scope.go:117] "RemoveContainer" containerID="5e70dc57effa61a0c354857a8899fc8ee8fea8bcf0b9ee5be0b152dd874e9e0f" Oct 03 08:59:05 crc kubenswrapper[4790]: E1003 08:59:05.532416 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.58:5001/podified-master-centos10/openstack-cinder-api:watcher_latest\\\"\"" pod="openstack/cinder-db-sync-gklmz" podUID="96250103-59af-4ec6-a036-f6e8222b02e0" Oct 03 08:59:05 crc kubenswrapper[4790]: I1003 08:59:05.536371 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/watcher-applier-0" podStartSLOduration=9.415294146 podStartE2EDuration="34.536350206s" podCreationTimestamp="2025-10-03 08:58:31 +0000 UTC" firstStartedPulling="2025-10-03 08:58:39.733476149 +0000 UTC m=+1106.086410387" lastFinishedPulling="2025-10-03 08:59:04.854532209 +0000 UTC m=+1131.207466447" observedRunningTime="2025-10-03 08:59:05.52738132 +0000 UTC m=+1131.880315558" watchObservedRunningTime="2025-10-03 08:59:05.536350206 +0000 UTC m=+1131.889284444" Oct 03 08:59:05 crc kubenswrapper[4790]: I1003 08:59:05.567873 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-6d5469d486-dtt4h"] Oct 03 08:59:05 crc kubenswrapper[4790]: I1003 08:59:05.629043 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-api-0"] Oct 03 08:59:05 crc kubenswrapper[4790]: I1003 08:59:05.678015 4790 scope.go:117] "RemoveContainer" containerID="0756731148ec4e5f1681cd64aa3f20411e6b35df4b805f488fb6e06f6474e65a" Oct 03 08:59:05 crc kubenswrapper[4790]: I1003 08:59:05.678177 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/watcher-api-0"] Oct 03 08:59:05 crc kubenswrapper[4790]: I1003 08:59:05.723770 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-api-0"] Oct 03 08:59:05 crc kubenswrapper[4790]: E1003 08:59:05.724335 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="35a981dd-1fb1-4a29-a36c-592ea3925ff7" containerName="watcher-api" Oct 03 08:59:05 crc kubenswrapper[4790]: I1003 08:59:05.724355 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="35a981dd-1fb1-4a29-a36c-592ea3925ff7" containerName="watcher-api" Oct 03 08:59:05 crc kubenswrapper[4790]: E1003 08:59:05.724399 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="35a981dd-1fb1-4a29-a36c-592ea3925ff7" containerName="watcher-api-log" Oct 03 08:59:05 crc kubenswrapper[4790]: I1003 08:59:05.724407 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="35a981dd-1fb1-4a29-a36c-592ea3925ff7" containerName="watcher-api-log" Oct 03 08:59:05 crc kubenswrapper[4790]: I1003 08:59:05.724721 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="35a981dd-1fb1-4a29-a36c-592ea3925ff7" containerName="watcher-api" Oct 03 08:59:05 crc kubenswrapper[4790]: I1003 08:59:05.724744 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="35a981dd-1fb1-4a29-a36c-592ea3925ff7" containerName="watcher-api-log" Oct 03 08:59:05 crc kubenswrapper[4790]: I1003 08:59:05.726065 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-api-0" Oct 03 08:59:05 crc kubenswrapper[4790]: I1003 08:59:05.734968 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-watcher-internal-svc" Oct 03 08:59:05 crc kubenswrapper[4790]: I1003 08:59:05.735247 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-api-config-data" Oct 03 08:59:05 crc kubenswrapper[4790]: I1003 08:59:05.735347 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-watcher-public-svc" Oct 03 08:59:05 crc kubenswrapper[4790]: I1003 08:59:05.746488 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-api-0"] Oct 03 08:59:05 crc kubenswrapper[4790]: I1003 08:59:05.751020 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-7c4ff96776-8dljp"] Oct 03 08:59:05 crc kubenswrapper[4790]: W1003 08:59:05.756819 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod750b0c2b_c871_46f7_b318_78ace47356a9.slice/crio-c2a91eed9320508c648abe83c3bc8c47859915a49491ecf31be1fb4645ff308b WatchSource:0}: Error finding container c2a91eed9320508c648abe83c3bc8c47859915a49491ecf31be1fb4645ff308b: Status 404 returned error can't find the container with id c2a91eed9320508c648abe83c3bc8c47859915a49491ecf31be1fb4645ff308b Oct 03 08:59:05 crc kubenswrapper[4790]: I1003 08:59:05.916102 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0b0e105-a5dc-4a34-b966-14fba0fd89ec-combined-ca-bundle\") pod \"watcher-api-0\" (UID: \"e0b0e105-a5dc-4a34-b966-14fba0fd89ec\") " pod="openstack/watcher-api-0" Oct 03 08:59:05 crc kubenswrapper[4790]: I1003 08:59:05.916715 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dfbjk\" (UniqueName: \"kubernetes.io/projected/e0b0e105-a5dc-4a34-b966-14fba0fd89ec-kube-api-access-dfbjk\") pod \"watcher-api-0\" (UID: \"e0b0e105-a5dc-4a34-b966-14fba0fd89ec\") " pod="openstack/watcher-api-0" Oct 03 08:59:05 crc kubenswrapper[4790]: I1003 08:59:05.916756 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e0b0e105-a5dc-4a34-b966-14fba0fd89ec-public-tls-certs\") pod \"watcher-api-0\" (UID: \"e0b0e105-a5dc-4a34-b966-14fba0fd89ec\") " pod="openstack/watcher-api-0" Oct 03 08:59:05 crc kubenswrapper[4790]: I1003 08:59:05.916802 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e0b0e105-a5dc-4a34-b966-14fba0fd89ec-logs\") pod \"watcher-api-0\" (UID: \"e0b0e105-a5dc-4a34-b966-14fba0fd89ec\") " pod="openstack/watcher-api-0" Oct 03 08:59:05 crc kubenswrapper[4790]: I1003 08:59:05.916829 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/e0b0e105-a5dc-4a34-b966-14fba0fd89ec-custom-prometheus-ca\") pod \"watcher-api-0\" (UID: \"e0b0e105-a5dc-4a34-b966-14fba0fd89ec\") " pod="openstack/watcher-api-0" Oct 03 08:59:05 crc kubenswrapper[4790]: I1003 08:59:05.916899 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e0b0e105-a5dc-4a34-b966-14fba0fd89ec-config-data\") pod \"watcher-api-0\" (UID: \"e0b0e105-a5dc-4a34-b966-14fba0fd89ec\") " pod="openstack/watcher-api-0" Oct 03 08:59:05 crc kubenswrapper[4790]: I1003 08:59:05.917106 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e0b0e105-a5dc-4a34-b966-14fba0fd89ec-internal-tls-certs\") pod \"watcher-api-0\" (UID: \"e0b0e105-a5dc-4a34-b966-14fba0fd89ec\") " pod="openstack/watcher-api-0" Oct 03 08:59:06 crc kubenswrapper[4790]: I1003 08:59:06.018811 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0b0e105-a5dc-4a34-b966-14fba0fd89ec-combined-ca-bundle\") pod \"watcher-api-0\" (UID: \"e0b0e105-a5dc-4a34-b966-14fba0fd89ec\") " pod="openstack/watcher-api-0" Oct 03 08:59:06 crc kubenswrapper[4790]: I1003 08:59:06.019068 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dfbjk\" (UniqueName: \"kubernetes.io/projected/e0b0e105-a5dc-4a34-b966-14fba0fd89ec-kube-api-access-dfbjk\") pod \"watcher-api-0\" (UID: \"e0b0e105-a5dc-4a34-b966-14fba0fd89ec\") " pod="openstack/watcher-api-0" Oct 03 08:59:06 crc kubenswrapper[4790]: I1003 08:59:06.019999 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e0b0e105-a5dc-4a34-b966-14fba0fd89ec-public-tls-certs\") pod \"watcher-api-0\" (UID: \"e0b0e105-a5dc-4a34-b966-14fba0fd89ec\") " pod="openstack/watcher-api-0" Oct 03 08:59:06 crc kubenswrapper[4790]: I1003 08:59:06.020494 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e0b0e105-a5dc-4a34-b966-14fba0fd89ec-logs\") pod \"watcher-api-0\" (UID: \"e0b0e105-a5dc-4a34-b966-14fba0fd89ec\") " pod="openstack/watcher-api-0" Oct 03 08:59:06 crc kubenswrapper[4790]: I1003 08:59:06.020606 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/e0b0e105-a5dc-4a34-b966-14fba0fd89ec-custom-prometheus-ca\") pod \"watcher-api-0\" (UID: \"e0b0e105-a5dc-4a34-b966-14fba0fd89ec\") " pod="openstack/watcher-api-0" Oct 03 08:59:06 crc kubenswrapper[4790]: I1003 08:59:06.020750 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e0b0e105-a5dc-4a34-b966-14fba0fd89ec-config-data\") pod \"watcher-api-0\" (UID: \"e0b0e105-a5dc-4a34-b966-14fba0fd89ec\") " pod="openstack/watcher-api-0" Oct 03 08:59:06 crc kubenswrapper[4790]: I1003 08:59:06.020891 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e0b0e105-a5dc-4a34-b966-14fba0fd89ec-internal-tls-certs\") pod \"watcher-api-0\" (UID: \"e0b0e105-a5dc-4a34-b966-14fba0fd89ec\") " pod="openstack/watcher-api-0" Oct 03 08:59:06 crc kubenswrapper[4790]: I1003 08:59:06.020937 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e0b0e105-a5dc-4a34-b966-14fba0fd89ec-logs\") pod \"watcher-api-0\" (UID: \"e0b0e105-a5dc-4a34-b966-14fba0fd89ec\") " pod="openstack/watcher-api-0" Oct 03 08:59:06 crc kubenswrapper[4790]: I1003 08:59:06.026352 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e0b0e105-a5dc-4a34-b966-14fba0fd89ec-public-tls-certs\") pod \"watcher-api-0\" (UID: \"e0b0e105-a5dc-4a34-b966-14fba0fd89ec\") " pod="openstack/watcher-api-0" Oct 03 08:59:06 crc kubenswrapper[4790]: I1003 08:59:06.026690 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e0b0e105-a5dc-4a34-b966-14fba0fd89ec-config-data\") pod \"watcher-api-0\" (UID: \"e0b0e105-a5dc-4a34-b966-14fba0fd89ec\") " pod="openstack/watcher-api-0" Oct 03 08:59:06 crc kubenswrapper[4790]: I1003 08:59:06.026964 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0b0e105-a5dc-4a34-b966-14fba0fd89ec-combined-ca-bundle\") pod \"watcher-api-0\" (UID: \"e0b0e105-a5dc-4a34-b966-14fba0fd89ec\") " pod="openstack/watcher-api-0" Oct 03 08:59:06 crc kubenswrapper[4790]: I1003 08:59:06.027105 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/e0b0e105-a5dc-4a34-b966-14fba0fd89ec-custom-prometheus-ca\") pod \"watcher-api-0\" (UID: \"e0b0e105-a5dc-4a34-b966-14fba0fd89ec\") " pod="openstack/watcher-api-0" Oct 03 08:59:06 crc kubenswrapper[4790]: I1003 08:59:06.028277 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e0b0e105-a5dc-4a34-b966-14fba0fd89ec-internal-tls-certs\") pod \"watcher-api-0\" (UID: \"e0b0e105-a5dc-4a34-b966-14fba0fd89ec\") " pod="openstack/watcher-api-0" Oct 03 08:59:06 crc kubenswrapper[4790]: I1003 08:59:06.043118 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dfbjk\" (UniqueName: \"kubernetes.io/projected/e0b0e105-a5dc-4a34-b966-14fba0fd89ec-kube-api-access-dfbjk\") pod \"watcher-api-0\" (UID: \"e0b0e105-a5dc-4a34-b966-14fba0fd89ec\") " pod="openstack/watcher-api-0" Oct 03 08:59:06 crc kubenswrapper[4790]: I1003 08:59:06.075848 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-api-0" Oct 03 08:59:06 crc kubenswrapper[4790]: I1003 08:59:06.387872 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="35a981dd-1fb1-4a29-a36c-592ea3925ff7" path="/var/lib/kubelet/pods/35a981dd-1fb1-4a29-a36c-592ea3925ff7/volumes" Oct 03 08:59:06 crc kubenswrapper[4790]: I1003 08:59:06.538827 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"f0afbb0f-f4dc-4814-8c7f-791351972c0b","Type":"ContainerStarted","Data":"b4032890331a2dab234fecf385619d911173462f72145113195e212d5b0cefe6"} Oct 03 08:59:06 crc kubenswrapper[4790]: I1003 08:59:06.547489 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"ab258ad1-5952-4586-97ac-9da3ad97658d","Type":"ContainerStarted","Data":"46229c309a222bb8e95f5453935ba4bf91e74c328298d721f09c2610d2e33236"} Oct 03 08:59:06 crc kubenswrapper[4790]: I1003 08:59:06.555110 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"0c264d15-baa6-494c-89e1-220296a37ce8","Type":"ContainerStarted","Data":"86300d1c7512e83f6372e380c2bce36bc05ced935845c7c701a17989990bda32"} Oct 03 08:59:06 crc kubenswrapper[4790]: I1003 08:59:06.556596 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/watcher-decision-engine-0" podStartSLOduration=10.445787666 podStartE2EDuration="35.556580876s" podCreationTimestamp="2025-10-03 08:58:31 +0000 UTC" firstStartedPulling="2025-10-03 08:58:39.743789501 +0000 UTC m=+1106.096723749" lastFinishedPulling="2025-10-03 08:59:04.854582721 +0000 UTC m=+1131.207516959" observedRunningTime="2025-10-03 08:59:06.556095722 +0000 UTC m=+1132.909029980" watchObservedRunningTime="2025-10-03 08:59:06.556580876 +0000 UTC m=+1132.909515114" Oct 03 08:59:06 crc kubenswrapper[4790]: I1003 08:59:06.559162 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-7c4ff96776-8dljp" event={"ID":"750b0c2b-c871-46f7-b318-78ace47356a9","Type":"ContainerStarted","Data":"ac5dd70aa1eff04cc741a203bcd7933a1d346a572225ed1399c1c2dc1b47fe3b"} Oct 03 08:59:06 crc kubenswrapper[4790]: I1003 08:59:06.559217 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-7c4ff96776-8dljp" event={"ID":"750b0c2b-c871-46f7-b318-78ace47356a9","Type":"ContainerStarted","Data":"c2a91eed9320508c648abe83c3bc8c47859915a49491ecf31be1fb4645ff308b"} Oct 03 08:59:06 crc kubenswrapper[4790]: I1003 08:59:06.559270 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-7c4ff96776-8dljp" Oct 03 08:59:06 crc kubenswrapper[4790]: I1003 08:59:06.562788 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-6d5469d486-dtt4h" event={"ID":"38463967-5519-4ea0-8081-b22ac7858270","Type":"ContainerStarted","Data":"964b7e492342f29571f1baee95cff8414a838430fbfb79025c21f4567fea47a8"} Oct 03 08:59:06 crc kubenswrapper[4790]: I1003 08:59:06.562819 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-6d5469d486-dtt4h" event={"ID":"38463967-5519-4ea0-8081-b22ac7858270","Type":"ContainerStarted","Data":"efe5c55e033a88405a83b47d76cbde982757fc25e0e41b10375f06d99ab95d9a"} Oct 03 08:59:06 crc kubenswrapper[4790]: I1003 08:59:06.562836 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-6d5469d486-dtt4h" event={"ID":"38463967-5519-4ea0-8081-b22ac7858270","Type":"ContainerStarted","Data":"3b39f94973796abdb74a11b3315763ae56ffaf3d9dabcdc9897f714080cc3c31"} Oct 03 08:59:06 crc kubenswrapper[4790]: I1003 08:59:06.562874 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-6d5469d486-dtt4h" Oct 03 08:59:06 crc kubenswrapper[4790]: I1003 08:59:06.562960 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-6d5469d486-dtt4h" Oct 03 08:59:06 crc kubenswrapper[4790]: I1003 08:59:06.590176 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=26.590158616 podStartE2EDuration="26.590158616s" podCreationTimestamp="2025-10-03 08:58:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 08:59:06.586816254 +0000 UTC m=+1132.939750512" watchObservedRunningTime="2025-10-03 08:59:06.590158616 +0000 UTC m=+1132.943092854" Oct 03 08:59:06 crc kubenswrapper[4790]: I1003 08:59:06.615899 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=26.615884821 podStartE2EDuration="26.615884821s" podCreationTimestamp="2025-10-03 08:58:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 08:59:06.614348799 +0000 UTC m=+1132.967283037" watchObservedRunningTime="2025-10-03 08:59:06.615884821 +0000 UTC m=+1132.968819079" Oct 03 08:59:06 crc kubenswrapper[4790]: I1003 08:59:06.650203 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-7c4ff96776-8dljp" podStartSLOduration=13.6501843 podStartE2EDuration="13.6501843s" podCreationTimestamp="2025-10-03 08:58:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 08:59:06.641249976 +0000 UTC m=+1132.994184214" watchObservedRunningTime="2025-10-03 08:59:06.6501843 +0000 UTC m=+1133.003118528" Oct 03 08:59:06 crc kubenswrapper[4790]: I1003 08:59:06.683109 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-6d5469d486-dtt4h" podStartSLOduration=13.683091242 podStartE2EDuration="13.683091242s" podCreationTimestamp="2025-10-03 08:58:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 08:59:06.667055333 +0000 UTC m=+1133.019989581" watchObservedRunningTime="2025-10-03 08:59:06.683091242 +0000 UTC m=+1133.036025480" Oct 03 08:59:06 crc kubenswrapper[4790]: I1003 08:59:06.733269 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-api-0"] Oct 03 08:59:06 crc kubenswrapper[4790]: I1003 08:59:06.919111 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-applier-0" Oct 03 08:59:07 crc kubenswrapper[4790]: I1003 08:59:07.586376 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"e0b0e105-a5dc-4a34-b966-14fba0fd89ec","Type":"ContainerStarted","Data":"be769c9caa993eee583cd51a2032431ea23062a4a0f58c3be978b4022bcd5964"} Oct 03 08:59:07 crc kubenswrapper[4790]: I1003 08:59:07.587372 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"e0b0e105-a5dc-4a34-b966-14fba0fd89ec","Type":"ContainerStarted","Data":"b8b8efbffe8f05f76e5d0af1920eb3fc299e4adb520c2f4e0188411362b81688"} Oct 03 08:59:07 crc kubenswrapper[4790]: I1003 08:59:07.587453 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"e0b0e105-a5dc-4a34-b966-14fba0fd89ec","Type":"ContainerStarted","Data":"b97179bff6ec37efc50d74eeb3fe2becb8ebff14825a4fd82c9c51c05a81a935"} Oct 03 08:59:07 crc kubenswrapper[4790]: I1003 08:59:07.587518 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-api-0" Oct 03 08:59:07 crc kubenswrapper[4790]: I1003 08:59:07.589773 4790 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/watcher-api-0" podUID="e0b0e105-a5dc-4a34-b966-14fba0fd89ec" containerName="watcher-api" probeResult="failure" output="Get \"https://10.217.0.171:9322/\": dial tcp 10.217.0.171:9322: connect: connection refused" Oct 03 08:59:07 crc kubenswrapper[4790]: I1003 08:59:07.622199 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/watcher-api-0" podStartSLOduration=2.622174218 podStartE2EDuration="2.622174218s" podCreationTimestamp="2025-10-03 08:59:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 08:59:07.606685605 +0000 UTC m=+1133.959619853" watchObservedRunningTime="2025-10-03 08:59:07.622174218 +0000 UTC m=+1133.975108456" Oct 03 08:59:09 crc kubenswrapper[4790]: I1003 08:59:09.491479 4790 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-7d76b4b9d6-n5tqf" podUID="f57970a5-79c5-4a4f-bdf6-89d25bd8d8c5" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.158:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.158:8443: connect: connection refused" Oct 03 08:59:09 crc kubenswrapper[4790]: I1003 08:59:09.605640 4790 generic.go:334] "Generic (PLEG): container finished" podID="f0afbb0f-f4dc-4814-8c7f-791351972c0b" containerID="b4032890331a2dab234fecf385619d911173462f72145113195e212d5b0cefe6" exitCode=1 Oct 03 08:59:09 crc kubenswrapper[4790]: I1003 08:59:09.605799 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"f0afbb0f-f4dc-4814-8c7f-791351972c0b","Type":"ContainerDied","Data":"b4032890331a2dab234fecf385619d911173462f72145113195e212d5b0cefe6"} Oct 03 08:59:09 crc kubenswrapper[4790]: I1003 08:59:09.606739 4790 scope.go:117] "RemoveContainer" containerID="b4032890331a2dab234fecf385619d911173462f72145113195e212d5b0cefe6" Oct 03 08:59:09 crc kubenswrapper[4790]: I1003 08:59:09.608749 4790 generic.go:334] "Generic (PLEG): container finished" podID="df20e14b-d7d8-4cab-8b7c-61122da75fbf" containerID="0d7452c96884917989d401aa60ff69a7463322b4b922a19b9ee7a87d035506bb" exitCode=0 Oct 03 08:59:09 crc kubenswrapper[4790]: I1003 08:59:09.608838 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-mfdjk" event={"ID":"df20e14b-d7d8-4cab-8b7c-61122da75fbf","Type":"ContainerDied","Data":"0d7452c96884917989d401aa60ff69a7463322b4b922a19b9ee7a87d035506bb"} Oct 03 08:59:10 crc kubenswrapper[4790]: I1003 08:59:10.186606 4790 patch_prober.go:28] interesting pod/machine-config-daemon-zqj4j container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 08:59:10 crc kubenswrapper[4790]: I1003 08:59:10.186672 4790 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" podUID="36eae06a-d8be-4128-8d68-acb32e1f011b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 08:59:10 crc kubenswrapper[4790]: I1003 08:59:10.186726 4790 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" Oct 03 08:59:10 crc kubenswrapper[4790]: I1003 08:59:10.187506 4790 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"ab4e236d11b4b58cf11bf821ee4fe0466ca737afca708932f0cbf0c7c147ea38"} pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 03 08:59:10 crc kubenswrapper[4790]: I1003 08:59:10.187593 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" podUID="36eae06a-d8be-4128-8d68-acb32e1f011b" containerName="machine-config-daemon" containerID="cri-o://ab4e236d11b4b58cf11bf821ee4fe0466ca737afca708932f0cbf0c7c147ea38" gracePeriod=600 Oct 03 08:59:10 crc kubenswrapper[4790]: I1003 08:59:10.623374 4790 generic.go:334] "Generic (PLEG): container finished" podID="36eae06a-d8be-4128-8d68-acb32e1f011b" containerID="ab4e236d11b4b58cf11bf821ee4fe0466ca737afca708932f0cbf0c7c147ea38" exitCode=0 Oct 03 08:59:10 crc kubenswrapper[4790]: I1003 08:59:10.623523 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" event={"ID":"36eae06a-d8be-4128-8d68-acb32e1f011b","Type":"ContainerDied","Data":"ab4e236d11b4b58cf11bf821ee4fe0466ca737afca708932f0cbf0c7c147ea38"} Oct 03 08:59:10 crc kubenswrapper[4790]: I1003 08:59:10.623555 4790 scope.go:117] "RemoveContainer" containerID="e2e53bba0341ebe043843fcef071e2d9f340680e77501673a1e1a360df997492" Oct 03 08:59:10 crc kubenswrapper[4790]: I1003 08:59:10.742381 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Oct 03 08:59:10 crc kubenswrapper[4790]: I1003 08:59:10.742708 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Oct 03 08:59:10 crc kubenswrapper[4790]: I1003 08:59:10.742725 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Oct 03 08:59:10 crc kubenswrapper[4790]: I1003 08:59:10.742738 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Oct 03 08:59:10 crc kubenswrapper[4790]: I1003 08:59:10.786846 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Oct 03 08:59:10 crc kubenswrapper[4790]: I1003 08:59:10.786897 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Oct 03 08:59:10 crc kubenswrapper[4790]: I1003 08:59:10.786909 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Oct 03 08:59:10 crc kubenswrapper[4790]: I1003 08:59:10.786921 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Oct 03 08:59:10 crc kubenswrapper[4790]: I1003 08:59:10.830727 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Oct 03 08:59:10 crc kubenswrapper[4790]: I1003 08:59:10.853793 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Oct 03 08:59:10 crc kubenswrapper[4790]: I1003 08:59:10.875088 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Oct 03 08:59:10 crc kubenswrapper[4790]: I1003 08:59:10.875231 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Oct 03 08:59:11 crc kubenswrapper[4790]: I1003 08:59:11.075965 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-api-0" Oct 03 08:59:11 crc kubenswrapper[4790]: I1003 08:59:11.076070 4790 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 03 08:59:11 crc kubenswrapper[4790]: I1003 08:59:11.135160 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/watcher-api-0" Oct 03 08:59:11 crc kubenswrapper[4790]: I1003 08:59:11.639396 4790 generic.go:334] "Generic (PLEG): container finished" podID="64754289-91b5-4f84-9809-6271b6a13031" containerID="140fd24e39bc9f1ab43f72b08f0a295a95db3c2d185cb86fac96c194e10cadb5" exitCode=137 Oct 03 08:59:11 crc kubenswrapper[4790]: I1003 08:59:11.639759 4790 generic.go:334] "Generic (PLEG): container finished" podID="64754289-91b5-4f84-9809-6271b6a13031" containerID="19ff0d78e06cc94459f21d1f5f64963a5b790c52733b5c522801481fb8aaedea" exitCode=137 Oct 03 08:59:11 crc kubenswrapper[4790]: I1003 08:59:11.639795 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-867694955c-fpnzb" event={"ID":"64754289-91b5-4f84-9809-6271b6a13031","Type":"ContainerDied","Data":"140fd24e39bc9f1ab43f72b08f0a295a95db3c2d185cb86fac96c194e10cadb5"} Oct 03 08:59:11 crc kubenswrapper[4790]: I1003 08:59:11.639819 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-867694955c-fpnzb" event={"ID":"64754289-91b5-4f84-9809-6271b6a13031","Type":"ContainerDied","Data":"19ff0d78e06cc94459f21d1f5f64963a5b790c52733b5c522801481fb8aaedea"} Oct 03 08:59:11 crc kubenswrapper[4790]: I1003 08:59:11.641958 4790 generic.go:334] "Generic (PLEG): container finished" podID="e5a8c4b6-bbe9-4dda-b8b9-4b84fe0864ef" containerID="4d9ecde0aa8e78ebd504b850bd733c9f1f1640a4a1ec7d3f30d5c89b961cce0f" exitCode=137 Oct 03 08:59:11 crc kubenswrapper[4790]: I1003 08:59:11.641974 4790 generic.go:334] "Generic (PLEG): container finished" podID="e5a8c4b6-bbe9-4dda-b8b9-4b84fe0864ef" containerID="fc91064cded5b32a5632b39284688d49158263bf22c6a86f5afd9e7d3936de2f" exitCode=137 Oct 03 08:59:11 crc kubenswrapper[4790]: I1003 08:59:11.642022 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6448764df-qdvpl" event={"ID":"e5a8c4b6-bbe9-4dda-b8b9-4b84fe0864ef","Type":"ContainerDied","Data":"4d9ecde0aa8e78ebd504b850bd733c9f1f1640a4a1ec7d3f30d5c89b961cce0f"} Oct 03 08:59:11 crc kubenswrapper[4790]: I1003 08:59:11.642040 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6448764df-qdvpl" event={"ID":"e5a8c4b6-bbe9-4dda-b8b9-4b84fe0864ef","Type":"ContainerDied","Data":"fc91064cded5b32a5632b39284688d49158263bf22c6a86f5afd9e7d3936de2f"} Oct 03 08:59:11 crc kubenswrapper[4790]: I1003 08:59:11.643622 4790 generic.go:334] "Generic (PLEG): container finished" podID="8a368664-2338-4579-ae15-981ee5ca3194" containerID="80d9e9f284c78efe1c4af25f8abb5f3875804eaeaa94d80bb165c97f01a16664" exitCode=137 Oct 03 08:59:11 crc kubenswrapper[4790]: I1003 08:59:11.643634 4790 generic.go:334] "Generic (PLEG): container finished" podID="8a368664-2338-4579-ae15-981ee5ca3194" containerID="c9c519b5c6dd5924b44d4d2bc8af4c14d93fd4cea6763325b628d674eb8b39b4" exitCode=137 Oct 03 08:59:11 crc kubenswrapper[4790]: I1003 08:59:11.644442 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6d887b5cbf-5qjjt" event={"ID":"8a368664-2338-4579-ae15-981ee5ca3194","Type":"ContainerDied","Data":"80d9e9f284c78efe1c4af25f8abb5f3875804eaeaa94d80bb165c97f01a16664"} Oct 03 08:59:11 crc kubenswrapper[4790]: I1003 08:59:11.644461 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6d887b5cbf-5qjjt" event={"ID":"8a368664-2338-4579-ae15-981ee5ca3194","Type":"ContainerDied","Data":"c9c519b5c6dd5924b44d4d2bc8af4c14d93fd4cea6763325b628d674eb8b39b4"} Oct 03 08:59:11 crc kubenswrapper[4790]: I1003 08:59:11.919559 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-applier-0" Oct 03 08:59:11 crc kubenswrapper[4790]: I1003 08:59:11.925236 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-decision-engine-0" Oct 03 08:59:11 crc kubenswrapper[4790]: I1003 08:59:11.925273 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-decision-engine-0" Oct 03 08:59:11 crc kubenswrapper[4790]: I1003 08:59:11.977336 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/watcher-applier-0" Oct 03 08:59:12 crc kubenswrapper[4790]: I1003 08:59:12.705509 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/watcher-applier-0" Oct 03 08:59:13 crc kubenswrapper[4790]: I1003 08:59:13.893222 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-mfdjk" Oct 03 08:59:13 crc kubenswrapper[4790]: I1003 08:59:13.906702 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-867694955c-fpnzb" Oct 03 08:59:13 crc kubenswrapper[4790]: I1003 08:59:13.960828 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ts7hg\" (UniqueName: \"kubernetes.io/projected/df20e14b-d7d8-4cab-8b7c-61122da75fbf-kube-api-access-ts7hg\") pod \"df20e14b-d7d8-4cab-8b7c-61122da75fbf\" (UID: \"df20e14b-d7d8-4cab-8b7c-61122da75fbf\") " Oct 03 08:59:13 crc kubenswrapper[4790]: I1003 08:59:13.961021 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/df20e14b-d7d8-4cab-8b7c-61122da75fbf-config\") pod \"df20e14b-d7d8-4cab-8b7c-61122da75fbf\" (UID: \"df20e14b-d7d8-4cab-8b7c-61122da75fbf\") " Oct 03 08:59:13 crc kubenswrapper[4790]: I1003 08:59:13.961139 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df20e14b-d7d8-4cab-8b7c-61122da75fbf-combined-ca-bundle\") pod \"df20e14b-d7d8-4cab-8b7c-61122da75fbf\" (UID: \"df20e14b-d7d8-4cab-8b7c-61122da75fbf\") " Oct 03 08:59:13 crc kubenswrapper[4790]: I1003 08:59:13.979322 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/df20e14b-d7d8-4cab-8b7c-61122da75fbf-kube-api-access-ts7hg" (OuterVolumeSpecName: "kube-api-access-ts7hg") pod "df20e14b-d7d8-4cab-8b7c-61122da75fbf" (UID: "df20e14b-d7d8-4cab-8b7c-61122da75fbf"). InnerVolumeSpecName "kube-api-access-ts7hg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:59:13 crc kubenswrapper[4790]: I1003 08:59:13.992351 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/df20e14b-d7d8-4cab-8b7c-61122da75fbf-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "df20e14b-d7d8-4cab-8b7c-61122da75fbf" (UID: "df20e14b-d7d8-4cab-8b7c-61122da75fbf"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:59:14 crc kubenswrapper[4790]: I1003 08:59:14.021881 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/df20e14b-d7d8-4cab-8b7c-61122da75fbf-config" (OuterVolumeSpecName: "config") pod "df20e14b-d7d8-4cab-8b7c-61122da75fbf" (UID: "df20e14b-d7d8-4cab-8b7c-61122da75fbf"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:59:14 crc kubenswrapper[4790]: I1003 08:59:14.062884 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jlfrj\" (UniqueName: \"kubernetes.io/projected/64754289-91b5-4f84-9809-6271b6a13031-kube-api-access-jlfrj\") pod \"64754289-91b5-4f84-9809-6271b6a13031\" (UID: \"64754289-91b5-4f84-9809-6271b6a13031\") " Oct 03 08:59:14 crc kubenswrapper[4790]: I1003 08:59:14.062950 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/64754289-91b5-4f84-9809-6271b6a13031-scripts\") pod \"64754289-91b5-4f84-9809-6271b6a13031\" (UID: \"64754289-91b5-4f84-9809-6271b6a13031\") " Oct 03 08:59:14 crc kubenswrapper[4790]: I1003 08:59:14.063007 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/64754289-91b5-4f84-9809-6271b6a13031-logs\") pod \"64754289-91b5-4f84-9809-6271b6a13031\" (UID: \"64754289-91b5-4f84-9809-6271b6a13031\") " Oct 03 08:59:14 crc kubenswrapper[4790]: I1003 08:59:14.063093 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/64754289-91b5-4f84-9809-6271b6a13031-horizon-secret-key\") pod \"64754289-91b5-4f84-9809-6271b6a13031\" (UID: \"64754289-91b5-4f84-9809-6271b6a13031\") " Oct 03 08:59:14 crc kubenswrapper[4790]: I1003 08:59:14.063119 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/64754289-91b5-4f84-9809-6271b6a13031-config-data\") pod \"64754289-91b5-4f84-9809-6271b6a13031\" (UID: \"64754289-91b5-4f84-9809-6271b6a13031\") " Oct 03 08:59:14 crc kubenswrapper[4790]: I1003 08:59:14.063714 4790 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/df20e14b-d7d8-4cab-8b7c-61122da75fbf-config\") on node \"crc\" DevicePath \"\"" Oct 03 08:59:14 crc kubenswrapper[4790]: I1003 08:59:14.063732 4790 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df20e14b-d7d8-4cab-8b7c-61122da75fbf-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 08:59:14 crc kubenswrapper[4790]: I1003 08:59:14.063759 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ts7hg\" (UniqueName: \"kubernetes.io/projected/df20e14b-d7d8-4cab-8b7c-61122da75fbf-kube-api-access-ts7hg\") on node \"crc\" DevicePath \"\"" Oct 03 08:59:14 crc kubenswrapper[4790]: I1003 08:59:14.063958 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/64754289-91b5-4f84-9809-6271b6a13031-logs" (OuterVolumeSpecName: "logs") pod "64754289-91b5-4f84-9809-6271b6a13031" (UID: "64754289-91b5-4f84-9809-6271b6a13031"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 08:59:14 crc kubenswrapper[4790]: I1003 08:59:14.067063 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/64754289-91b5-4f84-9809-6271b6a13031-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "64754289-91b5-4f84-9809-6271b6a13031" (UID: "64754289-91b5-4f84-9809-6271b6a13031"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:59:14 crc kubenswrapper[4790]: I1003 08:59:14.067723 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/64754289-91b5-4f84-9809-6271b6a13031-kube-api-access-jlfrj" (OuterVolumeSpecName: "kube-api-access-jlfrj") pod "64754289-91b5-4f84-9809-6271b6a13031" (UID: "64754289-91b5-4f84-9809-6271b6a13031"). InnerVolumeSpecName "kube-api-access-jlfrj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:59:14 crc kubenswrapper[4790]: I1003 08:59:14.097221 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/64754289-91b5-4f84-9809-6271b6a13031-scripts" (OuterVolumeSpecName: "scripts") pod "64754289-91b5-4f84-9809-6271b6a13031" (UID: "64754289-91b5-4f84-9809-6271b6a13031"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:59:14 crc kubenswrapper[4790]: I1003 08:59:14.105495 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/64754289-91b5-4f84-9809-6271b6a13031-config-data" (OuterVolumeSpecName: "config-data") pod "64754289-91b5-4f84-9809-6271b6a13031" (UID: "64754289-91b5-4f84-9809-6271b6a13031"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:59:14 crc kubenswrapper[4790]: I1003 08:59:14.165337 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jlfrj\" (UniqueName: \"kubernetes.io/projected/64754289-91b5-4f84-9809-6271b6a13031-kube-api-access-jlfrj\") on node \"crc\" DevicePath \"\"" Oct 03 08:59:14 crc kubenswrapper[4790]: I1003 08:59:14.165370 4790 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/64754289-91b5-4f84-9809-6271b6a13031-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 08:59:14 crc kubenswrapper[4790]: I1003 08:59:14.165381 4790 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/64754289-91b5-4f84-9809-6271b6a13031-logs\") on node \"crc\" DevicePath \"\"" Oct 03 08:59:14 crc kubenswrapper[4790]: I1003 08:59:14.165393 4790 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/64754289-91b5-4f84-9809-6271b6a13031-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Oct 03 08:59:14 crc kubenswrapper[4790]: I1003 08:59:14.165404 4790 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/64754289-91b5-4f84-9809-6271b6a13031-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 08:59:14 crc kubenswrapper[4790]: I1003 08:59:14.541615 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6d887b5cbf-5qjjt" Oct 03 08:59:14 crc kubenswrapper[4790]: I1003 08:59:14.552191 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6448764df-qdvpl" Oct 03 08:59:14 crc kubenswrapper[4790]: I1003 08:59:14.678412 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/e5a8c4b6-bbe9-4dda-b8b9-4b84fe0864ef-horizon-secret-key\") pod \"e5a8c4b6-bbe9-4dda-b8b9-4b84fe0864ef\" (UID: \"e5a8c4b6-bbe9-4dda-b8b9-4b84fe0864ef\") " Oct 03 08:59:14 crc kubenswrapper[4790]: I1003 08:59:14.678833 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e5a8c4b6-bbe9-4dda-b8b9-4b84fe0864ef-logs\") pod \"e5a8c4b6-bbe9-4dda-b8b9-4b84fe0864ef\" (UID: \"e5a8c4b6-bbe9-4dda-b8b9-4b84fe0864ef\") " Oct 03 08:59:14 crc kubenswrapper[4790]: I1003 08:59:14.678970 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q4blv\" (UniqueName: \"kubernetes.io/projected/e5a8c4b6-bbe9-4dda-b8b9-4b84fe0864ef-kube-api-access-q4blv\") pod \"e5a8c4b6-bbe9-4dda-b8b9-4b84fe0864ef\" (UID: \"e5a8c4b6-bbe9-4dda-b8b9-4b84fe0864ef\") " Oct 03 08:59:14 crc kubenswrapper[4790]: I1003 08:59:14.679248 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e5a8c4b6-bbe9-4dda-b8b9-4b84fe0864ef-logs" (OuterVolumeSpecName: "logs") pod "e5a8c4b6-bbe9-4dda-b8b9-4b84fe0864ef" (UID: "e5a8c4b6-bbe9-4dda-b8b9-4b84fe0864ef"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 08:59:14 crc kubenswrapper[4790]: I1003 08:59:14.679469 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8a368664-2338-4579-ae15-981ee5ca3194-config-data\") pod \"8a368664-2338-4579-ae15-981ee5ca3194\" (UID: \"8a368664-2338-4579-ae15-981ee5ca3194\") " Oct 03 08:59:14 crc kubenswrapper[4790]: I1003 08:59:14.679560 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8a368664-2338-4579-ae15-981ee5ca3194-logs\") pod \"8a368664-2338-4579-ae15-981ee5ca3194\" (UID: \"8a368664-2338-4579-ae15-981ee5ca3194\") " Oct 03 08:59:14 crc kubenswrapper[4790]: I1003 08:59:14.679607 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e5a8c4b6-bbe9-4dda-b8b9-4b84fe0864ef-scripts\") pod \"e5a8c4b6-bbe9-4dda-b8b9-4b84fe0864ef\" (UID: \"e5a8c4b6-bbe9-4dda-b8b9-4b84fe0864ef\") " Oct 03 08:59:14 crc kubenswrapper[4790]: I1003 08:59:14.679646 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5xvwx\" (UniqueName: \"kubernetes.io/projected/8a368664-2338-4579-ae15-981ee5ca3194-kube-api-access-5xvwx\") pod \"8a368664-2338-4579-ae15-981ee5ca3194\" (UID: \"8a368664-2338-4579-ae15-981ee5ca3194\") " Oct 03 08:59:14 crc kubenswrapper[4790]: I1003 08:59:14.680147 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/8a368664-2338-4579-ae15-981ee5ca3194-horizon-secret-key\") pod \"8a368664-2338-4579-ae15-981ee5ca3194\" (UID: \"8a368664-2338-4579-ae15-981ee5ca3194\") " Oct 03 08:59:14 crc kubenswrapper[4790]: I1003 08:59:14.680176 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8a368664-2338-4579-ae15-981ee5ca3194-scripts\") pod \"8a368664-2338-4579-ae15-981ee5ca3194\" (UID: \"8a368664-2338-4579-ae15-981ee5ca3194\") " Oct 03 08:59:14 crc kubenswrapper[4790]: I1003 08:59:14.680241 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e5a8c4b6-bbe9-4dda-b8b9-4b84fe0864ef-config-data\") pod \"e5a8c4b6-bbe9-4dda-b8b9-4b84fe0864ef\" (UID: \"e5a8c4b6-bbe9-4dda-b8b9-4b84fe0864ef\") " Oct 03 08:59:14 crc kubenswrapper[4790]: I1003 08:59:14.680028 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8a368664-2338-4579-ae15-981ee5ca3194-logs" (OuterVolumeSpecName: "logs") pod "8a368664-2338-4579-ae15-981ee5ca3194" (UID: "8a368664-2338-4579-ae15-981ee5ca3194"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 08:59:14 crc kubenswrapper[4790]: I1003 08:59:14.681341 4790 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e5a8c4b6-bbe9-4dda-b8b9-4b84fe0864ef-logs\") on node \"crc\" DevicePath \"\"" Oct 03 08:59:14 crc kubenswrapper[4790]: I1003 08:59:14.681359 4790 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8a368664-2338-4579-ae15-981ee5ca3194-logs\") on node \"crc\" DevicePath \"\"" Oct 03 08:59:14 crc kubenswrapper[4790]: I1003 08:59:14.683202 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e5a8c4b6-bbe9-4dda-b8b9-4b84fe0864ef-kube-api-access-q4blv" (OuterVolumeSpecName: "kube-api-access-q4blv") pod "e5a8c4b6-bbe9-4dda-b8b9-4b84fe0864ef" (UID: "e5a8c4b6-bbe9-4dda-b8b9-4b84fe0864ef"). InnerVolumeSpecName "kube-api-access-q4blv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:59:14 crc kubenswrapper[4790]: I1003 08:59:14.683894 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a368664-2338-4579-ae15-981ee5ca3194-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "8a368664-2338-4579-ae15-981ee5ca3194" (UID: "8a368664-2338-4579-ae15-981ee5ca3194"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:59:14 crc kubenswrapper[4790]: I1003 08:59:14.683899 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e5a8c4b6-bbe9-4dda-b8b9-4b84fe0864ef-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "e5a8c4b6-bbe9-4dda-b8b9-4b84fe0864ef" (UID: "e5a8c4b6-bbe9-4dda-b8b9-4b84fe0864ef"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:59:14 crc kubenswrapper[4790]: I1003 08:59:14.697497 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8a368664-2338-4579-ae15-981ee5ca3194-kube-api-access-5xvwx" (OuterVolumeSpecName: "kube-api-access-5xvwx") pod "8a368664-2338-4579-ae15-981ee5ca3194" (UID: "8a368664-2338-4579-ae15-981ee5ca3194"). InnerVolumeSpecName "kube-api-access-5xvwx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:59:14 crc kubenswrapper[4790]: I1003 08:59:14.697849 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-867694955c-fpnzb" event={"ID":"64754289-91b5-4f84-9809-6271b6a13031","Type":"ContainerDied","Data":"b0ff64c50c118d67a4cf15daa2b8ee6c4f295601ca118a67059cebe19412eeaf"} Oct 03 08:59:14 crc kubenswrapper[4790]: I1003 08:59:14.697904 4790 scope.go:117] "RemoveContainer" containerID="140fd24e39bc9f1ab43f72b08f0a295a95db3c2d185cb86fac96c194e10cadb5" Oct 03 08:59:14 crc kubenswrapper[4790]: I1003 08:59:14.698084 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-867694955c-fpnzb" Oct 03 08:59:14 crc kubenswrapper[4790]: I1003 08:59:14.707344 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8a368664-2338-4579-ae15-981ee5ca3194-config-data" (OuterVolumeSpecName: "config-data") pod "8a368664-2338-4579-ae15-981ee5ca3194" (UID: "8a368664-2338-4579-ae15-981ee5ca3194"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:59:14 crc kubenswrapper[4790]: I1003 08:59:14.715386 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-mfdjk" event={"ID":"df20e14b-d7d8-4cab-8b7c-61122da75fbf","Type":"ContainerDied","Data":"166a3036570750727b0dd758229bd28aafecd40eb06b2140fd160ae44b5f7978"} Oct 03 08:59:14 crc kubenswrapper[4790]: I1003 08:59:14.715454 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="166a3036570750727b0dd758229bd28aafecd40eb06b2140fd160ae44b5f7978" Oct 03 08:59:14 crc kubenswrapper[4790]: I1003 08:59:14.715597 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-mfdjk" Oct 03 08:59:14 crc kubenswrapper[4790]: I1003 08:59:14.719529 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8a368664-2338-4579-ae15-981ee5ca3194-scripts" (OuterVolumeSpecName: "scripts") pod "8a368664-2338-4579-ae15-981ee5ca3194" (UID: "8a368664-2338-4579-ae15-981ee5ca3194"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:59:14 crc kubenswrapper[4790]: I1003 08:59:14.734390 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e5a8c4b6-bbe9-4dda-b8b9-4b84fe0864ef-scripts" (OuterVolumeSpecName: "scripts") pod "e5a8c4b6-bbe9-4dda-b8b9-4b84fe0864ef" (UID: "e5a8c4b6-bbe9-4dda-b8b9-4b84fe0864ef"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:59:14 crc kubenswrapper[4790]: I1003 08:59:14.754528 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6448764df-qdvpl" event={"ID":"e5a8c4b6-bbe9-4dda-b8b9-4b84fe0864ef","Type":"ContainerDied","Data":"01256a351eb3e28957318809b2108790993abc2462c0abc393d83c0d76eced39"} Oct 03 08:59:14 crc kubenswrapper[4790]: I1003 08:59:14.754754 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6448764df-qdvpl" Oct 03 08:59:14 crc kubenswrapper[4790]: I1003 08:59:14.777421 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6d887b5cbf-5qjjt" event={"ID":"8a368664-2338-4579-ae15-981ee5ca3194","Type":"ContainerDied","Data":"bcba4addbf1ac988637f56e72ee0f4bba335de286226d15fb3303d9af941a7b0"} Oct 03 08:59:14 crc kubenswrapper[4790]: I1003 08:59:14.778511 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e5a8c4b6-bbe9-4dda-b8b9-4b84fe0864ef-config-data" (OuterVolumeSpecName: "config-data") pod "e5a8c4b6-bbe9-4dda-b8b9-4b84fe0864ef" (UID: "e5a8c4b6-bbe9-4dda-b8b9-4b84fe0864ef"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:59:14 crc kubenswrapper[4790]: I1003 08:59:14.778666 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6d887b5cbf-5qjjt" Oct 03 08:59:14 crc kubenswrapper[4790]: I1003 08:59:14.784091 4790 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8a368664-2338-4579-ae15-981ee5ca3194-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 08:59:14 crc kubenswrapper[4790]: I1003 08:59:14.784141 4790 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e5a8c4b6-bbe9-4dda-b8b9-4b84fe0864ef-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 08:59:14 crc kubenswrapper[4790]: I1003 08:59:14.784156 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5xvwx\" (UniqueName: \"kubernetes.io/projected/8a368664-2338-4579-ae15-981ee5ca3194-kube-api-access-5xvwx\") on node \"crc\" DevicePath \"\"" Oct 03 08:59:14 crc kubenswrapper[4790]: I1003 08:59:14.784170 4790 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/8a368664-2338-4579-ae15-981ee5ca3194-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Oct 03 08:59:14 crc kubenswrapper[4790]: I1003 08:59:14.784180 4790 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8a368664-2338-4579-ae15-981ee5ca3194-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 08:59:14 crc kubenswrapper[4790]: I1003 08:59:14.784188 4790 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e5a8c4b6-bbe9-4dda-b8b9-4b84fe0864ef-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 08:59:14 crc kubenswrapper[4790]: I1003 08:59:14.784201 4790 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/e5a8c4b6-bbe9-4dda-b8b9-4b84fe0864ef-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Oct 03 08:59:14 crc kubenswrapper[4790]: I1003 08:59:14.784211 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q4blv\" (UniqueName: \"kubernetes.io/projected/e5a8c4b6-bbe9-4dda-b8b9-4b84fe0864ef-kube-api-access-q4blv\") on node \"crc\" DevicePath \"\"" Oct 03 08:59:14 crc kubenswrapper[4790]: I1003 08:59:14.889485 4790 scope.go:117] "RemoveContainer" containerID="19ff0d78e06cc94459f21d1f5f64963a5b790c52733b5c522801481fb8aaedea" Oct 03 08:59:14 crc kubenswrapper[4790]: I1003 08:59:14.934058 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-867694955c-fpnzb"] Oct 03 08:59:14 crc kubenswrapper[4790]: I1003 08:59:14.955546 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-867694955c-fpnzb"] Oct 03 08:59:14 crc kubenswrapper[4790]: I1003 08:59:14.959419 4790 scope.go:117] "RemoveContainer" containerID="4d9ecde0aa8e78ebd504b850bd733c9f1f1640a4a1ec7d3f30d5c89b961cce0f" Oct 03 08:59:14 crc kubenswrapper[4790]: I1003 08:59:14.972705 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-6d887b5cbf-5qjjt"] Oct 03 08:59:14 crc kubenswrapper[4790]: I1003 08:59:14.987055 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-6d887b5cbf-5qjjt"] Oct 03 08:59:15 crc kubenswrapper[4790]: I1003 08:59:15.124764 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-b4b8db59f-55rqj"] Oct 03 08:59:15 crc kubenswrapper[4790]: E1003 08:59:15.125257 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e5a8c4b6-bbe9-4dda-b8b9-4b84fe0864ef" containerName="horizon-log" Oct 03 08:59:15 crc kubenswrapper[4790]: I1003 08:59:15.125274 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5a8c4b6-bbe9-4dda-b8b9-4b84fe0864ef" containerName="horizon-log" Oct 03 08:59:15 crc kubenswrapper[4790]: E1003 08:59:15.125311 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df20e14b-d7d8-4cab-8b7c-61122da75fbf" containerName="neutron-db-sync" Oct 03 08:59:15 crc kubenswrapper[4790]: I1003 08:59:15.125320 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="df20e14b-d7d8-4cab-8b7c-61122da75fbf" containerName="neutron-db-sync" Oct 03 08:59:15 crc kubenswrapper[4790]: E1003 08:59:15.125338 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="64754289-91b5-4f84-9809-6271b6a13031" containerName="horizon-log" Oct 03 08:59:15 crc kubenswrapper[4790]: I1003 08:59:15.125346 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="64754289-91b5-4f84-9809-6271b6a13031" containerName="horizon-log" Oct 03 08:59:15 crc kubenswrapper[4790]: E1003 08:59:15.125360 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="64754289-91b5-4f84-9809-6271b6a13031" containerName="horizon" Oct 03 08:59:15 crc kubenswrapper[4790]: I1003 08:59:15.125367 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="64754289-91b5-4f84-9809-6271b6a13031" containerName="horizon" Oct 03 08:59:15 crc kubenswrapper[4790]: E1003 08:59:15.125382 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e5a8c4b6-bbe9-4dda-b8b9-4b84fe0864ef" containerName="horizon" Oct 03 08:59:15 crc kubenswrapper[4790]: I1003 08:59:15.125390 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5a8c4b6-bbe9-4dda-b8b9-4b84fe0864ef" containerName="horizon" Oct 03 08:59:15 crc kubenswrapper[4790]: E1003 08:59:15.125409 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a368664-2338-4579-ae15-981ee5ca3194" containerName="horizon" Oct 03 08:59:15 crc kubenswrapper[4790]: I1003 08:59:15.125417 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a368664-2338-4579-ae15-981ee5ca3194" containerName="horizon" Oct 03 08:59:15 crc kubenswrapper[4790]: E1003 08:59:15.125437 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a368664-2338-4579-ae15-981ee5ca3194" containerName="horizon-log" Oct 03 08:59:15 crc kubenswrapper[4790]: I1003 08:59:15.125445 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a368664-2338-4579-ae15-981ee5ca3194" containerName="horizon-log" Oct 03 08:59:15 crc kubenswrapper[4790]: I1003 08:59:15.125687 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="64754289-91b5-4f84-9809-6271b6a13031" containerName="horizon-log" Oct 03 08:59:15 crc kubenswrapper[4790]: I1003 08:59:15.125713 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="e5a8c4b6-bbe9-4dda-b8b9-4b84fe0864ef" containerName="horizon" Oct 03 08:59:15 crc kubenswrapper[4790]: I1003 08:59:15.125725 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="8a368664-2338-4579-ae15-981ee5ca3194" containerName="horizon" Oct 03 08:59:15 crc kubenswrapper[4790]: I1003 08:59:15.125737 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="8a368664-2338-4579-ae15-981ee5ca3194" containerName="horizon-log" Oct 03 08:59:15 crc kubenswrapper[4790]: I1003 08:59:15.125748 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="64754289-91b5-4f84-9809-6271b6a13031" containerName="horizon" Oct 03 08:59:15 crc kubenswrapper[4790]: I1003 08:59:15.125762 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="df20e14b-d7d8-4cab-8b7c-61122da75fbf" containerName="neutron-db-sync" Oct 03 08:59:15 crc kubenswrapper[4790]: I1003 08:59:15.125778 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="e5a8c4b6-bbe9-4dda-b8b9-4b84fe0864ef" containerName="horizon-log" Oct 03 08:59:15 crc kubenswrapper[4790]: I1003 08:59:15.127101 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b4b8db59f-55rqj" Oct 03 08:59:15 crc kubenswrapper[4790]: I1003 08:59:15.178712 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-6448764df-qdvpl"] Oct 03 08:59:15 crc kubenswrapper[4790]: I1003 08:59:15.193564 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-llrbj\" (UniqueName: \"kubernetes.io/projected/4a6a5c83-e69b-4f25-913b-0ac02229efeb-kube-api-access-llrbj\") pod \"dnsmasq-dns-b4b8db59f-55rqj\" (UID: \"4a6a5c83-e69b-4f25-913b-0ac02229efeb\") " pod="openstack/dnsmasq-dns-b4b8db59f-55rqj" Oct 03 08:59:15 crc kubenswrapper[4790]: I1003 08:59:15.193777 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4a6a5c83-e69b-4f25-913b-0ac02229efeb-ovsdbserver-sb\") pod \"dnsmasq-dns-b4b8db59f-55rqj\" (UID: \"4a6a5c83-e69b-4f25-913b-0ac02229efeb\") " pod="openstack/dnsmasq-dns-b4b8db59f-55rqj" Oct 03 08:59:15 crc kubenswrapper[4790]: I1003 08:59:15.193922 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4a6a5c83-e69b-4f25-913b-0ac02229efeb-config\") pod \"dnsmasq-dns-b4b8db59f-55rqj\" (UID: \"4a6a5c83-e69b-4f25-913b-0ac02229efeb\") " pod="openstack/dnsmasq-dns-b4b8db59f-55rqj" Oct 03 08:59:15 crc kubenswrapper[4790]: I1003 08:59:15.194043 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4a6a5c83-e69b-4f25-913b-0ac02229efeb-dns-svc\") pod \"dnsmasq-dns-b4b8db59f-55rqj\" (UID: \"4a6a5c83-e69b-4f25-913b-0ac02229efeb\") " pod="openstack/dnsmasq-dns-b4b8db59f-55rqj" Oct 03 08:59:15 crc kubenswrapper[4790]: I1003 08:59:15.194061 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4a6a5c83-e69b-4f25-913b-0ac02229efeb-dns-swift-storage-0\") pod \"dnsmasq-dns-b4b8db59f-55rqj\" (UID: \"4a6a5c83-e69b-4f25-913b-0ac02229efeb\") " pod="openstack/dnsmasq-dns-b4b8db59f-55rqj" Oct 03 08:59:15 crc kubenswrapper[4790]: I1003 08:59:15.194119 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4a6a5c83-e69b-4f25-913b-0ac02229efeb-ovsdbserver-nb\") pod \"dnsmasq-dns-b4b8db59f-55rqj\" (UID: \"4a6a5c83-e69b-4f25-913b-0ac02229efeb\") " pod="openstack/dnsmasq-dns-b4b8db59f-55rqj" Oct 03 08:59:15 crc kubenswrapper[4790]: I1003 08:59:15.220018 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-b4b8db59f-55rqj"] Oct 03 08:59:15 crc kubenswrapper[4790]: I1003 08:59:15.252613 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-6448764df-qdvpl"] Oct 03 08:59:15 crc kubenswrapper[4790]: E1003 08:59:15.289505 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ceilometer-central-agent\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/ceilometer-0" podUID="8a0817c5-b72b-4433-977d-6185317b0885" Oct 03 08:59:15 crc kubenswrapper[4790]: I1003 08:59:15.296385 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-llrbj\" (UniqueName: \"kubernetes.io/projected/4a6a5c83-e69b-4f25-913b-0ac02229efeb-kube-api-access-llrbj\") pod \"dnsmasq-dns-b4b8db59f-55rqj\" (UID: \"4a6a5c83-e69b-4f25-913b-0ac02229efeb\") " pod="openstack/dnsmasq-dns-b4b8db59f-55rqj" Oct 03 08:59:15 crc kubenswrapper[4790]: I1003 08:59:15.296464 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4a6a5c83-e69b-4f25-913b-0ac02229efeb-ovsdbserver-sb\") pod \"dnsmasq-dns-b4b8db59f-55rqj\" (UID: \"4a6a5c83-e69b-4f25-913b-0ac02229efeb\") " pod="openstack/dnsmasq-dns-b4b8db59f-55rqj" Oct 03 08:59:15 crc kubenswrapper[4790]: I1003 08:59:15.296523 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4a6a5c83-e69b-4f25-913b-0ac02229efeb-config\") pod \"dnsmasq-dns-b4b8db59f-55rqj\" (UID: \"4a6a5c83-e69b-4f25-913b-0ac02229efeb\") " pod="openstack/dnsmasq-dns-b4b8db59f-55rqj" Oct 03 08:59:15 crc kubenswrapper[4790]: I1003 08:59:15.296607 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4a6a5c83-e69b-4f25-913b-0ac02229efeb-dns-svc\") pod \"dnsmasq-dns-b4b8db59f-55rqj\" (UID: \"4a6a5c83-e69b-4f25-913b-0ac02229efeb\") " pod="openstack/dnsmasq-dns-b4b8db59f-55rqj" Oct 03 08:59:15 crc kubenswrapper[4790]: I1003 08:59:15.296634 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4a6a5c83-e69b-4f25-913b-0ac02229efeb-dns-swift-storage-0\") pod \"dnsmasq-dns-b4b8db59f-55rqj\" (UID: \"4a6a5c83-e69b-4f25-913b-0ac02229efeb\") " pod="openstack/dnsmasq-dns-b4b8db59f-55rqj" Oct 03 08:59:15 crc kubenswrapper[4790]: I1003 08:59:15.296667 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4a6a5c83-e69b-4f25-913b-0ac02229efeb-ovsdbserver-nb\") pod \"dnsmasq-dns-b4b8db59f-55rqj\" (UID: \"4a6a5c83-e69b-4f25-913b-0ac02229efeb\") " pod="openstack/dnsmasq-dns-b4b8db59f-55rqj" Oct 03 08:59:15 crc kubenswrapper[4790]: I1003 08:59:15.298538 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4a6a5c83-e69b-4f25-913b-0ac02229efeb-config\") pod \"dnsmasq-dns-b4b8db59f-55rqj\" (UID: \"4a6a5c83-e69b-4f25-913b-0ac02229efeb\") " pod="openstack/dnsmasq-dns-b4b8db59f-55rqj" Oct 03 08:59:15 crc kubenswrapper[4790]: I1003 08:59:15.298664 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4a6a5c83-e69b-4f25-913b-0ac02229efeb-ovsdbserver-sb\") pod \"dnsmasq-dns-b4b8db59f-55rqj\" (UID: \"4a6a5c83-e69b-4f25-913b-0ac02229efeb\") " pod="openstack/dnsmasq-dns-b4b8db59f-55rqj" Oct 03 08:59:15 crc kubenswrapper[4790]: I1003 08:59:15.299592 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4a6a5c83-e69b-4f25-913b-0ac02229efeb-dns-svc\") pod \"dnsmasq-dns-b4b8db59f-55rqj\" (UID: \"4a6a5c83-e69b-4f25-913b-0ac02229efeb\") " pod="openstack/dnsmasq-dns-b4b8db59f-55rqj" Oct 03 08:59:15 crc kubenswrapper[4790]: I1003 08:59:15.299768 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4a6a5c83-e69b-4f25-913b-0ac02229efeb-dns-swift-storage-0\") pod \"dnsmasq-dns-b4b8db59f-55rqj\" (UID: \"4a6a5c83-e69b-4f25-913b-0ac02229efeb\") " pod="openstack/dnsmasq-dns-b4b8db59f-55rqj" Oct 03 08:59:15 crc kubenswrapper[4790]: I1003 08:59:15.300197 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4a6a5c83-e69b-4f25-913b-0ac02229efeb-ovsdbserver-nb\") pod \"dnsmasq-dns-b4b8db59f-55rqj\" (UID: \"4a6a5c83-e69b-4f25-913b-0ac02229efeb\") " pod="openstack/dnsmasq-dns-b4b8db59f-55rqj" Oct 03 08:59:15 crc kubenswrapper[4790]: I1003 08:59:15.305755 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-8454698868-xzzl2"] Oct 03 08:59:15 crc kubenswrapper[4790]: I1003 08:59:15.307962 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-8454698868-xzzl2" Oct 03 08:59:15 crc kubenswrapper[4790]: I1003 08:59:15.314019 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-wkzbz" Oct 03 08:59:15 crc kubenswrapper[4790]: I1003 08:59:15.314229 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Oct 03 08:59:15 crc kubenswrapper[4790]: I1003 08:59:15.314364 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-ovndbs" Oct 03 08:59:15 crc kubenswrapper[4790]: I1003 08:59:15.315557 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Oct 03 08:59:15 crc kubenswrapper[4790]: I1003 08:59:15.325374 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-llrbj\" (UniqueName: \"kubernetes.io/projected/4a6a5c83-e69b-4f25-913b-0ac02229efeb-kube-api-access-llrbj\") pod \"dnsmasq-dns-b4b8db59f-55rqj\" (UID: \"4a6a5c83-e69b-4f25-913b-0ac02229efeb\") " pod="openstack/dnsmasq-dns-b4b8db59f-55rqj" Oct 03 08:59:15 crc kubenswrapper[4790]: I1003 08:59:15.332455 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-8454698868-xzzl2"] Oct 03 08:59:15 crc kubenswrapper[4790]: I1003 08:59:15.397494 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/71091fa8-ba4a-42b7-b738-a6b012c5b65c-combined-ca-bundle\") pod \"neutron-8454698868-xzzl2\" (UID: \"71091fa8-ba4a-42b7-b738-a6b012c5b65c\") " pod="openstack/neutron-8454698868-xzzl2" Oct 03 08:59:15 crc kubenswrapper[4790]: I1003 08:59:15.397603 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/71091fa8-ba4a-42b7-b738-a6b012c5b65c-ovndb-tls-certs\") pod \"neutron-8454698868-xzzl2\" (UID: \"71091fa8-ba4a-42b7-b738-a6b012c5b65c\") " pod="openstack/neutron-8454698868-xzzl2" Oct 03 08:59:15 crc kubenswrapper[4790]: I1003 08:59:15.397670 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/71091fa8-ba4a-42b7-b738-a6b012c5b65c-httpd-config\") pod \"neutron-8454698868-xzzl2\" (UID: \"71091fa8-ba4a-42b7-b738-a6b012c5b65c\") " pod="openstack/neutron-8454698868-xzzl2" Oct 03 08:59:15 crc kubenswrapper[4790]: I1003 08:59:15.397697 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9qbtw\" (UniqueName: \"kubernetes.io/projected/71091fa8-ba4a-42b7-b738-a6b012c5b65c-kube-api-access-9qbtw\") pod \"neutron-8454698868-xzzl2\" (UID: \"71091fa8-ba4a-42b7-b738-a6b012c5b65c\") " pod="openstack/neutron-8454698868-xzzl2" Oct 03 08:59:15 crc kubenswrapper[4790]: I1003 08:59:15.397722 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/71091fa8-ba4a-42b7-b738-a6b012c5b65c-config\") pod \"neutron-8454698868-xzzl2\" (UID: \"71091fa8-ba4a-42b7-b738-a6b012c5b65c\") " pod="openstack/neutron-8454698868-xzzl2" Oct 03 08:59:15 crc kubenswrapper[4790]: I1003 08:59:15.400752 4790 scope.go:117] "RemoveContainer" containerID="fc91064cded5b32a5632b39284688d49158263bf22c6a86f5afd9e7d3936de2f" Oct 03 08:59:15 crc kubenswrapper[4790]: I1003 08:59:15.422510 4790 scope.go:117] "RemoveContainer" containerID="80d9e9f284c78efe1c4af25f8abb5f3875804eaeaa94d80bb165c97f01a16664" Oct 03 08:59:15 crc kubenswrapper[4790]: I1003 08:59:15.473354 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b4b8db59f-55rqj" Oct 03 08:59:15 crc kubenswrapper[4790]: I1003 08:59:15.499765 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/71091fa8-ba4a-42b7-b738-a6b012c5b65c-ovndb-tls-certs\") pod \"neutron-8454698868-xzzl2\" (UID: \"71091fa8-ba4a-42b7-b738-a6b012c5b65c\") " pod="openstack/neutron-8454698868-xzzl2" Oct 03 08:59:15 crc kubenswrapper[4790]: I1003 08:59:15.499847 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/71091fa8-ba4a-42b7-b738-a6b012c5b65c-httpd-config\") pod \"neutron-8454698868-xzzl2\" (UID: \"71091fa8-ba4a-42b7-b738-a6b012c5b65c\") " pod="openstack/neutron-8454698868-xzzl2" Oct 03 08:59:15 crc kubenswrapper[4790]: I1003 08:59:15.499879 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9qbtw\" (UniqueName: \"kubernetes.io/projected/71091fa8-ba4a-42b7-b738-a6b012c5b65c-kube-api-access-9qbtw\") pod \"neutron-8454698868-xzzl2\" (UID: \"71091fa8-ba4a-42b7-b738-a6b012c5b65c\") " pod="openstack/neutron-8454698868-xzzl2" Oct 03 08:59:15 crc kubenswrapper[4790]: I1003 08:59:15.499905 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/71091fa8-ba4a-42b7-b738-a6b012c5b65c-config\") pod \"neutron-8454698868-xzzl2\" (UID: \"71091fa8-ba4a-42b7-b738-a6b012c5b65c\") " pod="openstack/neutron-8454698868-xzzl2" Oct 03 08:59:15 crc kubenswrapper[4790]: I1003 08:59:15.499969 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/71091fa8-ba4a-42b7-b738-a6b012c5b65c-combined-ca-bundle\") pod \"neutron-8454698868-xzzl2\" (UID: \"71091fa8-ba4a-42b7-b738-a6b012c5b65c\") " pod="openstack/neutron-8454698868-xzzl2" Oct 03 08:59:15 crc kubenswrapper[4790]: I1003 08:59:15.505301 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/71091fa8-ba4a-42b7-b738-a6b012c5b65c-ovndb-tls-certs\") pod \"neutron-8454698868-xzzl2\" (UID: \"71091fa8-ba4a-42b7-b738-a6b012c5b65c\") " pod="openstack/neutron-8454698868-xzzl2" Oct 03 08:59:15 crc kubenswrapper[4790]: I1003 08:59:15.514366 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/71091fa8-ba4a-42b7-b738-a6b012c5b65c-httpd-config\") pod \"neutron-8454698868-xzzl2\" (UID: \"71091fa8-ba4a-42b7-b738-a6b012c5b65c\") " pod="openstack/neutron-8454698868-xzzl2" Oct 03 08:59:15 crc kubenswrapper[4790]: I1003 08:59:15.515080 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/71091fa8-ba4a-42b7-b738-a6b012c5b65c-combined-ca-bundle\") pod \"neutron-8454698868-xzzl2\" (UID: \"71091fa8-ba4a-42b7-b738-a6b012c5b65c\") " pod="openstack/neutron-8454698868-xzzl2" Oct 03 08:59:15 crc kubenswrapper[4790]: I1003 08:59:15.517472 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/71091fa8-ba4a-42b7-b738-a6b012c5b65c-config\") pod \"neutron-8454698868-xzzl2\" (UID: \"71091fa8-ba4a-42b7-b738-a6b012c5b65c\") " pod="openstack/neutron-8454698868-xzzl2" Oct 03 08:59:15 crc kubenswrapper[4790]: I1003 08:59:15.521632 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9qbtw\" (UniqueName: \"kubernetes.io/projected/71091fa8-ba4a-42b7-b738-a6b012c5b65c-kube-api-access-9qbtw\") pod \"neutron-8454698868-xzzl2\" (UID: \"71091fa8-ba4a-42b7-b738-a6b012c5b65c\") " pod="openstack/neutron-8454698868-xzzl2" Oct 03 08:59:15 crc kubenswrapper[4790]: I1003 08:59:15.627489 4790 scope.go:117] "RemoveContainer" containerID="c9c519b5c6dd5924b44d4d2bc8af4c14d93fd4cea6763325b628d674eb8b39b4" Oct 03 08:59:15 crc kubenswrapper[4790]: I1003 08:59:15.692202 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-8454698868-xzzl2" Oct 03 08:59:15 crc kubenswrapper[4790]: I1003 08:59:15.893218 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" event={"ID":"36eae06a-d8be-4128-8d68-acb32e1f011b","Type":"ContainerStarted","Data":"c2fcffece27550dbe144383b7c492d28b1ee8a149f741f79ecb706a01a9252b8"} Oct 03 08:59:15 crc kubenswrapper[4790]: I1003 08:59:15.957885 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8a0817c5-b72b-4433-977d-6185317b0885","Type":"ContainerStarted","Data":"1adcf2f3d72270a7149b524157da80c9909c9b0afd371e6fb65353403b50e68b"} Oct 03 08:59:15 crc kubenswrapper[4790]: I1003 08:59:15.958241 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 03 08:59:15 crc kubenswrapper[4790]: I1003 08:59:15.957992 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="8a0817c5-b72b-4433-977d-6185317b0885" containerName="ceilometer-notification-agent" containerID="cri-o://fab70dfb9f385a94ccc6a11aad0924e34371da2c99c29c954d36abeba9e33c9e" gracePeriod=30 Oct 03 08:59:15 crc kubenswrapper[4790]: I1003 08:59:15.958362 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="8a0817c5-b72b-4433-977d-6185317b0885" containerName="sg-core" containerID="cri-o://a7695d2d6a51762860632c605a70ca0fddde0aa8e14b150fee62a009907aca95" gracePeriod=30 Oct 03 08:59:15 crc kubenswrapper[4790]: I1003 08:59:15.958406 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="8a0817c5-b72b-4433-977d-6185317b0885" containerName="proxy-httpd" containerID="cri-o://1adcf2f3d72270a7149b524157da80c9909c9b0afd371e6fb65353403b50e68b" gracePeriod=30 Oct 03 08:59:16 crc kubenswrapper[4790]: I1003 08:59:16.030329 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"f0afbb0f-f4dc-4814-8c7f-791351972c0b","Type":"ContainerStarted","Data":"e09e195cd4e8efea622e902164e68ef566c1f2278445f5df4794272fd43c9eb9"} Oct 03 08:59:16 crc kubenswrapper[4790]: I1003 08:59:16.081213 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-api-0" Oct 03 08:59:16 crc kubenswrapper[4790]: I1003 08:59:16.136096 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/watcher-api-0" Oct 03 08:59:16 crc kubenswrapper[4790]: I1003 08:59:16.218685 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Oct 03 08:59:16 crc kubenswrapper[4790]: I1003 08:59:16.219138 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Oct 03 08:59:16 crc kubenswrapper[4790]: I1003 08:59:16.245841 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Oct 03 08:59:16 crc kubenswrapper[4790]: I1003 08:59:16.245937 4790 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 03 08:59:16 crc kubenswrapper[4790]: I1003 08:59:16.309172 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Oct 03 08:59:16 crc kubenswrapper[4790]: I1003 08:59:16.318130 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-b4b8db59f-55rqj"] Oct 03 08:59:16 crc kubenswrapper[4790]: I1003 08:59:16.421323 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="64754289-91b5-4f84-9809-6271b6a13031" path="/var/lib/kubelet/pods/64754289-91b5-4f84-9809-6271b6a13031/volumes" Oct 03 08:59:16 crc kubenswrapper[4790]: I1003 08:59:16.422281 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8a368664-2338-4579-ae15-981ee5ca3194" path="/var/lib/kubelet/pods/8a368664-2338-4579-ae15-981ee5ca3194/volumes" Oct 03 08:59:16 crc kubenswrapper[4790]: I1003 08:59:16.423058 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e5a8c4b6-bbe9-4dda-b8b9-4b84fe0864ef" path="/var/lib/kubelet/pods/e5a8c4b6-bbe9-4dda-b8b9-4b84fe0864ef/volumes" Oct 03 08:59:16 crc kubenswrapper[4790]: I1003 08:59:16.707124 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-8454698868-xzzl2"] Oct 03 08:59:16 crc kubenswrapper[4790]: W1003 08:59:16.712953 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod71091fa8_ba4a_42b7_b738_a6b012c5b65c.slice/crio-8ddbf07efb39bd64f43d898b232d9e11186dd4ea0b665ea2919fd71b603b4471 WatchSource:0}: Error finding container 8ddbf07efb39bd64f43d898b232d9e11186dd4ea0b665ea2919fd71b603b4471: Status 404 returned error can't find the container with id 8ddbf07efb39bd64f43d898b232d9e11186dd4ea0b665ea2919fd71b603b4471 Oct 03 08:59:17 crc kubenswrapper[4790]: I1003 08:59:17.061150 4790 generic.go:334] "Generic (PLEG): container finished" podID="8a0817c5-b72b-4433-977d-6185317b0885" containerID="1adcf2f3d72270a7149b524157da80c9909c9b0afd371e6fb65353403b50e68b" exitCode=0 Oct 03 08:59:17 crc kubenswrapper[4790]: I1003 08:59:17.061408 4790 generic.go:334] "Generic (PLEG): container finished" podID="8a0817c5-b72b-4433-977d-6185317b0885" containerID="a7695d2d6a51762860632c605a70ca0fddde0aa8e14b150fee62a009907aca95" exitCode=2 Oct 03 08:59:17 crc kubenswrapper[4790]: I1003 08:59:17.061190 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8a0817c5-b72b-4433-977d-6185317b0885","Type":"ContainerDied","Data":"1adcf2f3d72270a7149b524157da80c9909c9b0afd371e6fb65353403b50e68b"} Oct 03 08:59:17 crc kubenswrapper[4790]: I1003 08:59:17.061470 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8a0817c5-b72b-4433-977d-6185317b0885","Type":"ContainerDied","Data":"a7695d2d6a51762860632c605a70ca0fddde0aa8e14b150fee62a009907aca95"} Oct 03 08:59:17 crc kubenswrapper[4790]: I1003 08:59:17.063434 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-8454698868-xzzl2" event={"ID":"71091fa8-ba4a-42b7-b738-a6b012c5b65c","Type":"ContainerStarted","Data":"8ddbf07efb39bd64f43d898b232d9e11186dd4ea0b665ea2919fd71b603b4471"} Oct 03 08:59:17 crc kubenswrapper[4790]: I1003 08:59:17.064820 4790 generic.go:334] "Generic (PLEG): container finished" podID="4a6a5c83-e69b-4f25-913b-0ac02229efeb" containerID="2302088d6ec301d64f813bec35e7ce522e314ef668561f6a9c0e45fc24466efd" exitCode=0 Oct 03 08:59:17 crc kubenswrapper[4790]: I1003 08:59:17.065311 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b4b8db59f-55rqj" event={"ID":"4a6a5c83-e69b-4f25-913b-0ac02229efeb","Type":"ContainerDied","Data":"2302088d6ec301d64f813bec35e7ce522e314ef668561f6a9c0e45fc24466efd"} Oct 03 08:59:17 crc kubenswrapper[4790]: I1003 08:59:17.065346 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b4b8db59f-55rqj" event={"ID":"4a6a5c83-e69b-4f25-913b-0ac02229efeb","Type":"ContainerStarted","Data":"59679c0090ee92e8b908a13e20bbaa08dbaf010095c47bf62b3049d83d403a52"} Oct 03 08:59:17 crc kubenswrapper[4790]: I1003 08:59:17.082937 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/watcher-api-0" Oct 03 08:59:17 crc kubenswrapper[4790]: I1003 08:59:17.919591 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-95d849fd9-jww5w"] Oct 03 08:59:17 crc kubenswrapper[4790]: I1003 08:59:17.922288 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-95d849fd9-jww5w" Oct 03 08:59:17 crc kubenswrapper[4790]: I1003 08:59:17.933196 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-95d849fd9-jww5w"] Oct 03 08:59:17 crc kubenswrapper[4790]: I1003 08:59:17.935945 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-internal-svc" Oct 03 08:59:17 crc kubenswrapper[4790]: I1003 08:59:17.936123 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-public-svc" Oct 03 08:59:18 crc kubenswrapper[4790]: I1003 08:59:18.109569 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/11739e3d-ef66-4152-bd55-6c3de0dd19e2-httpd-config\") pod \"neutron-95d849fd9-jww5w\" (UID: \"11739e3d-ef66-4152-bd55-6c3de0dd19e2\") " pod="openstack/neutron-95d849fd9-jww5w" Oct 03 08:59:18 crc kubenswrapper[4790]: I1003 08:59:18.109948 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/11739e3d-ef66-4152-bd55-6c3de0dd19e2-config\") pod \"neutron-95d849fd9-jww5w\" (UID: \"11739e3d-ef66-4152-bd55-6c3de0dd19e2\") " pod="openstack/neutron-95d849fd9-jww5w" Oct 03 08:59:18 crc kubenswrapper[4790]: I1003 08:59:18.110011 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/11739e3d-ef66-4152-bd55-6c3de0dd19e2-combined-ca-bundle\") pod \"neutron-95d849fd9-jww5w\" (UID: \"11739e3d-ef66-4152-bd55-6c3de0dd19e2\") " pod="openstack/neutron-95d849fd9-jww5w" Oct 03 08:59:18 crc kubenswrapper[4790]: I1003 08:59:18.110035 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/11739e3d-ef66-4152-bd55-6c3de0dd19e2-internal-tls-certs\") pod \"neutron-95d849fd9-jww5w\" (UID: \"11739e3d-ef66-4152-bd55-6c3de0dd19e2\") " pod="openstack/neutron-95d849fd9-jww5w" Oct 03 08:59:18 crc kubenswrapper[4790]: I1003 08:59:18.110067 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/11739e3d-ef66-4152-bd55-6c3de0dd19e2-public-tls-certs\") pod \"neutron-95d849fd9-jww5w\" (UID: \"11739e3d-ef66-4152-bd55-6c3de0dd19e2\") " pod="openstack/neutron-95d849fd9-jww5w" Oct 03 08:59:18 crc kubenswrapper[4790]: I1003 08:59:18.110084 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/11739e3d-ef66-4152-bd55-6c3de0dd19e2-ovndb-tls-certs\") pod \"neutron-95d849fd9-jww5w\" (UID: \"11739e3d-ef66-4152-bd55-6c3de0dd19e2\") " pod="openstack/neutron-95d849fd9-jww5w" Oct 03 08:59:18 crc kubenswrapper[4790]: I1003 08:59:18.110116 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-btqx9\" (UniqueName: \"kubernetes.io/projected/11739e3d-ef66-4152-bd55-6c3de0dd19e2-kube-api-access-btqx9\") pod \"neutron-95d849fd9-jww5w\" (UID: \"11739e3d-ef66-4152-bd55-6c3de0dd19e2\") " pod="openstack/neutron-95d849fd9-jww5w" Oct 03 08:59:18 crc kubenswrapper[4790]: I1003 08:59:18.117306 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-8454698868-xzzl2" event={"ID":"71091fa8-ba4a-42b7-b738-a6b012c5b65c","Type":"ContainerStarted","Data":"40fed40024a54795fdebb448fab0ca6cabc5e85aef38ed269bd60974b3067a14"} Oct 03 08:59:18 crc kubenswrapper[4790]: I1003 08:59:18.117351 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-8454698868-xzzl2" event={"ID":"71091fa8-ba4a-42b7-b738-a6b012c5b65c","Type":"ContainerStarted","Data":"aa79f711bd93e6248a4171b28d343a171e80f79979523fb04a3f7639633acf14"} Oct 03 08:59:18 crc kubenswrapper[4790]: I1003 08:59:18.117907 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-8454698868-xzzl2" Oct 03 08:59:18 crc kubenswrapper[4790]: I1003 08:59:18.175862 4790 generic.go:334] "Generic (PLEG): container finished" podID="8a0817c5-b72b-4433-977d-6185317b0885" containerID="fab70dfb9f385a94ccc6a11aad0924e34371da2c99c29c954d36abeba9e33c9e" exitCode=0 Oct 03 08:59:18 crc kubenswrapper[4790]: I1003 08:59:18.175955 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8a0817c5-b72b-4433-977d-6185317b0885","Type":"ContainerDied","Data":"fab70dfb9f385a94ccc6a11aad0924e34371da2c99c29c954d36abeba9e33c9e"} Oct 03 08:59:18 crc kubenswrapper[4790]: I1003 08:59:18.210261 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b4b8db59f-55rqj" event={"ID":"4a6a5c83-e69b-4f25-913b-0ac02229efeb","Type":"ContainerStarted","Data":"863747eddd27a64a15fc8cab55a9036722aff232d9afb3736fa812d4fa908709"} Oct 03 08:59:18 crc kubenswrapper[4790]: I1003 08:59:18.210353 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-b4b8db59f-55rqj" Oct 03 08:59:18 crc kubenswrapper[4790]: I1003 08:59:18.212551 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-btqx9\" (UniqueName: \"kubernetes.io/projected/11739e3d-ef66-4152-bd55-6c3de0dd19e2-kube-api-access-btqx9\") pod \"neutron-95d849fd9-jww5w\" (UID: \"11739e3d-ef66-4152-bd55-6c3de0dd19e2\") " pod="openstack/neutron-95d849fd9-jww5w" Oct 03 08:59:18 crc kubenswrapper[4790]: I1003 08:59:18.212794 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/11739e3d-ef66-4152-bd55-6c3de0dd19e2-httpd-config\") pod \"neutron-95d849fd9-jww5w\" (UID: \"11739e3d-ef66-4152-bd55-6c3de0dd19e2\") " pod="openstack/neutron-95d849fd9-jww5w" Oct 03 08:59:18 crc kubenswrapper[4790]: I1003 08:59:18.213068 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/11739e3d-ef66-4152-bd55-6c3de0dd19e2-config\") pod \"neutron-95d849fd9-jww5w\" (UID: \"11739e3d-ef66-4152-bd55-6c3de0dd19e2\") " pod="openstack/neutron-95d849fd9-jww5w" Oct 03 08:59:18 crc kubenswrapper[4790]: I1003 08:59:18.213229 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/11739e3d-ef66-4152-bd55-6c3de0dd19e2-combined-ca-bundle\") pod \"neutron-95d849fd9-jww5w\" (UID: \"11739e3d-ef66-4152-bd55-6c3de0dd19e2\") " pod="openstack/neutron-95d849fd9-jww5w" Oct 03 08:59:18 crc kubenswrapper[4790]: I1003 08:59:18.213274 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/11739e3d-ef66-4152-bd55-6c3de0dd19e2-internal-tls-certs\") pod \"neutron-95d849fd9-jww5w\" (UID: \"11739e3d-ef66-4152-bd55-6c3de0dd19e2\") " pod="openstack/neutron-95d849fd9-jww5w" Oct 03 08:59:18 crc kubenswrapper[4790]: I1003 08:59:18.213338 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/11739e3d-ef66-4152-bd55-6c3de0dd19e2-public-tls-certs\") pod \"neutron-95d849fd9-jww5w\" (UID: \"11739e3d-ef66-4152-bd55-6c3de0dd19e2\") " pod="openstack/neutron-95d849fd9-jww5w" Oct 03 08:59:18 crc kubenswrapper[4790]: I1003 08:59:18.213373 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/11739e3d-ef66-4152-bd55-6c3de0dd19e2-ovndb-tls-certs\") pod \"neutron-95d849fd9-jww5w\" (UID: \"11739e3d-ef66-4152-bd55-6c3de0dd19e2\") " pod="openstack/neutron-95d849fd9-jww5w" Oct 03 08:59:18 crc kubenswrapper[4790]: I1003 08:59:18.227563 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/11739e3d-ef66-4152-bd55-6c3de0dd19e2-combined-ca-bundle\") pod \"neutron-95d849fd9-jww5w\" (UID: \"11739e3d-ef66-4152-bd55-6c3de0dd19e2\") " pod="openstack/neutron-95d849fd9-jww5w" Oct 03 08:59:18 crc kubenswrapper[4790]: I1003 08:59:18.246054 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/11739e3d-ef66-4152-bd55-6c3de0dd19e2-ovndb-tls-certs\") pod \"neutron-95d849fd9-jww5w\" (UID: \"11739e3d-ef66-4152-bd55-6c3de0dd19e2\") " pod="openstack/neutron-95d849fd9-jww5w" Oct 03 08:59:18 crc kubenswrapper[4790]: I1003 08:59:18.246857 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/11739e3d-ef66-4152-bd55-6c3de0dd19e2-httpd-config\") pod \"neutron-95d849fd9-jww5w\" (UID: \"11739e3d-ef66-4152-bd55-6c3de0dd19e2\") " pod="openstack/neutron-95d849fd9-jww5w" Oct 03 08:59:18 crc kubenswrapper[4790]: I1003 08:59:18.247326 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/11739e3d-ef66-4152-bd55-6c3de0dd19e2-config\") pod \"neutron-95d849fd9-jww5w\" (UID: \"11739e3d-ef66-4152-bd55-6c3de0dd19e2\") " pod="openstack/neutron-95d849fd9-jww5w" Oct 03 08:59:18 crc kubenswrapper[4790]: I1003 08:59:18.248400 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/11739e3d-ef66-4152-bd55-6c3de0dd19e2-internal-tls-certs\") pod \"neutron-95d849fd9-jww5w\" (UID: \"11739e3d-ef66-4152-bd55-6c3de0dd19e2\") " pod="openstack/neutron-95d849fd9-jww5w" Oct 03 08:59:18 crc kubenswrapper[4790]: I1003 08:59:18.262551 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/11739e3d-ef66-4152-bd55-6c3de0dd19e2-public-tls-certs\") pod \"neutron-95d849fd9-jww5w\" (UID: \"11739e3d-ef66-4152-bd55-6c3de0dd19e2\") " pod="openstack/neutron-95d849fd9-jww5w" Oct 03 08:59:18 crc kubenswrapper[4790]: I1003 08:59:18.266169 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-btqx9\" (UniqueName: \"kubernetes.io/projected/11739e3d-ef66-4152-bd55-6c3de0dd19e2-kube-api-access-btqx9\") pod \"neutron-95d849fd9-jww5w\" (UID: \"11739e3d-ef66-4152-bd55-6c3de0dd19e2\") " pod="openstack/neutron-95d849fd9-jww5w" Oct 03 08:59:18 crc kubenswrapper[4790]: I1003 08:59:18.274026 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-b4b8db59f-55rqj" podStartSLOduration=3.274001739 podStartE2EDuration="3.274001739s" podCreationTimestamp="2025-10-03 08:59:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 08:59:18.261998589 +0000 UTC m=+1144.614932827" watchObservedRunningTime="2025-10-03 08:59:18.274001739 +0000 UTC m=+1144.626935977" Oct 03 08:59:18 crc kubenswrapper[4790]: I1003 08:59:18.276596 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-8454698868-xzzl2" podStartSLOduration=3.27658403 podStartE2EDuration="3.27658403s" podCreationTimestamp="2025-10-03 08:59:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 08:59:18.185777611 +0000 UTC m=+1144.538711859" watchObservedRunningTime="2025-10-03 08:59:18.27658403 +0000 UTC m=+1144.629518268" Oct 03 08:59:18 crc kubenswrapper[4790]: I1003 08:59:18.284656 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-95d849fd9-jww5w" Oct 03 08:59:18 crc kubenswrapper[4790]: I1003 08:59:18.393826 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 03 08:59:18 crc kubenswrapper[4790]: I1003 08:59:18.520889 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a0817c5-b72b-4433-977d-6185317b0885-combined-ca-bundle\") pod \"8a0817c5-b72b-4433-977d-6185317b0885\" (UID: \"8a0817c5-b72b-4433-977d-6185317b0885\") " Oct 03 08:59:18 crc kubenswrapper[4790]: I1003 08:59:18.521331 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8a0817c5-b72b-4433-977d-6185317b0885-sg-core-conf-yaml\") pod \"8a0817c5-b72b-4433-977d-6185317b0885\" (UID: \"8a0817c5-b72b-4433-977d-6185317b0885\") " Oct 03 08:59:18 crc kubenswrapper[4790]: I1003 08:59:18.521397 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xmsgj\" (UniqueName: \"kubernetes.io/projected/8a0817c5-b72b-4433-977d-6185317b0885-kube-api-access-xmsgj\") pod \"8a0817c5-b72b-4433-977d-6185317b0885\" (UID: \"8a0817c5-b72b-4433-977d-6185317b0885\") " Oct 03 08:59:18 crc kubenswrapper[4790]: I1003 08:59:18.521441 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8a0817c5-b72b-4433-977d-6185317b0885-scripts\") pod \"8a0817c5-b72b-4433-977d-6185317b0885\" (UID: \"8a0817c5-b72b-4433-977d-6185317b0885\") " Oct 03 08:59:18 crc kubenswrapper[4790]: I1003 08:59:18.521459 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a0817c5-b72b-4433-977d-6185317b0885-config-data\") pod \"8a0817c5-b72b-4433-977d-6185317b0885\" (UID: \"8a0817c5-b72b-4433-977d-6185317b0885\") " Oct 03 08:59:18 crc kubenswrapper[4790]: I1003 08:59:18.521591 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8a0817c5-b72b-4433-977d-6185317b0885-run-httpd\") pod \"8a0817c5-b72b-4433-977d-6185317b0885\" (UID: \"8a0817c5-b72b-4433-977d-6185317b0885\") " Oct 03 08:59:18 crc kubenswrapper[4790]: I1003 08:59:18.521634 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8a0817c5-b72b-4433-977d-6185317b0885-log-httpd\") pod \"8a0817c5-b72b-4433-977d-6185317b0885\" (UID: \"8a0817c5-b72b-4433-977d-6185317b0885\") " Oct 03 08:59:18 crc kubenswrapper[4790]: I1003 08:59:18.525021 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8a0817c5-b72b-4433-977d-6185317b0885-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "8a0817c5-b72b-4433-977d-6185317b0885" (UID: "8a0817c5-b72b-4433-977d-6185317b0885"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 08:59:18 crc kubenswrapper[4790]: I1003 08:59:18.525293 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8a0817c5-b72b-4433-977d-6185317b0885-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "8a0817c5-b72b-4433-977d-6185317b0885" (UID: "8a0817c5-b72b-4433-977d-6185317b0885"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 08:59:18 crc kubenswrapper[4790]: I1003 08:59:18.534153 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8a0817c5-b72b-4433-977d-6185317b0885-kube-api-access-xmsgj" (OuterVolumeSpecName: "kube-api-access-xmsgj") pod "8a0817c5-b72b-4433-977d-6185317b0885" (UID: "8a0817c5-b72b-4433-977d-6185317b0885"). InnerVolumeSpecName "kube-api-access-xmsgj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:59:18 crc kubenswrapper[4790]: I1003 08:59:18.550690 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a0817c5-b72b-4433-977d-6185317b0885-scripts" (OuterVolumeSpecName: "scripts") pod "8a0817c5-b72b-4433-977d-6185317b0885" (UID: "8a0817c5-b72b-4433-977d-6185317b0885"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:59:18 crc kubenswrapper[4790]: I1003 08:59:18.623927 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xmsgj\" (UniqueName: \"kubernetes.io/projected/8a0817c5-b72b-4433-977d-6185317b0885-kube-api-access-xmsgj\") on node \"crc\" DevicePath \"\"" Oct 03 08:59:18 crc kubenswrapper[4790]: I1003 08:59:18.623959 4790 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8a0817c5-b72b-4433-977d-6185317b0885-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 08:59:18 crc kubenswrapper[4790]: I1003 08:59:18.623969 4790 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8a0817c5-b72b-4433-977d-6185317b0885-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 03 08:59:18 crc kubenswrapper[4790]: I1003 08:59:18.623979 4790 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8a0817c5-b72b-4433-977d-6185317b0885-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 03 08:59:18 crc kubenswrapper[4790]: I1003 08:59:18.734714 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a0817c5-b72b-4433-977d-6185317b0885-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "8a0817c5-b72b-4433-977d-6185317b0885" (UID: "8a0817c5-b72b-4433-977d-6185317b0885"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:59:18 crc kubenswrapper[4790]: I1003 08:59:18.776833 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a0817c5-b72b-4433-977d-6185317b0885-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8a0817c5-b72b-4433-977d-6185317b0885" (UID: "8a0817c5-b72b-4433-977d-6185317b0885"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:59:18 crc kubenswrapper[4790]: I1003 08:59:18.829885 4790 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a0817c5-b72b-4433-977d-6185317b0885-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 08:59:18 crc kubenswrapper[4790]: I1003 08:59:18.830201 4790 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8a0817c5-b72b-4433-977d-6185317b0885-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 03 08:59:18 crc kubenswrapper[4790]: I1003 08:59:18.871336 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a0817c5-b72b-4433-977d-6185317b0885-config-data" (OuterVolumeSpecName: "config-data") pod "8a0817c5-b72b-4433-977d-6185317b0885" (UID: "8a0817c5-b72b-4433-977d-6185317b0885"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:59:18 crc kubenswrapper[4790]: I1003 08:59:18.932864 4790 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a0817c5-b72b-4433-977d-6185317b0885-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 08:59:19 crc kubenswrapper[4790]: I1003 08:59:19.038305 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-95d849fd9-jww5w"] Oct 03 08:59:19 crc kubenswrapper[4790]: I1003 08:59:19.240552 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-95d849fd9-jww5w" event={"ID":"11739e3d-ef66-4152-bd55-6c3de0dd19e2","Type":"ContainerStarted","Data":"9ed875f8e3a4a6a8373b4bcd101d36a08c0446830090ce6f4d9bdb788ab18169"} Oct 03 08:59:19 crc kubenswrapper[4790]: I1003 08:59:19.246505 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8a0817c5-b72b-4433-977d-6185317b0885","Type":"ContainerDied","Data":"6f68cbe5675af8b8176635455fcabbd6a69f76ebd35d37d256275064dd3d9117"} Oct 03 08:59:19 crc kubenswrapper[4790]: I1003 08:59:19.246604 4790 scope.go:117] "RemoveContainer" containerID="1adcf2f3d72270a7149b524157da80c9909c9b0afd371e6fb65353403b50e68b" Oct 03 08:59:19 crc kubenswrapper[4790]: I1003 08:59:19.246759 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 03 08:59:19 crc kubenswrapper[4790]: I1003 08:59:19.270218 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-gklmz" event={"ID":"96250103-59af-4ec6-a036-f6e8222b02e0","Type":"ContainerStarted","Data":"98bb661d0f9ba60d912f0c38255f37a9a0aca621bc6e59eb791b9e2f702ed8f5"} Oct 03 08:59:19 crc kubenswrapper[4790]: I1003 08:59:19.287986 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-brfpm" event={"ID":"20340665-5e73-4bfd-8a1a-6d21e0d16432","Type":"ContainerStarted","Data":"8349d3d9c890e0f2559c2e27d15abf20cd2f6a976de2460c5606a8306b102d99"} Oct 03 08:59:19 crc kubenswrapper[4790]: I1003 08:59:19.309514 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-gklmz" podStartSLOduration=11.590580144 podStartE2EDuration="49.309493837s" podCreationTimestamp="2025-10-03 08:58:30 +0000 UTC" firstStartedPulling="2025-10-03 08:58:39.70542179 +0000 UTC m=+1106.058356028" lastFinishedPulling="2025-10-03 08:59:17.424335483 +0000 UTC m=+1143.777269721" observedRunningTime="2025-10-03 08:59:19.292026358 +0000 UTC m=+1145.644960596" watchObservedRunningTime="2025-10-03 08:59:19.309493837 +0000 UTC m=+1145.662428075" Oct 03 08:59:19 crc kubenswrapper[4790]: I1003 08:59:19.324672 4790 scope.go:117] "RemoveContainer" containerID="a7695d2d6a51762860632c605a70ca0fddde0aa8e14b150fee62a009907aca95" Oct 03 08:59:19 crc kubenswrapper[4790]: I1003 08:59:19.373851 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-brfpm" podStartSLOduration=10.636716302 podStartE2EDuration="49.37383131s" podCreationTimestamp="2025-10-03 08:58:30 +0000 UTC" firstStartedPulling="2025-10-03 08:58:39.734436214 +0000 UTC m=+1106.087370452" lastFinishedPulling="2025-10-03 08:59:18.471551212 +0000 UTC m=+1144.824485460" observedRunningTime="2025-10-03 08:59:19.321349232 +0000 UTC m=+1145.674283470" watchObservedRunningTime="2025-10-03 08:59:19.37383131 +0000 UTC m=+1145.726765548" Oct 03 08:59:19 crc kubenswrapper[4790]: I1003 08:59:19.436668 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 03 08:59:19 crc kubenswrapper[4790]: I1003 08:59:19.447308 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 03 08:59:19 crc kubenswrapper[4790]: I1003 08:59:19.453352 4790 scope.go:117] "RemoveContainer" containerID="fab70dfb9f385a94ccc6a11aad0924e34371da2c99c29c954d36abeba9e33c9e" Oct 03 08:59:19 crc kubenswrapper[4790]: I1003 08:59:19.468397 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 03 08:59:19 crc kubenswrapper[4790]: E1003 08:59:19.468824 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a0817c5-b72b-4433-977d-6185317b0885" containerName="proxy-httpd" Oct 03 08:59:19 crc kubenswrapper[4790]: I1003 08:59:19.468837 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a0817c5-b72b-4433-977d-6185317b0885" containerName="proxy-httpd" Oct 03 08:59:19 crc kubenswrapper[4790]: E1003 08:59:19.468850 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a0817c5-b72b-4433-977d-6185317b0885" containerName="ceilometer-notification-agent" Oct 03 08:59:19 crc kubenswrapper[4790]: I1003 08:59:19.468857 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a0817c5-b72b-4433-977d-6185317b0885" containerName="ceilometer-notification-agent" Oct 03 08:59:19 crc kubenswrapper[4790]: E1003 08:59:19.468882 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a0817c5-b72b-4433-977d-6185317b0885" containerName="sg-core" Oct 03 08:59:19 crc kubenswrapper[4790]: I1003 08:59:19.468889 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a0817c5-b72b-4433-977d-6185317b0885" containerName="sg-core" Oct 03 08:59:19 crc kubenswrapper[4790]: I1003 08:59:19.469110 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="8a0817c5-b72b-4433-977d-6185317b0885" containerName="sg-core" Oct 03 08:59:19 crc kubenswrapper[4790]: I1003 08:59:19.469135 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="8a0817c5-b72b-4433-977d-6185317b0885" containerName="ceilometer-notification-agent" Oct 03 08:59:19 crc kubenswrapper[4790]: I1003 08:59:19.469153 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="8a0817c5-b72b-4433-977d-6185317b0885" containerName="proxy-httpd" Oct 03 08:59:19 crc kubenswrapper[4790]: I1003 08:59:19.471002 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 03 08:59:19 crc kubenswrapper[4790]: I1003 08:59:19.481466 4790 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-7d76b4b9d6-n5tqf" podUID="f57970a5-79c5-4a4f-bdf6-89d25bd8d8c5" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.158:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.158:8443: connect: connection refused" Oct 03 08:59:19 crc kubenswrapper[4790]: I1003 08:59:19.481950 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 03 08:59:19 crc kubenswrapper[4790]: I1003 08:59:19.482767 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 03 08:59:19 crc kubenswrapper[4790]: I1003 08:59:19.505813 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 03 08:59:19 crc kubenswrapper[4790]: I1003 08:59:19.551711 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c8f86f3a-d523-455f-97b2-6995541d8487-log-httpd\") pod \"ceilometer-0\" (UID: \"c8f86f3a-d523-455f-97b2-6995541d8487\") " pod="openstack/ceilometer-0" Oct 03 08:59:19 crc kubenswrapper[4790]: I1003 08:59:19.551828 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c8f86f3a-d523-455f-97b2-6995541d8487-config-data\") pod \"ceilometer-0\" (UID: \"c8f86f3a-d523-455f-97b2-6995541d8487\") " pod="openstack/ceilometer-0" Oct 03 08:59:19 crc kubenswrapper[4790]: I1003 08:59:19.551853 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c8f86f3a-d523-455f-97b2-6995541d8487-run-httpd\") pod \"ceilometer-0\" (UID: \"c8f86f3a-d523-455f-97b2-6995541d8487\") " pod="openstack/ceilometer-0" Oct 03 08:59:19 crc kubenswrapper[4790]: I1003 08:59:19.551869 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c8f86f3a-d523-455f-97b2-6995541d8487-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"c8f86f3a-d523-455f-97b2-6995541d8487\") " pod="openstack/ceilometer-0" Oct 03 08:59:19 crc kubenswrapper[4790]: I1003 08:59:19.551917 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c8f86f3a-d523-455f-97b2-6995541d8487-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"c8f86f3a-d523-455f-97b2-6995541d8487\") " pod="openstack/ceilometer-0" Oct 03 08:59:19 crc kubenswrapper[4790]: I1003 08:59:19.551946 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bqp6m\" (UniqueName: \"kubernetes.io/projected/c8f86f3a-d523-455f-97b2-6995541d8487-kube-api-access-bqp6m\") pod \"ceilometer-0\" (UID: \"c8f86f3a-d523-455f-97b2-6995541d8487\") " pod="openstack/ceilometer-0" Oct 03 08:59:19 crc kubenswrapper[4790]: I1003 08:59:19.551970 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c8f86f3a-d523-455f-97b2-6995541d8487-scripts\") pod \"ceilometer-0\" (UID: \"c8f86f3a-d523-455f-97b2-6995541d8487\") " pod="openstack/ceilometer-0" Oct 03 08:59:19 crc kubenswrapper[4790]: I1003 08:59:19.653243 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c8f86f3a-d523-455f-97b2-6995541d8487-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"c8f86f3a-d523-455f-97b2-6995541d8487\") " pod="openstack/ceilometer-0" Oct 03 08:59:19 crc kubenswrapper[4790]: I1003 08:59:19.655787 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bqp6m\" (UniqueName: \"kubernetes.io/projected/c8f86f3a-d523-455f-97b2-6995541d8487-kube-api-access-bqp6m\") pod \"ceilometer-0\" (UID: \"c8f86f3a-d523-455f-97b2-6995541d8487\") " pod="openstack/ceilometer-0" Oct 03 08:59:19 crc kubenswrapper[4790]: I1003 08:59:19.655875 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c8f86f3a-d523-455f-97b2-6995541d8487-scripts\") pod \"ceilometer-0\" (UID: \"c8f86f3a-d523-455f-97b2-6995541d8487\") " pod="openstack/ceilometer-0" Oct 03 08:59:19 crc kubenswrapper[4790]: I1003 08:59:19.656353 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c8f86f3a-d523-455f-97b2-6995541d8487-log-httpd\") pod \"ceilometer-0\" (UID: \"c8f86f3a-d523-455f-97b2-6995541d8487\") " pod="openstack/ceilometer-0" Oct 03 08:59:19 crc kubenswrapper[4790]: I1003 08:59:19.656561 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c8f86f3a-d523-455f-97b2-6995541d8487-config-data\") pod \"ceilometer-0\" (UID: \"c8f86f3a-d523-455f-97b2-6995541d8487\") " pod="openstack/ceilometer-0" Oct 03 08:59:19 crc kubenswrapper[4790]: I1003 08:59:19.656641 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c8f86f3a-d523-455f-97b2-6995541d8487-run-httpd\") pod \"ceilometer-0\" (UID: \"c8f86f3a-d523-455f-97b2-6995541d8487\") " pod="openstack/ceilometer-0" Oct 03 08:59:19 crc kubenswrapper[4790]: I1003 08:59:19.656662 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c8f86f3a-d523-455f-97b2-6995541d8487-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"c8f86f3a-d523-455f-97b2-6995541d8487\") " pod="openstack/ceilometer-0" Oct 03 08:59:19 crc kubenswrapper[4790]: I1003 08:59:19.658961 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c8f86f3a-d523-455f-97b2-6995541d8487-scripts\") pod \"ceilometer-0\" (UID: \"c8f86f3a-d523-455f-97b2-6995541d8487\") " pod="openstack/ceilometer-0" Oct 03 08:59:19 crc kubenswrapper[4790]: I1003 08:59:19.659235 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c8f86f3a-d523-455f-97b2-6995541d8487-run-httpd\") pod \"ceilometer-0\" (UID: \"c8f86f3a-d523-455f-97b2-6995541d8487\") " pod="openstack/ceilometer-0" Oct 03 08:59:19 crc kubenswrapper[4790]: I1003 08:59:19.659422 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c8f86f3a-d523-455f-97b2-6995541d8487-log-httpd\") pod \"ceilometer-0\" (UID: \"c8f86f3a-d523-455f-97b2-6995541d8487\") " pod="openstack/ceilometer-0" Oct 03 08:59:19 crc kubenswrapper[4790]: I1003 08:59:19.659734 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c8f86f3a-d523-455f-97b2-6995541d8487-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"c8f86f3a-d523-455f-97b2-6995541d8487\") " pod="openstack/ceilometer-0" Oct 03 08:59:19 crc kubenswrapper[4790]: I1003 08:59:19.663215 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c8f86f3a-d523-455f-97b2-6995541d8487-config-data\") pod \"ceilometer-0\" (UID: \"c8f86f3a-d523-455f-97b2-6995541d8487\") " pod="openstack/ceilometer-0" Oct 03 08:59:19 crc kubenswrapper[4790]: I1003 08:59:19.664096 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c8f86f3a-d523-455f-97b2-6995541d8487-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"c8f86f3a-d523-455f-97b2-6995541d8487\") " pod="openstack/ceilometer-0" Oct 03 08:59:19 crc kubenswrapper[4790]: I1003 08:59:19.679269 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bqp6m\" (UniqueName: \"kubernetes.io/projected/c8f86f3a-d523-455f-97b2-6995541d8487-kube-api-access-bqp6m\") pod \"ceilometer-0\" (UID: \"c8f86f3a-d523-455f-97b2-6995541d8487\") " pod="openstack/ceilometer-0" Oct 03 08:59:19 crc kubenswrapper[4790]: I1003 08:59:19.799274 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 03 08:59:20 crc kubenswrapper[4790]: I1003 08:59:20.265464 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 03 08:59:20 crc kubenswrapper[4790]: I1003 08:59:20.299792 4790 generic.go:334] "Generic (PLEG): container finished" podID="f0afbb0f-f4dc-4814-8c7f-791351972c0b" containerID="e09e195cd4e8efea622e902164e68ef566c1f2278445f5df4794272fd43c9eb9" exitCode=1 Oct 03 08:59:20 crc kubenswrapper[4790]: I1003 08:59:20.299887 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"f0afbb0f-f4dc-4814-8c7f-791351972c0b","Type":"ContainerDied","Data":"e09e195cd4e8efea622e902164e68ef566c1f2278445f5df4794272fd43c9eb9"} Oct 03 08:59:20 crc kubenswrapper[4790]: I1003 08:59:20.299934 4790 scope.go:117] "RemoveContainer" containerID="b4032890331a2dab234fecf385619d911173462f72145113195e212d5b0cefe6" Oct 03 08:59:20 crc kubenswrapper[4790]: I1003 08:59:20.300707 4790 scope.go:117] "RemoveContainer" containerID="e09e195cd4e8efea622e902164e68ef566c1f2278445f5df4794272fd43c9eb9" Oct 03 08:59:20 crc kubenswrapper[4790]: E1003 08:59:20.301062 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 10s restarting failed container=watcher-decision-engine pod=watcher-decision-engine-0_openstack(f0afbb0f-f4dc-4814-8c7f-791351972c0b)\"" pod="openstack/watcher-decision-engine-0" podUID="f0afbb0f-f4dc-4814-8c7f-791351972c0b" Oct 03 08:59:20 crc kubenswrapper[4790]: I1003 08:59:20.306318 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-95d849fd9-jww5w" event={"ID":"11739e3d-ef66-4152-bd55-6c3de0dd19e2","Type":"ContainerStarted","Data":"8dc47caa7a6699022747327553a5f62902b45a386bf7216bdd999142eeaeeec5"} Oct 03 08:59:20 crc kubenswrapper[4790]: I1003 08:59:20.308298 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c8f86f3a-d523-455f-97b2-6995541d8487","Type":"ContainerStarted","Data":"77c7c8470e544e061284624df7cc25f4049550cbd17eef5db9144e2c3b322e26"} Oct 03 08:59:20 crc kubenswrapper[4790]: I1003 08:59:20.342114 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8a0817c5-b72b-4433-977d-6185317b0885" path="/var/lib/kubelet/pods/8a0817c5-b72b-4433-977d-6185317b0885/volumes" Oct 03 08:59:21 crc kubenswrapper[4790]: I1003 08:59:21.327652 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-95d849fd9-jww5w" event={"ID":"11739e3d-ef66-4152-bd55-6c3de0dd19e2","Type":"ContainerStarted","Data":"6f427603ecdb756f9056f32de522c3044ab62a46633d0d8bbf76e4c8e3e64a35"} Oct 03 08:59:21 crc kubenswrapper[4790]: I1003 08:59:21.328261 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-95d849fd9-jww5w" Oct 03 08:59:21 crc kubenswrapper[4790]: I1003 08:59:21.334859 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c8f86f3a-d523-455f-97b2-6995541d8487","Type":"ContainerStarted","Data":"728c655da19c3a9ced7deccd24d40a1cc7732051de2fa603ef66cdc99325c734"} Oct 03 08:59:21 crc kubenswrapper[4790]: I1003 08:59:21.334915 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c8f86f3a-d523-455f-97b2-6995541d8487","Type":"ContainerStarted","Data":"dafeb127ec4a5866622b10537defdc778fdde863c4a580ec5e37ff408cb0f271"} Oct 03 08:59:21 crc kubenswrapper[4790]: I1003 08:59:21.359539 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-95d849fd9-jww5w" podStartSLOduration=4.359520889 podStartE2EDuration="4.359520889s" podCreationTimestamp="2025-10-03 08:59:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 08:59:21.354976786 +0000 UTC m=+1147.707911044" watchObservedRunningTime="2025-10-03 08:59:21.359520889 +0000 UTC m=+1147.712455127" Oct 03 08:59:21 crc kubenswrapper[4790]: E1003 08:59:21.747686 4790 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddf20e14b_d7d8_4cab_8b7c_61122da75fbf.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod64754289_91b5_4f84_9809_6271b6a13031.slice/crio-b0ff64c50c118d67a4cf15daa2b8ee6c4f295601ca118a67059cebe19412eeaf\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddf20e14b_d7d8_4cab_8b7c_61122da75fbf.slice/crio-166a3036570750727b0dd758229bd28aafecd40eb06b2140fd160ae44b5f7978\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod64754289_91b5_4f84_9809_6271b6a13031.slice\": RecentStats: unable to find data in memory cache]" Oct 03 08:59:21 crc kubenswrapper[4790]: I1003 08:59:21.927813 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-decision-engine-0" Oct 03 08:59:21 crc kubenswrapper[4790]: I1003 08:59:21.927891 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-decision-engine-0" Oct 03 08:59:21 crc kubenswrapper[4790]: I1003 08:59:21.928802 4790 scope.go:117] "RemoveContainer" containerID="e09e195cd4e8efea622e902164e68ef566c1f2278445f5df4794272fd43c9eb9" Oct 03 08:59:21 crc kubenswrapper[4790]: E1003 08:59:21.929316 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 10s restarting failed container=watcher-decision-engine pod=watcher-decision-engine-0_openstack(f0afbb0f-f4dc-4814-8c7f-791351972c0b)\"" pod="openstack/watcher-decision-engine-0" podUID="f0afbb0f-f4dc-4814-8c7f-791351972c0b" Oct 03 08:59:22 crc kubenswrapper[4790]: I1003 08:59:22.347370 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c8f86f3a-d523-455f-97b2-6995541d8487","Type":"ContainerStarted","Data":"1f6aa929dba32f2388e4c4211571dc2d0ace0f9a415f6df77e04005289adc718"} Oct 03 08:59:23 crc kubenswrapper[4790]: I1003 08:59:23.397873 4790 generic.go:334] "Generic (PLEG): container finished" podID="20340665-5e73-4bfd-8a1a-6d21e0d16432" containerID="8349d3d9c890e0f2559c2e27d15abf20cd2f6a976de2460c5606a8306b102d99" exitCode=0 Oct 03 08:59:23 crc kubenswrapper[4790]: I1003 08:59:23.397925 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-brfpm" event={"ID":"20340665-5e73-4bfd-8a1a-6d21e0d16432","Type":"ContainerDied","Data":"8349d3d9c890e0f2559c2e27d15abf20cd2f6a976de2460c5606a8306b102d99"} Oct 03 08:59:24 crc kubenswrapper[4790]: I1003 08:59:24.410323 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c8f86f3a-d523-455f-97b2-6995541d8487","Type":"ContainerStarted","Data":"9d20834aa730d9597b074878440f498bf26e6043eac0c87f566906d35d75a87f"} Oct 03 08:59:24 crc kubenswrapper[4790]: I1003 08:59:24.410738 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 03 08:59:24 crc kubenswrapper[4790]: I1003 08:59:24.448894 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.606267952 podStartE2EDuration="5.448875256s" podCreationTimestamp="2025-10-03 08:59:19 +0000 UTC" firstStartedPulling="2025-10-03 08:59:20.272240442 +0000 UTC m=+1146.625174690" lastFinishedPulling="2025-10-03 08:59:23.114847736 +0000 UTC m=+1149.467781994" observedRunningTime="2025-10-03 08:59:24.442973544 +0000 UTC m=+1150.795907792" watchObservedRunningTime="2025-10-03 08:59:24.448875256 +0000 UTC m=+1150.801809494" Oct 03 08:59:25 crc kubenswrapper[4790]: I1003 08:59:25.064437 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-brfpm" Oct 03 08:59:25 crc kubenswrapper[4790]: I1003 08:59:25.188456 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zk7df\" (UniqueName: \"kubernetes.io/projected/20340665-5e73-4bfd-8a1a-6d21e0d16432-kube-api-access-zk7df\") pod \"20340665-5e73-4bfd-8a1a-6d21e0d16432\" (UID: \"20340665-5e73-4bfd-8a1a-6d21e0d16432\") " Oct 03 08:59:25 crc kubenswrapper[4790]: I1003 08:59:25.189306 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/20340665-5e73-4bfd-8a1a-6d21e0d16432-combined-ca-bundle\") pod \"20340665-5e73-4bfd-8a1a-6d21e0d16432\" (UID: \"20340665-5e73-4bfd-8a1a-6d21e0d16432\") " Oct 03 08:59:25 crc kubenswrapper[4790]: I1003 08:59:25.189328 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/20340665-5e73-4bfd-8a1a-6d21e0d16432-db-sync-config-data\") pod \"20340665-5e73-4bfd-8a1a-6d21e0d16432\" (UID: \"20340665-5e73-4bfd-8a1a-6d21e0d16432\") " Oct 03 08:59:25 crc kubenswrapper[4790]: I1003 08:59:25.196238 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20340665-5e73-4bfd-8a1a-6d21e0d16432-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "20340665-5e73-4bfd-8a1a-6d21e0d16432" (UID: "20340665-5e73-4bfd-8a1a-6d21e0d16432"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:59:25 crc kubenswrapper[4790]: I1003 08:59:25.203882 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20340665-5e73-4bfd-8a1a-6d21e0d16432-kube-api-access-zk7df" (OuterVolumeSpecName: "kube-api-access-zk7df") pod "20340665-5e73-4bfd-8a1a-6d21e0d16432" (UID: "20340665-5e73-4bfd-8a1a-6d21e0d16432"). InnerVolumeSpecName "kube-api-access-zk7df". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:59:25 crc kubenswrapper[4790]: I1003 08:59:25.222979 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20340665-5e73-4bfd-8a1a-6d21e0d16432-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "20340665-5e73-4bfd-8a1a-6d21e0d16432" (UID: "20340665-5e73-4bfd-8a1a-6d21e0d16432"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:59:25 crc kubenswrapper[4790]: I1003 08:59:25.292319 4790 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/20340665-5e73-4bfd-8a1a-6d21e0d16432-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 08:59:25 crc kubenswrapper[4790]: I1003 08:59:25.292375 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zk7df\" (UniqueName: \"kubernetes.io/projected/20340665-5e73-4bfd-8a1a-6d21e0d16432-kube-api-access-zk7df\") on node \"crc\" DevicePath \"\"" Oct 03 08:59:25 crc kubenswrapper[4790]: I1003 08:59:25.292401 4790 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/20340665-5e73-4bfd-8a1a-6d21e0d16432-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 08:59:25 crc kubenswrapper[4790]: I1003 08:59:25.419663 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-brfpm" event={"ID":"20340665-5e73-4bfd-8a1a-6d21e0d16432","Type":"ContainerDied","Data":"393ec01869d710e54cb79af23763bbc6c201fa62633d76b7abd231c77d64ad76"} Oct 03 08:59:25 crc kubenswrapper[4790]: I1003 08:59:25.419708 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="393ec01869d710e54cb79af23763bbc6c201fa62633d76b7abd231c77d64ad76" Oct 03 08:59:25 crc kubenswrapper[4790]: I1003 08:59:25.419673 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-brfpm" Oct 03 08:59:25 crc kubenswrapper[4790]: I1003 08:59:25.432412 4790 generic.go:334] "Generic (PLEG): container finished" podID="96250103-59af-4ec6-a036-f6e8222b02e0" containerID="98bb661d0f9ba60d912f0c38255f37a9a0aca621bc6e59eb791b9e2f702ed8f5" exitCode=0 Oct 03 08:59:25 crc kubenswrapper[4790]: I1003 08:59:25.433103 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-gklmz" event={"ID":"96250103-59af-4ec6-a036-f6e8222b02e0","Type":"ContainerDied","Data":"98bb661d0f9ba60d912f0c38255f37a9a0aca621bc6e59eb791b9e2f702ed8f5"} Oct 03 08:59:25 crc kubenswrapper[4790]: I1003 08:59:25.476820 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-b4b8db59f-55rqj" Oct 03 08:59:25 crc kubenswrapper[4790]: I1003 08:59:25.539744 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6454d9f559-66tmq"] Oct 03 08:59:25 crc kubenswrapper[4790]: I1003 08:59:25.539984 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6454d9f559-66tmq" podUID="1404dbad-1c09-4b76-a6a1-d563ed8d8bde" containerName="dnsmasq-dns" containerID="cri-o://16961df2c39ced84062e96d16f8afcb5b72c5e8edb0c1bb2c8ec402830f0fbc6" gracePeriod=10 Oct 03 08:59:25 crc kubenswrapper[4790]: I1003 08:59:25.731033 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-56674f8d84-pn6qq"] Oct 03 08:59:25 crc kubenswrapper[4790]: E1003 08:59:25.738859 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="20340665-5e73-4bfd-8a1a-6d21e0d16432" containerName="barbican-db-sync" Oct 03 08:59:25 crc kubenswrapper[4790]: I1003 08:59:25.738898 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="20340665-5e73-4bfd-8a1a-6d21e0d16432" containerName="barbican-db-sync" Oct 03 08:59:25 crc kubenswrapper[4790]: I1003 08:59:25.739184 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="20340665-5e73-4bfd-8a1a-6d21e0d16432" containerName="barbican-db-sync" Oct 03 08:59:25 crc kubenswrapper[4790]: I1003 08:59:25.740284 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-56674f8d84-pn6qq" Oct 03 08:59:25 crc kubenswrapper[4790]: I1003 08:59:25.743744 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Oct 03 08:59:25 crc kubenswrapper[4790]: I1003 08:59:25.754260 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Oct 03 08:59:25 crc kubenswrapper[4790]: I1003 08:59:25.754495 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-kp26b" Oct 03 08:59:25 crc kubenswrapper[4790]: I1003 08:59:25.781430 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-765d8bdc5-vfk8w"] Oct 03 08:59:25 crc kubenswrapper[4790]: I1003 08:59:25.783316 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-765d8bdc5-vfk8w" Oct 03 08:59:25 crc kubenswrapper[4790]: I1003 08:59:25.792648 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Oct 03 08:59:25 crc kubenswrapper[4790]: I1003 08:59:25.805175 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-56674f8d84-pn6qq"] Oct 03 08:59:25 crc kubenswrapper[4790]: I1003 08:59:25.808384 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0db45010-78ab-486c-b5be-bc2431f3ab9d-combined-ca-bundle\") pod \"barbican-keystone-listener-56674f8d84-pn6qq\" (UID: \"0db45010-78ab-486c-b5be-bc2431f3ab9d\") " pod="openstack/barbican-keystone-listener-56674f8d84-pn6qq" Oct 03 08:59:25 crc kubenswrapper[4790]: I1003 08:59:25.808414 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0db45010-78ab-486c-b5be-bc2431f3ab9d-logs\") pod \"barbican-keystone-listener-56674f8d84-pn6qq\" (UID: \"0db45010-78ab-486c-b5be-bc2431f3ab9d\") " pod="openstack/barbican-keystone-listener-56674f8d84-pn6qq" Oct 03 08:59:25 crc kubenswrapper[4790]: I1003 08:59:25.808450 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xctpm\" (UniqueName: \"kubernetes.io/projected/0db45010-78ab-486c-b5be-bc2431f3ab9d-kube-api-access-xctpm\") pod \"barbican-keystone-listener-56674f8d84-pn6qq\" (UID: \"0db45010-78ab-486c-b5be-bc2431f3ab9d\") " pod="openstack/barbican-keystone-listener-56674f8d84-pn6qq" Oct 03 08:59:25 crc kubenswrapper[4790]: I1003 08:59:25.808511 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0db45010-78ab-486c-b5be-bc2431f3ab9d-config-data-custom\") pod \"barbican-keystone-listener-56674f8d84-pn6qq\" (UID: \"0db45010-78ab-486c-b5be-bc2431f3ab9d\") " pod="openstack/barbican-keystone-listener-56674f8d84-pn6qq" Oct 03 08:59:25 crc kubenswrapper[4790]: I1003 08:59:25.808561 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0db45010-78ab-486c-b5be-bc2431f3ab9d-config-data\") pod \"barbican-keystone-listener-56674f8d84-pn6qq\" (UID: \"0db45010-78ab-486c-b5be-bc2431f3ab9d\") " pod="openstack/barbican-keystone-listener-56674f8d84-pn6qq" Oct 03 08:59:25 crc kubenswrapper[4790]: I1003 08:59:25.823825 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-765d8bdc5-vfk8w"] Oct 03 08:59:25 crc kubenswrapper[4790]: I1003 08:59:25.869878 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7b5f8bcbf9-r6glf"] Oct 03 08:59:25 crc kubenswrapper[4790]: I1003 08:59:25.873043 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7b5f8bcbf9-r6glf" Oct 03 08:59:25 crc kubenswrapper[4790]: I1003 08:59:25.903764 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7b5f8bcbf9-r6glf"] Oct 03 08:59:25 crc kubenswrapper[4790]: I1003 08:59:25.913275 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0db45010-78ab-486c-b5be-bc2431f3ab9d-combined-ca-bundle\") pod \"barbican-keystone-listener-56674f8d84-pn6qq\" (UID: \"0db45010-78ab-486c-b5be-bc2431f3ab9d\") " pod="openstack/barbican-keystone-listener-56674f8d84-pn6qq" Oct 03 08:59:25 crc kubenswrapper[4790]: I1003 08:59:25.913318 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0db45010-78ab-486c-b5be-bc2431f3ab9d-logs\") pod \"barbican-keystone-listener-56674f8d84-pn6qq\" (UID: \"0db45010-78ab-486c-b5be-bc2431f3ab9d\") " pod="openstack/barbican-keystone-listener-56674f8d84-pn6qq" Oct 03 08:59:25 crc kubenswrapper[4790]: I1003 08:59:25.913342 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-shrzv\" (UniqueName: \"kubernetes.io/projected/e00aaa21-5532-446d-b2dd-376c96e17c55-kube-api-access-shrzv\") pod \"dnsmasq-dns-7b5f8bcbf9-r6glf\" (UID: \"e00aaa21-5532-446d-b2dd-376c96e17c55\") " pod="openstack/dnsmasq-dns-7b5f8bcbf9-r6glf" Oct 03 08:59:25 crc kubenswrapper[4790]: I1003 08:59:25.913375 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xctpm\" (UniqueName: \"kubernetes.io/projected/0db45010-78ab-486c-b5be-bc2431f3ab9d-kube-api-access-xctpm\") pod \"barbican-keystone-listener-56674f8d84-pn6qq\" (UID: \"0db45010-78ab-486c-b5be-bc2431f3ab9d\") " pod="openstack/barbican-keystone-listener-56674f8d84-pn6qq" Oct 03 08:59:25 crc kubenswrapper[4790]: I1003 08:59:25.913397 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7a3c6af9-6ba2-4e18-8779-1e71dbc175a1-config-data-custom\") pod \"barbican-worker-765d8bdc5-vfk8w\" (UID: \"7a3c6af9-6ba2-4e18-8779-1e71dbc175a1\") " pod="openstack/barbican-worker-765d8bdc5-vfk8w" Oct 03 08:59:25 crc kubenswrapper[4790]: I1003 08:59:25.913421 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e00aaa21-5532-446d-b2dd-376c96e17c55-config\") pod \"dnsmasq-dns-7b5f8bcbf9-r6glf\" (UID: \"e00aaa21-5532-446d-b2dd-376c96e17c55\") " pod="openstack/dnsmasq-dns-7b5f8bcbf9-r6glf" Oct 03 08:59:25 crc kubenswrapper[4790]: I1003 08:59:25.913450 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e00aaa21-5532-446d-b2dd-376c96e17c55-dns-svc\") pod \"dnsmasq-dns-7b5f8bcbf9-r6glf\" (UID: \"e00aaa21-5532-446d-b2dd-376c96e17c55\") " pod="openstack/dnsmasq-dns-7b5f8bcbf9-r6glf" Oct 03 08:59:25 crc kubenswrapper[4790]: I1003 08:59:25.913467 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a3c6af9-6ba2-4e18-8779-1e71dbc175a1-combined-ca-bundle\") pod \"barbican-worker-765d8bdc5-vfk8w\" (UID: \"7a3c6af9-6ba2-4e18-8779-1e71dbc175a1\") " pod="openstack/barbican-worker-765d8bdc5-vfk8w" Oct 03 08:59:25 crc kubenswrapper[4790]: I1003 08:59:25.913502 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ftc2t\" (UniqueName: \"kubernetes.io/projected/7a3c6af9-6ba2-4e18-8779-1e71dbc175a1-kube-api-access-ftc2t\") pod \"barbican-worker-765d8bdc5-vfk8w\" (UID: \"7a3c6af9-6ba2-4e18-8779-1e71dbc175a1\") " pod="openstack/barbican-worker-765d8bdc5-vfk8w" Oct 03 08:59:25 crc kubenswrapper[4790]: I1003 08:59:25.913527 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0db45010-78ab-486c-b5be-bc2431f3ab9d-config-data-custom\") pod \"barbican-keystone-listener-56674f8d84-pn6qq\" (UID: \"0db45010-78ab-486c-b5be-bc2431f3ab9d\") " pod="openstack/barbican-keystone-listener-56674f8d84-pn6qq" Oct 03 08:59:25 crc kubenswrapper[4790]: I1003 08:59:25.913545 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e00aaa21-5532-446d-b2dd-376c96e17c55-ovsdbserver-nb\") pod \"dnsmasq-dns-7b5f8bcbf9-r6glf\" (UID: \"e00aaa21-5532-446d-b2dd-376c96e17c55\") " pod="openstack/dnsmasq-dns-7b5f8bcbf9-r6glf" Oct 03 08:59:25 crc kubenswrapper[4790]: I1003 08:59:25.913580 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e00aaa21-5532-446d-b2dd-376c96e17c55-dns-swift-storage-0\") pod \"dnsmasq-dns-7b5f8bcbf9-r6glf\" (UID: \"e00aaa21-5532-446d-b2dd-376c96e17c55\") " pod="openstack/dnsmasq-dns-7b5f8bcbf9-r6glf" Oct 03 08:59:25 crc kubenswrapper[4790]: I1003 08:59:25.913623 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7a3c6af9-6ba2-4e18-8779-1e71dbc175a1-config-data\") pod \"barbican-worker-765d8bdc5-vfk8w\" (UID: \"7a3c6af9-6ba2-4e18-8779-1e71dbc175a1\") " pod="openstack/barbican-worker-765d8bdc5-vfk8w" Oct 03 08:59:25 crc kubenswrapper[4790]: I1003 08:59:25.913641 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7a3c6af9-6ba2-4e18-8779-1e71dbc175a1-logs\") pod \"barbican-worker-765d8bdc5-vfk8w\" (UID: \"7a3c6af9-6ba2-4e18-8779-1e71dbc175a1\") " pod="openstack/barbican-worker-765d8bdc5-vfk8w" Oct 03 08:59:25 crc kubenswrapper[4790]: I1003 08:59:25.913661 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0db45010-78ab-486c-b5be-bc2431f3ab9d-config-data\") pod \"barbican-keystone-listener-56674f8d84-pn6qq\" (UID: \"0db45010-78ab-486c-b5be-bc2431f3ab9d\") " pod="openstack/barbican-keystone-listener-56674f8d84-pn6qq" Oct 03 08:59:25 crc kubenswrapper[4790]: I1003 08:59:25.913686 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e00aaa21-5532-446d-b2dd-376c96e17c55-ovsdbserver-sb\") pod \"dnsmasq-dns-7b5f8bcbf9-r6glf\" (UID: \"e00aaa21-5532-446d-b2dd-376c96e17c55\") " pod="openstack/dnsmasq-dns-7b5f8bcbf9-r6glf" Oct 03 08:59:25 crc kubenswrapper[4790]: I1003 08:59:25.916837 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0db45010-78ab-486c-b5be-bc2431f3ab9d-logs\") pod \"barbican-keystone-listener-56674f8d84-pn6qq\" (UID: \"0db45010-78ab-486c-b5be-bc2431f3ab9d\") " pod="openstack/barbican-keystone-listener-56674f8d84-pn6qq" Oct 03 08:59:25 crc kubenswrapper[4790]: I1003 08:59:25.928934 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0db45010-78ab-486c-b5be-bc2431f3ab9d-config-data\") pod \"barbican-keystone-listener-56674f8d84-pn6qq\" (UID: \"0db45010-78ab-486c-b5be-bc2431f3ab9d\") " pod="openstack/barbican-keystone-listener-56674f8d84-pn6qq" Oct 03 08:59:25 crc kubenswrapper[4790]: I1003 08:59:25.930221 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0db45010-78ab-486c-b5be-bc2431f3ab9d-config-data-custom\") pod \"barbican-keystone-listener-56674f8d84-pn6qq\" (UID: \"0db45010-78ab-486c-b5be-bc2431f3ab9d\") " pod="openstack/barbican-keystone-listener-56674f8d84-pn6qq" Oct 03 08:59:25 crc kubenswrapper[4790]: I1003 08:59:25.940135 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0db45010-78ab-486c-b5be-bc2431f3ab9d-combined-ca-bundle\") pod \"barbican-keystone-listener-56674f8d84-pn6qq\" (UID: \"0db45010-78ab-486c-b5be-bc2431f3ab9d\") " pod="openstack/barbican-keystone-listener-56674f8d84-pn6qq" Oct 03 08:59:25 crc kubenswrapper[4790]: I1003 08:59:25.948331 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xctpm\" (UniqueName: \"kubernetes.io/projected/0db45010-78ab-486c-b5be-bc2431f3ab9d-kube-api-access-xctpm\") pod \"barbican-keystone-listener-56674f8d84-pn6qq\" (UID: \"0db45010-78ab-486c-b5be-bc2431f3ab9d\") " pod="openstack/barbican-keystone-listener-56674f8d84-pn6qq" Oct 03 08:59:25 crc kubenswrapper[4790]: I1003 08:59:25.969283 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-6d5469d486-dtt4h" Oct 03 08:59:25 crc kubenswrapper[4790]: I1003 08:59:25.970209 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-59b887b4f6-vdnlv"] Oct 03 08:59:25 crc kubenswrapper[4790]: I1003 08:59:25.971708 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-59b887b4f6-vdnlv" Oct 03 08:59:25 crc kubenswrapper[4790]: I1003 08:59:25.978122 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Oct 03 08:59:25 crc kubenswrapper[4790]: I1003 08:59:25.979441 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-59b887b4f6-vdnlv"] Oct 03 08:59:26 crc kubenswrapper[4790]: I1003 08:59:26.016978 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a3c6af9-6ba2-4e18-8779-1e71dbc175a1-combined-ca-bundle\") pod \"barbican-worker-765d8bdc5-vfk8w\" (UID: \"7a3c6af9-6ba2-4e18-8779-1e71dbc175a1\") " pod="openstack/barbican-worker-765d8bdc5-vfk8w" Oct 03 08:59:26 crc kubenswrapper[4790]: I1003 08:59:26.017056 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f334f308-e77d-45ff-8c57-30164f037180-config-data\") pod \"barbican-api-59b887b4f6-vdnlv\" (UID: \"f334f308-e77d-45ff-8c57-30164f037180\") " pod="openstack/barbican-api-59b887b4f6-vdnlv" Oct 03 08:59:26 crc kubenswrapper[4790]: I1003 08:59:26.017101 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ftc2t\" (UniqueName: \"kubernetes.io/projected/7a3c6af9-6ba2-4e18-8779-1e71dbc175a1-kube-api-access-ftc2t\") pod \"barbican-worker-765d8bdc5-vfk8w\" (UID: \"7a3c6af9-6ba2-4e18-8779-1e71dbc175a1\") " pod="openstack/barbican-worker-765d8bdc5-vfk8w" Oct 03 08:59:26 crc kubenswrapper[4790]: I1003 08:59:26.017153 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e00aaa21-5532-446d-b2dd-376c96e17c55-ovsdbserver-nb\") pod \"dnsmasq-dns-7b5f8bcbf9-r6glf\" (UID: \"e00aaa21-5532-446d-b2dd-376c96e17c55\") " pod="openstack/dnsmasq-dns-7b5f8bcbf9-r6glf" Oct 03 08:59:26 crc kubenswrapper[4790]: I1003 08:59:26.017229 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e00aaa21-5532-446d-b2dd-376c96e17c55-dns-swift-storage-0\") pod \"dnsmasq-dns-7b5f8bcbf9-r6glf\" (UID: \"e00aaa21-5532-446d-b2dd-376c96e17c55\") " pod="openstack/dnsmasq-dns-7b5f8bcbf9-r6glf" Oct 03 08:59:26 crc kubenswrapper[4790]: I1003 08:59:26.017247 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f334f308-e77d-45ff-8c57-30164f037180-logs\") pod \"barbican-api-59b887b4f6-vdnlv\" (UID: \"f334f308-e77d-45ff-8c57-30164f037180\") " pod="openstack/barbican-api-59b887b4f6-vdnlv" Oct 03 08:59:26 crc kubenswrapper[4790]: I1003 08:59:26.017463 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7a3c6af9-6ba2-4e18-8779-1e71dbc175a1-config-data\") pod \"barbican-worker-765d8bdc5-vfk8w\" (UID: \"7a3c6af9-6ba2-4e18-8779-1e71dbc175a1\") " pod="openstack/barbican-worker-765d8bdc5-vfk8w" Oct 03 08:59:26 crc kubenswrapper[4790]: I1003 08:59:26.017514 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7a3c6af9-6ba2-4e18-8779-1e71dbc175a1-logs\") pod \"barbican-worker-765d8bdc5-vfk8w\" (UID: \"7a3c6af9-6ba2-4e18-8779-1e71dbc175a1\") " pod="openstack/barbican-worker-765d8bdc5-vfk8w" Oct 03 08:59:26 crc kubenswrapper[4790]: I1003 08:59:26.017554 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e00aaa21-5532-446d-b2dd-376c96e17c55-ovsdbserver-sb\") pod \"dnsmasq-dns-7b5f8bcbf9-r6glf\" (UID: \"e00aaa21-5532-446d-b2dd-376c96e17c55\") " pod="openstack/dnsmasq-dns-7b5f8bcbf9-r6glf" Oct 03 08:59:26 crc kubenswrapper[4790]: I1003 08:59:26.017641 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f334f308-e77d-45ff-8c57-30164f037180-combined-ca-bundle\") pod \"barbican-api-59b887b4f6-vdnlv\" (UID: \"f334f308-e77d-45ff-8c57-30164f037180\") " pod="openstack/barbican-api-59b887b4f6-vdnlv" Oct 03 08:59:26 crc kubenswrapper[4790]: I1003 08:59:26.017705 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-shrzv\" (UniqueName: \"kubernetes.io/projected/e00aaa21-5532-446d-b2dd-376c96e17c55-kube-api-access-shrzv\") pod \"dnsmasq-dns-7b5f8bcbf9-r6glf\" (UID: \"e00aaa21-5532-446d-b2dd-376c96e17c55\") " pod="openstack/dnsmasq-dns-7b5f8bcbf9-r6glf" Oct 03 08:59:26 crc kubenswrapper[4790]: I1003 08:59:26.017744 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f334f308-e77d-45ff-8c57-30164f037180-config-data-custom\") pod \"barbican-api-59b887b4f6-vdnlv\" (UID: \"f334f308-e77d-45ff-8c57-30164f037180\") " pod="openstack/barbican-api-59b887b4f6-vdnlv" Oct 03 08:59:26 crc kubenswrapper[4790]: I1003 08:59:26.017782 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ntch7\" (UniqueName: \"kubernetes.io/projected/f334f308-e77d-45ff-8c57-30164f037180-kube-api-access-ntch7\") pod \"barbican-api-59b887b4f6-vdnlv\" (UID: \"f334f308-e77d-45ff-8c57-30164f037180\") " pod="openstack/barbican-api-59b887b4f6-vdnlv" Oct 03 08:59:26 crc kubenswrapper[4790]: I1003 08:59:26.017811 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7a3c6af9-6ba2-4e18-8779-1e71dbc175a1-config-data-custom\") pod \"barbican-worker-765d8bdc5-vfk8w\" (UID: \"7a3c6af9-6ba2-4e18-8779-1e71dbc175a1\") " pod="openstack/barbican-worker-765d8bdc5-vfk8w" Oct 03 08:59:26 crc kubenswrapper[4790]: I1003 08:59:26.018031 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e00aaa21-5532-446d-b2dd-376c96e17c55-config\") pod \"dnsmasq-dns-7b5f8bcbf9-r6glf\" (UID: \"e00aaa21-5532-446d-b2dd-376c96e17c55\") " pod="openstack/dnsmasq-dns-7b5f8bcbf9-r6glf" Oct 03 08:59:26 crc kubenswrapper[4790]: I1003 08:59:26.018229 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e00aaa21-5532-446d-b2dd-376c96e17c55-dns-svc\") pod \"dnsmasq-dns-7b5f8bcbf9-r6glf\" (UID: \"e00aaa21-5532-446d-b2dd-376c96e17c55\") " pod="openstack/dnsmasq-dns-7b5f8bcbf9-r6glf" Oct 03 08:59:26 crc kubenswrapper[4790]: I1003 08:59:26.018320 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7a3c6af9-6ba2-4e18-8779-1e71dbc175a1-logs\") pod \"barbican-worker-765d8bdc5-vfk8w\" (UID: \"7a3c6af9-6ba2-4e18-8779-1e71dbc175a1\") " pod="openstack/barbican-worker-765d8bdc5-vfk8w" Oct 03 08:59:26 crc kubenswrapper[4790]: I1003 08:59:26.018526 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e00aaa21-5532-446d-b2dd-376c96e17c55-dns-swift-storage-0\") pod \"dnsmasq-dns-7b5f8bcbf9-r6glf\" (UID: \"e00aaa21-5532-446d-b2dd-376c96e17c55\") " pod="openstack/dnsmasq-dns-7b5f8bcbf9-r6glf" Oct 03 08:59:26 crc kubenswrapper[4790]: I1003 08:59:26.020784 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e00aaa21-5532-446d-b2dd-376c96e17c55-ovsdbserver-sb\") pod \"dnsmasq-dns-7b5f8bcbf9-r6glf\" (UID: \"e00aaa21-5532-446d-b2dd-376c96e17c55\") " pod="openstack/dnsmasq-dns-7b5f8bcbf9-r6glf" Oct 03 08:59:26 crc kubenswrapper[4790]: I1003 08:59:26.020852 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e00aaa21-5532-446d-b2dd-376c96e17c55-ovsdbserver-nb\") pod \"dnsmasq-dns-7b5f8bcbf9-r6glf\" (UID: \"e00aaa21-5532-446d-b2dd-376c96e17c55\") " pod="openstack/dnsmasq-dns-7b5f8bcbf9-r6glf" Oct 03 08:59:26 crc kubenswrapper[4790]: I1003 08:59:26.020968 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e00aaa21-5532-446d-b2dd-376c96e17c55-dns-svc\") pod \"dnsmasq-dns-7b5f8bcbf9-r6glf\" (UID: \"e00aaa21-5532-446d-b2dd-376c96e17c55\") " pod="openstack/dnsmasq-dns-7b5f8bcbf9-r6glf" Oct 03 08:59:26 crc kubenswrapper[4790]: I1003 08:59:26.024462 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a3c6af9-6ba2-4e18-8779-1e71dbc175a1-combined-ca-bundle\") pod \"barbican-worker-765d8bdc5-vfk8w\" (UID: \"7a3c6af9-6ba2-4e18-8779-1e71dbc175a1\") " pod="openstack/barbican-worker-765d8bdc5-vfk8w" Oct 03 08:59:26 crc kubenswrapper[4790]: I1003 08:59:26.026234 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e00aaa21-5532-446d-b2dd-376c96e17c55-config\") pod \"dnsmasq-dns-7b5f8bcbf9-r6glf\" (UID: \"e00aaa21-5532-446d-b2dd-376c96e17c55\") " pod="openstack/dnsmasq-dns-7b5f8bcbf9-r6glf" Oct 03 08:59:26 crc kubenswrapper[4790]: I1003 08:59:26.034415 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7a3c6af9-6ba2-4e18-8779-1e71dbc175a1-config-data-custom\") pod \"barbican-worker-765d8bdc5-vfk8w\" (UID: \"7a3c6af9-6ba2-4e18-8779-1e71dbc175a1\") " pod="openstack/barbican-worker-765d8bdc5-vfk8w" Oct 03 08:59:26 crc kubenswrapper[4790]: I1003 08:59:26.038981 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ftc2t\" (UniqueName: \"kubernetes.io/projected/7a3c6af9-6ba2-4e18-8779-1e71dbc175a1-kube-api-access-ftc2t\") pod \"barbican-worker-765d8bdc5-vfk8w\" (UID: \"7a3c6af9-6ba2-4e18-8779-1e71dbc175a1\") " pod="openstack/barbican-worker-765d8bdc5-vfk8w" Oct 03 08:59:26 crc kubenswrapper[4790]: I1003 08:59:26.041531 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7a3c6af9-6ba2-4e18-8779-1e71dbc175a1-config-data\") pod \"barbican-worker-765d8bdc5-vfk8w\" (UID: \"7a3c6af9-6ba2-4e18-8779-1e71dbc175a1\") " pod="openstack/barbican-worker-765d8bdc5-vfk8w" Oct 03 08:59:26 crc kubenswrapper[4790]: I1003 08:59:26.046141 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-shrzv\" (UniqueName: \"kubernetes.io/projected/e00aaa21-5532-446d-b2dd-376c96e17c55-kube-api-access-shrzv\") pod \"dnsmasq-dns-7b5f8bcbf9-r6glf\" (UID: \"e00aaa21-5532-446d-b2dd-376c96e17c55\") " pod="openstack/dnsmasq-dns-7b5f8bcbf9-r6glf" Oct 03 08:59:26 crc kubenswrapper[4790]: I1003 08:59:26.077984 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-56674f8d84-pn6qq" Oct 03 08:59:26 crc kubenswrapper[4790]: I1003 08:59:26.126744 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-765d8bdc5-vfk8w" Oct 03 08:59:26 crc kubenswrapper[4790]: I1003 08:59:26.127531 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f334f308-e77d-45ff-8c57-30164f037180-combined-ca-bundle\") pod \"barbican-api-59b887b4f6-vdnlv\" (UID: \"f334f308-e77d-45ff-8c57-30164f037180\") " pod="openstack/barbican-api-59b887b4f6-vdnlv" Oct 03 08:59:26 crc kubenswrapper[4790]: I1003 08:59:26.127636 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f334f308-e77d-45ff-8c57-30164f037180-config-data-custom\") pod \"barbican-api-59b887b4f6-vdnlv\" (UID: \"f334f308-e77d-45ff-8c57-30164f037180\") " pod="openstack/barbican-api-59b887b4f6-vdnlv" Oct 03 08:59:26 crc kubenswrapper[4790]: I1003 08:59:26.127655 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ntch7\" (UniqueName: \"kubernetes.io/projected/f334f308-e77d-45ff-8c57-30164f037180-kube-api-access-ntch7\") pod \"barbican-api-59b887b4f6-vdnlv\" (UID: \"f334f308-e77d-45ff-8c57-30164f037180\") " pod="openstack/barbican-api-59b887b4f6-vdnlv" Oct 03 08:59:26 crc kubenswrapper[4790]: I1003 08:59:26.127701 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f334f308-e77d-45ff-8c57-30164f037180-config-data\") pod \"barbican-api-59b887b4f6-vdnlv\" (UID: \"f334f308-e77d-45ff-8c57-30164f037180\") " pod="openstack/barbican-api-59b887b4f6-vdnlv" Oct 03 08:59:26 crc kubenswrapper[4790]: I1003 08:59:26.127758 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f334f308-e77d-45ff-8c57-30164f037180-logs\") pod \"barbican-api-59b887b4f6-vdnlv\" (UID: \"f334f308-e77d-45ff-8c57-30164f037180\") " pod="openstack/barbican-api-59b887b4f6-vdnlv" Oct 03 08:59:26 crc kubenswrapper[4790]: I1003 08:59:26.128513 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f334f308-e77d-45ff-8c57-30164f037180-logs\") pod \"barbican-api-59b887b4f6-vdnlv\" (UID: \"f334f308-e77d-45ff-8c57-30164f037180\") " pod="openstack/barbican-api-59b887b4f6-vdnlv" Oct 03 08:59:26 crc kubenswrapper[4790]: I1003 08:59:26.136519 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f334f308-e77d-45ff-8c57-30164f037180-combined-ca-bundle\") pod \"barbican-api-59b887b4f6-vdnlv\" (UID: \"f334f308-e77d-45ff-8c57-30164f037180\") " pod="openstack/barbican-api-59b887b4f6-vdnlv" Oct 03 08:59:26 crc kubenswrapper[4790]: I1003 08:59:26.137145 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f334f308-e77d-45ff-8c57-30164f037180-config-data-custom\") pod \"barbican-api-59b887b4f6-vdnlv\" (UID: \"f334f308-e77d-45ff-8c57-30164f037180\") " pod="openstack/barbican-api-59b887b4f6-vdnlv" Oct 03 08:59:26 crc kubenswrapper[4790]: I1003 08:59:26.139896 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f334f308-e77d-45ff-8c57-30164f037180-config-data\") pod \"barbican-api-59b887b4f6-vdnlv\" (UID: \"f334f308-e77d-45ff-8c57-30164f037180\") " pod="openstack/barbican-api-59b887b4f6-vdnlv" Oct 03 08:59:26 crc kubenswrapper[4790]: I1003 08:59:26.149229 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ntch7\" (UniqueName: \"kubernetes.io/projected/f334f308-e77d-45ff-8c57-30164f037180-kube-api-access-ntch7\") pod \"barbican-api-59b887b4f6-vdnlv\" (UID: \"f334f308-e77d-45ff-8c57-30164f037180\") " pod="openstack/barbican-api-59b887b4f6-vdnlv" Oct 03 08:59:26 crc kubenswrapper[4790]: I1003 08:59:26.197503 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-6d5469d486-dtt4h" Oct 03 08:59:26 crc kubenswrapper[4790]: I1003 08:59:26.204147 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-59b887b4f6-vdnlv" Oct 03 08:59:26 crc kubenswrapper[4790]: I1003 08:59:26.206144 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7b5f8bcbf9-r6glf" Oct 03 08:59:26 crc kubenswrapper[4790]: I1003 08:59:26.269452 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6454d9f559-66tmq" Oct 03 08:59:26 crc kubenswrapper[4790]: I1003 08:59:26.334528 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1404dbad-1c09-4b76-a6a1-d563ed8d8bde-ovsdbserver-sb\") pod \"1404dbad-1c09-4b76-a6a1-d563ed8d8bde\" (UID: \"1404dbad-1c09-4b76-a6a1-d563ed8d8bde\") " Oct 03 08:59:26 crc kubenswrapper[4790]: I1003 08:59:26.334752 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7drk6\" (UniqueName: \"kubernetes.io/projected/1404dbad-1c09-4b76-a6a1-d563ed8d8bde-kube-api-access-7drk6\") pod \"1404dbad-1c09-4b76-a6a1-d563ed8d8bde\" (UID: \"1404dbad-1c09-4b76-a6a1-d563ed8d8bde\") " Oct 03 08:59:26 crc kubenswrapper[4790]: I1003 08:59:26.334794 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1404dbad-1c09-4b76-a6a1-d563ed8d8bde-dns-svc\") pod \"1404dbad-1c09-4b76-a6a1-d563ed8d8bde\" (UID: \"1404dbad-1c09-4b76-a6a1-d563ed8d8bde\") " Oct 03 08:59:26 crc kubenswrapper[4790]: I1003 08:59:26.334848 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1404dbad-1c09-4b76-a6a1-d563ed8d8bde-dns-swift-storage-0\") pod \"1404dbad-1c09-4b76-a6a1-d563ed8d8bde\" (UID: \"1404dbad-1c09-4b76-a6a1-d563ed8d8bde\") " Oct 03 08:59:26 crc kubenswrapper[4790]: I1003 08:59:26.334887 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1404dbad-1c09-4b76-a6a1-d563ed8d8bde-ovsdbserver-nb\") pod \"1404dbad-1c09-4b76-a6a1-d563ed8d8bde\" (UID: \"1404dbad-1c09-4b76-a6a1-d563ed8d8bde\") " Oct 03 08:59:26 crc kubenswrapper[4790]: I1003 08:59:26.334965 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1404dbad-1c09-4b76-a6a1-d563ed8d8bde-config\") pod \"1404dbad-1c09-4b76-a6a1-d563ed8d8bde\" (UID: \"1404dbad-1c09-4b76-a6a1-d563ed8d8bde\") " Oct 03 08:59:26 crc kubenswrapper[4790]: I1003 08:59:26.355877 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1404dbad-1c09-4b76-a6a1-d563ed8d8bde-kube-api-access-7drk6" (OuterVolumeSpecName: "kube-api-access-7drk6") pod "1404dbad-1c09-4b76-a6a1-d563ed8d8bde" (UID: "1404dbad-1c09-4b76-a6a1-d563ed8d8bde"). InnerVolumeSpecName "kube-api-access-7drk6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:59:26 crc kubenswrapper[4790]: I1003 08:59:26.437505 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7drk6\" (UniqueName: \"kubernetes.io/projected/1404dbad-1c09-4b76-a6a1-d563ed8d8bde-kube-api-access-7drk6\") on node \"crc\" DevicePath \"\"" Oct 03 08:59:26 crc kubenswrapper[4790]: I1003 08:59:26.470322 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1404dbad-1c09-4b76-a6a1-d563ed8d8bde-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "1404dbad-1c09-4b76-a6a1-d563ed8d8bde" (UID: "1404dbad-1c09-4b76-a6a1-d563ed8d8bde"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:59:26 crc kubenswrapper[4790]: I1003 08:59:26.485064 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1404dbad-1c09-4b76-a6a1-d563ed8d8bde-config" (OuterVolumeSpecName: "config") pod "1404dbad-1c09-4b76-a6a1-d563ed8d8bde" (UID: "1404dbad-1c09-4b76-a6a1-d563ed8d8bde"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:59:26 crc kubenswrapper[4790]: I1003 08:59:26.506459 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1404dbad-1c09-4b76-a6a1-d563ed8d8bde-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "1404dbad-1c09-4b76-a6a1-d563ed8d8bde" (UID: "1404dbad-1c09-4b76-a6a1-d563ed8d8bde"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:59:26 crc kubenswrapper[4790]: I1003 08:59:26.509271 4790 generic.go:334] "Generic (PLEG): container finished" podID="1404dbad-1c09-4b76-a6a1-d563ed8d8bde" containerID="16961df2c39ced84062e96d16f8afcb5b72c5e8edb0c1bb2c8ec402830f0fbc6" exitCode=0 Oct 03 08:59:26 crc kubenswrapper[4790]: I1003 08:59:26.509499 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6454d9f559-66tmq" Oct 03 08:59:26 crc kubenswrapper[4790]: I1003 08:59:26.510035 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6454d9f559-66tmq" event={"ID":"1404dbad-1c09-4b76-a6a1-d563ed8d8bde","Type":"ContainerDied","Data":"16961df2c39ced84062e96d16f8afcb5b72c5e8edb0c1bb2c8ec402830f0fbc6"} Oct 03 08:59:26 crc kubenswrapper[4790]: I1003 08:59:26.510061 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6454d9f559-66tmq" event={"ID":"1404dbad-1c09-4b76-a6a1-d563ed8d8bde","Type":"ContainerDied","Data":"1415f324dd7837e61466dd7387c73fb163ce3c0af88885c4d0363d2fcf35bcb5"} Oct 03 08:59:26 crc kubenswrapper[4790]: I1003 08:59:26.510076 4790 scope.go:117] "RemoveContainer" containerID="16961df2c39ced84062e96d16f8afcb5b72c5e8edb0c1bb2c8ec402830f0fbc6" Oct 03 08:59:26 crc kubenswrapper[4790]: I1003 08:59:26.510207 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1404dbad-1c09-4b76-a6a1-d563ed8d8bde-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "1404dbad-1c09-4b76-a6a1-d563ed8d8bde" (UID: "1404dbad-1c09-4b76-a6a1-d563ed8d8bde"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:59:26 crc kubenswrapper[4790]: I1003 08:59:26.522375 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1404dbad-1c09-4b76-a6a1-d563ed8d8bde-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "1404dbad-1c09-4b76-a6a1-d563ed8d8bde" (UID: "1404dbad-1c09-4b76-a6a1-d563ed8d8bde"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:59:26 crc kubenswrapper[4790]: I1003 08:59:26.539742 4790 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1404dbad-1c09-4b76-a6a1-d563ed8d8bde-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 03 08:59:26 crc kubenswrapper[4790]: I1003 08:59:26.539778 4790 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1404dbad-1c09-4b76-a6a1-d563ed8d8bde-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 03 08:59:26 crc kubenswrapper[4790]: I1003 08:59:26.539791 4790 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1404dbad-1c09-4b76-a6a1-d563ed8d8bde-config\") on node \"crc\" DevicePath \"\"" Oct 03 08:59:26 crc kubenswrapper[4790]: I1003 08:59:26.539802 4790 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1404dbad-1c09-4b76-a6a1-d563ed8d8bde-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 03 08:59:26 crc kubenswrapper[4790]: I1003 08:59:26.539813 4790 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1404dbad-1c09-4b76-a6a1-d563ed8d8bde-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 03 08:59:26 crc kubenswrapper[4790]: I1003 08:59:26.582795 4790 scope.go:117] "RemoveContainer" containerID="f0a60fb4840f95ea41ae2623394aa25ab6e9fae2e4fd798f9addf6a51bf1c24e" Oct 03 08:59:26 crc kubenswrapper[4790]: I1003 08:59:26.639524 4790 scope.go:117] "RemoveContainer" containerID="16961df2c39ced84062e96d16f8afcb5b72c5e8edb0c1bb2c8ec402830f0fbc6" Oct 03 08:59:26 crc kubenswrapper[4790]: E1003 08:59:26.640839 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"16961df2c39ced84062e96d16f8afcb5b72c5e8edb0c1bb2c8ec402830f0fbc6\": container with ID starting with 16961df2c39ced84062e96d16f8afcb5b72c5e8edb0c1bb2c8ec402830f0fbc6 not found: ID does not exist" containerID="16961df2c39ced84062e96d16f8afcb5b72c5e8edb0c1bb2c8ec402830f0fbc6" Oct 03 08:59:26 crc kubenswrapper[4790]: I1003 08:59:26.640880 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"16961df2c39ced84062e96d16f8afcb5b72c5e8edb0c1bb2c8ec402830f0fbc6"} err="failed to get container status \"16961df2c39ced84062e96d16f8afcb5b72c5e8edb0c1bb2c8ec402830f0fbc6\": rpc error: code = NotFound desc = could not find container \"16961df2c39ced84062e96d16f8afcb5b72c5e8edb0c1bb2c8ec402830f0fbc6\": container with ID starting with 16961df2c39ced84062e96d16f8afcb5b72c5e8edb0c1bb2c8ec402830f0fbc6 not found: ID does not exist" Oct 03 08:59:26 crc kubenswrapper[4790]: I1003 08:59:26.640908 4790 scope.go:117] "RemoveContainer" containerID="f0a60fb4840f95ea41ae2623394aa25ab6e9fae2e4fd798f9addf6a51bf1c24e" Oct 03 08:59:26 crc kubenswrapper[4790]: E1003 08:59:26.646751 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f0a60fb4840f95ea41ae2623394aa25ab6e9fae2e4fd798f9addf6a51bf1c24e\": container with ID starting with f0a60fb4840f95ea41ae2623394aa25ab6e9fae2e4fd798f9addf6a51bf1c24e not found: ID does not exist" containerID="f0a60fb4840f95ea41ae2623394aa25ab6e9fae2e4fd798f9addf6a51bf1c24e" Oct 03 08:59:26 crc kubenswrapper[4790]: I1003 08:59:26.646781 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f0a60fb4840f95ea41ae2623394aa25ab6e9fae2e4fd798f9addf6a51bf1c24e"} err="failed to get container status \"f0a60fb4840f95ea41ae2623394aa25ab6e9fae2e4fd798f9addf6a51bf1c24e\": rpc error: code = NotFound desc = could not find container \"f0a60fb4840f95ea41ae2623394aa25ab6e9fae2e4fd798f9addf6a51bf1c24e\": container with ID starting with f0a60fb4840f95ea41ae2623394aa25ab6e9fae2e4fd798f9addf6a51bf1c24e not found: ID does not exist" Oct 03 08:59:26 crc kubenswrapper[4790]: I1003 08:59:26.737197 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-56674f8d84-pn6qq"] Oct 03 08:59:26 crc kubenswrapper[4790]: W1003 08:59:26.740126 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0db45010_78ab_486c_b5be_bc2431f3ab9d.slice/crio-7180ca92857e9860f079f5d64f5e804601e0b066af108f2a5429a2a12f6f30d0 WatchSource:0}: Error finding container 7180ca92857e9860f079f5d64f5e804601e0b066af108f2a5429a2a12f6f30d0: Status 404 returned error can't find the container with id 7180ca92857e9860f079f5d64f5e804601e0b066af108f2a5429a2a12f6f30d0 Oct 03 08:59:26 crc kubenswrapper[4790]: I1003 08:59:26.862620 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6454d9f559-66tmq"] Oct 03 08:59:26 crc kubenswrapper[4790]: I1003 08:59:26.874038 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6454d9f559-66tmq"] Oct 03 08:59:27 crc kubenswrapper[4790]: I1003 08:59:27.129437 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-gklmz" Oct 03 08:59:27 crc kubenswrapper[4790]: I1003 08:59:27.225734 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-765d8bdc5-vfk8w"] Oct 03 08:59:27 crc kubenswrapper[4790]: I1003 08:59:27.260894 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/96250103-59af-4ec6-a036-f6e8222b02e0-etc-machine-id\") pod \"96250103-59af-4ec6-a036-f6e8222b02e0\" (UID: \"96250103-59af-4ec6-a036-f6e8222b02e0\") " Oct 03 08:59:27 crc kubenswrapper[4790]: I1003 08:59:27.261013 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/96250103-59af-4ec6-a036-f6e8222b02e0-db-sync-config-data\") pod \"96250103-59af-4ec6-a036-f6e8222b02e0\" (UID: \"96250103-59af-4ec6-a036-f6e8222b02e0\") " Oct 03 08:59:27 crc kubenswrapper[4790]: I1003 08:59:27.261040 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/96250103-59af-4ec6-a036-f6e8222b02e0-combined-ca-bundle\") pod \"96250103-59af-4ec6-a036-f6e8222b02e0\" (UID: \"96250103-59af-4ec6-a036-f6e8222b02e0\") " Oct 03 08:59:27 crc kubenswrapper[4790]: I1003 08:59:27.261074 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/96250103-59af-4ec6-a036-f6e8222b02e0-config-data\") pod \"96250103-59af-4ec6-a036-f6e8222b02e0\" (UID: \"96250103-59af-4ec6-a036-f6e8222b02e0\") " Oct 03 08:59:27 crc kubenswrapper[4790]: I1003 08:59:27.261154 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hv7rx\" (UniqueName: \"kubernetes.io/projected/96250103-59af-4ec6-a036-f6e8222b02e0-kube-api-access-hv7rx\") pod \"96250103-59af-4ec6-a036-f6e8222b02e0\" (UID: \"96250103-59af-4ec6-a036-f6e8222b02e0\") " Oct 03 08:59:27 crc kubenswrapper[4790]: I1003 08:59:27.261176 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/96250103-59af-4ec6-a036-f6e8222b02e0-scripts\") pod \"96250103-59af-4ec6-a036-f6e8222b02e0\" (UID: \"96250103-59af-4ec6-a036-f6e8222b02e0\") " Oct 03 08:59:27 crc kubenswrapper[4790]: I1003 08:59:27.261279 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/96250103-59af-4ec6-a036-f6e8222b02e0-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "96250103-59af-4ec6-a036-f6e8222b02e0" (UID: "96250103-59af-4ec6-a036-f6e8222b02e0"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 08:59:27 crc kubenswrapper[4790]: I1003 08:59:27.261687 4790 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/96250103-59af-4ec6-a036-f6e8222b02e0-etc-machine-id\") on node \"crc\" DevicePath \"\"" Oct 03 08:59:27 crc kubenswrapper[4790]: I1003 08:59:27.265501 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-59b887b4f6-vdnlv"] Oct 03 08:59:27 crc kubenswrapper[4790]: I1003 08:59:27.270933 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96250103-59af-4ec6-a036-f6e8222b02e0-scripts" (OuterVolumeSpecName: "scripts") pod "96250103-59af-4ec6-a036-f6e8222b02e0" (UID: "96250103-59af-4ec6-a036-f6e8222b02e0"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:59:27 crc kubenswrapper[4790]: I1003 08:59:27.272012 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96250103-59af-4ec6-a036-f6e8222b02e0-kube-api-access-hv7rx" (OuterVolumeSpecName: "kube-api-access-hv7rx") pod "96250103-59af-4ec6-a036-f6e8222b02e0" (UID: "96250103-59af-4ec6-a036-f6e8222b02e0"). InnerVolumeSpecName "kube-api-access-hv7rx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:59:27 crc kubenswrapper[4790]: I1003 08:59:27.273058 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96250103-59af-4ec6-a036-f6e8222b02e0-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "96250103-59af-4ec6-a036-f6e8222b02e0" (UID: "96250103-59af-4ec6-a036-f6e8222b02e0"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:59:27 crc kubenswrapper[4790]: I1003 08:59:27.314856 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96250103-59af-4ec6-a036-f6e8222b02e0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "96250103-59af-4ec6-a036-f6e8222b02e0" (UID: "96250103-59af-4ec6-a036-f6e8222b02e0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:59:27 crc kubenswrapper[4790]: I1003 08:59:27.327457 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7b5f8bcbf9-r6glf"] Oct 03 08:59:27 crc kubenswrapper[4790]: I1003 08:59:27.363736 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hv7rx\" (UniqueName: \"kubernetes.io/projected/96250103-59af-4ec6-a036-f6e8222b02e0-kube-api-access-hv7rx\") on node \"crc\" DevicePath \"\"" Oct 03 08:59:27 crc kubenswrapper[4790]: I1003 08:59:27.364093 4790 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/96250103-59af-4ec6-a036-f6e8222b02e0-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 08:59:27 crc kubenswrapper[4790]: I1003 08:59:27.364110 4790 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/96250103-59af-4ec6-a036-f6e8222b02e0-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 08:59:27 crc kubenswrapper[4790]: I1003 08:59:27.364121 4790 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/96250103-59af-4ec6-a036-f6e8222b02e0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 08:59:27 crc kubenswrapper[4790]: I1003 08:59:27.395118 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96250103-59af-4ec6-a036-f6e8222b02e0-config-data" (OuterVolumeSpecName: "config-data") pod "96250103-59af-4ec6-a036-f6e8222b02e0" (UID: "96250103-59af-4ec6-a036-f6e8222b02e0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:59:27 crc kubenswrapper[4790]: I1003 08:59:27.465929 4790 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/96250103-59af-4ec6-a036-f6e8222b02e0-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 08:59:27 crc kubenswrapper[4790]: I1003 08:59:27.530087 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-gklmz" event={"ID":"96250103-59af-4ec6-a036-f6e8222b02e0","Type":"ContainerDied","Data":"667fdf6cae749ef69d70ba0c8033b599c355a044033c89f0595179dd66ca3613"} Oct 03 08:59:27 crc kubenswrapper[4790]: I1003 08:59:27.530119 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="667fdf6cae749ef69d70ba0c8033b599c355a044033c89f0595179dd66ca3613" Oct 03 08:59:27 crc kubenswrapper[4790]: I1003 08:59:27.530168 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-gklmz" Oct 03 08:59:27 crc kubenswrapper[4790]: I1003 08:59:27.532734 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-59b887b4f6-vdnlv" event={"ID":"f334f308-e77d-45ff-8c57-30164f037180","Type":"ContainerStarted","Data":"cb394e52a9c3ce4a1011d9ebb09501a30c8b49cbbfad1db98cb632f81401d6cf"} Oct 03 08:59:27 crc kubenswrapper[4790]: I1003 08:59:27.532891 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-59b887b4f6-vdnlv" event={"ID":"f334f308-e77d-45ff-8c57-30164f037180","Type":"ContainerStarted","Data":"19ecd839288fc39ac586a2c0669071d05e5e64d95a63012ca15a166a1e3fc220"} Oct 03 08:59:27 crc kubenswrapper[4790]: I1003 08:59:27.534215 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b5f8bcbf9-r6glf" event={"ID":"e00aaa21-5532-446d-b2dd-376c96e17c55","Type":"ContainerStarted","Data":"2e4517c44314708bf6742f761df27850b9bebc48ceb3827b7e31e297a29f365f"} Oct 03 08:59:27 crc kubenswrapper[4790]: I1003 08:59:27.535439 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-56674f8d84-pn6qq" event={"ID":"0db45010-78ab-486c-b5be-bc2431f3ab9d","Type":"ContainerStarted","Data":"7180ca92857e9860f079f5d64f5e804601e0b066af108f2a5429a2a12f6f30d0"} Oct 03 08:59:27 crc kubenswrapper[4790]: I1003 08:59:27.539973 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-765d8bdc5-vfk8w" event={"ID":"7a3c6af9-6ba2-4e18-8779-1e71dbc175a1","Type":"ContainerStarted","Data":"9a5f9ea8295efc12e4a5f376bc3e1a1e1f4d6a5773d35bee91de9ada02d730b6"} Oct 03 08:59:27 crc kubenswrapper[4790]: I1003 08:59:27.720923 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-7c4ff96776-8dljp" Oct 03 08:59:27 crc kubenswrapper[4790]: I1003 08:59:27.797830 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Oct 03 08:59:27 crc kubenswrapper[4790]: E1003 08:59:27.798246 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1404dbad-1c09-4b76-a6a1-d563ed8d8bde" containerName="init" Oct 03 08:59:27 crc kubenswrapper[4790]: I1003 08:59:27.798263 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="1404dbad-1c09-4b76-a6a1-d563ed8d8bde" containerName="init" Oct 03 08:59:27 crc kubenswrapper[4790]: E1003 08:59:27.798300 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="96250103-59af-4ec6-a036-f6e8222b02e0" containerName="cinder-db-sync" Oct 03 08:59:27 crc kubenswrapper[4790]: I1003 08:59:27.798307 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="96250103-59af-4ec6-a036-f6e8222b02e0" containerName="cinder-db-sync" Oct 03 08:59:27 crc kubenswrapper[4790]: E1003 08:59:27.798319 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1404dbad-1c09-4b76-a6a1-d563ed8d8bde" containerName="dnsmasq-dns" Oct 03 08:59:27 crc kubenswrapper[4790]: I1003 08:59:27.798325 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="1404dbad-1c09-4b76-a6a1-d563ed8d8bde" containerName="dnsmasq-dns" Oct 03 08:59:27 crc kubenswrapper[4790]: I1003 08:59:27.798482 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="1404dbad-1c09-4b76-a6a1-d563ed8d8bde" containerName="dnsmasq-dns" Oct 03 08:59:27 crc kubenswrapper[4790]: I1003 08:59:27.798504 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="96250103-59af-4ec6-a036-f6e8222b02e0" containerName="cinder-db-sync" Oct 03 08:59:27 crc kubenswrapper[4790]: I1003 08:59:27.800187 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 03 08:59:27 crc kubenswrapper[4790]: I1003 08:59:27.804232 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-5pgsx" Oct 03 08:59:27 crc kubenswrapper[4790]: I1003 08:59:27.804480 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Oct 03 08:59:27 crc kubenswrapper[4790]: I1003 08:59:27.810987 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Oct 03 08:59:27 crc kubenswrapper[4790]: I1003 08:59:27.815681 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Oct 03 08:59:27 crc kubenswrapper[4790]: I1003 08:59:27.818920 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 03 08:59:27 crc kubenswrapper[4790]: I1003 08:59:27.887299 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m45rv\" (UniqueName: \"kubernetes.io/projected/dded1c26-62f8-4b21-a7da-f8d4b25bb677-kube-api-access-m45rv\") pod \"cinder-scheduler-0\" (UID: \"dded1c26-62f8-4b21-a7da-f8d4b25bb677\") " pod="openstack/cinder-scheduler-0" Oct 03 08:59:27 crc kubenswrapper[4790]: I1003 08:59:27.887612 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dded1c26-62f8-4b21-a7da-f8d4b25bb677-config-data\") pod \"cinder-scheduler-0\" (UID: \"dded1c26-62f8-4b21-a7da-f8d4b25bb677\") " pod="openstack/cinder-scheduler-0" Oct 03 08:59:27 crc kubenswrapper[4790]: I1003 08:59:27.887793 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/dded1c26-62f8-4b21-a7da-f8d4b25bb677-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"dded1c26-62f8-4b21-a7da-f8d4b25bb677\") " pod="openstack/cinder-scheduler-0" Oct 03 08:59:27 crc kubenswrapper[4790]: I1003 08:59:27.887923 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/dded1c26-62f8-4b21-a7da-f8d4b25bb677-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"dded1c26-62f8-4b21-a7da-f8d4b25bb677\") " pod="openstack/cinder-scheduler-0" Oct 03 08:59:27 crc kubenswrapper[4790]: I1003 08:59:27.887998 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dded1c26-62f8-4b21-a7da-f8d4b25bb677-scripts\") pod \"cinder-scheduler-0\" (UID: \"dded1c26-62f8-4b21-a7da-f8d4b25bb677\") " pod="openstack/cinder-scheduler-0" Oct 03 08:59:27 crc kubenswrapper[4790]: I1003 08:59:27.888103 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dded1c26-62f8-4b21-a7da-f8d4b25bb677-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"dded1c26-62f8-4b21-a7da-f8d4b25bb677\") " pod="openstack/cinder-scheduler-0" Oct 03 08:59:27 crc kubenswrapper[4790]: I1003 08:59:27.905843 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7b5f8bcbf9-r6glf"] Oct 03 08:59:27 crc kubenswrapper[4790]: I1003 08:59:27.929235 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-66cd99db9f-qwm9f"] Oct 03 08:59:27 crc kubenswrapper[4790]: I1003 08:59:27.938945 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-66cd99db9f-qwm9f"] Oct 03 08:59:27 crc kubenswrapper[4790]: I1003 08:59:27.939051 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-66cd99db9f-qwm9f" Oct 03 08:59:27 crc kubenswrapper[4790]: I1003 08:59:27.993127 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/dded1c26-62f8-4b21-a7da-f8d4b25bb677-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"dded1c26-62f8-4b21-a7da-f8d4b25bb677\") " pod="openstack/cinder-scheduler-0" Oct 03 08:59:27 crc kubenswrapper[4790]: I1003 08:59:27.993243 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/dded1c26-62f8-4b21-a7da-f8d4b25bb677-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"dded1c26-62f8-4b21-a7da-f8d4b25bb677\") " pod="openstack/cinder-scheduler-0" Oct 03 08:59:27 crc kubenswrapper[4790]: I1003 08:59:27.993272 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dded1c26-62f8-4b21-a7da-f8d4b25bb677-scripts\") pod \"cinder-scheduler-0\" (UID: \"dded1c26-62f8-4b21-a7da-f8d4b25bb677\") " pod="openstack/cinder-scheduler-0" Oct 03 08:59:27 crc kubenswrapper[4790]: I1003 08:59:27.993341 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dded1c26-62f8-4b21-a7da-f8d4b25bb677-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"dded1c26-62f8-4b21-a7da-f8d4b25bb677\") " pod="openstack/cinder-scheduler-0" Oct 03 08:59:27 crc kubenswrapper[4790]: I1003 08:59:27.993383 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m45rv\" (UniqueName: \"kubernetes.io/projected/dded1c26-62f8-4b21-a7da-f8d4b25bb677-kube-api-access-m45rv\") pod \"cinder-scheduler-0\" (UID: \"dded1c26-62f8-4b21-a7da-f8d4b25bb677\") " pod="openstack/cinder-scheduler-0" Oct 03 08:59:27 crc kubenswrapper[4790]: I1003 08:59:27.993452 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dded1c26-62f8-4b21-a7da-f8d4b25bb677-config-data\") pod \"cinder-scheduler-0\" (UID: \"dded1c26-62f8-4b21-a7da-f8d4b25bb677\") " pod="openstack/cinder-scheduler-0" Oct 03 08:59:27 crc kubenswrapper[4790]: I1003 08:59:27.997701 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/dded1c26-62f8-4b21-a7da-f8d4b25bb677-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"dded1c26-62f8-4b21-a7da-f8d4b25bb677\") " pod="openstack/cinder-scheduler-0" Oct 03 08:59:28 crc kubenswrapper[4790]: I1003 08:59:28.001479 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dded1c26-62f8-4b21-a7da-f8d4b25bb677-scripts\") pod \"cinder-scheduler-0\" (UID: \"dded1c26-62f8-4b21-a7da-f8d4b25bb677\") " pod="openstack/cinder-scheduler-0" Oct 03 08:59:28 crc kubenswrapper[4790]: I1003 08:59:28.005177 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/dded1c26-62f8-4b21-a7da-f8d4b25bb677-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"dded1c26-62f8-4b21-a7da-f8d4b25bb677\") " pod="openstack/cinder-scheduler-0" Oct 03 08:59:28 crc kubenswrapper[4790]: I1003 08:59:28.012332 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dded1c26-62f8-4b21-a7da-f8d4b25bb677-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"dded1c26-62f8-4b21-a7da-f8d4b25bb677\") " pod="openstack/cinder-scheduler-0" Oct 03 08:59:28 crc kubenswrapper[4790]: I1003 08:59:28.014854 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dded1c26-62f8-4b21-a7da-f8d4b25bb677-config-data\") pod \"cinder-scheduler-0\" (UID: \"dded1c26-62f8-4b21-a7da-f8d4b25bb677\") " pod="openstack/cinder-scheduler-0" Oct 03 08:59:28 crc kubenswrapper[4790]: I1003 08:59:28.045277 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m45rv\" (UniqueName: \"kubernetes.io/projected/dded1c26-62f8-4b21-a7da-f8d4b25bb677-kube-api-access-m45rv\") pod \"cinder-scheduler-0\" (UID: \"dded1c26-62f8-4b21-a7da-f8d4b25bb677\") " pod="openstack/cinder-scheduler-0" Oct 03 08:59:28 crc kubenswrapper[4790]: I1003 08:59:28.048856 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Oct 03 08:59:28 crc kubenswrapper[4790]: I1003 08:59:28.050543 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 03 08:59:28 crc kubenswrapper[4790]: I1003 08:59:28.064227 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Oct 03 08:59:28 crc kubenswrapper[4790]: I1003 08:59:28.087060 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Oct 03 08:59:28 crc kubenswrapper[4790]: I1003 08:59:28.105066 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5d1475d1-40ee-4daf-a4ad-f52488205cc0-ovsdbserver-nb\") pod \"dnsmasq-dns-66cd99db9f-qwm9f\" (UID: \"5d1475d1-40ee-4daf-a4ad-f52488205cc0\") " pod="openstack/dnsmasq-dns-66cd99db9f-qwm9f" Oct 03 08:59:28 crc kubenswrapper[4790]: I1003 08:59:28.105102 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5d1475d1-40ee-4daf-a4ad-f52488205cc0-dns-swift-storage-0\") pod \"dnsmasq-dns-66cd99db9f-qwm9f\" (UID: \"5d1475d1-40ee-4daf-a4ad-f52488205cc0\") " pod="openstack/dnsmasq-dns-66cd99db9f-qwm9f" Oct 03 08:59:28 crc kubenswrapper[4790]: I1003 08:59:28.105160 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8b5l6\" (UniqueName: \"kubernetes.io/projected/5d1475d1-40ee-4daf-a4ad-f52488205cc0-kube-api-access-8b5l6\") pod \"dnsmasq-dns-66cd99db9f-qwm9f\" (UID: \"5d1475d1-40ee-4daf-a4ad-f52488205cc0\") " pod="openstack/dnsmasq-dns-66cd99db9f-qwm9f" Oct 03 08:59:28 crc kubenswrapper[4790]: I1003 08:59:28.105203 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5d1475d1-40ee-4daf-a4ad-f52488205cc0-config\") pod \"dnsmasq-dns-66cd99db9f-qwm9f\" (UID: \"5d1475d1-40ee-4daf-a4ad-f52488205cc0\") " pod="openstack/dnsmasq-dns-66cd99db9f-qwm9f" Oct 03 08:59:28 crc kubenswrapper[4790]: I1003 08:59:28.105235 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5d1475d1-40ee-4daf-a4ad-f52488205cc0-dns-svc\") pod \"dnsmasq-dns-66cd99db9f-qwm9f\" (UID: \"5d1475d1-40ee-4daf-a4ad-f52488205cc0\") " pod="openstack/dnsmasq-dns-66cd99db9f-qwm9f" Oct 03 08:59:28 crc kubenswrapper[4790]: I1003 08:59:28.105264 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5d1475d1-40ee-4daf-a4ad-f52488205cc0-ovsdbserver-sb\") pod \"dnsmasq-dns-66cd99db9f-qwm9f\" (UID: \"5d1475d1-40ee-4daf-a4ad-f52488205cc0\") " pod="openstack/dnsmasq-dns-66cd99db9f-qwm9f" Oct 03 08:59:28 crc kubenswrapper[4790]: I1003 08:59:28.149532 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 03 08:59:28 crc kubenswrapper[4790]: I1003 08:59:28.207322 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8b5l6\" (UniqueName: \"kubernetes.io/projected/5d1475d1-40ee-4daf-a4ad-f52488205cc0-kube-api-access-8b5l6\") pod \"dnsmasq-dns-66cd99db9f-qwm9f\" (UID: \"5d1475d1-40ee-4daf-a4ad-f52488205cc0\") " pod="openstack/dnsmasq-dns-66cd99db9f-qwm9f" Oct 03 08:59:28 crc kubenswrapper[4790]: I1003 08:59:28.207385 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/adfeaccd-4cd7-4038-a189-050439f6407a-config-data\") pod \"cinder-api-0\" (UID: \"adfeaccd-4cd7-4038-a189-050439f6407a\") " pod="openstack/cinder-api-0" Oct 03 08:59:28 crc kubenswrapper[4790]: I1003 08:59:28.207415 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/adfeaccd-4cd7-4038-a189-050439f6407a-logs\") pod \"cinder-api-0\" (UID: \"adfeaccd-4cd7-4038-a189-050439f6407a\") " pod="openstack/cinder-api-0" Oct 03 08:59:28 crc kubenswrapper[4790]: I1003 08:59:28.207467 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jmdtr\" (UniqueName: \"kubernetes.io/projected/adfeaccd-4cd7-4038-a189-050439f6407a-kube-api-access-jmdtr\") pod \"cinder-api-0\" (UID: \"adfeaccd-4cd7-4038-a189-050439f6407a\") " pod="openstack/cinder-api-0" Oct 03 08:59:28 crc kubenswrapper[4790]: I1003 08:59:28.207487 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5d1475d1-40ee-4daf-a4ad-f52488205cc0-config\") pod \"dnsmasq-dns-66cd99db9f-qwm9f\" (UID: \"5d1475d1-40ee-4daf-a4ad-f52488205cc0\") " pod="openstack/dnsmasq-dns-66cd99db9f-qwm9f" Oct 03 08:59:28 crc kubenswrapper[4790]: I1003 08:59:28.207534 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5d1475d1-40ee-4daf-a4ad-f52488205cc0-dns-svc\") pod \"dnsmasq-dns-66cd99db9f-qwm9f\" (UID: \"5d1475d1-40ee-4daf-a4ad-f52488205cc0\") " pod="openstack/dnsmasq-dns-66cd99db9f-qwm9f" Oct 03 08:59:28 crc kubenswrapper[4790]: I1003 08:59:28.207563 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5d1475d1-40ee-4daf-a4ad-f52488205cc0-ovsdbserver-sb\") pod \"dnsmasq-dns-66cd99db9f-qwm9f\" (UID: \"5d1475d1-40ee-4daf-a4ad-f52488205cc0\") " pod="openstack/dnsmasq-dns-66cd99db9f-qwm9f" Oct 03 08:59:28 crc kubenswrapper[4790]: I1003 08:59:28.207600 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/adfeaccd-4cd7-4038-a189-050439f6407a-scripts\") pod \"cinder-api-0\" (UID: \"adfeaccd-4cd7-4038-a189-050439f6407a\") " pod="openstack/cinder-api-0" Oct 03 08:59:28 crc kubenswrapper[4790]: I1003 08:59:28.207637 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/adfeaccd-4cd7-4038-a189-050439f6407a-config-data-custom\") pod \"cinder-api-0\" (UID: \"adfeaccd-4cd7-4038-a189-050439f6407a\") " pod="openstack/cinder-api-0" Oct 03 08:59:28 crc kubenswrapper[4790]: I1003 08:59:28.207699 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5d1475d1-40ee-4daf-a4ad-f52488205cc0-ovsdbserver-nb\") pod \"dnsmasq-dns-66cd99db9f-qwm9f\" (UID: \"5d1475d1-40ee-4daf-a4ad-f52488205cc0\") " pod="openstack/dnsmasq-dns-66cd99db9f-qwm9f" Oct 03 08:59:28 crc kubenswrapper[4790]: I1003 08:59:28.207719 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5d1475d1-40ee-4daf-a4ad-f52488205cc0-dns-swift-storage-0\") pod \"dnsmasq-dns-66cd99db9f-qwm9f\" (UID: \"5d1475d1-40ee-4daf-a4ad-f52488205cc0\") " pod="openstack/dnsmasq-dns-66cd99db9f-qwm9f" Oct 03 08:59:28 crc kubenswrapper[4790]: I1003 08:59:28.207772 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/adfeaccd-4cd7-4038-a189-050439f6407a-etc-machine-id\") pod \"cinder-api-0\" (UID: \"adfeaccd-4cd7-4038-a189-050439f6407a\") " pod="openstack/cinder-api-0" Oct 03 08:59:28 crc kubenswrapper[4790]: I1003 08:59:28.207793 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/adfeaccd-4cd7-4038-a189-050439f6407a-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"adfeaccd-4cd7-4038-a189-050439f6407a\") " pod="openstack/cinder-api-0" Oct 03 08:59:28 crc kubenswrapper[4790]: I1003 08:59:28.210440 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5d1475d1-40ee-4daf-a4ad-f52488205cc0-config\") pod \"dnsmasq-dns-66cd99db9f-qwm9f\" (UID: \"5d1475d1-40ee-4daf-a4ad-f52488205cc0\") " pod="openstack/dnsmasq-dns-66cd99db9f-qwm9f" Oct 03 08:59:28 crc kubenswrapper[4790]: I1003 08:59:28.211197 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5d1475d1-40ee-4daf-a4ad-f52488205cc0-dns-svc\") pod \"dnsmasq-dns-66cd99db9f-qwm9f\" (UID: \"5d1475d1-40ee-4daf-a4ad-f52488205cc0\") " pod="openstack/dnsmasq-dns-66cd99db9f-qwm9f" Oct 03 08:59:28 crc kubenswrapper[4790]: I1003 08:59:28.212433 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5d1475d1-40ee-4daf-a4ad-f52488205cc0-ovsdbserver-sb\") pod \"dnsmasq-dns-66cd99db9f-qwm9f\" (UID: \"5d1475d1-40ee-4daf-a4ad-f52488205cc0\") " pod="openstack/dnsmasq-dns-66cd99db9f-qwm9f" Oct 03 08:59:28 crc kubenswrapper[4790]: I1003 08:59:28.216495 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5d1475d1-40ee-4daf-a4ad-f52488205cc0-dns-swift-storage-0\") pod \"dnsmasq-dns-66cd99db9f-qwm9f\" (UID: \"5d1475d1-40ee-4daf-a4ad-f52488205cc0\") " pod="openstack/dnsmasq-dns-66cd99db9f-qwm9f" Oct 03 08:59:28 crc kubenswrapper[4790]: I1003 08:59:28.221124 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5d1475d1-40ee-4daf-a4ad-f52488205cc0-ovsdbserver-nb\") pod \"dnsmasq-dns-66cd99db9f-qwm9f\" (UID: \"5d1475d1-40ee-4daf-a4ad-f52488205cc0\") " pod="openstack/dnsmasq-dns-66cd99db9f-qwm9f" Oct 03 08:59:28 crc kubenswrapper[4790]: I1003 08:59:28.235851 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8b5l6\" (UniqueName: \"kubernetes.io/projected/5d1475d1-40ee-4daf-a4ad-f52488205cc0-kube-api-access-8b5l6\") pod \"dnsmasq-dns-66cd99db9f-qwm9f\" (UID: \"5d1475d1-40ee-4daf-a4ad-f52488205cc0\") " pod="openstack/dnsmasq-dns-66cd99db9f-qwm9f" Oct 03 08:59:28 crc kubenswrapper[4790]: I1003 08:59:28.309588 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/adfeaccd-4cd7-4038-a189-050439f6407a-etc-machine-id\") pod \"cinder-api-0\" (UID: \"adfeaccd-4cd7-4038-a189-050439f6407a\") " pod="openstack/cinder-api-0" Oct 03 08:59:28 crc kubenswrapper[4790]: I1003 08:59:28.309632 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/adfeaccd-4cd7-4038-a189-050439f6407a-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"adfeaccd-4cd7-4038-a189-050439f6407a\") " pod="openstack/cinder-api-0" Oct 03 08:59:28 crc kubenswrapper[4790]: I1003 08:59:28.309682 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/adfeaccd-4cd7-4038-a189-050439f6407a-config-data\") pod \"cinder-api-0\" (UID: \"adfeaccd-4cd7-4038-a189-050439f6407a\") " pod="openstack/cinder-api-0" Oct 03 08:59:28 crc kubenswrapper[4790]: I1003 08:59:28.309782 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/adfeaccd-4cd7-4038-a189-050439f6407a-etc-machine-id\") pod \"cinder-api-0\" (UID: \"adfeaccd-4cd7-4038-a189-050439f6407a\") " pod="openstack/cinder-api-0" Oct 03 08:59:28 crc kubenswrapper[4790]: I1003 08:59:28.309708 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/adfeaccd-4cd7-4038-a189-050439f6407a-logs\") pod \"cinder-api-0\" (UID: \"adfeaccd-4cd7-4038-a189-050439f6407a\") " pod="openstack/cinder-api-0" Oct 03 08:59:28 crc kubenswrapper[4790]: I1003 08:59:28.310170 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jmdtr\" (UniqueName: \"kubernetes.io/projected/adfeaccd-4cd7-4038-a189-050439f6407a-kube-api-access-jmdtr\") pod \"cinder-api-0\" (UID: \"adfeaccd-4cd7-4038-a189-050439f6407a\") " pod="openstack/cinder-api-0" Oct 03 08:59:28 crc kubenswrapper[4790]: I1003 08:59:28.310284 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/adfeaccd-4cd7-4038-a189-050439f6407a-scripts\") pod \"cinder-api-0\" (UID: \"adfeaccd-4cd7-4038-a189-050439f6407a\") " pod="openstack/cinder-api-0" Oct 03 08:59:28 crc kubenswrapper[4790]: I1003 08:59:28.310327 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/adfeaccd-4cd7-4038-a189-050439f6407a-config-data-custom\") pod \"cinder-api-0\" (UID: \"adfeaccd-4cd7-4038-a189-050439f6407a\") " pod="openstack/cinder-api-0" Oct 03 08:59:28 crc kubenswrapper[4790]: I1003 08:59:28.312133 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/adfeaccd-4cd7-4038-a189-050439f6407a-logs\") pod \"cinder-api-0\" (UID: \"adfeaccd-4cd7-4038-a189-050439f6407a\") " pod="openstack/cinder-api-0" Oct 03 08:59:28 crc kubenswrapper[4790]: I1003 08:59:28.317158 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/adfeaccd-4cd7-4038-a189-050439f6407a-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"adfeaccd-4cd7-4038-a189-050439f6407a\") " pod="openstack/cinder-api-0" Oct 03 08:59:28 crc kubenswrapper[4790]: I1003 08:59:28.320436 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/adfeaccd-4cd7-4038-a189-050439f6407a-config-data\") pod \"cinder-api-0\" (UID: \"adfeaccd-4cd7-4038-a189-050439f6407a\") " pod="openstack/cinder-api-0" Oct 03 08:59:28 crc kubenswrapper[4790]: I1003 08:59:28.322236 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/adfeaccd-4cd7-4038-a189-050439f6407a-scripts\") pod \"cinder-api-0\" (UID: \"adfeaccd-4cd7-4038-a189-050439f6407a\") " pod="openstack/cinder-api-0" Oct 03 08:59:28 crc kubenswrapper[4790]: I1003 08:59:28.322726 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/adfeaccd-4cd7-4038-a189-050439f6407a-config-data-custom\") pod \"cinder-api-0\" (UID: \"adfeaccd-4cd7-4038-a189-050439f6407a\") " pod="openstack/cinder-api-0" Oct 03 08:59:28 crc kubenswrapper[4790]: I1003 08:59:28.333167 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jmdtr\" (UniqueName: \"kubernetes.io/projected/adfeaccd-4cd7-4038-a189-050439f6407a-kube-api-access-jmdtr\") pod \"cinder-api-0\" (UID: \"adfeaccd-4cd7-4038-a189-050439f6407a\") " pod="openstack/cinder-api-0" Oct 03 08:59:28 crc kubenswrapper[4790]: I1003 08:59:28.374123 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1404dbad-1c09-4b76-a6a1-d563ed8d8bde" path="/var/lib/kubelet/pods/1404dbad-1c09-4b76-a6a1-d563ed8d8bde/volumes" Oct 03 08:59:28 crc kubenswrapper[4790]: I1003 08:59:28.443394 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-66cd99db9f-qwm9f" Oct 03 08:59:28 crc kubenswrapper[4790]: I1003 08:59:28.443962 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 03 08:59:28 crc kubenswrapper[4790]: I1003 08:59:28.616560 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-59b887b4f6-vdnlv" event={"ID":"f334f308-e77d-45ff-8c57-30164f037180","Type":"ContainerStarted","Data":"e84566cac6111f752ce54b2300b992075f833822e1ee7f609aa4d7a9799fe899"} Oct 03 08:59:28 crc kubenswrapper[4790]: I1003 08:59:28.617220 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-59b887b4f6-vdnlv" Oct 03 08:59:28 crc kubenswrapper[4790]: I1003 08:59:28.619815 4790 generic.go:334] "Generic (PLEG): container finished" podID="e00aaa21-5532-446d-b2dd-376c96e17c55" containerID="40f77b7315ec79f9603ef9c8f95adc7229e18cb5adeb8bb35f9da41d2541281d" exitCode=0 Oct 03 08:59:28 crc kubenswrapper[4790]: I1003 08:59:28.619863 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b5f8bcbf9-r6glf" event={"ID":"e00aaa21-5532-446d-b2dd-376c96e17c55","Type":"ContainerDied","Data":"40f77b7315ec79f9603ef9c8f95adc7229e18cb5adeb8bb35f9da41d2541281d"} Oct 03 08:59:28 crc kubenswrapper[4790]: I1003 08:59:28.658655 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-59b887b4f6-vdnlv" podStartSLOduration=3.658635638 podStartE2EDuration="3.658635638s" podCreationTimestamp="2025-10-03 08:59:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 08:59:28.651303646 +0000 UTC m=+1155.004237884" watchObservedRunningTime="2025-10-03 08:59:28.658635638 +0000 UTC m=+1155.011569876" Oct 03 08:59:28 crc kubenswrapper[4790]: I1003 08:59:28.765057 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 03 08:59:29 crc kubenswrapper[4790]: I1003 08:59:29.479393 4790 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-7d76b4b9d6-n5tqf" podUID="f57970a5-79c5-4a4f-bdf6-89d25bd8d8c5" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.158:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.158:8443: connect: connection refused" Oct 03 08:59:29 crc kubenswrapper[4790]: I1003 08:59:29.479700 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-7d76b4b9d6-n5tqf" Oct 03 08:59:29 crc kubenswrapper[4790]: I1003 08:59:29.638731 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"dded1c26-62f8-4b21-a7da-f8d4b25bb677","Type":"ContainerStarted","Data":"cf596d6400578bbacd511876623c828570a7eacb8dd11836cf9f5494bed89778"} Oct 03 08:59:29 crc kubenswrapper[4790]: I1003 08:59:29.639081 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-59b887b4f6-vdnlv" Oct 03 08:59:30 crc kubenswrapper[4790]: I1003 08:59:30.392415 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Oct 03 08:59:30 crc kubenswrapper[4790]: I1003 08:59:30.393800 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 03 08:59:30 crc kubenswrapper[4790]: I1003 08:59:30.399986 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Oct 03 08:59:30 crc kubenswrapper[4790]: I1003 08:59:30.400931 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-2klb7" Oct 03 08:59:30 crc kubenswrapper[4790]: I1003 08:59:30.405131 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Oct 03 08:59:30 crc kubenswrapper[4790]: I1003 08:59:30.428141 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Oct 03 08:59:30 crc kubenswrapper[4790]: I1003 08:59:30.585544 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/f386edcc-4d35-455c-86fa-06fc2ae385cf-openstack-config\") pod \"openstackclient\" (UID: \"f386edcc-4d35-455c-86fa-06fc2ae385cf\") " pod="openstack/openstackclient" Oct 03 08:59:30 crc kubenswrapper[4790]: I1003 08:59:30.585962 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/f386edcc-4d35-455c-86fa-06fc2ae385cf-openstack-config-secret\") pod \"openstackclient\" (UID: \"f386edcc-4d35-455c-86fa-06fc2ae385cf\") " pod="openstack/openstackclient" Oct 03 08:59:30 crc kubenswrapper[4790]: I1003 08:59:30.586188 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f386edcc-4d35-455c-86fa-06fc2ae385cf-combined-ca-bundle\") pod \"openstackclient\" (UID: \"f386edcc-4d35-455c-86fa-06fc2ae385cf\") " pod="openstack/openstackclient" Oct 03 08:59:30 crc kubenswrapper[4790]: I1003 08:59:30.586459 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-675zr\" (UniqueName: \"kubernetes.io/projected/f386edcc-4d35-455c-86fa-06fc2ae385cf-kube-api-access-675zr\") pod \"openstackclient\" (UID: \"f386edcc-4d35-455c-86fa-06fc2ae385cf\") " pod="openstack/openstackclient" Oct 03 08:59:30 crc kubenswrapper[4790]: I1003 08:59:30.684804 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstackclient"] Oct 03 08:59:30 crc kubenswrapper[4790]: E1003 08:59:30.686211 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[combined-ca-bundle kube-api-access-675zr openstack-config openstack-config-secret], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/openstackclient" podUID="f386edcc-4d35-455c-86fa-06fc2ae385cf" Oct 03 08:59:30 crc kubenswrapper[4790]: I1003 08:59:30.689247 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-675zr\" (UniqueName: \"kubernetes.io/projected/f386edcc-4d35-455c-86fa-06fc2ae385cf-kube-api-access-675zr\") pod \"openstackclient\" (UID: \"f386edcc-4d35-455c-86fa-06fc2ae385cf\") " pod="openstack/openstackclient" Oct 03 08:59:30 crc kubenswrapper[4790]: I1003 08:59:30.689414 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/f386edcc-4d35-455c-86fa-06fc2ae385cf-openstack-config\") pod \"openstackclient\" (UID: \"f386edcc-4d35-455c-86fa-06fc2ae385cf\") " pod="openstack/openstackclient" Oct 03 08:59:30 crc kubenswrapper[4790]: I1003 08:59:30.689489 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/f386edcc-4d35-455c-86fa-06fc2ae385cf-openstack-config-secret\") pod \"openstackclient\" (UID: \"f386edcc-4d35-455c-86fa-06fc2ae385cf\") " pod="openstack/openstackclient" Oct 03 08:59:30 crc kubenswrapper[4790]: I1003 08:59:30.689611 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f386edcc-4d35-455c-86fa-06fc2ae385cf-combined-ca-bundle\") pod \"openstackclient\" (UID: \"f386edcc-4d35-455c-86fa-06fc2ae385cf\") " pod="openstack/openstackclient" Oct 03 08:59:30 crc kubenswrapper[4790]: I1003 08:59:30.694378 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/f386edcc-4d35-455c-86fa-06fc2ae385cf-openstack-config\") pod \"openstackclient\" (UID: \"f386edcc-4d35-455c-86fa-06fc2ae385cf\") " pod="openstack/openstackclient" Oct 03 08:59:30 crc kubenswrapper[4790]: E1003 08:59:30.696118 4790 projected.go:194] Error preparing data for projected volume kube-api-access-675zr for pod openstack/openstackclient: failed to fetch token: serviceaccounts "openstackclient-openstackclient" is forbidden: User "system:node:crc" cannot create resource "serviceaccounts/token" in API group "" in the namespace "openstack": no relationship found between node 'crc' and this object Oct 03 08:59:30 crc kubenswrapper[4790]: E1003 08:59:30.696197 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f386edcc-4d35-455c-86fa-06fc2ae385cf-kube-api-access-675zr podName:f386edcc-4d35-455c-86fa-06fc2ae385cf nodeName:}" failed. No retries permitted until 2025-10-03 08:59:31.196173977 +0000 UTC m=+1157.549108285 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-675zr" (UniqueName: "kubernetes.io/projected/f386edcc-4d35-455c-86fa-06fc2ae385cf-kube-api-access-675zr") pod "openstackclient" (UID: "f386edcc-4d35-455c-86fa-06fc2ae385cf") : failed to fetch token: serviceaccounts "openstackclient-openstackclient" is forbidden: User "system:node:crc" cannot create resource "serviceaccounts/token" in API group "" in the namespace "openstack": no relationship found between node 'crc' and this object Oct 03 08:59:30 crc kubenswrapper[4790]: I1003 08:59:30.703263 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f386edcc-4d35-455c-86fa-06fc2ae385cf-combined-ca-bundle\") pod \"openstackclient\" (UID: \"f386edcc-4d35-455c-86fa-06fc2ae385cf\") " pod="openstack/openstackclient" Oct 03 08:59:30 crc kubenswrapper[4790]: I1003 08:59:30.707283 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/f386edcc-4d35-455c-86fa-06fc2ae385cf-openstack-config-secret\") pod \"openstackclient\" (UID: \"f386edcc-4d35-455c-86fa-06fc2ae385cf\") " pod="openstack/openstackclient" Oct 03 08:59:30 crc kubenswrapper[4790]: I1003 08:59:30.717768 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstackclient"] Oct 03 08:59:30 crc kubenswrapper[4790]: I1003 08:59:30.732658 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Oct 03 08:59:30 crc kubenswrapper[4790]: I1003 08:59:30.734192 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 03 08:59:30 crc kubenswrapper[4790]: I1003 08:59:30.777924 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Oct 03 08:59:30 crc kubenswrapper[4790]: I1003 08:59:30.793021 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/7a835cde-9c23-4cf9-9473-35d01cf2c528-openstack-config\") pod \"openstackclient\" (UID: \"7a835cde-9c23-4cf9-9473-35d01cf2c528\") " pod="openstack/openstackclient" Oct 03 08:59:30 crc kubenswrapper[4790]: I1003 08:59:30.793069 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-64z2f\" (UniqueName: \"kubernetes.io/projected/7a835cde-9c23-4cf9-9473-35d01cf2c528-kube-api-access-64z2f\") pod \"openstackclient\" (UID: \"7a835cde-9c23-4cf9-9473-35d01cf2c528\") " pod="openstack/openstackclient" Oct 03 08:59:30 crc kubenswrapper[4790]: I1003 08:59:30.793121 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/7a835cde-9c23-4cf9-9473-35d01cf2c528-openstack-config-secret\") pod \"openstackclient\" (UID: \"7a835cde-9c23-4cf9-9473-35d01cf2c528\") " pod="openstack/openstackclient" Oct 03 08:59:30 crc kubenswrapper[4790]: I1003 08:59:30.793139 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a835cde-9c23-4cf9-9473-35d01cf2c528-combined-ca-bundle\") pod \"openstackclient\" (UID: \"7a835cde-9c23-4cf9-9473-35d01cf2c528\") " pod="openstack/openstackclient" Oct 03 08:59:30 crc kubenswrapper[4790]: I1003 08:59:30.895886 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/7a835cde-9c23-4cf9-9473-35d01cf2c528-openstack-config\") pod \"openstackclient\" (UID: \"7a835cde-9c23-4cf9-9473-35d01cf2c528\") " pod="openstack/openstackclient" Oct 03 08:59:30 crc kubenswrapper[4790]: I1003 08:59:30.896267 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-64z2f\" (UniqueName: \"kubernetes.io/projected/7a835cde-9c23-4cf9-9473-35d01cf2c528-kube-api-access-64z2f\") pod \"openstackclient\" (UID: \"7a835cde-9c23-4cf9-9473-35d01cf2c528\") " pod="openstack/openstackclient" Oct 03 08:59:30 crc kubenswrapper[4790]: I1003 08:59:30.896406 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/7a835cde-9c23-4cf9-9473-35d01cf2c528-openstack-config-secret\") pod \"openstackclient\" (UID: \"7a835cde-9c23-4cf9-9473-35d01cf2c528\") " pod="openstack/openstackclient" Oct 03 08:59:30 crc kubenswrapper[4790]: I1003 08:59:30.896425 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a835cde-9c23-4cf9-9473-35d01cf2c528-combined-ca-bundle\") pod \"openstackclient\" (UID: \"7a835cde-9c23-4cf9-9473-35d01cf2c528\") " pod="openstack/openstackclient" Oct 03 08:59:30 crc kubenswrapper[4790]: I1003 08:59:30.898030 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/7a835cde-9c23-4cf9-9473-35d01cf2c528-openstack-config\") pod \"openstackclient\" (UID: \"7a835cde-9c23-4cf9-9473-35d01cf2c528\") " pod="openstack/openstackclient" Oct 03 08:59:30 crc kubenswrapper[4790]: I1003 08:59:30.900490 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a835cde-9c23-4cf9-9473-35d01cf2c528-combined-ca-bundle\") pod \"openstackclient\" (UID: \"7a835cde-9c23-4cf9-9473-35d01cf2c528\") " pod="openstack/openstackclient" Oct 03 08:59:30 crc kubenswrapper[4790]: I1003 08:59:30.905914 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/7a835cde-9c23-4cf9-9473-35d01cf2c528-openstack-config-secret\") pod \"openstackclient\" (UID: \"7a835cde-9c23-4cf9-9473-35d01cf2c528\") " pod="openstack/openstackclient" Oct 03 08:59:30 crc kubenswrapper[4790]: I1003 08:59:30.915649 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-64z2f\" (UniqueName: \"kubernetes.io/projected/7a835cde-9c23-4cf9-9473-35d01cf2c528-kube-api-access-64z2f\") pod \"openstackclient\" (UID: \"7a835cde-9c23-4cf9-9473-35d01cf2c528\") " pod="openstack/openstackclient" Oct 03 08:59:30 crc kubenswrapper[4790]: I1003 08:59:30.928660 4790 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-6454d9f559-66tmq" podUID="1404dbad-1c09-4b76-a6a1-d563ed8d8bde" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.152:5353: i/o timeout" Oct 03 08:59:31 crc kubenswrapper[4790]: I1003 08:59:31.058886 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-66cd99db9f-qwm9f"] Oct 03 08:59:31 crc kubenswrapper[4790]: I1003 08:59:31.069214 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Oct 03 08:59:31 crc kubenswrapper[4790]: I1003 08:59:31.093935 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 03 08:59:31 crc kubenswrapper[4790]: W1003 08:59:31.112725 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podadfeaccd_4cd7_4038_a189_050439f6407a.slice/crio-a3465a659dcf805935038cbea705469ab0b6f95123fe9223cba9f81034531b16 WatchSource:0}: Error finding container a3465a659dcf805935038cbea705469ab0b6f95123fe9223cba9f81034531b16: Status 404 returned error can't find the container with id a3465a659dcf805935038cbea705469ab0b6f95123fe9223cba9f81034531b16 Oct 03 08:59:31 crc kubenswrapper[4790]: I1003 08:59:31.210093 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-675zr\" (UniqueName: \"kubernetes.io/projected/f386edcc-4d35-455c-86fa-06fc2ae385cf-kube-api-access-675zr\") pod \"openstackclient\" (UID: \"f386edcc-4d35-455c-86fa-06fc2ae385cf\") " pod="openstack/openstackclient" Oct 03 08:59:31 crc kubenswrapper[4790]: E1003 08:59:31.212691 4790 projected.go:194] Error preparing data for projected volume kube-api-access-675zr for pod openstack/openstackclient: failed to fetch token: serviceaccounts "openstackclient-openstackclient" is forbidden: the UID in the bound object reference (f386edcc-4d35-455c-86fa-06fc2ae385cf) does not match the UID in record. The object might have been deleted and then recreated Oct 03 08:59:31 crc kubenswrapper[4790]: E1003 08:59:31.212776 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f386edcc-4d35-455c-86fa-06fc2ae385cf-kube-api-access-675zr podName:f386edcc-4d35-455c-86fa-06fc2ae385cf nodeName:}" failed. No retries permitted until 2025-10-03 08:59:32.212752124 +0000 UTC m=+1158.565686372 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-675zr" (UniqueName: "kubernetes.io/projected/f386edcc-4d35-455c-86fa-06fc2ae385cf-kube-api-access-675zr") pod "openstackclient" (UID: "f386edcc-4d35-455c-86fa-06fc2ae385cf") : failed to fetch token: serviceaccounts "openstackclient-openstackclient" is forbidden: the UID in the bound object reference (f386edcc-4d35-455c-86fa-06fc2ae385cf) does not match the UID in record. The object might have been deleted and then recreated Oct 03 08:59:31 crc kubenswrapper[4790]: I1003 08:59:31.315989 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Oct 03 08:59:31 crc kubenswrapper[4790]: I1003 08:59:31.670307 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b5f8bcbf9-r6glf" event={"ID":"e00aaa21-5532-446d-b2dd-376c96e17c55","Type":"ContainerStarted","Data":"9ea98f8b6cc070542866e903df2dfb93129fa70bd34119938478e3dc6d32764d"} Oct 03 08:59:31 crc kubenswrapper[4790]: I1003 08:59:31.670659 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7b5f8bcbf9-r6glf" podUID="e00aaa21-5532-446d-b2dd-376c96e17c55" containerName="dnsmasq-dns" containerID="cri-o://9ea98f8b6cc070542866e903df2dfb93129fa70bd34119938478e3dc6d32764d" gracePeriod=10 Oct 03 08:59:31 crc kubenswrapper[4790]: I1003 08:59:31.670847 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7b5f8bcbf9-r6glf" Oct 03 08:59:31 crc kubenswrapper[4790]: I1003 08:59:31.679273 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-56674f8d84-pn6qq" event={"ID":"0db45010-78ab-486c-b5be-bc2431f3ab9d","Type":"ContainerStarted","Data":"43d5e9f801ec11416fbc430b7dc05152b3526e95630ec827261996561f0fa489"} Oct 03 08:59:31 crc kubenswrapper[4790]: I1003 08:59:31.679337 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-56674f8d84-pn6qq" event={"ID":"0db45010-78ab-486c-b5be-bc2431f3ab9d","Type":"ContainerStarted","Data":"7351688ecbf667b2924616157680db121073f4a55ffdb334a360345392f3d3ff"} Oct 03 08:59:31 crc kubenswrapper[4790]: I1003 08:59:31.680723 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"adfeaccd-4cd7-4038-a189-050439f6407a","Type":"ContainerStarted","Data":"a3465a659dcf805935038cbea705469ab0b6f95123fe9223cba9f81034531b16"} Oct 03 08:59:31 crc kubenswrapper[4790]: I1003 08:59:31.683602 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-765d8bdc5-vfk8w" event={"ID":"7a3c6af9-6ba2-4e18-8779-1e71dbc175a1","Type":"ContainerStarted","Data":"618e39acc901d8732884d54e801791419e21477cfcd7348ed7f62a44242f7f23"} Oct 03 08:59:31 crc kubenswrapper[4790]: I1003 08:59:31.683643 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-765d8bdc5-vfk8w" event={"ID":"7a3c6af9-6ba2-4e18-8779-1e71dbc175a1","Type":"ContainerStarted","Data":"1cfb0c3637ac5c6d261a284bcf22f34eef1acb0eb0c86944513f8eea9219822b"} Oct 03 08:59:31 crc kubenswrapper[4790]: I1003 08:59:31.685682 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 03 08:59:31 crc kubenswrapper[4790]: I1003 08:59:31.686177 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-66cd99db9f-qwm9f" event={"ID":"5d1475d1-40ee-4daf-a4ad-f52488205cc0","Type":"ContainerStarted","Data":"9f1a374df2e3e420a0d83c2145ea53c80d54efbcdf52e221e00cd34741ffcb14"} Oct 03 08:59:31 crc kubenswrapper[4790]: I1003 08:59:31.686214 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-66cd99db9f-qwm9f" event={"ID":"5d1475d1-40ee-4daf-a4ad-f52488205cc0","Type":"ContainerStarted","Data":"45f9b309c2953395a38c17e5dbdb1b0ccaa4165b1663e892840088cab640e128"} Oct 03 08:59:31 crc kubenswrapper[4790]: I1003 08:59:31.696483 4790 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="f386edcc-4d35-455c-86fa-06fc2ae385cf" podUID="7a835cde-9c23-4cf9-9473-35d01cf2c528" Oct 03 08:59:31 crc kubenswrapper[4790]: I1003 08:59:31.706396 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7b5f8bcbf9-r6glf" podStartSLOduration=6.706374633 podStartE2EDuration="6.706374633s" podCreationTimestamp="2025-10-03 08:59:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 08:59:31.694150727 +0000 UTC m=+1158.047084985" watchObservedRunningTime="2025-10-03 08:59:31.706374633 +0000 UTC m=+1158.059308871" Oct 03 08:59:31 crc kubenswrapper[4790]: I1003 08:59:31.738078 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-765d8bdc5-vfk8w" podStartSLOduration=3.513762737 podStartE2EDuration="6.738052801s" podCreationTimestamp="2025-10-03 08:59:25 +0000 UTC" firstStartedPulling="2025-10-03 08:59:27.20064976 +0000 UTC m=+1153.553583988" lastFinishedPulling="2025-10-03 08:59:30.424939814 +0000 UTC m=+1156.777874052" observedRunningTime="2025-10-03 08:59:31.721428845 +0000 UTC m=+1158.074363103" watchObservedRunningTime="2025-10-03 08:59:31.738052801 +0000 UTC m=+1158.090987039" Oct 03 08:59:31 crc kubenswrapper[4790]: I1003 08:59:31.850331 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Oct 03 08:59:31 crc kubenswrapper[4790]: I1003 08:59:31.889771 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-56674f8d84-pn6qq" podStartSLOduration=3.208414349 podStartE2EDuration="6.889750578s" podCreationTimestamp="2025-10-03 08:59:25 +0000 UTC" firstStartedPulling="2025-10-03 08:59:26.743402239 +0000 UTC m=+1153.096336477" lastFinishedPulling="2025-10-03 08:59:30.424738478 +0000 UTC m=+1156.777672706" observedRunningTime="2025-10-03 08:59:31.78764057 +0000 UTC m=+1158.140574838" watchObservedRunningTime="2025-10-03 08:59:31.889750578 +0000 UTC m=+1158.242684816" Oct 03 08:59:31 crc kubenswrapper[4790]: I1003 08:59:31.928923 4790 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack/watcher-decision-engine-0" Oct 03 08:59:31 crc kubenswrapper[4790]: I1003 08:59:31.929353 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-decision-engine-0" Oct 03 08:59:31 crc kubenswrapper[4790]: I1003 08:59:31.930027 4790 scope.go:117] "RemoveContainer" containerID="e09e195cd4e8efea622e902164e68ef566c1f2278445f5df4794272fd43c9eb9" Oct 03 08:59:31 crc kubenswrapper[4790]: I1003 08:59:31.951440 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 03 08:59:32 crc kubenswrapper[4790]: I1003 08:59:32.054817 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/f386edcc-4d35-455c-86fa-06fc2ae385cf-openstack-config\") pod \"f386edcc-4d35-455c-86fa-06fc2ae385cf\" (UID: \"f386edcc-4d35-455c-86fa-06fc2ae385cf\") " Oct 03 08:59:32 crc kubenswrapper[4790]: I1003 08:59:32.054881 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/f386edcc-4d35-455c-86fa-06fc2ae385cf-openstack-config-secret\") pod \"f386edcc-4d35-455c-86fa-06fc2ae385cf\" (UID: \"f386edcc-4d35-455c-86fa-06fc2ae385cf\") " Oct 03 08:59:32 crc kubenswrapper[4790]: I1003 08:59:32.054949 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f386edcc-4d35-455c-86fa-06fc2ae385cf-combined-ca-bundle\") pod \"f386edcc-4d35-455c-86fa-06fc2ae385cf\" (UID: \"f386edcc-4d35-455c-86fa-06fc2ae385cf\") " Oct 03 08:59:32 crc kubenswrapper[4790]: I1003 08:59:32.056570 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f386edcc-4d35-455c-86fa-06fc2ae385cf-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "f386edcc-4d35-455c-86fa-06fc2ae385cf" (UID: "f386edcc-4d35-455c-86fa-06fc2ae385cf"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:59:32 crc kubenswrapper[4790]: I1003 08:59:32.058269 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-675zr\" (UniqueName: \"kubernetes.io/projected/f386edcc-4d35-455c-86fa-06fc2ae385cf-kube-api-access-675zr\") on node \"crc\" DevicePath \"\"" Oct 03 08:59:32 crc kubenswrapper[4790]: I1003 08:59:32.058643 4790 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/f386edcc-4d35-455c-86fa-06fc2ae385cf-openstack-config\") on node \"crc\" DevicePath \"\"" Oct 03 08:59:32 crc kubenswrapper[4790]: I1003 08:59:32.062706 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f386edcc-4d35-455c-86fa-06fc2ae385cf-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f386edcc-4d35-455c-86fa-06fc2ae385cf" (UID: "f386edcc-4d35-455c-86fa-06fc2ae385cf"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:59:32 crc kubenswrapper[4790]: I1003 08:59:32.063459 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f386edcc-4d35-455c-86fa-06fc2ae385cf-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "f386edcc-4d35-455c-86fa-06fc2ae385cf" (UID: "f386edcc-4d35-455c-86fa-06fc2ae385cf"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:59:32 crc kubenswrapper[4790]: I1003 08:59:32.160050 4790 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/f386edcc-4d35-455c-86fa-06fc2ae385cf-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Oct 03 08:59:32 crc kubenswrapper[4790]: I1003 08:59:32.160082 4790 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f386edcc-4d35-455c-86fa-06fc2ae385cf-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 08:59:32 crc kubenswrapper[4790]: E1003 08:59:32.310136 4790 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddf20e14b_d7d8_4cab_8b7c_61122da75fbf.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod64754289_91b5_4f84_9809_6271b6a13031.slice/crio-b0ff64c50c118d67a4cf15daa2b8ee6c4f295601ca118a67059cebe19412eeaf\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod64754289_91b5_4f84_9809_6271b6a13031.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddf20e14b_d7d8_4cab_8b7c_61122da75fbf.slice/crio-166a3036570750727b0dd758229bd28aafecd40eb06b2140fd160ae44b5f7978\": RecentStats: unable to find data in memory cache]" Oct 03 08:59:32 crc kubenswrapper[4790]: I1003 08:59:32.342840 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7b5f8bcbf9-r6glf" Oct 03 08:59:32 crc kubenswrapper[4790]: I1003 08:59:32.356708 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f386edcc-4d35-455c-86fa-06fc2ae385cf" path="/var/lib/kubelet/pods/f386edcc-4d35-455c-86fa-06fc2ae385cf/volumes" Oct 03 08:59:32 crc kubenswrapper[4790]: I1003 08:59:32.464684 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e00aaa21-5532-446d-b2dd-376c96e17c55-dns-svc\") pod \"e00aaa21-5532-446d-b2dd-376c96e17c55\" (UID: \"e00aaa21-5532-446d-b2dd-376c96e17c55\") " Oct 03 08:59:32 crc kubenswrapper[4790]: I1003 08:59:32.465007 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-shrzv\" (UniqueName: \"kubernetes.io/projected/e00aaa21-5532-446d-b2dd-376c96e17c55-kube-api-access-shrzv\") pod \"e00aaa21-5532-446d-b2dd-376c96e17c55\" (UID: \"e00aaa21-5532-446d-b2dd-376c96e17c55\") " Oct 03 08:59:32 crc kubenswrapper[4790]: I1003 08:59:32.465043 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e00aaa21-5532-446d-b2dd-376c96e17c55-ovsdbserver-sb\") pod \"e00aaa21-5532-446d-b2dd-376c96e17c55\" (UID: \"e00aaa21-5532-446d-b2dd-376c96e17c55\") " Oct 03 08:59:32 crc kubenswrapper[4790]: I1003 08:59:32.465125 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e00aaa21-5532-446d-b2dd-376c96e17c55-ovsdbserver-nb\") pod \"e00aaa21-5532-446d-b2dd-376c96e17c55\" (UID: \"e00aaa21-5532-446d-b2dd-376c96e17c55\") " Oct 03 08:59:32 crc kubenswrapper[4790]: I1003 08:59:32.465191 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e00aaa21-5532-446d-b2dd-376c96e17c55-dns-swift-storage-0\") pod \"e00aaa21-5532-446d-b2dd-376c96e17c55\" (UID: \"e00aaa21-5532-446d-b2dd-376c96e17c55\") " Oct 03 08:59:32 crc kubenswrapper[4790]: I1003 08:59:32.465240 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e00aaa21-5532-446d-b2dd-376c96e17c55-config\") pod \"e00aaa21-5532-446d-b2dd-376c96e17c55\" (UID: \"e00aaa21-5532-446d-b2dd-376c96e17c55\") " Oct 03 08:59:32 crc kubenswrapper[4790]: I1003 08:59:32.494690 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e00aaa21-5532-446d-b2dd-376c96e17c55-kube-api-access-shrzv" (OuterVolumeSpecName: "kube-api-access-shrzv") pod "e00aaa21-5532-446d-b2dd-376c96e17c55" (UID: "e00aaa21-5532-446d-b2dd-376c96e17c55"). InnerVolumeSpecName "kube-api-access-shrzv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:59:32 crc kubenswrapper[4790]: I1003 08:59:32.567175 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-shrzv\" (UniqueName: \"kubernetes.io/projected/e00aaa21-5532-446d-b2dd-376c96e17c55-kube-api-access-shrzv\") on node \"crc\" DevicePath \"\"" Oct 03 08:59:32 crc kubenswrapper[4790]: I1003 08:59:32.581853 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e00aaa21-5532-446d-b2dd-376c96e17c55-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "e00aaa21-5532-446d-b2dd-376c96e17c55" (UID: "e00aaa21-5532-446d-b2dd-376c96e17c55"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:59:32 crc kubenswrapper[4790]: I1003 08:59:32.584751 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e00aaa21-5532-446d-b2dd-376c96e17c55-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "e00aaa21-5532-446d-b2dd-376c96e17c55" (UID: "e00aaa21-5532-446d-b2dd-376c96e17c55"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:59:32 crc kubenswrapper[4790]: I1003 08:59:32.606262 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e00aaa21-5532-446d-b2dd-376c96e17c55-config" (OuterVolumeSpecName: "config") pod "e00aaa21-5532-446d-b2dd-376c96e17c55" (UID: "e00aaa21-5532-446d-b2dd-376c96e17c55"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:59:32 crc kubenswrapper[4790]: I1003 08:59:32.610010 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e00aaa21-5532-446d-b2dd-376c96e17c55-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "e00aaa21-5532-446d-b2dd-376c96e17c55" (UID: "e00aaa21-5532-446d-b2dd-376c96e17c55"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:59:32 crc kubenswrapper[4790]: I1003 08:59:32.639874 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e00aaa21-5532-446d-b2dd-376c96e17c55-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "e00aaa21-5532-446d-b2dd-376c96e17c55" (UID: "e00aaa21-5532-446d-b2dd-376c96e17c55"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:59:32 crc kubenswrapper[4790]: I1003 08:59:32.669471 4790 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e00aaa21-5532-446d-b2dd-376c96e17c55-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 03 08:59:32 crc kubenswrapper[4790]: I1003 08:59:32.669515 4790 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e00aaa21-5532-446d-b2dd-376c96e17c55-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 03 08:59:32 crc kubenswrapper[4790]: I1003 08:59:32.669530 4790 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e00aaa21-5532-446d-b2dd-376c96e17c55-config\") on node \"crc\" DevicePath \"\"" Oct 03 08:59:32 crc kubenswrapper[4790]: I1003 08:59:32.669541 4790 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e00aaa21-5532-446d-b2dd-376c96e17c55-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 03 08:59:32 crc kubenswrapper[4790]: I1003 08:59:32.669552 4790 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e00aaa21-5532-446d-b2dd-376c96e17c55-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 03 08:59:32 crc kubenswrapper[4790]: I1003 08:59:32.710883 4790 generic.go:334] "Generic (PLEG): container finished" podID="5d1475d1-40ee-4daf-a4ad-f52488205cc0" containerID="9f1a374df2e3e420a0d83c2145ea53c80d54efbcdf52e221e00cd34741ffcb14" exitCode=0 Oct 03 08:59:32 crc kubenswrapper[4790]: I1003 08:59:32.711013 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-66cd99db9f-qwm9f" event={"ID":"5d1475d1-40ee-4daf-a4ad-f52488205cc0","Type":"ContainerDied","Data":"9f1a374df2e3e420a0d83c2145ea53c80d54efbcdf52e221e00cd34741ffcb14"} Oct 03 08:59:32 crc kubenswrapper[4790]: I1003 08:59:32.727052 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"7a835cde-9c23-4cf9-9473-35d01cf2c528","Type":"ContainerStarted","Data":"fb46ff18a4124cb51890c9366e29eed2002dbad6a5d374edd244c0e1d856c780"} Oct 03 08:59:32 crc kubenswrapper[4790]: I1003 08:59:32.745399 4790 generic.go:334] "Generic (PLEG): container finished" podID="e00aaa21-5532-446d-b2dd-376c96e17c55" containerID="9ea98f8b6cc070542866e903df2dfb93129fa70bd34119938478e3dc6d32764d" exitCode=0 Oct 03 08:59:32 crc kubenswrapper[4790]: I1003 08:59:32.745482 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b5f8bcbf9-r6glf" event={"ID":"e00aaa21-5532-446d-b2dd-376c96e17c55","Type":"ContainerDied","Data":"9ea98f8b6cc070542866e903df2dfb93129fa70bd34119938478e3dc6d32764d"} Oct 03 08:59:32 crc kubenswrapper[4790]: I1003 08:59:32.745509 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b5f8bcbf9-r6glf" event={"ID":"e00aaa21-5532-446d-b2dd-376c96e17c55","Type":"ContainerDied","Data":"2e4517c44314708bf6742f761df27850b9bebc48ceb3827b7e31e297a29f365f"} Oct 03 08:59:32 crc kubenswrapper[4790]: I1003 08:59:32.745525 4790 scope.go:117] "RemoveContainer" containerID="9ea98f8b6cc070542866e903df2dfb93129fa70bd34119938478e3dc6d32764d" Oct 03 08:59:32 crc kubenswrapper[4790]: I1003 08:59:32.745669 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7b5f8bcbf9-r6glf" Oct 03 08:59:32 crc kubenswrapper[4790]: I1003 08:59:32.755610 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"f0afbb0f-f4dc-4814-8c7f-791351972c0b","Type":"ContainerStarted","Data":"f7e1596a2e3a93c9acb16c7a72441fad846a55cd8aad3832a2f6428d26e70811"} Oct 03 08:59:32 crc kubenswrapper[4790]: I1003 08:59:32.761852 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 03 08:59:32 crc kubenswrapper[4790]: I1003 08:59:32.761983 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"dded1c26-62f8-4b21-a7da-f8d4b25bb677","Type":"ContainerStarted","Data":"5c5f8b72278b652f33128ff7085aae88d1656148c0307c10c8c9d6bfefa75a2c"} Oct 03 08:59:32 crc kubenswrapper[4790]: I1003 08:59:32.811304 4790 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="f386edcc-4d35-455c-86fa-06fc2ae385cf" podUID="7a835cde-9c23-4cf9-9473-35d01cf2c528" Oct 03 08:59:32 crc kubenswrapper[4790]: I1003 08:59:32.821500 4790 scope.go:117] "RemoveContainer" containerID="40f77b7315ec79f9603ef9c8f95adc7229e18cb5adeb8bb35f9da41d2541281d" Oct 03 08:59:32 crc kubenswrapper[4790]: I1003 08:59:32.848276 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7b5f8bcbf9-r6glf"] Oct 03 08:59:32 crc kubenswrapper[4790]: I1003 08:59:32.878313 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7b5f8bcbf9-r6glf"] Oct 03 08:59:32 crc kubenswrapper[4790]: I1003 08:59:32.899831 4790 scope.go:117] "RemoveContainer" containerID="9ea98f8b6cc070542866e903df2dfb93129fa70bd34119938478e3dc6d32764d" Oct 03 08:59:32 crc kubenswrapper[4790]: E1003 08:59:32.900976 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9ea98f8b6cc070542866e903df2dfb93129fa70bd34119938478e3dc6d32764d\": container with ID starting with 9ea98f8b6cc070542866e903df2dfb93129fa70bd34119938478e3dc6d32764d not found: ID does not exist" containerID="9ea98f8b6cc070542866e903df2dfb93129fa70bd34119938478e3dc6d32764d" Oct 03 08:59:32 crc kubenswrapper[4790]: I1003 08:59:32.901017 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9ea98f8b6cc070542866e903df2dfb93129fa70bd34119938478e3dc6d32764d"} err="failed to get container status \"9ea98f8b6cc070542866e903df2dfb93129fa70bd34119938478e3dc6d32764d\": rpc error: code = NotFound desc = could not find container \"9ea98f8b6cc070542866e903df2dfb93129fa70bd34119938478e3dc6d32764d\": container with ID starting with 9ea98f8b6cc070542866e903df2dfb93129fa70bd34119938478e3dc6d32764d not found: ID does not exist" Oct 03 08:59:32 crc kubenswrapper[4790]: I1003 08:59:32.901041 4790 scope.go:117] "RemoveContainer" containerID="40f77b7315ec79f9603ef9c8f95adc7229e18cb5adeb8bb35f9da41d2541281d" Oct 03 08:59:32 crc kubenswrapper[4790]: E1003 08:59:32.901258 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"40f77b7315ec79f9603ef9c8f95adc7229e18cb5adeb8bb35f9da41d2541281d\": container with ID starting with 40f77b7315ec79f9603ef9c8f95adc7229e18cb5adeb8bb35f9da41d2541281d not found: ID does not exist" containerID="40f77b7315ec79f9603ef9c8f95adc7229e18cb5adeb8bb35f9da41d2541281d" Oct 03 08:59:32 crc kubenswrapper[4790]: I1003 08:59:32.901288 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"40f77b7315ec79f9603ef9c8f95adc7229e18cb5adeb8bb35f9da41d2541281d"} err="failed to get container status \"40f77b7315ec79f9603ef9c8f95adc7229e18cb5adeb8bb35f9da41d2541281d\": rpc error: code = NotFound desc = could not find container \"40f77b7315ec79f9603ef9c8f95adc7229e18cb5adeb8bb35f9da41d2541281d\": container with ID starting with 40f77b7315ec79f9603ef9c8f95adc7229e18cb5adeb8bb35f9da41d2541281d not found: ID does not exist" Oct 03 08:59:32 crc kubenswrapper[4790]: I1003 08:59:32.963912 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-55549c44b-l2c25"] Oct 03 08:59:32 crc kubenswrapper[4790]: E1003 08:59:32.964486 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e00aaa21-5532-446d-b2dd-376c96e17c55" containerName="dnsmasq-dns" Oct 03 08:59:32 crc kubenswrapper[4790]: I1003 08:59:32.964511 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="e00aaa21-5532-446d-b2dd-376c96e17c55" containerName="dnsmasq-dns" Oct 03 08:59:32 crc kubenswrapper[4790]: E1003 08:59:32.964563 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e00aaa21-5532-446d-b2dd-376c96e17c55" containerName="init" Oct 03 08:59:32 crc kubenswrapper[4790]: I1003 08:59:32.964587 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="e00aaa21-5532-446d-b2dd-376c96e17c55" containerName="init" Oct 03 08:59:32 crc kubenswrapper[4790]: I1003 08:59:32.964818 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="e00aaa21-5532-446d-b2dd-376c96e17c55" containerName="dnsmasq-dns" Oct 03 08:59:32 crc kubenswrapper[4790]: I1003 08:59:32.966157 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-55549c44b-l2c25" Oct 03 08:59:32 crc kubenswrapper[4790]: I1003 08:59:32.982851 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-55549c44b-l2c25"] Oct 03 08:59:32 crc kubenswrapper[4790]: I1003 08:59:32.989921 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-public-svc" Oct 03 08:59:33 crc kubenswrapper[4790]: I1003 08:59:33.007954 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-internal-svc" Oct 03 08:59:33 crc kubenswrapper[4790]: I1003 08:59:33.095035 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f1f79552-9846-40d8-b095-ac773b5af079-internal-tls-certs\") pod \"barbican-api-55549c44b-l2c25\" (UID: \"f1f79552-9846-40d8-b095-ac773b5af079\") " pod="openstack/barbican-api-55549c44b-l2c25" Oct 03 08:59:33 crc kubenswrapper[4790]: I1003 08:59:33.095111 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f1f79552-9846-40d8-b095-ac773b5af079-public-tls-certs\") pod \"barbican-api-55549c44b-l2c25\" (UID: \"f1f79552-9846-40d8-b095-ac773b5af079\") " pod="openstack/barbican-api-55549c44b-l2c25" Oct 03 08:59:33 crc kubenswrapper[4790]: I1003 08:59:33.095139 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pl2s6\" (UniqueName: \"kubernetes.io/projected/f1f79552-9846-40d8-b095-ac773b5af079-kube-api-access-pl2s6\") pod \"barbican-api-55549c44b-l2c25\" (UID: \"f1f79552-9846-40d8-b095-ac773b5af079\") " pod="openstack/barbican-api-55549c44b-l2c25" Oct 03 08:59:33 crc kubenswrapper[4790]: I1003 08:59:33.095182 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f1f79552-9846-40d8-b095-ac773b5af079-config-data-custom\") pod \"barbican-api-55549c44b-l2c25\" (UID: \"f1f79552-9846-40d8-b095-ac773b5af079\") " pod="openstack/barbican-api-55549c44b-l2c25" Oct 03 08:59:33 crc kubenswrapper[4790]: I1003 08:59:33.095207 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f1f79552-9846-40d8-b095-ac773b5af079-config-data\") pod \"barbican-api-55549c44b-l2c25\" (UID: \"f1f79552-9846-40d8-b095-ac773b5af079\") " pod="openstack/barbican-api-55549c44b-l2c25" Oct 03 08:59:33 crc kubenswrapper[4790]: I1003 08:59:33.095260 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f1f79552-9846-40d8-b095-ac773b5af079-logs\") pod \"barbican-api-55549c44b-l2c25\" (UID: \"f1f79552-9846-40d8-b095-ac773b5af079\") " pod="openstack/barbican-api-55549c44b-l2c25" Oct 03 08:59:33 crc kubenswrapper[4790]: I1003 08:59:33.095297 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f1f79552-9846-40d8-b095-ac773b5af079-combined-ca-bundle\") pod \"barbican-api-55549c44b-l2c25\" (UID: \"f1f79552-9846-40d8-b095-ac773b5af079\") " pod="openstack/barbican-api-55549c44b-l2c25" Oct 03 08:59:33 crc kubenswrapper[4790]: I1003 08:59:33.197724 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f1f79552-9846-40d8-b095-ac773b5af079-internal-tls-certs\") pod \"barbican-api-55549c44b-l2c25\" (UID: \"f1f79552-9846-40d8-b095-ac773b5af079\") " pod="openstack/barbican-api-55549c44b-l2c25" Oct 03 08:59:33 crc kubenswrapper[4790]: I1003 08:59:33.197790 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f1f79552-9846-40d8-b095-ac773b5af079-public-tls-certs\") pod \"barbican-api-55549c44b-l2c25\" (UID: \"f1f79552-9846-40d8-b095-ac773b5af079\") " pod="openstack/barbican-api-55549c44b-l2c25" Oct 03 08:59:33 crc kubenswrapper[4790]: I1003 08:59:33.197816 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pl2s6\" (UniqueName: \"kubernetes.io/projected/f1f79552-9846-40d8-b095-ac773b5af079-kube-api-access-pl2s6\") pod \"barbican-api-55549c44b-l2c25\" (UID: \"f1f79552-9846-40d8-b095-ac773b5af079\") " pod="openstack/barbican-api-55549c44b-l2c25" Oct 03 08:59:33 crc kubenswrapper[4790]: I1003 08:59:33.197855 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f1f79552-9846-40d8-b095-ac773b5af079-config-data-custom\") pod \"barbican-api-55549c44b-l2c25\" (UID: \"f1f79552-9846-40d8-b095-ac773b5af079\") " pod="openstack/barbican-api-55549c44b-l2c25" Oct 03 08:59:33 crc kubenswrapper[4790]: I1003 08:59:33.197877 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f1f79552-9846-40d8-b095-ac773b5af079-config-data\") pod \"barbican-api-55549c44b-l2c25\" (UID: \"f1f79552-9846-40d8-b095-ac773b5af079\") " pod="openstack/barbican-api-55549c44b-l2c25" Oct 03 08:59:33 crc kubenswrapper[4790]: I1003 08:59:33.197925 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f1f79552-9846-40d8-b095-ac773b5af079-logs\") pod \"barbican-api-55549c44b-l2c25\" (UID: \"f1f79552-9846-40d8-b095-ac773b5af079\") " pod="openstack/barbican-api-55549c44b-l2c25" Oct 03 08:59:33 crc kubenswrapper[4790]: I1003 08:59:33.197960 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f1f79552-9846-40d8-b095-ac773b5af079-combined-ca-bundle\") pod \"barbican-api-55549c44b-l2c25\" (UID: \"f1f79552-9846-40d8-b095-ac773b5af079\") " pod="openstack/barbican-api-55549c44b-l2c25" Oct 03 08:59:33 crc kubenswrapper[4790]: I1003 08:59:33.205054 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f1f79552-9846-40d8-b095-ac773b5af079-logs\") pod \"barbican-api-55549c44b-l2c25\" (UID: \"f1f79552-9846-40d8-b095-ac773b5af079\") " pod="openstack/barbican-api-55549c44b-l2c25" Oct 03 08:59:33 crc kubenswrapper[4790]: I1003 08:59:33.213187 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f1f79552-9846-40d8-b095-ac773b5af079-internal-tls-certs\") pod \"barbican-api-55549c44b-l2c25\" (UID: \"f1f79552-9846-40d8-b095-ac773b5af079\") " pod="openstack/barbican-api-55549c44b-l2c25" Oct 03 08:59:33 crc kubenswrapper[4790]: I1003 08:59:33.218273 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pl2s6\" (UniqueName: \"kubernetes.io/projected/f1f79552-9846-40d8-b095-ac773b5af079-kube-api-access-pl2s6\") pod \"barbican-api-55549c44b-l2c25\" (UID: \"f1f79552-9846-40d8-b095-ac773b5af079\") " pod="openstack/barbican-api-55549c44b-l2c25" Oct 03 08:59:33 crc kubenswrapper[4790]: I1003 08:59:33.220484 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f1f79552-9846-40d8-b095-ac773b5af079-config-data-custom\") pod \"barbican-api-55549c44b-l2c25\" (UID: \"f1f79552-9846-40d8-b095-ac773b5af079\") " pod="openstack/barbican-api-55549c44b-l2c25" Oct 03 08:59:33 crc kubenswrapper[4790]: I1003 08:59:33.221600 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f1f79552-9846-40d8-b095-ac773b5af079-config-data\") pod \"barbican-api-55549c44b-l2c25\" (UID: \"f1f79552-9846-40d8-b095-ac773b5af079\") " pod="openstack/barbican-api-55549c44b-l2c25" Oct 03 08:59:33 crc kubenswrapper[4790]: I1003 08:59:33.222082 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f1f79552-9846-40d8-b095-ac773b5af079-combined-ca-bundle\") pod \"barbican-api-55549c44b-l2c25\" (UID: \"f1f79552-9846-40d8-b095-ac773b5af079\") " pod="openstack/barbican-api-55549c44b-l2c25" Oct 03 08:59:33 crc kubenswrapper[4790]: I1003 08:59:33.232219 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f1f79552-9846-40d8-b095-ac773b5af079-public-tls-certs\") pod \"barbican-api-55549c44b-l2c25\" (UID: \"f1f79552-9846-40d8-b095-ac773b5af079\") " pod="openstack/barbican-api-55549c44b-l2c25" Oct 03 08:59:33 crc kubenswrapper[4790]: I1003 08:59:33.343206 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-55549c44b-l2c25" Oct 03 08:59:33 crc kubenswrapper[4790]: I1003 08:59:33.791028 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-66cd99db9f-qwm9f" event={"ID":"5d1475d1-40ee-4daf-a4ad-f52488205cc0","Type":"ContainerStarted","Data":"b690c39b3b3f4a00e09558c8151b25a3c526c946ee83f9dd9530853ac267bfdb"} Oct 03 08:59:33 crc kubenswrapper[4790]: I1003 08:59:33.792559 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-66cd99db9f-qwm9f" Oct 03 08:59:33 crc kubenswrapper[4790]: I1003 08:59:33.801359 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"adfeaccd-4cd7-4038-a189-050439f6407a","Type":"ContainerStarted","Data":"9afdc3825298a6690ac8ddc251e334d2f7b17c687a5f5f95a33c197bba4f5618"} Oct 03 08:59:33 crc kubenswrapper[4790]: I1003 08:59:33.812743 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"dded1c26-62f8-4b21-a7da-f8d4b25bb677","Type":"ContainerStarted","Data":"763b42a2a4547ba508e9d42da461fa8e7d0cb131165109e046d88f98df39bcb0"} Oct 03 08:59:33 crc kubenswrapper[4790]: I1003 08:59:33.814250 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-66cd99db9f-qwm9f" podStartSLOduration=6.81423477 podStartE2EDuration="6.81423477s" podCreationTimestamp="2025-10-03 08:59:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 08:59:33.811979248 +0000 UTC m=+1160.164913506" watchObservedRunningTime="2025-10-03 08:59:33.81423477 +0000 UTC m=+1160.167169008" Oct 03 08:59:33 crc kubenswrapper[4790]: I1003 08:59:33.872964 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=5.444858322 podStartE2EDuration="6.872933779s" podCreationTimestamp="2025-10-03 08:59:27 +0000 UTC" firstStartedPulling="2025-10-03 08:59:29.355761973 +0000 UTC m=+1155.708696211" lastFinishedPulling="2025-10-03 08:59:30.78383743 +0000 UTC m=+1157.136771668" observedRunningTime="2025-10-03 08:59:33.848638683 +0000 UTC m=+1160.201572921" watchObservedRunningTime="2025-10-03 08:59:33.872933779 +0000 UTC m=+1160.225868017" Oct 03 08:59:33 crc kubenswrapper[4790]: I1003 08:59:33.985247 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-55549c44b-l2c25"] Oct 03 08:59:34 crc kubenswrapper[4790]: I1003 08:59:34.351030 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e00aaa21-5532-446d-b2dd-376c96e17c55" path="/var/lib/kubelet/pods/e00aaa21-5532-446d-b2dd-376c96e17c55/volumes" Oct 03 08:59:34 crc kubenswrapper[4790]: I1003 08:59:34.837308 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"adfeaccd-4cd7-4038-a189-050439f6407a","Type":"ContainerStarted","Data":"4f2cbcb0d2c058b0dfd6e72c36d3b3ee0bfd21d2339dfc7369bb8990645f2f29"} Oct 03 08:59:34 crc kubenswrapper[4790]: I1003 08:59:34.837857 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="adfeaccd-4cd7-4038-a189-050439f6407a" containerName="cinder-api-log" containerID="cri-o://9afdc3825298a6690ac8ddc251e334d2f7b17c687a5f5f95a33c197bba4f5618" gracePeriod=30 Oct 03 08:59:34 crc kubenswrapper[4790]: I1003 08:59:34.838022 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Oct 03 08:59:34 crc kubenswrapper[4790]: I1003 08:59:34.838287 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="adfeaccd-4cd7-4038-a189-050439f6407a" containerName="cinder-api" containerID="cri-o://4f2cbcb0d2c058b0dfd6e72c36d3b3ee0bfd21d2339dfc7369bb8990645f2f29" gracePeriod=30 Oct 03 08:59:34 crc kubenswrapper[4790]: I1003 08:59:34.844725 4790 generic.go:334] "Generic (PLEG): container finished" podID="f57970a5-79c5-4a4f-bdf6-89d25bd8d8c5" containerID="bc33e5db07b735662f20eb2b839ba0dbfaa1141b576d9dfed09994c044e63804" exitCode=137 Oct 03 08:59:34 crc kubenswrapper[4790]: I1003 08:59:34.844835 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7d76b4b9d6-n5tqf" event={"ID":"f57970a5-79c5-4a4f-bdf6-89d25bd8d8c5","Type":"ContainerDied","Data":"bc33e5db07b735662f20eb2b839ba0dbfaa1141b576d9dfed09994c044e63804"} Oct 03 08:59:34 crc kubenswrapper[4790]: I1003 08:59:34.868241 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-55549c44b-l2c25" event={"ID":"f1f79552-9846-40d8-b095-ac773b5af079","Type":"ContainerStarted","Data":"4c1d8edc4f1ec89982e9305e0e324be62dc784f0a0e2c87bc7b6adbf406317f5"} Oct 03 08:59:34 crc kubenswrapper[4790]: I1003 08:59:34.868295 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-55549c44b-l2c25" event={"ID":"f1f79552-9846-40d8-b095-ac773b5af079","Type":"ContainerStarted","Data":"e63719f3ed649c25cfe79024fd0721983f2f5dde5d27d45200070ba579cf9316"} Oct 03 08:59:34 crc kubenswrapper[4790]: I1003 08:59:34.868313 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-55549c44b-l2c25" event={"ID":"f1f79552-9846-40d8-b095-ac773b5af079","Type":"ContainerStarted","Data":"429909f6f70270848e7a8bc9f124b95e6efabf86027b080d68b5edce8da846d7"} Oct 03 08:59:34 crc kubenswrapper[4790]: I1003 08:59:34.874403 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-55549c44b-l2c25" Oct 03 08:59:34 crc kubenswrapper[4790]: I1003 08:59:34.914674 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=7.914635668 podStartE2EDuration="7.914635668s" podCreationTimestamp="2025-10-03 08:59:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 08:59:34.865982974 +0000 UTC m=+1161.218917202" watchObservedRunningTime="2025-10-03 08:59:34.914635668 +0000 UTC m=+1161.267569906" Oct 03 08:59:35 crc kubenswrapper[4790]: I1003 08:59:35.016708 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7d76b4b9d6-n5tqf" Oct 03 08:59:35 crc kubenswrapper[4790]: I1003 08:59:35.039543 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-55549c44b-l2c25" podStartSLOduration=3.03952766 podStartE2EDuration="3.03952766s" podCreationTimestamp="2025-10-03 08:59:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 08:59:34.90706937 +0000 UTC m=+1161.260003628" watchObservedRunningTime="2025-10-03 08:59:35.03952766 +0000 UTC m=+1161.392461898" Oct 03 08:59:35 crc kubenswrapper[4790]: I1003 08:59:35.174242 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/f57970a5-79c5-4a4f-bdf6-89d25bd8d8c5-horizon-secret-key\") pod \"f57970a5-79c5-4a4f-bdf6-89d25bd8d8c5\" (UID: \"f57970a5-79c5-4a4f-bdf6-89d25bd8d8c5\") " Oct 03 08:59:35 crc kubenswrapper[4790]: I1003 08:59:35.174335 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f57970a5-79c5-4a4f-bdf6-89d25bd8d8c5-combined-ca-bundle\") pod \"f57970a5-79c5-4a4f-bdf6-89d25bd8d8c5\" (UID: \"f57970a5-79c5-4a4f-bdf6-89d25bd8d8c5\") " Oct 03 08:59:35 crc kubenswrapper[4790]: I1003 08:59:35.174393 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/f57970a5-79c5-4a4f-bdf6-89d25bd8d8c5-horizon-tls-certs\") pod \"f57970a5-79c5-4a4f-bdf6-89d25bd8d8c5\" (UID: \"f57970a5-79c5-4a4f-bdf6-89d25bd8d8c5\") " Oct 03 08:59:35 crc kubenswrapper[4790]: I1003 08:59:35.174468 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zx4h7\" (UniqueName: \"kubernetes.io/projected/f57970a5-79c5-4a4f-bdf6-89d25bd8d8c5-kube-api-access-zx4h7\") pod \"f57970a5-79c5-4a4f-bdf6-89d25bd8d8c5\" (UID: \"f57970a5-79c5-4a4f-bdf6-89d25bd8d8c5\") " Oct 03 08:59:35 crc kubenswrapper[4790]: I1003 08:59:35.174524 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f57970a5-79c5-4a4f-bdf6-89d25bd8d8c5-logs\") pod \"f57970a5-79c5-4a4f-bdf6-89d25bd8d8c5\" (UID: \"f57970a5-79c5-4a4f-bdf6-89d25bd8d8c5\") " Oct 03 08:59:35 crc kubenswrapper[4790]: I1003 08:59:35.174600 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f57970a5-79c5-4a4f-bdf6-89d25bd8d8c5-config-data\") pod \"f57970a5-79c5-4a4f-bdf6-89d25bd8d8c5\" (UID: \"f57970a5-79c5-4a4f-bdf6-89d25bd8d8c5\") " Oct 03 08:59:35 crc kubenswrapper[4790]: I1003 08:59:35.174629 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f57970a5-79c5-4a4f-bdf6-89d25bd8d8c5-scripts\") pod \"f57970a5-79c5-4a4f-bdf6-89d25bd8d8c5\" (UID: \"f57970a5-79c5-4a4f-bdf6-89d25bd8d8c5\") " Oct 03 08:59:35 crc kubenswrapper[4790]: I1003 08:59:35.179409 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f57970a5-79c5-4a4f-bdf6-89d25bd8d8c5-logs" (OuterVolumeSpecName: "logs") pod "f57970a5-79c5-4a4f-bdf6-89d25bd8d8c5" (UID: "f57970a5-79c5-4a4f-bdf6-89d25bd8d8c5"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 08:59:35 crc kubenswrapper[4790]: I1003 08:59:35.188655 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f57970a5-79c5-4a4f-bdf6-89d25bd8d8c5-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "f57970a5-79c5-4a4f-bdf6-89d25bd8d8c5" (UID: "f57970a5-79c5-4a4f-bdf6-89d25bd8d8c5"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:59:35 crc kubenswrapper[4790]: I1003 08:59:35.191038 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f57970a5-79c5-4a4f-bdf6-89d25bd8d8c5-kube-api-access-zx4h7" (OuterVolumeSpecName: "kube-api-access-zx4h7") pod "f57970a5-79c5-4a4f-bdf6-89d25bd8d8c5" (UID: "f57970a5-79c5-4a4f-bdf6-89d25bd8d8c5"). InnerVolumeSpecName "kube-api-access-zx4h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:59:35 crc kubenswrapper[4790]: I1003 08:59:35.210951 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f57970a5-79c5-4a4f-bdf6-89d25bd8d8c5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f57970a5-79c5-4a4f-bdf6-89d25bd8d8c5" (UID: "f57970a5-79c5-4a4f-bdf6-89d25bd8d8c5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:59:35 crc kubenswrapper[4790]: I1003 08:59:35.220804 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f57970a5-79c5-4a4f-bdf6-89d25bd8d8c5-scripts" (OuterVolumeSpecName: "scripts") pod "f57970a5-79c5-4a4f-bdf6-89d25bd8d8c5" (UID: "f57970a5-79c5-4a4f-bdf6-89d25bd8d8c5"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:59:35 crc kubenswrapper[4790]: I1003 08:59:35.230918 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f57970a5-79c5-4a4f-bdf6-89d25bd8d8c5-config-data" (OuterVolumeSpecName: "config-data") pod "f57970a5-79c5-4a4f-bdf6-89d25bd8d8c5" (UID: "f57970a5-79c5-4a4f-bdf6-89d25bd8d8c5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:59:35 crc kubenswrapper[4790]: I1003 08:59:35.261679 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f57970a5-79c5-4a4f-bdf6-89d25bd8d8c5-horizon-tls-certs" (OuterVolumeSpecName: "horizon-tls-certs") pod "f57970a5-79c5-4a4f-bdf6-89d25bd8d8c5" (UID: "f57970a5-79c5-4a4f-bdf6-89d25bd8d8c5"). InnerVolumeSpecName "horizon-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:59:35 crc kubenswrapper[4790]: I1003 08:59:35.276779 4790 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f57970a5-79c5-4a4f-bdf6-89d25bd8d8c5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 08:59:35 crc kubenswrapper[4790]: I1003 08:59:35.276808 4790 reconciler_common.go:293] "Volume detached for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/f57970a5-79c5-4a4f-bdf6-89d25bd8d8c5-horizon-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 03 08:59:35 crc kubenswrapper[4790]: I1003 08:59:35.276819 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zx4h7\" (UniqueName: \"kubernetes.io/projected/f57970a5-79c5-4a4f-bdf6-89d25bd8d8c5-kube-api-access-zx4h7\") on node \"crc\" DevicePath \"\"" Oct 03 08:59:35 crc kubenswrapper[4790]: I1003 08:59:35.276830 4790 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f57970a5-79c5-4a4f-bdf6-89d25bd8d8c5-logs\") on node \"crc\" DevicePath \"\"" Oct 03 08:59:35 crc kubenswrapper[4790]: I1003 08:59:35.276840 4790 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f57970a5-79c5-4a4f-bdf6-89d25bd8d8c5-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 08:59:35 crc kubenswrapper[4790]: I1003 08:59:35.276849 4790 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f57970a5-79c5-4a4f-bdf6-89d25bd8d8c5-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 08:59:35 crc kubenswrapper[4790]: I1003 08:59:35.276859 4790 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/f57970a5-79c5-4a4f-bdf6-89d25bd8d8c5-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Oct 03 08:59:35 crc kubenswrapper[4790]: I1003 08:59:35.864651 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 03 08:59:35 crc kubenswrapper[4790]: I1003 08:59:35.888132 4790 generic.go:334] "Generic (PLEG): container finished" podID="adfeaccd-4cd7-4038-a189-050439f6407a" containerID="4f2cbcb0d2c058b0dfd6e72c36d3b3ee0bfd21d2339dfc7369bb8990645f2f29" exitCode=0 Oct 03 08:59:35 crc kubenswrapper[4790]: I1003 08:59:35.888191 4790 generic.go:334] "Generic (PLEG): container finished" podID="adfeaccd-4cd7-4038-a189-050439f6407a" containerID="9afdc3825298a6690ac8ddc251e334d2f7b17c687a5f5f95a33c197bba4f5618" exitCode=143 Oct 03 08:59:35 crc kubenswrapper[4790]: I1003 08:59:35.888274 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"adfeaccd-4cd7-4038-a189-050439f6407a","Type":"ContainerDied","Data":"4f2cbcb0d2c058b0dfd6e72c36d3b3ee0bfd21d2339dfc7369bb8990645f2f29"} Oct 03 08:59:35 crc kubenswrapper[4790]: I1003 08:59:35.888329 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"adfeaccd-4cd7-4038-a189-050439f6407a","Type":"ContainerDied","Data":"9afdc3825298a6690ac8ddc251e334d2f7b17c687a5f5f95a33c197bba4f5618"} Oct 03 08:59:35 crc kubenswrapper[4790]: I1003 08:59:35.888347 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"adfeaccd-4cd7-4038-a189-050439f6407a","Type":"ContainerDied","Data":"a3465a659dcf805935038cbea705469ab0b6f95123fe9223cba9f81034531b16"} Oct 03 08:59:35 crc kubenswrapper[4790]: I1003 08:59:35.888366 4790 scope.go:117] "RemoveContainer" containerID="4f2cbcb0d2c058b0dfd6e72c36d3b3ee0bfd21d2339dfc7369bb8990645f2f29" Oct 03 08:59:35 crc kubenswrapper[4790]: I1003 08:59:35.888566 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 03 08:59:35 crc kubenswrapper[4790]: I1003 08:59:35.907224 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7d76b4b9d6-n5tqf" event={"ID":"f57970a5-79c5-4a4f-bdf6-89d25bd8d8c5","Type":"ContainerDied","Data":"80ff8f257cad22f768c65b4fbdbfe5c80ba083d64adccf248419774a119fc165"} Oct 03 08:59:35 crc kubenswrapper[4790]: I1003 08:59:35.907747 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-55549c44b-l2c25" Oct 03 08:59:35 crc kubenswrapper[4790]: I1003 08:59:35.907906 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7d76b4b9d6-n5tqf" Oct 03 08:59:35 crc kubenswrapper[4790]: I1003 08:59:35.962718 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-7d76b4b9d6-n5tqf"] Oct 03 08:59:35 crc kubenswrapper[4790]: I1003 08:59:35.979512 4790 scope.go:117] "RemoveContainer" containerID="9afdc3825298a6690ac8ddc251e334d2f7b17c687a5f5f95a33c197bba4f5618" Oct 03 08:59:35 crc kubenswrapper[4790]: I1003 08:59:35.981225 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-7d76b4b9d6-n5tqf"] Oct 03 08:59:35 crc kubenswrapper[4790]: I1003 08:59:35.996524 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/adfeaccd-4cd7-4038-a189-050439f6407a-config-data\") pod \"adfeaccd-4cd7-4038-a189-050439f6407a\" (UID: \"adfeaccd-4cd7-4038-a189-050439f6407a\") " Oct 03 08:59:35 crc kubenswrapper[4790]: I1003 08:59:35.996591 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/adfeaccd-4cd7-4038-a189-050439f6407a-etc-machine-id\") pod \"adfeaccd-4cd7-4038-a189-050439f6407a\" (UID: \"adfeaccd-4cd7-4038-a189-050439f6407a\") " Oct 03 08:59:35 crc kubenswrapper[4790]: I1003 08:59:35.996708 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/adfeaccd-4cd7-4038-a189-050439f6407a-scripts\") pod \"adfeaccd-4cd7-4038-a189-050439f6407a\" (UID: \"adfeaccd-4cd7-4038-a189-050439f6407a\") " Oct 03 08:59:35 crc kubenswrapper[4790]: I1003 08:59:35.996780 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/adfeaccd-4cd7-4038-a189-050439f6407a-logs\") pod \"adfeaccd-4cd7-4038-a189-050439f6407a\" (UID: \"adfeaccd-4cd7-4038-a189-050439f6407a\") " Oct 03 08:59:35 crc kubenswrapper[4790]: I1003 08:59:35.996814 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/adfeaccd-4cd7-4038-a189-050439f6407a-config-data-custom\") pod \"adfeaccd-4cd7-4038-a189-050439f6407a\" (UID: \"adfeaccd-4cd7-4038-a189-050439f6407a\") " Oct 03 08:59:35 crc kubenswrapper[4790]: I1003 08:59:35.996837 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/adfeaccd-4cd7-4038-a189-050439f6407a-combined-ca-bundle\") pod \"adfeaccd-4cd7-4038-a189-050439f6407a\" (UID: \"adfeaccd-4cd7-4038-a189-050439f6407a\") " Oct 03 08:59:35 crc kubenswrapper[4790]: I1003 08:59:35.996910 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/adfeaccd-4cd7-4038-a189-050439f6407a-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "adfeaccd-4cd7-4038-a189-050439f6407a" (UID: "adfeaccd-4cd7-4038-a189-050439f6407a"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 08:59:35 crc kubenswrapper[4790]: I1003 08:59:35.996964 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jmdtr\" (UniqueName: \"kubernetes.io/projected/adfeaccd-4cd7-4038-a189-050439f6407a-kube-api-access-jmdtr\") pod \"adfeaccd-4cd7-4038-a189-050439f6407a\" (UID: \"adfeaccd-4cd7-4038-a189-050439f6407a\") " Oct 03 08:59:35 crc kubenswrapper[4790]: I1003 08:59:35.997755 4790 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/adfeaccd-4cd7-4038-a189-050439f6407a-etc-machine-id\") on node \"crc\" DevicePath \"\"" Oct 03 08:59:36 crc kubenswrapper[4790]: I1003 08:59:35.999091 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/adfeaccd-4cd7-4038-a189-050439f6407a-logs" (OuterVolumeSpecName: "logs") pod "adfeaccd-4cd7-4038-a189-050439f6407a" (UID: "adfeaccd-4cd7-4038-a189-050439f6407a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 08:59:36 crc kubenswrapper[4790]: I1003 08:59:36.007329 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/adfeaccd-4cd7-4038-a189-050439f6407a-kube-api-access-jmdtr" (OuterVolumeSpecName: "kube-api-access-jmdtr") pod "adfeaccd-4cd7-4038-a189-050439f6407a" (UID: "adfeaccd-4cd7-4038-a189-050439f6407a"). InnerVolumeSpecName "kube-api-access-jmdtr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:59:36 crc kubenswrapper[4790]: I1003 08:59:36.015565 4790 scope.go:117] "RemoveContainer" containerID="4f2cbcb0d2c058b0dfd6e72c36d3b3ee0bfd21d2339dfc7369bb8990645f2f29" Oct 03 08:59:36 crc kubenswrapper[4790]: E1003 08:59:36.016133 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4f2cbcb0d2c058b0dfd6e72c36d3b3ee0bfd21d2339dfc7369bb8990645f2f29\": container with ID starting with 4f2cbcb0d2c058b0dfd6e72c36d3b3ee0bfd21d2339dfc7369bb8990645f2f29 not found: ID does not exist" containerID="4f2cbcb0d2c058b0dfd6e72c36d3b3ee0bfd21d2339dfc7369bb8990645f2f29" Oct 03 08:59:36 crc kubenswrapper[4790]: I1003 08:59:36.016171 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4f2cbcb0d2c058b0dfd6e72c36d3b3ee0bfd21d2339dfc7369bb8990645f2f29"} err="failed to get container status \"4f2cbcb0d2c058b0dfd6e72c36d3b3ee0bfd21d2339dfc7369bb8990645f2f29\": rpc error: code = NotFound desc = could not find container \"4f2cbcb0d2c058b0dfd6e72c36d3b3ee0bfd21d2339dfc7369bb8990645f2f29\": container with ID starting with 4f2cbcb0d2c058b0dfd6e72c36d3b3ee0bfd21d2339dfc7369bb8990645f2f29 not found: ID does not exist" Oct 03 08:59:36 crc kubenswrapper[4790]: I1003 08:59:36.016196 4790 scope.go:117] "RemoveContainer" containerID="9afdc3825298a6690ac8ddc251e334d2f7b17c687a5f5f95a33c197bba4f5618" Oct 03 08:59:36 crc kubenswrapper[4790]: E1003 08:59:36.016534 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9afdc3825298a6690ac8ddc251e334d2f7b17c687a5f5f95a33c197bba4f5618\": container with ID starting with 9afdc3825298a6690ac8ddc251e334d2f7b17c687a5f5f95a33c197bba4f5618 not found: ID does not exist" containerID="9afdc3825298a6690ac8ddc251e334d2f7b17c687a5f5f95a33c197bba4f5618" Oct 03 08:59:36 crc kubenswrapper[4790]: I1003 08:59:36.016566 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9afdc3825298a6690ac8ddc251e334d2f7b17c687a5f5f95a33c197bba4f5618"} err="failed to get container status \"9afdc3825298a6690ac8ddc251e334d2f7b17c687a5f5f95a33c197bba4f5618\": rpc error: code = NotFound desc = could not find container \"9afdc3825298a6690ac8ddc251e334d2f7b17c687a5f5f95a33c197bba4f5618\": container with ID starting with 9afdc3825298a6690ac8ddc251e334d2f7b17c687a5f5f95a33c197bba4f5618 not found: ID does not exist" Oct 03 08:59:36 crc kubenswrapper[4790]: I1003 08:59:36.016795 4790 scope.go:117] "RemoveContainer" containerID="4f2cbcb0d2c058b0dfd6e72c36d3b3ee0bfd21d2339dfc7369bb8990645f2f29" Oct 03 08:59:36 crc kubenswrapper[4790]: I1003 08:59:36.018246 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4f2cbcb0d2c058b0dfd6e72c36d3b3ee0bfd21d2339dfc7369bb8990645f2f29"} err="failed to get container status \"4f2cbcb0d2c058b0dfd6e72c36d3b3ee0bfd21d2339dfc7369bb8990645f2f29\": rpc error: code = NotFound desc = could not find container \"4f2cbcb0d2c058b0dfd6e72c36d3b3ee0bfd21d2339dfc7369bb8990645f2f29\": container with ID starting with 4f2cbcb0d2c058b0dfd6e72c36d3b3ee0bfd21d2339dfc7369bb8990645f2f29 not found: ID does not exist" Oct 03 08:59:36 crc kubenswrapper[4790]: I1003 08:59:36.018273 4790 scope.go:117] "RemoveContainer" containerID="9afdc3825298a6690ac8ddc251e334d2f7b17c687a5f5f95a33c197bba4f5618" Oct 03 08:59:36 crc kubenswrapper[4790]: I1003 08:59:36.019508 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9afdc3825298a6690ac8ddc251e334d2f7b17c687a5f5f95a33c197bba4f5618"} err="failed to get container status \"9afdc3825298a6690ac8ddc251e334d2f7b17c687a5f5f95a33c197bba4f5618\": rpc error: code = NotFound desc = could not find container \"9afdc3825298a6690ac8ddc251e334d2f7b17c687a5f5f95a33c197bba4f5618\": container with ID starting with 9afdc3825298a6690ac8ddc251e334d2f7b17c687a5f5f95a33c197bba4f5618 not found: ID does not exist" Oct 03 08:59:36 crc kubenswrapper[4790]: I1003 08:59:36.019533 4790 scope.go:117] "RemoveContainer" containerID="4a4cc8546043f16febf7615f741031a66032e2c6d6c4ccd1550b14aee53138a2" Oct 03 08:59:36 crc kubenswrapper[4790]: I1003 08:59:36.026467 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/adfeaccd-4cd7-4038-a189-050439f6407a-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "adfeaccd-4cd7-4038-a189-050439f6407a" (UID: "adfeaccd-4cd7-4038-a189-050439f6407a"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:59:36 crc kubenswrapper[4790]: I1003 08:59:36.026714 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/adfeaccd-4cd7-4038-a189-050439f6407a-scripts" (OuterVolumeSpecName: "scripts") pod "adfeaccd-4cd7-4038-a189-050439f6407a" (UID: "adfeaccd-4cd7-4038-a189-050439f6407a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:59:36 crc kubenswrapper[4790]: I1003 08:59:36.058056 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/adfeaccd-4cd7-4038-a189-050439f6407a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "adfeaccd-4cd7-4038-a189-050439f6407a" (UID: "adfeaccd-4cd7-4038-a189-050439f6407a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:59:36 crc kubenswrapper[4790]: I1003 08:59:36.063151 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/adfeaccd-4cd7-4038-a189-050439f6407a-config-data" (OuterVolumeSpecName: "config-data") pod "adfeaccd-4cd7-4038-a189-050439f6407a" (UID: "adfeaccd-4cd7-4038-a189-050439f6407a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:59:36 crc kubenswrapper[4790]: I1003 08:59:36.101364 4790 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/adfeaccd-4cd7-4038-a189-050439f6407a-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 08:59:36 crc kubenswrapper[4790]: I1003 08:59:36.101401 4790 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/adfeaccd-4cd7-4038-a189-050439f6407a-logs\") on node \"crc\" DevicePath \"\"" Oct 03 08:59:36 crc kubenswrapper[4790]: I1003 08:59:36.101414 4790 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/adfeaccd-4cd7-4038-a189-050439f6407a-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 03 08:59:36 crc kubenswrapper[4790]: I1003 08:59:36.101427 4790 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/adfeaccd-4cd7-4038-a189-050439f6407a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 08:59:36 crc kubenswrapper[4790]: I1003 08:59:36.101441 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jmdtr\" (UniqueName: \"kubernetes.io/projected/adfeaccd-4cd7-4038-a189-050439f6407a-kube-api-access-jmdtr\") on node \"crc\" DevicePath \"\"" Oct 03 08:59:36 crc kubenswrapper[4790]: I1003 08:59:36.101452 4790 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/adfeaccd-4cd7-4038-a189-050439f6407a-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 08:59:36 crc kubenswrapper[4790]: I1003 08:59:36.223299 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Oct 03 08:59:36 crc kubenswrapper[4790]: I1003 08:59:36.245130 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Oct 03 08:59:36 crc kubenswrapper[4790]: I1003 08:59:36.295163 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Oct 03 08:59:36 crc kubenswrapper[4790]: E1003 08:59:36.299248 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f57970a5-79c5-4a4f-bdf6-89d25bd8d8c5" containerName="horizon" Oct 03 08:59:36 crc kubenswrapper[4790]: I1003 08:59:36.299287 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="f57970a5-79c5-4a4f-bdf6-89d25bd8d8c5" containerName="horizon" Oct 03 08:59:36 crc kubenswrapper[4790]: E1003 08:59:36.299314 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="adfeaccd-4cd7-4038-a189-050439f6407a" containerName="cinder-api-log" Oct 03 08:59:36 crc kubenswrapper[4790]: I1003 08:59:36.299322 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="adfeaccd-4cd7-4038-a189-050439f6407a" containerName="cinder-api-log" Oct 03 08:59:36 crc kubenswrapper[4790]: E1003 08:59:36.299352 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="adfeaccd-4cd7-4038-a189-050439f6407a" containerName="cinder-api" Oct 03 08:59:36 crc kubenswrapper[4790]: I1003 08:59:36.299357 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="adfeaccd-4cd7-4038-a189-050439f6407a" containerName="cinder-api" Oct 03 08:59:36 crc kubenswrapper[4790]: E1003 08:59:36.300671 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f57970a5-79c5-4a4f-bdf6-89d25bd8d8c5" containerName="horizon-log" Oct 03 08:59:36 crc kubenswrapper[4790]: I1003 08:59:36.300743 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="f57970a5-79c5-4a4f-bdf6-89d25bd8d8c5" containerName="horizon-log" Oct 03 08:59:36 crc kubenswrapper[4790]: I1003 08:59:36.301239 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="f57970a5-79c5-4a4f-bdf6-89d25bd8d8c5" containerName="horizon-log" Oct 03 08:59:36 crc kubenswrapper[4790]: I1003 08:59:36.301264 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="f57970a5-79c5-4a4f-bdf6-89d25bd8d8c5" containerName="horizon" Oct 03 08:59:36 crc kubenswrapper[4790]: I1003 08:59:36.301286 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="adfeaccd-4cd7-4038-a189-050439f6407a" containerName="cinder-api-log" Oct 03 08:59:36 crc kubenswrapper[4790]: I1003 08:59:36.301310 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="adfeaccd-4cd7-4038-a189-050439f6407a" containerName="cinder-api" Oct 03 08:59:36 crc kubenswrapper[4790]: I1003 08:59:36.306926 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 03 08:59:36 crc kubenswrapper[4790]: I1003 08:59:36.312395 4790 scope.go:117] "RemoveContainer" containerID="bc33e5db07b735662f20eb2b839ba0dbfaa1141b576d9dfed09994c044e63804" Oct 03 08:59:36 crc kubenswrapper[4790]: I1003 08:59:36.312846 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Oct 03 08:59:36 crc kubenswrapper[4790]: I1003 08:59:36.313126 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Oct 03 08:59:36 crc kubenswrapper[4790]: I1003 08:59:36.313209 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Oct 03 08:59:36 crc kubenswrapper[4790]: I1003 08:59:36.353665 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="adfeaccd-4cd7-4038-a189-050439f6407a" path="/var/lib/kubelet/pods/adfeaccd-4cd7-4038-a189-050439f6407a/volumes" Oct 03 08:59:36 crc kubenswrapper[4790]: I1003 08:59:36.354456 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f57970a5-79c5-4a4f-bdf6-89d25bd8d8c5" path="/var/lib/kubelet/pods/f57970a5-79c5-4a4f-bdf6-89d25bd8d8c5/volumes" Oct 03 08:59:36 crc kubenswrapper[4790]: I1003 08:59:36.355007 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Oct 03 08:59:36 crc kubenswrapper[4790]: I1003 08:59:36.408273 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a7f77060-c6da-48ba-9c55-e454fd1da9fd-logs\") pod \"cinder-api-0\" (UID: \"a7f77060-c6da-48ba-9c55-e454fd1da9fd\") " pod="openstack/cinder-api-0" Oct 03 08:59:36 crc kubenswrapper[4790]: I1003 08:59:36.408329 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a7f77060-c6da-48ba-9c55-e454fd1da9fd-config-data\") pod \"cinder-api-0\" (UID: \"a7f77060-c6da-48ba-9c55-e454fd1da9fd\") " pod="openstack/cinder-api-0" Oct 03 08:59:36 crc kubenswrapper[4790]: I1003 08:59:36.408386 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g8hcc\" (UniqueName: \"kubernetes.io/projected/a7f77060-c6da-48ba-9c55-e454fd1da9fd-kube-api-access-g8hcc\") pod \"cinder-api-0\" (UID: \"a7f77060-c6da-48ba-9c55-e454fd1da9fd\") " pod="openstack/cinder-api-0" Oct 03 08:59:36 crc kubenswrapper[4790]: I1003 08:59:36.408480 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a7f77060-c6da-48ba-9c55-e454fd1da9fd-scripts\") pod \"cinder-api-0\" (UID: \"a7f77060-c6da-48ba-9c55-e454fd1da9fd\") " pod="openstack/cinder-api-0" Oct 03 08:59:36 crc kubenswrapper[4790]: I1003 08:59:36.408645 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a7f77060-c6da-48ba-9c55-e454fd1da9fd-etc-machine-id\") pod \"cinder-api-0\" (UID: \"a7f77060-c6da-48ba-9c55-e454fd1da9fd\") " pod="openstack/cinder-api-0" Oct 03 08:59:36 crc kubenswrapper[4790]: I1003 08:59:36.408681 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a7f77060-c6da-48ba-9c55-e454fd1da9fd-public-tls-certs\") pod \"cinder-api-0\" (UID: \"a7f77060-c6da-48ba-9c55-e454fd1da9fd\") " pod="openstack/cinder-api-0" Oct 03 08:59:36 crc kubenswrapper[4790]: I1003 08:59:36.408748 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a7f77060-c6da-48ba-9c55-e454fd1da9fd-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"a7f77060-c6da-48ba-9c55-e454fd1da9fd\") " pod="openstack/cinder-api-0" Oct 03 08:59:36 crc kubenswrapper[4790]: I1003 08:59:36.408875 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a7f77060-c6da-48ba-9c55-e454fd1da9fd-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"a7f77060-c6da-48ba-9c55-e454fd1da9fd\") " pod="openstack/cinder-api-0" Oct 03 08:59:36 crc kubenswrapper[4790]: I1003 08:59:36.408943 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a7f77060-c6da-48ba-9c55-e454fd1da9fd-config-data-custom\") pod \"cinder-api-0\" (UID: \"a7f77060-c6da-48ba-9c55-e454fd1da9fd\") " pod="openstack/cinder-api-0" Oct 03 08:59:36 crc kubenswrapper[4790]: I1003 08:59:36.511389 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a7f77060-c6da-48ba-9c55-e454fd1da9fd-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"a7f77060-c6da-48ba-9c55-e454fd1da9fd\") " pod="openstack/cinder-api-0" Oct 03 08:59:36 crc kubenswrapper[4790]: I1003 08:59:36.511457 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a7f77060-c6da-48ba-9c55-e454fd1da9fd-config-data-custom\") pod \"cinder-api-0\" (UID: \"a7f77060-c6da-48ba-9c55-e454fd1da9fd\") " pod="openstack/cinder-api-0" Oct 03 08:59:36 crc kubenswrapper[4790]: I1003 08:59:36.511517 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a7f77060-c6da-48ba-9c55-e454fd1da9fd-logs\") pod \"cinder-api-0\" (UID: \"a7f77060-c6da-48ba-9c55-e454fd1da9fd\") " pod="openstack/cinder-api-0" Oct 03 08:59:36 crc kubenswrapper[4790]: I1003 08:59:36.511557 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a7f77060-c6da-48ba-9c55-e454fd1da9fd-config-data\") pod \"cinder-api-0\" (UID: \"a7f77060-c6da-48ba-9c55-e454fd1da9fd\") " pod="openstack/cinder-api-0" Oct 03 08:59:36 crc kubenswrapper[4790]: I1003 08:59:36.511647 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g8hcc\" (UniqueName: \"kubernetes.io/projected/a7f77060-c6da-48ba-9c55-e454fd1da9fd-kube-api-access-g8hcc\") pod \"cinder-api-0\" (UID: \"a7f77060-c6da-48ba-9c55-e454fd1da9fd\") " pod="openstack/cinder-api-0" Oct 03 08:59:36 crc kubenswrapper[4790]: I1003 08:59:36.511673 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a7f77060-c6da-48ba-9c55-e454fd1da9fd-scripts\") pod \"cinder-api-0\" (UID: \"a7f77060-c6da-48ba-9c55-e454fd1da9fd\") " pod="openstack/cinder-api-0" Oct 03 08:59:36 crc kubenswrapper[4790]: I1003 08:59:36.511724 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a7f77060-c6da-48ba-9c55-e454fd1da9fd-etc-machine-id\") pod \"cinder-api-0\" (UID: \"a7f77060-c6da-48ba-9c55-e454fd1da9fd\") " pod="openstack/cinder-api-0" Oct 03 08:59:36 crc kubenswrapper[4790]: I1003 08:59:36.511751 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a7f77060-c6da-48ba-9c55-e454fd1da9fd-public-tls-certs\") pod \"cinder-api-0\" (UID: \"a7f77060-c6da-48ba-9c55-e454fd1da9fd\") " pod="openstack/cinder-api-0" Oct 03 08:59:36 crc kubenswrapper[4790]: I1003 08:59:36.511797 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a7f77060-c6da-48ba-9c55-e454fd1da9fd-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"a7f77060-c6da-48ba-9c55-e454fd1da9fd\") " pod="openstack/cinder-api-0" Oct 03 08:59:36 crc kubenswrapper[4790]: I1003 08:59:36.512405 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a7f77060-c6da-48ba-9c55-e454fd1da9fd-etc-machine-id\") pod \"cinder-api-0\" (UID: \"a7f77060-c6da-48ba-9c55-e454fd1da9fd\") " pod="openstack/cinder-api-0" Oct 03 08:59:36 crc kubenswrapper[4790]: I1003 08:59:36.512619 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a7f77060-c6da-48ba-9c55-e454fd1da9fd-logs\") pod \"cinder-api-0\" (UID: \"a7f77060-c6da-48ba-9c55-e454fd1da9fd\") " pod="openstack/cinder-api-0" Oct 03 08:59:36 crc kubenswrapper[4790]: I1003 08:59:36.516884 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a7f77060-c6da-48ba-9c55-e454fd1da9fd-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"a7f77060-c6da-48ba-9c55-e454fd1da9fd\") " pod="openstack/cinder-api-0" Oct 03 08:59:36 crc kubenswrapper[4790]: I1003 08:59:36.516947 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a7f77060-c6da-48ba-9c55-e454fd1da9fd-config-data-custom\") pod \"cinder-api-0\" (UID: \"a7f77060-c6da-48ba-9c55-e454fd1da9fd\") " pod="openstack/cinder-api-0" Oct 03 08:59:36 crc kubenswrapper[4790]: I1003 08:59:36.517091 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a7f77060-c6da-48ba-9c55-e454fd1da9fd-config-data\") pod \"cinder-api-0\" (UID: \"a7f77060-c6da-48ba-9c55-e454fd1da9fd\") " pod="openstack/cinder-api-0" Oct 03 08:59:36 crc kubenswrapper[4790]: I1003 08:59:36.518836 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a7f77060-c6da-48ba-9c55-e454fd1da9fd-scripts\") pod \"cinder-api-0\" (UID: \"a7f77060-c6da-48ba-9c55-e454fd1da9fd\") " pod="openstack/cinder-api-0" Oct 03 08:59:36 crc kubenswrapper[4790]: I1003 08:59:36.519301 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a7f77060-c6da-48ba-9c55-e454fd1da9fd-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"a7f77060-c6da-48ba-9c55-e454fd1da9fd\") " pod="openstack/cinder-api-0" Oct 03 08:59:36 crc kubenswrapper[4790]: I1003 08:59:36.522978 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a7f77060-c6da-48ba-9c55-e454fd1da9fd-public-tls-certs\") pod \"cinder-api-0\" (UID: \"a7f77060-c6da-48ba-9c55-e454fd1da9fd\") " pod="openstack/cinder-api-0" Oct 03 08:59:36 crc kubenswrapper[4790]: I1003 08:59:36.541243 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g8hcc\" (UniqueName: \"kubernetes.io/projected/a7f77060-c6da-48ba-9c55-e454fd1da9fd-kube-api-access-g8hcc\") pod \"cinder-api-0\" (UID: \"a7f77060-c6da-48ba-9c55-e454fd1da9fd\") " pod="openstack/cinder-api-0" Oct 03 08:59:36 crc kubenswrapper[4790]: I1003 08:59:36.652071 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 03 08:59:37 crc kubenswrapper[4790]: I1003 08:59:37.255393 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Oct 03 08:59:37 crc kubenswrapper[4790]: I1003 08:59:37.978235 4790 generic.go:334] "Generic (PLEG): container finished" podID="f0afbb0f-f4dc-4814-8c7f-791351972c0b" containerID="f7e1596a2e3a93c9acb16c7a72441fad846a55cd8aad3832a2f6428d26e70811" exitCode=1 Oct 03 08:59:37 crc kubenswrapper[4790]: I1003 08:59:37.978319 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"f0afbb0f-f4dc-4814-8c7f-791351972c0b","Type":"ContainerDied","Data":"f7e1596a2e3a93c9acb16c7a72441fad846a55cd8aad3832a2f6428d26e70811"} Oct 03 08:59:37 crc kubenswrapper[4790]: I1003 08:59:37.978350 4790 scope.go:117] "RemoveContainer" containerID="e09e195cd4e8efea622e902164e68ef566c1f2278445f5df4794272fd43c9eb9" Oct 03 08:59:37 crc kubenswrapper[4790]: I1003 08:59:37.981134 4790 scope.go:117] "RemoveContainer" containerID="f7e1596a2e3a93c9acb16c7a72441fad846a55cd8aad3832a2f6428d26e70811" Oct 03 08:59:37 crc kubenswrapper[4790]: E1003 08:59:37.981460 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 20s restarting failed container=watcher-decision-engine pod=watcher-decision-engine-0_openstack(f0afbb0f-f4dc-4814-8c7f-791351972c0b)\"" pod="openstack/watcher-decision-engine-0" podUID="f0afbb0f-f4dc-4814-8c7f-791351972c0b" Oct 03 08:59:37 crc kubenswrapper[4790]: I1003 08:59:37.982852 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"a7f77060-c6da-48ba-9c55-e454fd1da9fd","Type":"ContainerStarted","Data":"102de172eae34679c1d15578c9c5b7528fbb94bc91338c4f690a06b843667f27"} Oct 03 08:59:37 crc kubenswrapper[4790]: I1003 08:59:37.992164 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-proxy-7657d6d659-5lb95"] Oct 03 08:59:37 crc kubenswrapper[4790]: I1003 08:59:37.994404 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-7657d6d659-5lb95" Oct 03 08:59:37 crc kubenswrapper[4790]: I1003 08:59:37.996640 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-internal-svc" Oct 03 08:59:37 crc kubenswrapper[4790]: I1003 08:59:37.996880 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-public-svc" Oct 03 08:59:37 crc kubenswrapper[4790]: I1003 08:59:37.996892 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Oct 03 08:59:38 crc kubenswrapper[4790]: I1003 08:59:38.003322 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-7657d6d659-5lb95"] Oct 03 08:59:38 crc kubenswrapper[4790]: I1003 08:59:38.028468 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-59b887b4f6-vdnlv" Oct 03 08:59:38 crc kubenswrapper[4790]: I1003 08:59:38.150406 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Oct 03 08:59:38 crc kubenswrapper[4790]: I1003 08:59:38.162518 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/9ee67672-dab9-4d89-b6b4-02eb06e8fe8d-etc-swift\") pod \"swift-proxy-7657d6d659-5lb95\" (UID: \"9ee67672-dab9-4d89-b6b4-02eb06e8fe8d\") " pod="openstack/swift-proxy-7657d6d659-5lb95" Oct 03 08:59:38 crc kubenswrapper[4790]: I1003 08:59:38.162747 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9ee67672-dab9-4d89-b6b4-02eb06e8fe8d-run-httpd\") pod \"swift-proxy-7657d6d659-5lb95\" (UID: \"9ee67672-dab9-4d89-b6b4-02eb06e8fe8d\") " pod="openstack/swift-proxy-7657d6d659-5lb95" Oct 03 08:59:38 crc kubenswrapper[4790]: I1003 08:59:38.162841 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9ee67672-dab9-4d89-b6b4-02eb06e8fe8d-config-data\") pod \"swift-proxy-7657d6d659-5lb95\" (UID: \"9ee67672-dab9-4d89-b6b4-02eb06e8fe8d\") " pod="openstack/swift-proxy-7657d6d659-5lb95" Oct 03 08:59:38 crc kubenswrapper[4790]: I1003 08:59:38.162949 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9ee67672-dab9-4d89-b6b4-02eb06e8fe8d-internal-tls-certs\") pod \"swift-proxy-7657d6d659-5lb95\" (UID: \"9ee67672-dab9-4d89-b6b4-02eb06e8fe8d\") " pod="openstack/swift-proxy-7657d6d659-5lb95" Oct 03 08:59:38 crc kubenswrapper[4790]: I1003 08:59:38.163068 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-st64x\" (UniqueName: \"kubernetes.io/projected/9ee67672-dab9-4d89-b6b4-02eb06e8fe8d-kube-api-access-st64x\") pod \"swift-proxy-7657d6d659-5lb95\" (UID: \"9ee67672-dab9-4d89-b6b4-02eb06e8fe8d\") " pod="openstack/swift-proxy-7657d6d659-5lb95" Oct 03 08:59:38 crc kubenswrapper[4790]: I1003 08:59:38.163150 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9ee67672-dab9-4d89-b6b4-02eb06e8fe8d-log-httpd\") pod \"swift-proxy-7657d6d659-5lb95\" (UID: \"9ee67672-dab9-4d89-b6b4-02eb06e8fe8d\") " pod="openstack/swift-proxy-7657d6d659-5lb95" Oct 03 08:59:38 crc kubenswrapper[4790]: I1003 08:59:38.163270 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9ee67672-dab9-4d89-b6b4-02eb06e8fe8d-public-tls-certs\") pod \"swift-proxy-7657d6d659-5lb95\" (UID: \"9ee67672-dab9-4d89-b6b4-02eb06e8fe8d\") " pod="openstack/swift-proxy-7657d6d659-5lb95" Oct 03 08:59:38 crc kubenswrapper[4790]: I1003 08:59:38.163400 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ee67672-dab9-4d89-b6b4-02eb06e8fe8d-combined-ca-bundle\") pod \"swift-proxy-7657d6d659-5lb95\" (UID: \"9ee67672-dab9-4d89-b6b4-02eb06e8fe8d\") " pod="openstack/swift-proxy-7657d6d659-5lb95" Oct 03 08:59:38 crc kubenswrapper[4790]: I1003 08:59:38.177406 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-59b887b4f6-vdnlv" Oct 03 08:59:38 crc kubenswrapper[4790]: I1003 08:59:38.265433 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/9ee67672-dab9-4d89-b6b4-02eb06e8fe8d-etc-swift\") pod \"swift-proxy-7657d6d659-5lb95\" (UID: \"9ee67672-dab9-4d89-b6b4-02eb06e8fe8d\") " pod="openstack/swift-proxy-7657d6d659-5lb95" Oct 03 08:59:38 crc kubenswrapper[4790]: I1003 08:59:38.265494 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9ee67672-dab9-4d89-b6b4-02eb06e8fe8d-config-data\") pod \"swift-proxy-7657d6d659-5lb95\" (UID: \"9ee67672-dab9-4d89-b6b4-02eb06e8fe8d\") " pod="openstack/swift-proxy-7657d6d659-5lb95" Oct 03 08:59:38 crc kubenswrapper[4790]: I1003 08:59:38.265513 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9ee67672-dab9-4d89-b6b4-02eb06e8fe8d-run-httpd\") pod \"swift-proxy-7657d6d659-5lb95\" (UID: \"9ee67672-dab9-4d89-b6b4-02eb06e8fe8d\") " pod="openstack/swift-proxy-7657d6d659-5lb95" Oct 03 08:59:38 crc kubenswrapper[4790]: I1003 08:59:38.265533 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9ee67672-dab9-4d89-b6b4-02eb06e8fe8d-internal-tls-certs\") pod \"swift-proxy-7657d6d659-5lb95\" (UID: \"9ee67672-dab9-4d89-b6b4-02eb06e8fe8d\") " pod="openstack/swift-proxy-7657d6d659-5lb95" Oct 03 08:59:38 crc kubenswrapper[4790]: I1003 08:59:38.265611 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-st64x\" (UniqueName: \"kubernetes.io/projected/9ee67672-dab9-4d89-b6b4-02eb06e8fe8d-kube-api-access-st64x\") pod \"swift-proxy-7657d6d659-5lb95\" (UID: \"9ee67672-dab9-4d89-b6b4-02eb06e8fe8d\") " pod="openstack/swift-proxy-7657d6d659-5lb95" Oct 03 08:59:38 crc kubenswrapper[4790]: I1003 08:59:38.265636 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9ee67672-dab9-4d89-b6b4-02eb06e8fe8d-log-httpd\") pod \"swift-proxy-7657d6d659-5lb95\" (UID: \"9ee67672-dab9-4d89-b6b4-02eb06e8fe8d\") " pod="openstack/swift-proxy-7657d6d659-5lb95" Oct 03 08:59:38 crc kubenswrapper[4790]: I1003 08:59:38.265691 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9ee67672-dab9-4d89-b6b4-02eb06e8fe8d-public-tls-certs\") pod \"swift-proxy-7657d6d659-5lb95\" (UID: \"9ee67672-dab9-4d89-b6b4-02eb06e8fe8d\") " pod="openstack/swift-proxy-7657d6d659-5lb95" Oct 03 08:59:38 crc kubenswrapper[4790]: I1003 08:59:38.265760 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ee67672-dab9-4d89-b6b4-02eb06e8fe8d-combined-ca-bundle\") pod \"swift-proxy-7657d6d659-5lb95\" (UID: \"9ee67672-dab9-4d89-b6b4-02eb06e8fe8d\") " pod="openstack/swift-proxy-7657d6d659-5lb95" Oct 03 08:59:38 crc kubenswrapper[4790]: I1003 08:59:38.272778 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9ee67672-dab9-4d89-b6b4-02eb06e8fe8d-log-httpd\") pod \"swift-proxy-7657d6d659-5lb95\" (UID: \"9ee67672-dab9-4d89-b6b4-02eb06e8fe8d\") " pod="openstack/swift-proxy-7657d6d659-5lb95" Oct 03 08:59:38 crc kubenswrapper[4790]: I1003 08:59:38.272862 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9ee67672-dab9-4d89-b6b4-02eb06e8fe8d-run-httpd\") pod \"swift-proxy-7657d6d659-5lb95\" (UID: \"9ee67672-dab9-4d89-b6b4-02eb06e8fe8d\") " pod="openstack/swift-proxy-7657d6d659-5lb95" Oct 03 08:59:38 crc kubenswrapper[4790]: I1003 08:59:38.278784 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9ee67672-dab9-4d89-b6b4-02eb06e8fe8d-public-tls-certs\") pod \"swift-proxy-7657d6d659-5lb95\" (UID: \"9ee67672-dab9-4d89-b6b4-02eb06e8fe8d\") " pod="openstack/swift-proxy-7657d6d659-5lb95" Oct 03 08:59:38 crc kubenswrapper[4790]: I1003 08:59:38.281822 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ee67672-dab9-4d89-b6b4-02eb06e8fe8d-combined-ca-bundle\") pod \"swift-proxy-7657d6d659-5lb95\" (UID: \"9ee67672-dab9-4d89-b6b4-02eb06e8fe8d\") " pod="openstack/swift-proxy-7657d6d659-5lb95" Oct 03 08:59:38 crc kubenswrapper[4790]: I1003 08:59:38.287057 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9ee67672-dab9-4d89-b6b4-02eb06e8fe8d-internal-tls-certs\") pod \"swift-proxy-7657d6d659-5lb95\" (UID: \"9ee67672-dab9-4d89-b6b4-02eb06e8fe8d\") " pod="openstack/swift-proxy-7657d6d659-5lb95" Oct 03 08:59:38 crc kubenswrapper[4790]: I1003 08:59:38.292130 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-st64x\" (UniqueName: \"kubernetes.io/projected/9ee67672-dab9-4d89-b6b4-02eb06e8fe8d-kube-api-access-st64x\") pod \"swift-proxy-7657d6d659-5lb95\" (UID: \"9ee67672-dab9-4d89-b6b4-02eb06e8fe8d\") " pod="openstack/swift-proxy-7657d6d659-5lb95" Oct 03 08:59:38 crc kubenswrapper[4790]: I1003 08:59:38.299811 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/9ee67672-dab9-4d89-b6b4-02eb06e8fe8d-etc-swift\") pod \"swift-proxy-7657d6d659-5lb95\" (UID: \"9ee67672-dab9-4d89-b6b4-02eb06e8fe8d\") " pod="openstack/swift-proxy-7657d6d659-5lb95" Oct 03 08:59:38 crc kubenswrapper[4790]: I1003 08:59:38.306102 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9ee67672-dab9-4d89-b6b4-02eb06e8fe8d-config-data\") pod \"swift-proxy-7657d6d659-5lb95\" (UID: \"9ee67672-dab9-4d89-b6b4-02eb06e8fe8d\") " pod="openstack/swift-proxy-7657d6d659-5lb95" Oct 03 08:59:38 crc kubenswrapper[4790]: I1003 08:59:38.369029 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Oct 03 08:59:38 crc kubenswrapper[4790]: I1003 08:59:38.374277 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-7657d6d659-5lb95" Oct 03 08:59:38 crc kubenswrapper[4790]: I1003 08:59:38.451797 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-66cd99db9f-qwm9f" Oct 03 08:59:38 crc kubenswrapper[4790]: I1003 08:59:38.553529 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-b4b8db59f-55rqj"] Oct 03 08:59:38 crc kubenswrapper[4790]: I1003 08:59:38.553808 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-b4b8db59f-55rqj" podUID="4a6a5c83-e69b-4f25-913b-0ac02229efeb" containerName="dnsmasq-dns" containerID="cri-o://863747eddd27a64a15fc8cab55a9036722aff232d9afb3736fa812d4fa908709" gracePeriod=10 Oct 03 08:59:38 crc kubenswrapper[4790]: I1003 08:59:38.997743 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"a7f77060-c6da-48ba-9c55-e454fd1da9fd","Type":"ContainerStarted","Data":"60091816637c3a4764a17a076eb5d222d75abeb04dfece8f35063e10775c9510"} Oct 03 08:59:39 crc kubenswrapper[4790]: I1003 08:59:39.015029 4790 generic.go:334] "Generic (PLEG): container finished" podID="4a6a5c83-e69b-4f25-913b-0ac02229efeb" containerID="863747eddd27a64a15fc8cab55a9036722aff232d9afb3736fa812d4fa908709" exitCode=0 Oct 03 08:59:39 crc kubenswrapper[4790]: I1003 08:59:39.015115 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b4b8db59f-55rqj" event={"ID":"4a6a5c83-e69b-4f25-913b-0ac02229efeb","Type":"ContainerDied","Data":"863747eddd27a64a15fc8cab55a9036722aff232d9afb3736fa812d4fa908709"} Oct 03 08:59:39 crc kubenswrapper[4790]: I1003 08:59:39.128610 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 03 08:59:39 crc kubenswrapper[4790]: I1003 08:59:39.164210 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-7657d6d659-5lb95"] Oct 03 08:59:39 crc kubenswrapper[4790]: I1003 08:59:39.264114 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b4b8db59f-55rqj" Oct 03 08:59:39 crc kubenswrapper[4790]: I1003 08:59:39.420407 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4a6a5c83-e69b-4f25-913b-0ac02229efeb-dns-svc\") pod \"4a6a5c83-e69b-4f25-913b-0ac02229efeb\" (UID: \"4a6a5c83-e69b-4f25-913b-0ac02229efeb\") " Oct 03 08:59:39 crc kubenswrapper[4790]: I1003 08:59:39.420482 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4a6a5c83-e69b-4f25-913b-0ac02229efeb-ovsdbserver-nb\") pod \"4a6a5c83-e69b-4f25-913b-0ac02229efeb\" (UID: \"4a6a5c83-e69b-4f25-913b-0ac02229efeb\") " Oct 03 08:59:39 crc kubenswrapper[4790]: I1003 08:59:39.420539 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-llrbj\" (UniqueName: \"kubernetes.io/projected/4a6a5c83-e69b-4f25-913b-0ac02229efeb-kube-api-access-llrbj\") pod \"4a6a5c83-e69b-4f25-913b-0ac02229efeb\" (UID: \"4a6a5c83-e69b-4f25-913b-0ac02229efeb\") " Oct 03 08:59:39 crc kubenswrapper[4790]: I1003 08:59:39.420686 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4a6a5c83-e69b-4f25-913b-0ac02229efeb-dns-swift-storage-0\") pod \"4a6a5c83-e69b-4f25-913b-0ac02229efeb\" (UID: \"4a6a5c83-e69b-4f25-913b-0ac02229efeb\") " Oct 03 08:59:39 crc kubenswrapper[4790]: I1003 08:59:39.420745 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4a6a5c83-e69b-4f25-913b-0ac02229efeb-config\") pod \"4a6a5c83-e69b-4f25-913b-0ac02229efeb\" (UID: \"4a6a5c83-e69b-4f25-913b-0ac02229efeb\") " Oct 03 08:59:39 crc kubenswrapper[4790]: I1003 08:59:39.420798 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4a6a5c83-e69b-4f25-913b-0ac02229efeb-ovsdbserver-sb\") pod \"4a6a5c83-e69b-4f25-913b-0ac02229efeb\" (UID: \"4a6a5c83-e69b-4f25-913b-0ac02229efeb\") " Oct 03 08:59:39 crc kubenswrapper[4790]: I1003 08:59:39.439888 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4a6a5c83-e69b-4f25-913b-0ac02229efeb-kube-api-access-llrbj" (OuterVolumeSpecName: "kube-api-access-llrbj") pod "4a6a5c83-e69b-4f25-913b-0ac02229efeb" (UID: "4a6a5c83-e69b-4f25-913b-0ac02229efeb"). InnerVolumeSpecName "kube-api-access-llrbj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:59:39 crc kubenswrapper[4790]: I1003 08:59:39.473094 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4a6a5c83-e69b-4f25-913b-0ac02229efeb-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "4a6a5c83-e69b-4f25-913b-0ac02229efeb" (UID: "4a6a5c83-e69b-4f25-913b-0ac02229efeb"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:59:39 crc kubenswrapper[4790]: I1003 08:59:39.481467 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4a6a5c83-e69b-4f25-913b-0ac02229efeb-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "4a6a5c83-e69b-4f25-913b-0ac02229efeb" (UID: "4a6a5c83-e69b-4f25-913b-0ac02229efeb"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:59:39 crc kubenswrapper[4790]: I1003 08:59:39.486390 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4a6a5c83-e69b-4f25-913b-0ac02229efeb-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "4a6a5c83-e69b-4f25-913b-0ac02229efeb" (UID: "4a6a5c83-e69b-4f25-913b-0ac02229efeb"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:59:39 crc kubenswrapper[4790]: I1003 08:59:39.487208 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4a6a5c83-e69b-4f25-913b-0ac02229efeb-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "4a6a5c83-e69b-4f25-913b-0ac02229efeb" (UID: "4a6a5c83-e69b-4f25-913b-0ac02229efeb"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:59:39 crc kubenswrapper[4790]: I1003 08:59:39.525972 4790 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4a6a5c83-e69b-4f25-913b-0ac02229efeb-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 03 08:59:39 crc kubenswrapper[4790]: I1003 08:59:39.526007 4790 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4a6a5c83-e69b-4f25-913b-0ac02229efeb-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 03 08:59:39 crc kubenswrapper[4790]: I1003 08:59:39.526016 4790 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4a6a5c83-e69b-4f25-913b-0ac02229efeb-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 03 08:59:39 crc kubenswrapper[4790]: I1003 08:59:39.526033 4790 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4a6a5c83-e69b-4f25-913b-0ac02229efeb-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 03 08:59:39 crc kubenswrapper[4790]: I1003 08:59:39.526042 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-llrbj\" (UniqueName: \"kubernetes.io/projected/4a6a5c83-e69b-4f25-913b-0ac02229efeb-kube-api-access-llrbj\") on node \"crc\" DevicePath \"\"" Oct 03 08:59:39 crc kubenswrapper[4790]: I1003 08:59:39.537590 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4a6a5c83-e69b-4f25-913b-0ac02229efeb-config" (OuterVolumeSpecName: "config") pod "4a6a5c83-e69b-4f25-913b-0ac02229efeb" (UID: "4a6a5c83-e69b-4f25-913b-0ac02229efeb"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 08:59:39 crc kubenswrapper[4790]: I1003 08:59:39.589242 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 03 08:59:39 crc kubenswrapper[4790]: I1003 08:59:39.589515 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="c8f86f3a-d523-455f-97b2-6995541d8487" containerName="ceilometer-central-agent" containerID="cri-o://dafeb127ec4a5866622b10537defdc778fdde863c4a580ec5e37ff408cb0f271" gracePeriod=30 Oct 03 08:59:39 crc kubenswrapper[4790]: I1003 08:59:39.589597 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="c8f86f3a-d523-455f-97b2-6995541d8487" containerName="proxy-httpd" containerID="cri-o://9d20834aa730d9597b074878440f498bf26e6043eac0c87f566906d35d75a87f" gracePeriod=30 Oct 03 08:59:39 crc kubenswrapper[4790]: I1003 08:59:39.589608 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="c8f86f3a-d523-455f-97b2-6995541d8487" containerName="sg-core" containerID="cri-o://1f6aa929dba32f2388e4c4211571dc2d0ace0f9a415f6df77e04005289adc718" gracePeriod=30 Oct 03 08:59:39 crc kubenswrapper[4790]: I1003 08:59:39.589676 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="c8f86f3a-d523-455f-97b2-6995541d8487" containerName="ceilometer-notification-agent" containerID="cri-o://728c655da19c3a9ced7deccd24d40a1cc7732051de2fa603ef66cdc99325c734" gracePeriod=30 Oct 03 08:59:39 crc kubenswrapper[4790]: I1003 08:59:39.627826 4790 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4a6a5c83-e69b-4f25-913b-0ac02229efeb-config\") on node \"crc\" DevicePath \"\"" Oct 03 08:59:39 crc kubenswrapper[4790]: I1003 08:59:39.631430 4790 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="c8f86f3a-d523-455f-97b2-6995541d8487" containerName="proxy-httpd" probeResult="failure" output="Get \"http://10.217.0.175:3000/\": EOF" Oct 03 08:59:40 crc kubenswrapper[4790]: I1003 08:59:40.042438 4790 generic.go:334] "Generic (PLEG): container finished" podID="c8f86f3a-d523-455f-97b2-6995541d8487" containerID="9d20834aa730d9597b074878440f498bf26e6043eac0c87f566906d35d75a87f" exitCode=0 Oct 03 08:59:40 crc kubenswrapper[4790]: I1003 08:59:40.042768 4790 generic.go:334] "Generic (PLEG): container finished" podID="c8f86f3a-d523-455f-97b2-6995541d8487" containerID="1f6aa929dba32f2388e4c4211571dc2d0ace0f9a415f6df77e04005289adc718" exitCode=2 Oct 03 08:59:40 crc kubenswrapper[4790]: I1003 08:59:40.042529 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c8f86f3a-d523-455f-97b2-6995541d8487","Type":"ContainerDied","Data":"9d20834aa730d9597b074878440f498bf26e6043eac0c87f566906d35d75a87f"} Oct 03 08:59:40 crc kubenswrapper[4790]: I1003 08:59:40.042844 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c8f86f3a-d523-455f-97b2-6995541d8487","Type":"ContainerDied","Data":"1f6aa929dba32f2388e4c4211571dc2d0ace0f9a415f6df77e04005289adc718"} Oct 03 08:59:40 crc kubenswrapper[4790]: I1003 08:59:40.047092 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b4b8db59f-55rqj" event={"ID":"4a6a5c83-e69b-4f25-913b-0ac02229efeb","Type":"ContainerDied","Data":"59679c0090ee92e8b908a13e20bbaa08dbaf010095c47bf62b3049d83d403a52"} Oct 03 08:59:40 crc kubenswrapper[4790]: I1003 08:59:40.047167 4790 scope.go:117] "RemoveContainer" containerID="863747eddd27a64a15fc8cab55a9036722aff232d9afb3736fa812d4fa908709" Oct 03 08:59:40 crc kubenswrapper[4790]: I1003 08:59:40.047370 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b4b8db59f-55rqj" Oct 03 08:59:40 crc kubenswrapper[4790]: I1003 08:59:40.051377 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-7657d6d659-5lb95" event={"ID":"9ee67672-dab9-4d89-b6b4-02eb06e8fe8d","Type":"ContainerStarted","Data":"e3d9752ba95ed49550da768618db17ff65fd492cfe150732a19ed92bc2dc765b"} Oct 03 08:59:40 crc kubenswrapper[4790]: I1003 08:59:40.051421 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-7657d6d659-5lb95" event={"ID":"9ee67672-dab9-4d89-b6b4-02eb06e8fe8d","Type":"ContainerStarted","Data":"7b0a555bae0a3931dd96df55abfa3c552a38d8773c290ad751321c2ac694a8d7"} Oct 03 08:59:40 crc kubenswrapper[4790]: I1003 08:59:40.051432 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-7657d6d659-5lb95" event={"ID":"9ee67672-dab9-4d89-b6b4-02eb06e8fe8d","Type":"ContainerStarted","Data":"b69fd6a9d3f1d07d7e6fa79e0119c18a9958eeb498086da4277be093701ebc77"} Oct 03 08:59:40 crc kubenswrapper[4790]: I1003 08:59:40.051545 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-7657d6d659-5lb95" Oct 03 08:59:40 crc kubenswrapper[4790]: I1003 08:59:40.056716 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"a7f77060-c6da-48ba-9c55-e454fd1da9fd","Type":"ContainerStarted","Data":"191ef00e59fab67ec69d8eb293c4a621d29c21e9d2de5a42ac918d3b8d543089"} Oct 03 08:59:40 crc kubenswrapper[4790]: I1003 08:59:40.056889 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="dded1c26-62f8-4b21-a7da-f8d4b25bb677" containerName="cinder-scheduler" containerID="cri-o://5c5f8b72278b652f33128ff7085aae88d1656148c0307c10c8c9d6bfefa75a2c" gracePeriod=30 Oct 03 08:59:40 crc kubenswrapper[4790]: I1003 08:59:40.058028 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="dded1c26-62f8-4b21-a7da-f8d4b25bb677" containerName="probe" containerID="cri-o://763b42a2a4547ba508e9d42da461fa8e7d0cb131165109e046d88f98df39bcb0" gracePeriod=30 Oct 03 08:59:40 crc kubenswrapper[4790]: I1003 08:59:40.101351 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-proxy-7657d6d659-5lb95" podStartSLOduration=3.101329052 podStartE2EDuration="3.101329052s" podCreationTimestamp="2025-10-03 08:59:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 08:59:40.075677739 +0000 UTC m=+1166.428612007" watchObservedRunningTime="2025-10-03 08:59:40.101329052 +0000 UTC m=+1166.454263290" Oct 03 08:59:40 crc kubenswrapper[4790]: I1003 08:59:40.102894 4790 scope.go:117] "RemoveContainer" containerID="2302088d6ec301d64f813bec35e7ce522e314ef668561f6a9c0e45fc24466efd" Oct 03 08:59:40 crc kubenswrapper[4790]: I1003 08:59:40.114769 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=4.11474915 podStartE2EDuration="4.11474915s" podCreationTimestamp="2025-10-03 08:59:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 08:59:40.106550425 +0000 UTC m=+1166.459484653" watchObservedRunningTime="2025-10-03 08:59:40.11474915 +0000 UTC m=+1166.467683388" Oct 03 08:59:40 crc kubenswrapper[4790]: I1003 08:59:40.136891 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-b4b8db59f-55rqj"] Oct 03 08:59:40 crc kubenswrapper[4790]: I1003 08:59:40.148148 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-b4b8db59f-55rqj"] Oct 03 08:59:40 crc kubenswrapper[4790]: I1003 08:59:40.360876 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4a6a5c83-e69b-4f25-913b-0ac02229efeb" path="/var/lib/kubelet/pods/4a6a5c83-e69b-4f25-913b-0ac02229efeb/volumes" Oct 03 08:59:41 crc kubenswrapper[4790]: I1003 08:59:41.074288 4790 generic.go:334] "Generic (PLEG): container finished" podID="dded1c26-62f8-4b21-a7da-f8d4b25bb677" containerID="763b42a2a4547ba508e9d42da461fa8e7d0cb131165109e046d88f98df39bcb0" exitCode=0 Oct 03 08:59:41 crc kubenswrapper[4790]: I1003 08:59:41.074608 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"dded1c26-62f8-4b21-a7da-f8d4b25bb677","Type":"ContainerDied","Data":"763b42a2a4547ba508e9d42da461fa8e7d0cb131165109e046d88f98df39bcb0"} Oct 03 08:59:41 crc kubenswrapper[4790]: I1003 08:59:41.079410 4790 generic.go:334] "Generic (PLEG): container finished" podID="c8f86f3a-d523-455f-97b2-6995541d8487" containerID="dafeb127ec4a5866622b10537defdc778fdde863c4a580ec5e37ff408cb0f271" exitCode=0 Oct 03 08:59:41 crc kubenswrapper[4790]: I1003 08:59:41.079465 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c8f86f3a-d523-455f-97b2-6995541d8487","Type":"ContainerDied","Data":"dafeb127ec4a5866622b10537defdc778fdde863c4a580ec5e37ff408cb0f271"} Oct 03 08:59:41 crc kubenswrapper[4790]: I1003 08:59:41.079522 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-7657d6d659-5lb95" Oct 03 08:59:41 crc kubenswrapper[4790]: I1003 08:59:41.080424 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Oct 03 08:59:41 crc kubenswrapper[4790]: I1003 08:59:41.925747 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-decision-engine-0" Oct 03 08:59:41 crc kubenswrapper[4790]: I1003 08:59:41.926034 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-decision-engine-0" Oct 03 08:59:41 crc kubenswrapper[4790]: I1003 08:59:41.927008 4790 scope.go:117] "RemoveContainer" containerID="f7e1596a2e3a93c9acb16c7a72441fad846a55cd8aad3832a2f6428d26e70811" Oct 03 08:59:41 crc kubenswrapper[4790]: E1003 08:59:41.927300 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 20s restarting failed container=watcher-decision-engine pod=watcher-decision-engine-0_openstack(f0afbb0f-f4dc-4814-8c7f-791351972c0b)\"" pod="openstack/watcher-decision-engine-0" podUID="f0afbb0f-f4dc-4814-8c7f-791351972c0b" Oct 03 08:59:42 crc kubenswrapper[4790]: I1003 08:59:42.324344 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 03 08:59:42 crc kubenswrapper[4790]: I1003 08:59:42.324643 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="0c264d15-baa6-494c-89e1-220296a37ce8" containerName="glance-log" containerID="cri-o://e1f7fd3c3f6cd89b03346ac4f66120fb71f51641e1aa7429240630f77d743d72" gracePeriod=30 Oct 03 08:59:42 crc kubenswrapper[4790]: I1003 08:59:42.325133 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="0c264d15-baa6-494c-89e1-220296a37ce8" containerName="glance-httpd" containerID="cri-o://86300d1c7512e83f6372e380c2bce36bc05ced935845c7c701a17989990bda32" gracePeriod=30 Oct 03 08:59:42 crc kubenswrapper[4790]: E1003 08:59:42.616622 4790 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddf20e14b_d7d8_4cab_8b7c_61122da75fbf.slice/crio-166a3036570750727b0dd758229bd28aafecd40eb06b2140fd160ae44b5f7978\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod64754289_91b5_4f84_9809_6271b6a13031.slice/crio-b0ff64c50c118d67a4cf15daa2b8ee6c4f295601ca118a67059cebe19412eeaf\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod64754289_91b5_4f84_9809_6271b6a13031.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddf20e14b_d7d8_4cab_8b7c_61122da75fbf.slice\": RecentStats: unable to find data in memory cache]" Oct 03 08:59:43 crc kubenswrapper[4790]: I1003 08:59:43.106791 4790 generic.go:334] "Generic (PLEG): container finished" podID="0c264d15-baa6-494c-89e1-220296a37ce8" containerID="e1f7fd3c3f6cd89b03346ac4f66120fb71f51641e1aa7429240630f77d743d72" exitCode=143 Oct 03 08:59:43 crc kubenswrapper[4790]: I1003 08:59:43.106831 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"0c264d15-baa6-494c-89e1-220296a37ce8","Type":"ContainerDied","Data":"e1f7fd3c3f6cd89b03346ac4f66120fb71f51641e1aa7429240630f77d743d72"} Oct 03 08:59:43 crc kubenswrapper[4790]: I1003 08:59:43.118487 4790 generic.go:334] "Generic (PLEG): container finished" podID="c8f86f3a-d523-455f-97b2-6995541d8487" containerID="728c655da19c3a9ced7deccd24d40a1cc7732051de2fa603ef66cdc99325c734" exitCode=0 Oct 03 08:59:43 crc kubenswrapper[4790]: I1003 08:59:43.118571 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c8f86f3a-d523-455f-97b2-6995541d8487","Type":"ContainerDied","Data":"728c655da19c3a9ced7deccd24d40a1cc7732051de2fa603ef66cdc99325c734"} Oct 03 08:59:43 crc kubenswrapper[4790]: I1003 08:59:43.124306 4790 generic.go:334] "Generic (PLEG): container finished" podID="dded1c26-62f8-4b21-a7da-f8d4b25bb677" containerID="5c5f8b72278b652f33128ff7085aae88d1656148c0307c10c8c9d6bfefa75a2c" exitCode=0 Oct 03 08:59:43 crc kubenswrapper[4790]: I1003 08:59:43.124365 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"dded1c26-62f8-4b21-a7da-f8d4b25bb677","Type":"ContainerDied","Data":"5c5f8b72278b652f33128ff7085aae88d1656148c0307c10c8c9d6bfefa75a2c"} Oct 03 08:59:44 crc kubenswrapper[4790]: I1003 08:59:44.138257 4790 generic.go:334] "Generic (PLEG): container finished" podID="0c264d15-baa6-494c-89e1-220296a37ce8" containerID="86300d1c7512e83f6372e380c2bce36bc05ced935845c7c701a17989990bda32" exitCode=0 Oct 03 08:59:44 crc kubenswrapper[4790]: I1003 08:59:44.138303 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"0c264d15-baa6-494c-89e1-220296a37ce8","Type":"ContainerDied","Data":"86300d1c7512e83f6372e380c2bce36bc05ced935845c7c701a17989990bda32"} Oct 03 08:59:44 crc kubenswrapper[4790]: I1003 08:59:44.785702 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 03 08:59:44 crc kubenswrapper[4790]: I1003 08:59:44.785947 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="ab258ad1-5952-4586-97ac-9da3ad97658d" containerName="glance-log" containerID="cri-o://a00e77f9ee436c7cc8964d611a19e338bf93e927ab08a3cc85c2f19a770a75f4" gracePeriod=30 Oct 03 08:59:44 crc kubenswrapper[4790]: I1003 08:59:44.786389 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="ab258ad1-5952-4586-97ac-9da3ad97658d" containerName="glance-httpd" containerID="cri-o://46229c309a222bb8e95f5453935ba4bf91e74c328298d721f09c2610d2e33236" gracePeriod=30 Oct 03 08:59:44 crc kubenswrapper[4790]: I1003 08:59:44.859307 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-55549c44b-l2c25" Oct 03 08:59:45 crc kubenswrapper[4790]: I1003 08:59:45.090287 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-55549c44b-l2c25" Oct 03 08:59:45 crc kubenswrapper[4790]: I1003 08:59:45.156746 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-59b887b4f6-vdnlv"] Oct 03 08:59:45 crc kubenswrapper[4790]: I1003 08:59:45.160026 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-59b887b4f6-vdnlv" podUID="f334f308-e77d-45ff-8c57-30164f037180" containerName="barbican-api-log" containerID="cri-o://cb394e52a9c3ce4a1011d9ebb09501a30c8b49cbbfad1db98cb632f81401d6cf" gracePeriod=30 Oct 03 08:59:45 crc kubenswrapper[4790]: I1003 08:59:45.160377 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-59b887b4f6-vdnlv" podUID="f334f308-e77d-45ff-8c57-30164f037180" containerName="barbican-api" containerID="cri-o://e84566cac6111f752ce54b2300b992075f833822e1ee7f609aa4d7a9799fe899" gracePeriod=30 Oct 03 08:59:45 crc kubenswrapper[4790]: I1003 08:59:45.175455 4790 generic.go:334] "Generic (PLEG): container finished" podID="ab258ad1-5952-4586-97ac-9da3ad97658d" containerID="a00e77f9ee436c7cc8964d611a19e338bf93e927ab08a3cc85c2f19a770a75f4" exitCode=143 Oct 03 08:59:45 crc kubenswrapper[4790]: I1003 08:59:45.176555 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"ab258ad1-5952-4586-97ac-9da3ad97658d","Type":"ContainerDied","Data":"a00e77f9ee436c7cc8964d611a19e338bf93e927ab08a3cc85c2f19a770a75f4"} Oct 03 08:59:45 crc kubenswrapper[4790]: I1003 08:59:45.720392 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-8454698868-xzzl2" Oct 03 08:59:46 crc kubenswrapper[4790]: I1003 08:59:46.191030 4790 generic.go:334] "Generic (PLEG): container finished" podID="f334f308-e77d-45ff-8c57-30164f037180" containerID="cb394e52a9c3ce4a1011d9ebb09501a30c8b49cbbfad1db98cb632f81401d6cf" exitCode=143 Oct 03 08:59:46 crc kubenswrapper[4790]: I1003 08:59:46.191073 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-59b887b4f6-vdnlv" event={"ID":"f334f308-e77d-45ff-8c57-30164f037180","Type":"ContainerDied","Data":"cb394e52a9c3ce4a1011d9ebb09501a30c8b49cbbfad1db98cb632f81401d6cf"} Oct 03 08:59:46 crc kubenswrapper[4790]: I1003 08:59:46.815329 4790 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-59b887b4f6-vdnlv" podUID="f334f308-e77d-45ff-8c57-30164f037180" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.179:9311/healthcheck\": read tcp 10.217.0.2:37860->10.217.0.179:9311: read: connection reset by peer" Oct 03 08:59:46 crc kubenswrapper[4790]: I1003 08:59:46.815342 4790 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-59b887b4f6-vdnlv" podUID="f334f308-e77d-45ff-8c57-30164f037180" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.179:9311/healthcheck\": read tcp 10.217.0.2:37856->10.217.0.179:9311: read: connection reset by peer" Oct 03 08:59:47 crc kubenswrapper[4790]: I1003 08:59:47.213754 4790 generic.go:334] "Generic (PLEG): container finished" podID="f334f308-e77d-45ff-8c57-30164f037180" containerID="e84566cac6111f752ce54b2300b992075f833822e1ee7f609aa4d7a9799fe899" exitCode=0 Oct 03 08:59:47 crc kubenswrapper[4790]: I1003 08:59:47.213807 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-59b887b4f6-vdnlv" event={"ID":"f334f308-e77d-45ff-8c57-30164f037180","Type":"ContainerDied","Data":"e84566cac6111f752ce54b2300b992075f833822e1ee7f609aa4d7a9799fe899"} Oct 03 08:59:47 crc kubenswrapper[4790]: I1003 08:59:47.216048 4790 generic.go:334] "Generic (PLEG): container finished" podID="ab258ad1-5952-4586-97ac-9da3ad97658d" containerID="46229c309a222bb8e95f5453935ba4bf91e74c328298d721f09c2610d2e33236" exitCode=0 Oct 03 08:59:47 crc kubenswrapper[4790]: I1003 08:59:47.216097 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"ab258ad1-5952-4586-97ac-9da3ad97658d","Type":"ContainerDied","Data":"46229c309a222bb8e95f5453935ba4bf91e74c328298d721f09c2610d2e33236"} Oct 03 08:59:48 crc kubenswrapper[4790]: I1003 08:59:48.134851 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-h5qqd"] Oct 03 08:59:48 crc kubenswrapper[4790]: E1003 08:59:48.135376 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a6a5c83-e69b-4f25-913b-0ac02229efeb" containerName="dnsmasq-dns" Oct 03 08:59:48 crc kubenswrapper[4790]: I1003 08:59:48.135398 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a6a5c83-e69b-4f25-913b-0ac02229efeb" containerName="dnsmasq-dns" Oct 03 08:59:48 crc kubenswrapper[4790]: E1003 08:59:48.135418 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a6a5c83-e69b-4f25-913b-0ac02229efeb" containerName="init" Oct 03 08:59:48 crc kubenswrapper[4790]: I1003 08:59:48.135424 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a6a5c83-e69b-4f25-913b-0ac02229efeb" containerName="init" Oct 03 08:59:48 crc kubenswrapper[4790]: I1003 08:59:48.135642 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="4a6a5c83-e69b-4f25-913b-0ac02229efeb" containerName="dnsmasq-dns" Oct 03 08:59:48 crc kubenswrapper[4790]: I1003 08:59:48.136349 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-h5qqd" Oct 03 08:59:48 crc kubenswrapper[4790]: I1003 08:59:48.147263 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-h5qqd"] Oct 03 08:59:48 crc kubenswrapper[4790]: I1003 08:59:48.230452 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-q2zhq"] Oct 03 08:59:48 crc kubenswrapper[4790]: I1003 08:59:48.231912 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-q2zhq" Oct 03 08:59:48 crc kubenswrapper[4790]: I1003 08:59:48.233068 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2w46v\" (UniqueName: \"kubernetes.io/projected/28ca9811-81f8-4113-b7c0-ddcb16a66ad9-kube-api-access-2w46v\") pod \"nova-api-db-create-h5qqd\" (UID: \"28ca9811-81f8-4113-b7c0-ddcb16a66ad9\") " pod="openstack/nova-api-db-create-h5qqd" Oct 03 08:59:48 crc kubenswrapper[4790]: I1003 08:59:48.238285 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-q2zhq"] Oct 03 08:59:48 crc kubenswrapper[4790]: I1003 08:59:48.309556 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-95d849fd9-jww5w" Oct 03 08:59:48 crc kubenswrapper[4790]: I1003 08:59:48.356799 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2w46v\" (UniqueName: \"kubernetes.io/projected/28ca9811-81f8-4113-b7c0-ddcb16a66ad9-kube-api-access-2w46v\") pod \"nova-api-db-create-h5qqd\" (UID: \"28ca9811-81f8-4113-b7c0-ddcb16a66ad9\") " pod="openstack/nova-api-db-create-h5qqd" Oct 03 08:59:48 crc kubenswrapper[4790]: I1003 08:59:48.356935 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rwvfp\" (UniqueName: \"kubernetes.io/projected/0ffe5a55-d80d-47ce-9fc9-0611c97437d1-kube-api-access-rwvfp\") pod \"nova-cell0-db-create-q2zhq\" (UID: \"0ffe5a55-d80d-47ce-9fc9-0611c97437d1\") " pod="openstack/nova-cell0-db-create-q2zhq" Oct 03 08:59:48 crc kubenswrapper[4790]: I1003 08:59:48.360336 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-rq7b8"] Oct 03 08:59:48 crc kubenswrapper[4790]: I1003 08:59:48.361768 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-rq7b8" Oct 03 08:59:48 crc kubenswrapper[4790]: I1003 08:59:48.385133 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-rq7b8"] Oct 03 08:59:48 crc kubenswrapper[4790]: I1003 08:59:48.386426 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2w46v\" (UniqueName: \"kubernetes.io/projected/28ca9811-81f8-4113-b7c0-ddcb16a66ad9-kube-api-access-2w46v\") pod \"nova-api-db-create-h5qqd\" (UID: \"28ca9811-81f8-4113-b7c0-ddcb16a66ad9\") " pod="openstack/nova-api-db-create-h5qqd" Oct 03 08:59:48 crc kubenswrapper[4790]: I1003 08:59:48.394509 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-7657d6d659-5lb95" Oct 03 08:59:48 crc kubenswrapper[4790]: I1003 08:59:48.403352 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-7657d6d659-5lb95" Oct 03 08:59:48 crc kubenswrapper[4790]: I1003 08:59:48.453140 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-8454698868-xzzl2"] Oct 03 08:59:48 crc kubenswrapper[4790]: I1003 08:59:48.453444 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-8454698868-xzzl2" podUID="71091fa8-ba4a-42b7-b738-a6b012c5b65c" containerName="neutron-api" containerID="cri-o://aa79f711bd93e6248a4171b28d343a171e80f79979523fb04a3f7639633acf14" gracePeriod=30 Oct 03 08:59:48 crc kubenswrapper[4790]: I1003 08:59:48.453633 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-8454698868-xzzl2" podUID="71091fa8-ba4a-42b7-b738-a6b012c5b65c" containerName="neutron-httpd" containerID="cri-o://40fed40024a54795fdebb448fab0ca6cabc5e85aef38ed269bd60974b3067a14" gracePeriod=30 Oct 03 08:59:48 crc kubenswrapper[4790]: I1003 08:59:48.459778 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f7swv\" (UniqueName: \"kubernetes.io/projected/231c1e2d-3330-48e9-80cb-6f490ba17e37-kube-api-access-f7swv\") pod \"nova-cell1-db-create-rq7b8\" (UID: \"231c1e2d-3330-48e9-80cb-6f490ba17e37\") " pod="openstack/nova-cell1-db-create-rq7b8" Oct 03 08:59:48 crc kubenswrapper[4790]: I1003 08:59:48.459955 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rwvfp\" (UniqueName: \"kubernetes.io/projected/0ffe5a55-d80d-47ce-9fc9-0611c97437d1-kube-api-access-rwvfp\") pod \"nova-cell0-db-create-q2zhq\" (UID: \"0ffe5a55-d80d-47ce-9fc9-0611c97437d1\") " pod="openstack/nova-cell0-db-create-q2zhq" Oct 03 08:59:48 crc kubenswrapper[4790]: I1003 08:59:48.465030 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-h5qqd" Oct 03 08:59:48 crc kubenswrapper[4790]: I1003 08:59:48.535249 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rwvfp\" (UniqueName: \"kubernetes.io/projected/0ffe5a55-d80d-47ce-9fc9-0611c97437d1-kube-api-access-rwvfp\") pod \"nova-cell0-db-create-q2zhq\" (UID: \"0ffe5a55-d80d-47ce-9fc9-0611c97437d1\") " pod="openstack/nova-cell0-db-create-q2zhq" Oct 03 08:59:48 crc kubenswrapper[4790]: I1003 08:59:48.558315 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-q2zhq" Oct 03 08:59:48 crc kubenswrapper[4790]: I1003 08:59:48.576069 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f7swv\" (UniqueName: \"kubernetes.io/projected/231c1e2d-3330-48e9-80cb-6f490ba17e37-kube-api-access-f7swv\") pod \"nova-cell1-db-create-rq7b8\" (UID: \"231c1e2d-3330-48e9-80cb-6f490ba17e37\") " pod="openstack/nova-cell1-db-create-rq7b8" Oct 03 08:59:48 crc kubenswrapper[4790]: I1003 08:59:48.635989 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f7swv\" (UniqueName: \"kubernetes.io/projected/231c1e2d-3330-48e9-80cb-6f490ba17e37-kube-api-access-f7swv\") pod \"nova-cell1-db-create-rq7b8\" (UID: \"231c1e2d-3330-48e9-80cb-6f490ba17e37\") " pod="openstack/nova-cell1-db-create-rq7b8" Oct 03 08:59:48 crc kubenswrapper[4790]: I1003 08:59:48.784441 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-rq7b8" Oct 03 08:59:49 crc kubenswrapper[4790]: I1003 08:59:49.243362 4790 generic.go:334] "Generic (PLEG): container finished" podID="71091fa8-ba4a-42b7-b738-a6b012c5b65c" containerID="40fed40024a54795fdebb448fab0ca6cabc5e85aef38ed269bd60974b3067a14" exitCode=0 Oct 03 08:59:49 crc kubenswrapper[4790]: I1003 08:59:49.244220 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-8454698868-xzzl2" event={"ID":"71091fa8-ba4a-42b7-b738-a6b012c5b65c","Type":"ContainerDied","Data":"40fed40024a54795fdebb448fab0ca6cabc5e85aef38ed269bd60974b3067a14"} Oct 03 08:59:49 crc kubenswrapper[4790]: I1003 08:59:49.554054 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Oct 03 08:59:49 crc kubenswrapper[4790]: I1003 08:59:49.805880 4790 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="c8f86f3a-d523-455f-97b2-6995541d8487" containerName="proxy-httpd" probeResult="failure" output="Get \"http://10.217.0.175:3000/\": dial tcp 10.217.0.175:3000: connect: connection refused" Oct 03 08:59:50 crc kubenswrapper[4790]: I1003 08:59:50.534557 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 03 08:59:50 crc kubenswrapper[4790]: I1003 08:59:50.625362 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/dded1c26-62f8-4b21-a7da-f8d4b25bb677-config-data-custom\") pod \"dded1c26-62f8-4b21-a7da-f8d4b25bb677\" (UID: \"dded1c26-62f8-4b21-a7da-f8d4b25bb677\") " Oct 03 08:59:50 crc kubenswrapper[4790]: I1003 08:59:50.625603 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/dded1c26-62f8-4b21-a7da-f8d4b25bb677-etc-machine-id\") pod \"dded1c26-62f8-4b21-a7da-f8d4b25bb677\" (UID: \"dded1c26-62f8-4b21-a7da-f8d4b25bb677\") " Oct 03 08:59:50 crc kubenswrapper[4790]: I1003 08:59:50.625684 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dded1c26-62f8-4b21-a7da-f8d4b25bb677-combined-ca-bundle\") pod \"dded1c26-62f8-4b21-a7da-f8d4b25bb677\" (UID: \"dded1c26-62f8-4b21-a7da-f8d4b25bb677\") " Oct 03 08:59:50 crc kubenswrapper[4790]: I1003 08:59:50.625713 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dded1c26-62f8-4b21-a7da-f8d4b25bb677-config-data\") pod \"dded1c26-62f8-4b21-a7da-f8d4b25bb677\" (UID: \"dded1c26-62f8-4b21-a7da-f8d4b25bb677\") " Oct 03 08:59:50 crc kubenswrapper[4790]: I1003 08:59:50.625846 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dded1c26-62f8-4b21-a7da-f8d4b25bb677-scripts\") pod \"dded1c26-62f8-4b21-a7da-f8d4b25bb677\" (UID: \"dded1c26-62f8-4b21-a7da-f8d4b25bb677\") " Oct 03 08:59:50 crc kubenswrapper[4790]: I1003 08:59:50.625898 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m45rv\" (UniqueName: \"kubernetes.io/projected/dded1c26-62f8-4b21-a7da-f8d4b25bb677-kube-api-access-m45rv\") pod \"dded1c26-62f8-4b21-a7da-f8d4b25bb677\" (UID: \"dded1c26-62f8-4b21-a7da-f8d4b25bb677\") " Oct 03 08:59:50 crc kubenswrapper[4790]: I1003 08:59:50.638028 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/dded1c26-62f8-4b21-a7da-f8d4b25bb677-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "dded1c26-62f8-4b21-a7da-f8d4b25bb677" (UID: "dded1c26-62f8-4b21-a7da-f8d4b25bb677"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 08:59:50 crc kubenswrapper[4790]: I1003 08:59:50.638090 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dded1c26-62f8-4b21-a7da-f8d4b25bb677-kube-api-access-m45rv" (OuterVolumeSpecName: "kube-api-access-m45rv") pod "dded1c26-62f8-4b21-a7da-f8d4b25bb677" (UID: "dded1c26-62f8-4b21-a7da-f8d4b25bb677"). InnerVolumeSpecName "kube-api-access-m45rv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:59:50 crc kubenswrapper[4790]: I1003 08:59:50.644690 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dded1c26-62f8-4b21-a7da-f8d4b25bb677-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "dded1c26-62f8-4b21-a7da-f8d4b25bb677" (UID: "dded1c26-62f8-4b21-a7da-f8d4b25bb677"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:59:50 crc kubenswrapper[4790]: I1003 08:59:50.650103 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dded1c26-62f8-4b21-a7da-f8d4b25bb677-scripts" (OuterVolumeSpecName: "scripts") pod "dded1c26-62f8-4b21-a7da-f8d4b25bb677" (UID: "dded1c26-62f8-4b21-a7da-f8d4b25bb677"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:59:50 crc kubenswrapper[4790]: I1003 08:59:50.728818 4790 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dded1c26-62f8-4b21-a7da-f8d4b25bb677-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 08:59:50 crc kubenswrapper[4790]: I1003 08:59:50.728852 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m45rv\" (UniqueName: \"kubernetes.io/projected/dded1c26-62f8-4b21-a7da-f8d4b25bb677-kube-api-access-m45rv\") on node \"crc\" DevicePath \"\"" Oct 03 08:59:50 crc kubenswrapper[4790]: I1003 08:59:50.728865 4790 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/dded1c26-62f8-4b21-a7da-f8d4b25bb677-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 03 08:59:50 crc kubenswrapper[4790]: I1003 08:59:50.728873 4790 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/dded1c26-62f8-4b21-a7da-f8d4b25bb677-etc-machine-id\") on node \"crc\" DevicePath \"\"" Oct 03 08:59:50 crc kubenswrapper[4790]: I1003 08:59:50.774986 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dded1c26-62f8-4b21-a7da-f8d4b25bb677-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "dded1c26-62f8-4b21-a7da-f8d4b25bb677" (UID: "dded1c26-62f8-4b21-a7da-f8d4b25bb677"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:59:50 crc kubenswrapper[4790]: I1003 08:59:50.829995 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dded1c26-62f8-4b21-a7da-f8d4b25bb677-config-data" (OuterVolumeSpecName: "config-data") pod "dded1c26-62f8-4b21-a7da-f8d4b25bb677" (UID: "dded1c26-62f8-4b21-a7da-f8d4b25bb677"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:59:50 crc kubenswrapper[4790]: I1003 08:59:50.830076 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dded1c26-62f8-4b21-a7da-f8d4b25bb677-config-data\") pod \"dded1c26-62f8-4b21-a7da-f8d4b25bb677\" (UID: \"dded1c26-62f8-4b21-a7da-f8d4b25bb677\") " Oct 03 08:59:50 crc kubenswrapper[4790]: W1003 08:59:50.830185 4790 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/dded1c26-62f8-4b21-a7da-f8d4b25bb677/volumes/kubernetes.io~secret/config-data Oct 03 08:59:50 crc kubenswrapper[4790]: I1003 08:59:50.830201 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dded1c26-62f8-4b21-a7da-f8d4b25bb677-config-data" (OuterVolumeSpecName: "config-data") pod "dded1c26-62f8-4b21-a7da-f8d4b25bb677" (UID: "dded1c26-62f8-4b21-a7da-f8d4b25bb677"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:59:50 crc kubenswrapper[4790]: I1003 08:59:50.830460 4790 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dded1c26-62f8-4b21-a7da-f8d4b25bb677-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 08:59:50 crc kubenswrapper[4790]: I1003 08:59:50.830471 4790 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dded1c26-62f8-4b21-a7da-f8d4b25bb677-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 08:59:50 crc kubenswrapper[4790]: I1003 08:59:50.960225 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 03 08:59:51 crc kubenswrapper[4790]: I1003 08:59:51.033932 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c8f86f3a-d523-455f-97b2-6995541d8487-sg-core-conf-yaml\") pod \"c8f86f3a-d523-455f-97b2-6995541d8487\" (UID: \"c8f86f3a-d523-455f-97b2-6995541d8487\") " Oct 03 08:59:51 crc kubenswrapper[4790]: I1003 08:59:51.034009 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c8f86f3a-d523-455f-97b2-6995541d8487-combined-ca-bundle\") pod \"c8f86f3a-d523-455f-97b2-6995541d8487\" (UID: \"c8f86f3a-d523-455f-97b2-6995541d8487\") " Oct 03 08:59:51 crc kubenswrapper[4790]: I1003 08:59:51.034045 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c8f86f3a-d523-455f-97b2-6995541d8487-run-httpd\") pod \"c8f86f3a-d523-455f-97b2-6995541d8487\" (UID: \"c8f86f3a-d523-455f-97b2-6995541d8487\") " Oct 03 08:59:51 crc kubenswrapper[4790]: I1003 08:59:51.034124 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bqp6m\" (UniqueName: \"kubernetes.io/projected/c8f86f3a-d523-455f-97b2-6995541d8487-kube-api-access-bqp6m\") pod \"c8f86f3a-d523-455f-97b2-6995541d8487\" (UID: \"c8f86f3a-d523-455f-97b2-6995541d8487\") " Oct 03 08:59:51 crc kubenswrapper[4790]: I1003 08:59:51.034284 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c8f86f3a-d523-455f-97b2-6995541d8487-config-data\") pod \"c8f86f3a-d523-455f-97b2-6995541d8487\" (UID: \"c8f86f3a-d523-455f-97b2-6995541d8487\") " Oct 03 08:59:51 crc kubenswrapper[4790]: I1003 08:59:51.034331 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c8f86f3a-d523-455f-97b2-6995541d8487-scripts\") pod \"c8f86f3a-d523-455f-97b2-6995541d8487\" (UID: \"c8f86f3a-d523-455f-97b2-6995541d8487\") " Oct 03 08:59:51 crc kubenswrapper[4790]: I1003 08:59:51.034369 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c8f86f3a-d523-455f-97b2-6995541d8487-log-httpd\") pod \"c8f86f3a-d523-455f-97b2-6995541d8487\" (UID: \"c8f86f3a-d523-455f-97b2-6995541d8487\") " Oct 03 08:59:51 crc kubenswrapper[4790]: I1003 08:59:51.040559 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c8f86f3a-d523-455f-97b2-6995541d8487-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "c8f86f3a-d523-455f-97b2-6995541d8487" (UID: "c8f86f3a-d523-455f-97b2-6995541d8487"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 08:59:51 crc kubenswrapper[4790]: I1003 08:59:51.040784 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c8f86f3a-d523-455f-97b2-6995541d8487-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "c8f86f3a-d523-455f-97b2-6995541d8487" (UID: "c8f86f3a-d523-455f-97b2-6995541d8487"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 08:59:51 crc kubenswrapper[4790]: I1003 08:59:51.043118 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c8f86f3a-d523-455f-97b2-6995541d8487-kube-api-access-bqp6m" (OuterVolumeSpecName: "kube-api-access-bqp6m") pod "c8f86f3a-d523-455f-97b2-6995541d8487" (UID: "c8f86f3a-d523-455f-97b2-6995541d8487"). InnerVolumeSpecName "kube-api-access-bqp6m". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:59:51 crc kubenswrapper[4790]: I1003 08:59:51.045732 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c8f86f3a-d523-455f-97b2-6995541d8487-scripts" (OuterVolumeSpecName: "scripts") pod "c8f86f3a-d523-455f-97b2-6995541d8487" (UID: "c8f86f3a-d523-455f-97b2-6995541d8487"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:59:51 crc kubenswrapper[4790]: I1003 08:59:51.083122 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c8f86f3a-d523-455f-97b2-6995541d8487-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "c8f86f3a-d523-455f-97b2-6995541d8487" (UID: "c8f86f3a-d523-455f-97b2-6995541d8487"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:59:51 crc kubenswrapper[4790]: I1003 08:59:51.154065 4790 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c8f86f3a-d523-455f-97b2-6995541d8487-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 08:59:51 crc kubenswrapper[4790]: I1003 08:59:51.154101 4790 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c8f86f3a-d523-455f-97b2-6995541d8487-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 03 08:59:51 crc kubenswrapper[4790]: I1003 08:59:51.154111 4790 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c8f86f3a-d523-455f-97b2-6995541d8487-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 03 08:59:51 crc kubenswrapper[4790]: I1003 08:59:51.154122 4790 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c8f86f3a-d523-455f-97b2-6995541d8487-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 03 08:59:51 crc kubenswrapper[4790]: I1003 08:59:51.154131 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bqp6m\" (UniqueName: \"kubernetes.io/projected/c8f86f3a-d523-455f-97b2-6995541d8487-kube-api-access-bqp6m\") on node \"crc\" DevicePath \"\"" Oct 03 08:59:51 crc kubenswrapper[4790]: I1003 08:59:51.185201 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c8f86f3a-d523-455f-97b2-6995541d8487-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c8f86f3a-d523-455f-97b2-6995541d8487" (UID: "c8f86f3a-d523-455f-97b2-6995541d8487"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:59:51 crc kubenswrapper[4790]: I1003 08:59:51.214307 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c8f86f3a-d523-455f-97b2-6995541d8487-config-data" (OuterVolumeSpecName: "config-data") pod "c8f86f3a-d523-455f-97b2-6995541d8487" (UID: "c8f86f3a-d523-455f-97b2-6995541d8487"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:59:51 crc kubenswrapper[4790]: I1003 08:59:51.258692 4790 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c8f86f3a-d523-455f-97b2-6995541d8487-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 08:59:51 crc kubenswrapper[4790]: I1003 08:59:51.258724 4790 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c8f86f3a-d523-455f-97b2-6995541d8487-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 08:59:51 crc kubenswrapper[4790]: I1003 08:59:51.295904 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-59b887b4f6-vdnlv" Oct 03 08:59:51 crc kubenswrapper[4790]: I1003 08:59:51.297356 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 03 08:59:51 crc kubenswrapper[4790]: I1003 08:59:51.300612 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c8f86f3a-d523-455f-97b2-6995541d8487","Type":"ContainerDied","Data":"77c7c8470e544e061284624df7cc25f4049550cbd17eef5db9144e2c3b322e26"} Oct 03 08:59:51 crc kubenswrapper[4790]: I1003 08:59:51.300648 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 03 08:59:51 crc kubenswrapper[4790]: I1003 08:59:51.300671 4790 scope.go:117] "RemoveContainer" containerID="9d20834aa730d9597b074878440f498bf26e6043eac0c87f566906d35d75a87f" Oct 03 08:59:51 crc kubenswrapper[4790]: I1003 08:59:51.313679 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 03 08:59:51 crc kubenswrapper[4790]: I1003 08:59:51.313902 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"dded1c26-62f8-4b21-a7da-f8d4b25bb677","Type":"ContainerDied","Data":"cf596d6400578bbacd511876623c828570a7eacb8dd11836cf9f5494bed89778"} Oct 03 08:59:51 crc kubenswrapper[4790]: I1003 08:59:51.331293 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-59b887b4f6-vdnlv" Oct 03 08:59:51 crc kubenswrapper[4790]: I1003 08:59:51.331289 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-59b887b4f6-vdnlv" event={"ID":"f334f308-e77d-45ff-8c57-30164f037180","Type":"ContainerDied","Data":"19ecd839288fc39ac586a2c0669071d05e5e64d95a63012ca15a166a1e3fc220"} Oct 03 08:59:51 crc kubenswrapper[4790]: I1003 08:59:51.353531 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 03 08:59:51 crc kubenswrapper[4790]: I1003 08:59:51.353660 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"0c264d15-baa6-494c-89e1-220296a37ce8","Type":"ContainerDied","Data":"75e14685cbdc766dac9c5631ebf06f24403d08c618e0cee83389ef736be2b117"} Oct 03 08:59:51 crc kubenswrapper[4790]: I1003 08:59:51.359459 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"7a835cde-9c23-4cf9-9473-35d01cf2c528","Type":"ContainerStarted","Data":"c5a2fe9826e8b01395dfbdbcb43030f95e6f51c7bef75503216fe1613587a634"} Oct 03 08:59:51 crc kubenswrapper[4790]: I1003 08:59:51.359553 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f334f308-e77d-45ff-8c57-30164f037180-config-data-custom\") pod \"f334f308-e77d-45ff-8c57-30164f037180\" (UID: \"f334f308-e77d-45ff-8c57-30164f037180\") " Oct 03 08:59:51 crc kubenswrapper[4790]: I1003 08:59:51.359649 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0c264d15-baa6-494c-89e1-220296a37ce8-config-data\") pod \"0c264d15-baa6-494c-89e1-220296a37ce8\" (UID: \"0c264d15-baa6-494c-89e1-220296a37ce8\") " Oct 03 08:59:51 crc kubenswrapper[4790]: I1003 08:59:51.359672 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0c264d15-baa6-494c-89e1-220296a37ce8-logs\") pod \"0c264d15-baa6-494c-89e1-220296a37ce8\" (UID: \"0c264d15-baa6-494c-89e1-220296a37ce8\") " Oct 03 08:59:51 crc kubenswrapper[4790]: I1003 08:59:51.359702 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ntch7\" (UniqueName: \"kubernetes.io/projected/f334f308-e77d-45ff-8c57-30164f037180-kube-api-access-ntch7\") pod \"f334f308-e77d-45ff-8c57-30164f037180\" (UID: \"f334f308-e77d-45ff-8c57-30164f037180\") " Oct 03 08:59:51 crc kubenswrapper[4790]: I1003 08:59:51.359753 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0c264d15-baa6-494c-89e1-220296a37ce8-scripts\") pod \"0c264d15-baa6-494c-89e1-220296a37ce8\" (UID: \"0c264d15-baa6-494c-89e1-220296a37ce8\") " Oct 03 08:59:51 crc kubenswrapper[4790]: I1003 08:59:51.359793 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/0c264d15-baa6-494c-89e1-220296a37ce8-httpd-run\") pod \"0c264d15-baa6-494c-89e1-220296a37ce8\" (UID: \"0c264d15-baa6-494c-89e1-220296a37ce8\") " Oct 03 08:59:51 crc kubenswrapper[4790]: I1003 08:59:51.359869 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0c264d15-baa6-494c-89e1-220296a37ce8-public-tls-certs\") pod \"0c264d15-baa6-494c-89e1-220296a37ce8\" (UID: \"0c264d15-baa6-494c-89e1-220296a37ce8\") " Oct 03 08:59:51 crc kubenswrapper[4790]: I1003 08:59:51.359887 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f334f308-e77d-45ff-8c57-30164f037180-logs\") pod \"f334f308-e77d-45ff-8c57-30164f037180\" (UID: \"f334f308-e77d-45ff-8c57-30164f037180\") " Oct 03 08:59:51 crc kubenswrapper[4790]: I1003 08:59:51.359904 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f334f308-e77d-45ff-8c57-30164f037180-config-data\") pod \"f334f308-e77d-45ff-8c57-30164f037180\" (UID: \"f334f308-e77d-45ff-8c57-30164f037180\") " Oct 03 08:59:51 crc kubenswrapper[4790]: I1003 08:59:51.359945 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f334f308-e77d-45ff-8c57-30164f037180-combined-ca-bundle\") pod \"f334f308-e77d-45ff-8c57-30164f037180\" (UID: \"f334f308-e77d-45ff-8c57-30164f037180\") " Oct 03 08:59:51 crc kubenswrapper[4790]: I1003 08:59:51.359981 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jtd2n\" (UniqueName: \"kubernetes.io/projected/0c264d15-baa6-494c-89e1-220296a37ce8-kube-api-access-jtd2n\") pod \"0c264d15-baa6-494c-89e1-220296a37ce8\" (UID: \"0c264d15-baa6-494c-89e1-220296a37ce8\") " Oct 03 08:59:51 crc kubenswrapper[4790]: I1003 08:59:51.359998 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"0c264d15-baa6-494c-89e1-220296a37ce8\" (UID: \"0c264d15-baa6-494c-89e1-220296a37ce8\") " Oct 03 08:59:51 crc kubenswrapper[4790]: I1003 08:59:51.360054 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c264d15-baa6-494c-89e1-220296a37ce8-combined-ca-bundle\") pod \"0c264d15-baa6-494c-89e1-220296a37ce8\" (UID: \"0c264d15-baa6-494c-89e1-220296a37ce8\") " Oct 03 08:59:51 crc kubenswrapper[4790]: I1003 08:59:51.360783 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0c264d15-baa6-494c-89e1-220296a37ce8-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "0c264d15-baa6-494c-89e1-220296a37ce8" (UID: "0c264d15-baa6-494c-89e1-220296a37ce8"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 08:59:51 crc kubenswrapper[4790]: I1003 08:59:51.362204 4790 scope.go:117] "RemoveContainer" containerID="1f6aa929dba32f2388e4c4211571dc2d0ace0f9a415f6df77e04005289adc718" Oct 03 08:59:51 crc kubenswrapper[4790]: I1003 08:59:51.362462 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0c264d15-baa6-494c-89e1-220296a37ce8-logs" (OuterVolumeSpecName: "logs") pod "0c264d15-baa6-494c-89e1-220296a37ce8" (UID: "0c264d15-baa6-494c-89e1-220296a37ce8"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 08:59:51 crc kubenswrapper[4790]: I1003 08:59:51.363244 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f334f308-e77d-45ff-8c57-30164f037180-logs" (OuterVolumeSpecName: "logs") pod "f334f308-e77d-45ff-8c57-30164f037180" (UID: "f334f308-e77d-45ff-8c57-30164f037180"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 08:59:51 crc kubenswrapper[4790]: I1003 08:59:51.400422 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 03 08:59:51 crc kubenswrapper[4790]: I1003 08:59:51.424774 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f334f308-e77d-45ff-8c57-30164f037180-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "f334f308-e77d-45ff-8c57-30164f037180" (UID: "f334f308-e77d-45ff-8c57-30164f037180"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:59:51 crc kubenswrapper[4790]: I1003 08:59:51.424870 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage10-crc" (OuterVolumeSpecName: "glance") pod "0c264d15-baa6-494c-89e1-220296a37ce8" (UID: "0c264d15-baa6-494c-89e1-220296a37ce8"). InnerVolumeSpecName "local-storage10-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 03 08:59:51 crc kubenswrapper[4790]: I1003 08:59:51.425910 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0c264d15-baa6-494c-89e1-220296a37ce8-scripts" (OuterVolumeSpecName: "scripts") pod "0c264d15-baa6-494c-89e1-220296a37ce8" (UID: "0c264d15-baa6-494c-89e1-220296a37ce8"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:59:51 crc kubenswrapper[4790]: I1003 08:59:51.435619 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0c264d15-baa6-494c-89e1-220296a37ce8-kube-api-access-jtd2n" (OuterVolumeSpecName: "kube-api-access-jtd2n") pod "0c264d15-baa6-494c-89e1-220296a37ce8" (UID: "0c264d15-baa6-494c-89e1-220296a37ce8"). InnerVolumeSpecName "kube-api-access-jtd2n". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:59:51 crc kubenswrapper[4790]: I1003 08:59:51.453072 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f334f308-e77d-45ff-8c57-30164f037180-kube-api-access-ntch7" (OuterVolumeSpecName: "kube-api-access-ntch7") pod "f334f308-e77d-45ff-8c57-30164f037180" (UID: "f334f308-e77d-45ff-8c57-30164f037180"). InnerVolumeSpecName "kube-api-access-ntch7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:59:51 crc kubenswrapper[4790]: I1003 08:59:51.455417 4790 scope.go:117] "RemoveContainer" containerID="728c655da19c3a9ced7deccd24d40a1cc7732051de2fa603ef66cdc99325c734" Oct 03 08:59:51 crc kubenswrapper[4790]: I1003 08:59:51.460697 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f334f308-e77d-45ff-8c57-30164f037180-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f334f308-e77d-45ff-8c57-30164f037180" (UID: "f334f308-e77d-45ff-8c57-30164f037180"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:59:51 crc kubenswrapper[4790]: I1003 08:59:51.461906 4790 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f334f308-e77d-45ff-8c57-30164f037180-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 03 08:59:51 crc kubenswrapper[4790]: I1003 08:59:51.461926 4790 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0c264d15-baa6-494c-89e1-220296a37ce8-logs\") on node \"crc\" DevicePath \"\"" Oct 03 08:59:51 crc kubenswrapper[4790]: I1003 08:59:51.461935 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ntch7\" (UniqueName: \"kubernetes.io/projected/f334f308-e77d-45ff-8c57-30164f037180-kube-api-access-ntch7\") on node \"crc\" DevicePath \"\"" Oct 03 08:59:51 crc kubenswrapper[4790]: I1003 08:59:51.461945 4790 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0c264d15-baa6-494c-89e1-220296a37ce8-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 08:59:51 crc kubenswrapper[4790]: I1003 08:59:51.461953 4790 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/0c264d15-baa6-494c-89e1-220296a37ce8-httpd-run\") on node \"crc\" DevicePath \"\"" Oct 03 08:59:51 crc kubenswrapper[4790]: I1003 08:59:51.461961 4790 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f334f308-e77d-45ff-8c57-30164f037180-logs\") on node \"crc\" DevicePath \"\"" Oct 03 08:59:51 crc kubenswrapper[4790]: I1003 08:59:51.461968 4790 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f334f308-e77d-45ff-8c57-30164f037180-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 08:59:51 crc kubenswrapper[4790]: I1003 08:59:51.461977 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jtd2n\" (UniqueName: \"kubernetes.io/projected/0c264d15-baa6-494c-89e1-220296a37ce8-kube-api-access-jtd2n\") on node \"crc\" DevicePath \"\"" Oct 03 08:59:51 crc kubenswrapper[4790]: I1003 08:59:51.461993 4790 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" " Oct 03 08:59:51 crc kubenswrapper[4790]: I1003 08:59:51.499735 4790 scope.go:117] "RemoveContainer" containerID="dafeb127ec4a5866622b10537defdc778fdde863c4a580ec5e37ff408cb0f271" Oct 03 08:59:51 crc kubenswrapper[4790]: I1003 08:59:51.507645 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 03 08:59:51 crc kubenswrapper[4790]: I1003 08:59:51.523373 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 03 08:59:51 crc kubenswrapper[4790]: E1003 08:59:51.523794 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c8f86f3a-d523-455f-97b2-6995541d8487" containerName="sg-core" Oct 03 08:59:51 crc kubenswrapper[4790]: I1003 08:59:51.523808 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="c8f86f3a-d523-455f-97b2-6995541d8487" containerName="sg-core" Oct 03 08:59:51 crc kubenswrapper[4790]: E1003 08:59:51.523827 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0c264d15-baa6-494c-89e1-220296a37ce8" containerName="glance-httpd" Oct 03 08:59:51 crc kubenswrapper[4790]: I1003 08:59:51.523834 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c264d15-baa6-494c-89e1-220296a37ce8" containerName="glance-httpd" Oct 03 08:59:51 crc kubenswrapper[4790]: E1003 08:59:51.523846 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f334f308-e77d-45ff-8c57-30164f037180" containerName="barbican-api-log" Oct 03 08:59:51 crc kubenswrapper[4790]: I1003 08:59:51.523852 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="f334f308-e77d-45ff-8c57-30164f037180" containerName="barbican-api-log" Oct 03 08:59:51 crc kubenswrapper[4790]: E1003 08:59:51.523867 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dded1c26-62f8-4b21-a7da-f8d4b25bb677" containerName="cinder-scheduler" Oct 03 08:59:51 crc kubenswrapper[4790]: I1003 08:59:51.523873 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="dded1c26-62f8-4b21-a7da-f8d4b25bb677" containerName="cinder-scheduler" Oct 03 08:59:51 crc kubenswrapper[4790]: E1003 08:59:51.523890 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0c264d15-baa6-494c-89e1-220296a37ce8" containerName="glance-log" Oct 03 08:59:51 crc kubenswrapper[4790]: I1003 08:59:51.523896 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c264d15-baa6-494c-89e1-220296a37ce8" containerName="glance-log" Oct 03 08:59:51 crc kubenswrapper[4790]: E1003 08:59:51.523904 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c8f86f3a-d523-455f-97b2-6995541d8487" containerName="proxy-httpd" Oct 03 08:59:51 crc kubenswrapper[4790]: I1003 08:59:51.523909 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="c8f86f3a-d523-455f-97b2-6995541d8487" containerName="proxy-httpd" Oct 03 08:59:51 crc kubenswrapper[4790]: E1003 08:59:51.523921 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c8f86f3a-d523-455f-97b2-6995541d8487" containerName="ceilometer-central-agent" Oct 03 08:59:51 crc kubenswrapper[4790]: I1003 08:59:51.523926 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="c8f86f3a-d523-455f-97b2-6995541d8487" containerName="ceilometer-central-agent" Oct 03 08:59:51 crc kubenswrapper[4790]: E1003 08:59:51.523938 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c8f86f3a-d523-455f-97b2-6995541d8487" containerName="ceilometer-notification-agent" Oct 03 08:59:51 crc kubenswrapper[4790]: I1003 08:59:51.523944 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="c8f86f3a-d523-455f-97b2-6995541d8487" containerName="ceilometer-notification-agent" Oct 03 08:59:51 crc kubenswrapper[4790]: E1003 08:59:51.523952 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dded1c26-62f8-4b21-a7da-f8d4b25bb677" containerName="probe" Oct 03 08:59:51 crc kubenswrapper[4790]: I1003 08:59:51.523958 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="dded1c26-62f8-4b21-a7da-f8d4b25bb677" containerName="probe" Oct 03 08:59:51 crc kubenswrapper[4790]: E1003 08:59:51.523968 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f334f308-e77d-45ff-8c57-30164f037180" containerName="barbican-api" Oct 03 08:59:51 crc kubenswrapper[4790]: I1003 08:59:51.523973 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="f334f308-e77d-45ff-8c57-30164f037180" containerName="barbican-api" Oct 03 08:59:51 crc kubenswrapper[4790]: I1003 08:59:51.524158 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="0c264d15-baa6-494c-89e1-220296a37ce8" containerName="glance-httpd" Oct 03 08:59:51 crc kubenswrapper[4790]: I1003 08:59:51.524170 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="0c264d15-baa6-494c-89e1-220296a37ce8" containerName="glance-log" Oct 03 08:59:51 crc kubenswrapper[4790]: I1003 08:59:51.524182 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="f334f308-e77d-45ff-8c57-30164f037180" containerName="barbican-api" Oct 03 08:59:51 crc kubenswrapper[4790]: I1003 08:59:51.524191 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="dded1c26-62f8-4b21-a7da-f8d4b25bb677" containerName="probe" Oct 03 08:59:51 crc kubenswrapper[4790]: I1003 08:59:51.524199 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="c8f86f3a-d523-455f-97b2-6995541d8487" containerName="ceilometer-notification-agent" Oct 03 08:59:51 crc kubenswrapper[4790]: I1003 08:59:51.524210 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="dded1c26-62f8-4b21-a7da-f8d4b25bb677" containerName="cinder-scheduler" Oct 03 08:59:51 crc kubenswrapper[4790]: I1003 08:59:51.524217 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="c8f86f3a-d523-455f-97b2-6995541d8487" containerName="ceilometer-central-agent" Oct 03 08:59:51 crc kubenswrapper[4790]: I1003 08:59:51.524227 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="c8f86f3a-d523-455f-97b2-6995541d8487" containerName="sg-core" Oct 03 08:59:51 crc kubenswrapper[4790]: I1003 08:59:51.524235 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="c8f86f3a-d523-455f-97b2-6995541d8487" containerName="proxy-httpd" Oct 03 08:59:51 crc kubenswrapper[4790]: I1003 08:59:51.524245 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="f334f308-e77d-45ff-8c57-30164f037180" containerName="barbican-api-log" Oct 03 08:59:51 crc kubenswrapper[4790]: I1003 08:59:51.526074 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 03 08:59:51 crc kubenswrapper[4790]: I1003 08:59:51.532134 4790 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage10-crc" (UniqueName: "kubernetes.io/local-volume/local-storage10-crc") on node "crc" Oct 03 08:59:51 crc kubenswrapper[4790]: I1003 08:59:51.543020 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 03 08:59:51 crc kubenswrapper[4790]: I1003 08:59:51.543460 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 03 08:59:51 crc kubenswrapper[4790]: I1003 08:59:51.544494 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 03 08:59:51 crc kubenswrapper[4790]: I1003 08:59:51.547301 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0c264d15-baa6-494c-89e1-220296a37ce8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0c264d15-baa6-494c-89e1-220296a37ce8" (UID: "0c264d15-baa6-494c-89e1-220296a37ce8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:59:51 crc kubenswrapper[4790]: I1003 08:59:51.552698 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0c264d15-baa6-494c-89e1-220296a37ce8-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "0c264d15-baa6-494c-89e1-220296a37ce8" (UID: "0c264d15-baa6-494c-89e1-220296a37ce8"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:59:51 crc kubenswrapper[4790]: I1003 08:59:51.559364 4790 scope.go:117] "RemoveContainer" containerID="763b42a2a4547ba508e9d42da461fa8e7d0cb131165109e046d88f98df39bcb0" Oct 03 08:59:51 crc kubenswrapper[4790]: I1003 08:59:51.566521 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b240084f-7a9d-4dca-89a2-097e9904c9d8-scripts\") pod \"ceilometer-0\" (UID: \"b240084f-7a9d-4dca-89a2-097e9904c9d8\") " pod="openstack/ceilometer-0" Oct 03 08:59:51 crc kubenswrapper[4790]: I1003 08:59:51.566610 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-smttp\" (UniqueName: \"kubernetes.io/projected/b240084f-7a9d-4dca-89a2-097e9904c9d8-kube-api-access-smttp\") pod \"ceilometer-0\" (UID: \"b240084f-7a9d-4dca-89a2-097e9904c9d8\") " pod="openstack/ceilometer-0" Oct 03 08:59:51 crc kubenswrapper[4790]: I1003 08:59:51.566644 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b240084f-7a9d-4dca-89a2-097e9904c9d8-log-httpd\") pod \"ceilometer-0\" (UID: \"b240084f-7a9d-4dca-89a2-097e9904c9d8\") " pod="openstack/ceilometer-0" Oct 03 08:59:51 crc kubenswrapper[4790]: I1003 08:59:51.566697 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b240084f-7a9d-4dca-89a2-097e9904c9d8-run-httpd\") pod \"ceilometer-0\" (UID: \"b240084f-7a9d-4dca-89a2-097e9904c9d8\") " pod="openstack/ceilometer-0" Oct 03 08:59:51 crc kubenswrapper[4790]: I1003 08:59:51.566729 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b240084f-7a9d-4dca-89a2-097e9904c9d8-config-data\") pod \"ceilometer-0\" (UID: \"b240084f-7a9d-4dca-89a2-097e9904c9d8\") " pod="openstack/ceilometer-0" Oct 03 08:59:51 crc kubenswrapper[4790]: I1003 08:59:51.566795 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b240084f-7a9d-4dca-89a2-097e9904c9d8-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"b240084f-7a9d-4dca-89a2-097e9904c9d8\") " pod="openstack/ceilometer-0" Oct 03 08:59:51 crc kubenswrapper[4790]: I1003 08:59:51.566835 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b240084f-7a9d-4dca-89a2-097e9904c9d8-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"b240084f-7a9d-4dca-89a2-097e9904c9d8\") " pod="openstack/ceilometer-0" Oct 03 08:59:51 crc kubenswrapper[4790]: I1003 08:59:51.566907 4790 reconciler_common.go:293] "Volume detached for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" DevicePath \"\"" Oct 03 08:59:51 crc kubenswrapper[4790]: I1003 08:59:51.566929 4790 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c264d15-baa6-494c-89e1-220296a37ce8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 08:59:51 crc kubenswrapper[4790]: I1003 08:59:51.566942 4790 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0c264d15-baa6-494c-89e1-220296a37ce8-public-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 03 08:59:51 crc kubenswrapper[4790]: I1003 08:59:51.570337 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-h5qqd"] Oct 03 08:59:51 crc kubenswrapper[4790]: I1003 08:59:51.571285 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=2.916891228 podStartE2EDuration="21.571266225s" podCreationTimestamp="2025-10-03 08:59:30 +0000 UTC" firstStartedPulling="2025-10-03 08:59:31.812364527 +0000 UTC m=+1158.165298775" lastFinishedPulling="2025-10-03 08:59:50.466739534 +0000 UTC m=+1176.819673772" observedRunningTime="2025-10-03 08:59:51.453294212 +0000 UTC m=+1177.806228460" watchObservedRunningTime="2025-10-03 08:59:51.571266225 +0000 UTC m=+1177.924200463" Oct 03 08:59:51 crc kubenswrapper[4790]: I1003 08:59:51.584197 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f334f308-e77d-45ff-8c57-30164f037180-config-data" (OuterVolumeSpecName: "config-data") pod "f334f308-e77d-45ff-8c57-30164f037180" (UID: "f334f308-e77d-45ff-8c57-30164f037180"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:59:51 crc kubenswrapper[4790]: I1003 08:59:51.584419 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0c264d15-baa6-494c-89e1-220296a37ce8-config-data" (OuterVolumeSpecName: "config-data") pod "0c264d15-baa6-494c-89e1-220296a37ce8" (UID: "0c264d15-baa6-494c-89e1-220296a37ce8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:59:51 crc kubenswrapper[4790]: I1003 08:59:51.628535 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-q2zhq"] Oct 03 08:59:51 crc kubenswrapper[4790]: I1003 08:59:51.635673 4790 scope.go:117] "RemoveContainer" containerID="5c5f8b72278b652f33128ff7085aae88d1656148c0307c10c8c9d6bfefa75a2c" Oct 03 08:59:51 crc kubenswrapper[4790]: I1003 08:59:51.641677 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 03 08:59:51 crc kubenswrapper[4790]: I1003 08:59:51.645394 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 03 08:59:51 crc kubenswrapper[4790]: I1003 08:59:51.664165 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 03 08:59:51 crc kubenswrapper[4790]: I1003 08:59:51.670375 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-smttp\" (UniqueName: \"kubernetes.io/projected/b240084f-7a9d-4dca-89a2-097e9904c9d8-kube-api-access-smttp\") pod \"ceilometer-0\" (UID: \"b240084f-7a9d-4dca-89a2-097e9904c9d8\") " pod="openstack/ceilometer-0" Oct 03 08:59:51 crc kubenswrapper[4790]: I1003 08:59:51.670407 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b240084f-7a9d-4dca-89a2-097e9904c9d8-log-httpd\") pod \"ceilometer-0\" (UID: \"b240084f-7a9d-4dca-89a2-097e9904c9d8\") " pod="openstack/ceilometer-0" Oct 03 08:59:51 crc kubenswrapper[4790]: I1003 08:59:51.670454 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b240084f-7a9d-4dca-89a2-097e9904c9d8-run-httpd\") pod \"ceilometer-0\" (UID: \"b240084f-7a9d-4dca-89a2-097e9904c9d8\") " pod="openstack/ceilometer-0" Oct 03 08:59:51 crc kubenswrapper[4790]: I1003 08:59:51.670477 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b240084f-7a9d-4dca-89a2-097e9904c9d8-config-data\") pod \"ceilometer-0\" (UID: \"b240084f-7a9d-4dca-89a2-097e9904c9d8\") " pod="openstack/ceilometer-0" Oct 03 08:59:51 crc kubenswrapper[4790]: I1003 08:59:51.670538 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b240084f-7a9d-4dca-89a2-097e9904c9d8-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"b240084f-7a9d-4dca-89a2-097e9904c9d8\") " pod="openstack/ceilometer-0" Oct 03 08:59:51 crc kubenswrapper[4790]: I1003 08:59:51.670564 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b240084f-7a9d-4dca-89a2-097e9904c9d8-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"b240084f-7a9d-4dca-89a2-097e9904c9d8\") " pod="openstack/ceilometer-0" Oct 03 08:59:51 crc kubenswrapper[4790]: I1003 08:59:51.670639 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b240084f-7a9d-4dca-89a2-097e9904c9d8-scripts\") pod \"ceilometer-0\" (UID: \"b240084f-7a9d-4dca-89a2-097e9904c9d8\") " pod="openstack/ceilometer-0" Oct 03 08:59:51 crc kubenswrapper[4790]: I1003 08:59:51.670687 4790 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f334f308-e77d-45ff-8c57-30164f037180-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 08:59:51 crc kubenswrapper[4790]: I1003 08:59:51.670698 4790 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0c264d15-baa6-494c-89e1-220296a37ce8-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 08:59:51 crc kubenswrapper[4790]: I1003 08:59:51.671311 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b240084f-7a9d-4dca-89a2-097e9904c9d8-run-httpd\") pod \"ceilometer-0\" (UID: \"b240084f-7a9d-4dca-89a2-097e9904c9d8\") " pod="openstack/ceilometer-0" Oct 03 08:59:51 crc kubenswrapper[4790]: I1003 08:59:51.671801 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b240084f-7a9d-4dca-89a2-097e9904c9d8-log-httpd\") pod \"ceilometer-0\" (UID: \"b240084f-7a9d-4dca-89a2-097e9904c9d8\") " pod="openstack/ceilometer-0" Oct 03 08:59:51 crc kubenswrapper[4790]: I1003 08:59:51.680050 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Oct 03 08:59:51 crc kubenswrapper[4790]: E1003 08:59:51.680445 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ab258ad1-5952-4586-97ac-9da3ad97658d" containerName="glance-log" Oct 03 08:59:51 crc kubenswrapper[4790]: I1003 08:59:51.680458 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="ab258ad1-5952-4586-97ac-9da3ad97658d" containerName="glance-log" Oct 03 08:59:51 crc kubenswrapper[4790]: E1003 08:59:51.680466 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ab258ad1-5952-4586-97ac-9da3ad97658d" containerName="glance-httpd" Oct 03 08:59:51 crc kubenswrapper[4790]: I1003 08:59:51.680472 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="ab258ad1-5952-4586-97ac-9da3ad97658d" containerName="glance-httpd" Oct 03 08:59:51 crc kubenswrapper[4790]: I1003 08:59:51.680736 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="ab258ad1-5952-4586-97ac-9da3ad97658d" containerName="glance-log" Oct 03 08:59:51 crc kubenswrapper[4790]: I1003 08:59:51.680748 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="ab258ad1-5952-4586-97ac-9da3ad97658d" containerName="glance-httpd" Oct 03 08:59:51 crc kubenswrapper[4790]: I1003 08:59:51.681728 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 03 08:59:51 crc kubenswrapper[4790]: I1003 08:59:51.684874 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b240084f-7a9d-4dca-89a2-097e9904c9d8-scripts\") pod \"ceilometer-0\" (UID: \"b240084f-7a9d-4dca-89a2-097e9904c9d8\") " pod="openstack/ceilometer-0" Oct 03 08:59:51 crc kubenswrapper[4790]: I1003 08:59:51.685278 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Oct 03 08:59:51 crc kubenswrapper[4790]: I1003 08:59:51.690863 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b240084f-7a9d-4dca-89a2-097e9904c9d8-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"b240084f-7a9d-4dca-89a2-097e9904c9d8\") " pod="openstack/ceilometer-0" Oct 03 08:59:51 crc kubenswrapper[4790]: I1003 08:59:51.692058 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b240084f-7a9d-4dca-89a2-097e9904c9d8-config-data\") pod \"ceilometer-0\" (UID: \"b240084f-7a9d-4dca-89a2-097e9904c9d8\") " pod="openstack/ceilometer-0" Oct 03 08:59:51 crc kubenswrapper[4790]: I1003 08:59:51.701609 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 03 08:59:51 crc kubenswrapper[4790]: I1003 08:59:51.703135 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b240084f-7a9d-4dca-89a2-097e9904c9d8-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"b240084f-7a9d-4dca-89a2-097e9904c9d8\") " pod="openstack/ceilometer-0" Oct 03 08:59:51 crc kubenswrapper[4790]: I1003 08:59:51.715713 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-smttp\" (UniqueName: \"kubernetes.io/projected/b240084f-7a9d-4dca-89a2-097e9904c9d8-kube-api-access-smttp\") pod \"ceilometer-0\" (UID: \"b240084f-7a9d-4dca-89a2-097e9904c9d8\") " pod="openstack/ceilometer-0" Oct 03 08:59:51 crc kubenswrapper[4790]: I1003 08:59:51.731036 4790 scope.go:117] "RemoveContainer" containerID="e84566cac6111f752ce54b2300b992075f833822e1ee7f609aa4d7a9799fe899" Oct 03 08:59:51 crc kubenswrapper[4790]: I1003 08:59:51.740021 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-rq7b8"] Oct 03 08:59:51 crc kubenswrapper[4790]: I1003 08:59:51.772426 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab258ad1-5952-4586-97ac-9da3ad97658d-combined-ca-bundle\") pod \"ab258ad1-5952-4586-97ac-9da3ad97658d\" (UID: \"ab258ad1-5952-4586-97ac-9da3ad97658d\") " Oct 03 08:59:51 crc kubenswrapper[4790]: I1003 08:59:51.772825 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ab258ad1-5952-4586-97ac-9da3ad97658d-config-data\") pod \"ab258ad1-5952-4586-97ac-9da3ad97658d\" (UID: \"ab258ad1-5952-4586-97ac-9da3ad97658d\") " Oct 03 08:59:51 crc kubenswrapper[4790]: I1003 08:59:51.773018 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"ab258ad1-5952-4586-97ac-9da3ad97658d\" (UID: \"ab258ad1-5952-4586-97ac-9da3ad97658d\") " Oct 03 08:59:51 crc kubenswrapper[4790]: I1003 08:59:51.773262 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k4gfc\" (UniqueName: \"kubernetes.io/projected/ab258ad1-5952-4586-97ac-9da3ad97658d-kube-api-access-k4gfc\") pod \"ab258ad1-5952-4586-97ac-9da3ad97658d\" (UID: \"ab258ad1-5952-4586-97ac-9da3ad97658d\") " Oct 03 08:59:51 crc kubenswrapper[4790]: I1003 08:59:51.773547 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ab258ad1-5952-4586-97ac-9da3ad97658d-scripts\") pod \"ab258ad1-5952-4586-97ac-9da3ad97658d\" (UID: \"ab258ad1-5952-4586-97ac-9da3ad97658d\") " Oct 03 08:59:51 crc kubenswrapper[4790]: I1003 08:59:51.773681 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ab258ad1-5952-4586-97ac-9da3ad97658d-httpd-run\") pod \"ab258ad1-5952-4586-97ac-9da3ad97658d\" (UID: \"ab258ad1-5952-4586-97ac-9da3ad97658d\") " Oct 03 08:59:51 crc kubenswrapper[4790]: I1003 08:59:51.773831 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ab258ad1-5952-4586-97ac-9da3ad97658d-internal-tls-certs\") pod \"ab258ad1-5952-4586-97ac-9da3ad97658d\" (UID: \"ab258ad1-5952-4586-97ac-9da3ad97658d\") " Oct 03 08:59:51 crc kubenswrapper[4790]: I1003 08:59:51.773952 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ab258ad1-5952-4586-97ac-9da3ad97658d-logs\") pod \"ab258ad1-5952-4586-97ac-9da3ad97658d\" (UID: \"ab258ad1-5952-4586-97ac-9da3ad97658d\") " Oct 03 08:59:51 crc kubenswrapper[4790]: I1003 08:59:51.774481 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1289ded7-2702-43dc-9d28-00567ba7b668-config-data\") pod \"cinder-scheduler-0\" (UID: \"1289ded7-2702-43dc-9d28-00567ba7b668\") " pod="openstack/cinder-scheduler-0" Oct 03 08:59:51 crc kubenswrapper[4790]: I1003 08:59:51.774826 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2jj5g\" (UniqueName: \"kubernetes.io/projected/1289ded7-2702-43dc-9d28-00567ba7b668-kube-api-access-2jj5g\") pod \"cinder-scheduler-0\" (UID: \"1289ded7-2702-43dc-9d28-00567ba7b668\") " pod="openstack/cinder-scheduler-0" Oct 03 08:59:51 crc kubenswrapper[4790]: I1003 08:59:51.775069 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/1289ded7-2702-43dc-9d28-00567ba7b668-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"1289ded7-2702-43dc-9d28-00567ba7b668\") " pod="openstack/cinder-scheduler-0" Oct 03 08:59:51 crc kubenswrapper[4790]: I1003 08:59:51.775229 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1289ded7-2702-43dc-9d28-00567ba7b668-scripts\") pod \"cinder-scheduler-0\" (UID: \"1289ded7-2702-43dc-9d28-00567ba7b668\") " pod="openstack/cinder-scheduler-0" Oct 03 08:59:51 crc kubenswrapper[4790]: I1003 08:59:51.775388 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1289ded7-2702-43dc-9d28-00567ba7b668-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"1289ded7-2702-43dc-9d28-00567ba7b668\") " pod="openstack/cinder-scheduler-0" Oct 03 08:59:51 crc kubenswrapper[4790]: I1003 08:59:51.775671 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1289ded7-2702-43dc-9d28-00567ba7b668-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"1289ded7-2702-43dc-9d28-00567ba7b668\") " pod="openstack/cinder-scheduler-0" Oct 03 08:59:51 crc kubenswrapper[4790]: I1003 08:59:51.776860 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ab258ad1-5952-4586-97ac-9da3ad97658d-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "ab258ad1-5952-4586-97ac-9da3ad97658d" (UID: "ab258ad1-5952-4586-97ac-9da3ad97658d"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 08:59:51 crc kubenswrapper[4790]: I1003 08:59:51.780489 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ab258ad1-5952-4586-97ac-9da3ad97658d-logs" (OuterVolumeSpecName: "logs") pod "ab258ad1-5952-4586-97ac-9da3ad97658d" (UID: "ab258ad1-5952-4586-97ac-9da3ad97658d"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 08:59:51 crc kubenswrapper[4790]: I1003 08:59:51.820797 4790 scope.go:117] "RemoveContainer" containerID="cb394e52a9c3ce4a1011d9ebb09501a30c8b49cbbfad1db98cb632f81401d6cf" Oct 03 08:59:51 crc kubenswrapper[4790]: I1003 08:59:51.825076 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-59b887b4f6-vdnlv"] Oct 03 08:59:51 crc kubenswrapper[4790]: I1003 08:59:51.825241 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage09-crc" (OuterVolumeSpecName: "glance") pod "ab258ad1-5952-4586-97ac-9da3ad97658d" (UID: "ab258ad1-5952-4586-97ac-9da3ad97658d"). InnerVolumeSpecName "local-storage09-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 03 08:59:51 crc kubenswrapper[4790]: I1003 08:59:51.825452 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ab258ad1-5952-4586-97ac-9da3ad97658d-scripts" (OuterVolumeSpecName: "scripts") pod "ab258ad1-5952-4586-97ac-9da3ad97658d" (UID: "ab258ad1-5952-4586-97ac-9da3ad97658d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:59:51 crc kubenswrapper[4790]: I1003 08:59:51.826892 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ab258ad1-5952-4586-97ac-9da3ad97658d-kube-api-access-k4gfc" (OuterVolumeSpecName: "kube-api-access-k4gfc") pod "ab258ad1-5952-4586-97ac-9da3ad97658d" (UID: "ab258ad1-5952-4586-97ac-9da3ad97658d"). InnerVolumeSpecName "kube-api-access-k4gfc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:59:51 crc kubenswrapper[4790]: I1003 08:59:51.859506 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-59b887b4f6-vdnlv"] Oct 03 08:59:51 crc kubenswrapper[4790]: I1003 08:59:51.864536 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 03 08:59:51 crc kubenswrapper[4790]: I1003 08:59:51.865507 4790 scope.go:117] "RemoveContainer" containerID="86300d1c7512e83f6372e380c2bce36bc05ced935845c7c701a17989990bda32" Oct 03 08:59:51 crc kubenswrapper[4790]: I1003 08:59:51.878058 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/1289ded7-2702-43dc-9d28-00567ba7b668-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"1289ded7-2702-43dc-9d28-00567ba7b668\") " pod="openstack/cinder-scheduler-0" Oct 03 08:59:51 crc kubenswrapper[4790]: I1003 08:59:51.878112 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1289ded7-2702-43dc-9d28-00567ba7b668-scripts\") pod \"cinder-scheduler-0\" (UID: \"1289ded7-2702-43dc-9d28-00567ba7b668\") " pod="openstack/cinder-scheduler-0" Oct 03 08:59:51 crc kubenswrapper[4790]: I1003 08:59:51.878160 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1289ded7-2702-43dc-9d28-00567ba7b668-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"1289ded7-2702-43dc-9d28-00567ba7b668\") " pod="openstack/cinder-scheduler-0" Oct 03 08:59:51 crc kubenswrapper[4790]: I1003 08:59:51.878232 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1289ded7-2702-43dc-9d28-00567ba7b668-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"1289ded7-2702-43dc-9d28-00567ba7b668\") " pod="openstack/cinder-scheduler-0" Oct 03 08:59:51 crc kubenswrapper[4790]: I1003 08:59:51.878345 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1289ded7-2702-43dc-9d28-00567ba7b668-config-data\") pod \"cinder-scheduler-0\" (UID: \"1289ded7-2702-43dc-9d28-00567ba7b668\") " pod="openstack/cinder-scheduler-0" Oct 03 08:59:51 crc kubenswrapper[4790]: I1003 08:59:51.878376 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2jj5g\" (UniqueName: \"kubernetes.io/projected/1289ded7-2702-43dc-9d28-00567ba7b668-kube-api-access-2jj5g\") pod \"cinder-scheduler-0\" (UID: \"1289ded7-2702-43dc-9d28-00567ba7b668\") " pod="openstack/cinder-scheduler-0" Oct 03 08:59:51 crc kubenswrapper[4790]: I1003 08:59:51.878467 4790 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ab258ad1-5952-4586-97ac-9da3ad97658d-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 08:59:51 crc kubenswrapper[4790]: I1003 08:59:51.878481 4790 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ab258ad1-5952-4586-97ac-9da3ad97658d-httpd-run\") on node \"crc\" DevicePath \"\"" Oct 03 08:59:51 crc kubenswrapper[4790]: I1003 08:59:51.878496 4790 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ab258ad1-5952-4586-97ac-9da3ad97658d-logs\") on node \"crc\" DevicePath \"\"" Oct 03 08:59:51 crc kubenswrapper[4790]: I1003 08:59:51.878521 4790 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" " Oct 03 08:59:51 crc kubenswrapper[4790]: I1003 08:59:51.878534 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k4gfc\" (UniqueName: \"kubernetes.io/projected/ab258ad1-5952-4586-97ac-9da3ad97658d-kube-api-access-k4gfc\") on node \"crc\" DevicePath \"\"" Oct 03 08:59:51 crc kubenswrapper[4790]: I1003 08:59:51.880289 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/1289ded7-2702-43dc-9d28-00567ba7b668-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"1289ded7-2702-43dc-9d28-00567ba7b668\") " pod="openstack/cinder-scheduler-0" Oct 03 08:59:51 crc kubenswrapper[4790]: I1003 08:59:51.883850 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1289ded7-2702-43dc-9d28-00567ba7b668-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"1289ded7-2702-43dc-9d28-00567ba7b668\") " pod="openstack/cinder-scheduler-0" Oct 03 08:59:51 crc kubenswrapper[4790]: I1003 08:59:51.884651 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1289ded7-2702-43dc-9d28-00567ba7b668-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"1289ded7-2702-43dc-9d28-00567ba7b668\") " pod="openstack/cinder-scheduler-0" Oct 03 08:59:51 crc kubenswrapper[4790]: I1003 08:59:51.885358 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1289ded7-2702-43dc-9d28-00567ba7b668-config-data\") pod \"cinder-scheduler-0\" (UID: \"1289ded7-2702-43dc-9d28-00567ba7b668\") " pod="openstack/cinder-scheduler-0" Oct 03 08:59:51 crc kubenswrapper[4790]: I1003 08:59:51.885673 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1289ded7-2702-43dc-9d28-00567ba7b668-scripts\") pod \"cinder-scheduler-0\" (UID: \"1289ded7-2702-43dc-9d28-00567ba7b668\") " pod="openstack/cinder-scheduler-0" Oct 03 08:59:51 crc kubenswrapper[4790]: I1003 08:59:51.889403 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 03 08:59:51 crc kubenswrapper[4790]: I1003 08:59:51.899781 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 03 08:59:51 crc kubenswrapper[4790]: I1003 08:59:51.907234 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ab258ad1-5952-4586-97ac-9da3ad97658d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ab258ad1-5952-4586-97ac-9da3ad97658d" (UID: "ab258ad1-5952-4586-97ac-9da3ad97658d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:59:51 crc kubenswrapper[4790]: I1003 08:59:51.907835 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Oct 03 08:59:51 crc kubenswrapper[4790]: I1003 08:59:51.908773 4790 scope.go:117] "RemoveContainer" containerID="e1f7fd3c3f6cd89b03346ac4f66120fb71f51641e1aa7429240630f77d743d72" Oct 03 08:59:51 crc kubenswrapper[4790]: I1003 08:59:51.909505 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 03 08:59:51 crc kubenswrapper[4790]: I1003 08:59:51.912678 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Oct 03 08:59:51 crc kubenswrapper[4790]: I1003 08:59:51.913289 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Oct 03 08:59:51 crc kubenswrapper[4790]: I1003 08:59:51.920063 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2jj5g\" (UniqueName: \"kubernetes.io/projected/1289ded7-2702-43dc-9d28-00567ba7b668-kube-api-access-2jj5g\") pod \"cinder-scheduler-0\" (UID: \"1289ded7-2702-43dc-9d28-00567ba7b668\") " pod="openstack/cinder-scheduler-0" Oct 03 08:59:51 crc kubenswrapper[4790]: I1003 08:59:51.921117 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 03 08:59:51 crc kubenswrapper[4790]: I1003 08:59:51.958336 4790 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage09-crc" (UniqueName: "kubernetes.io/local-volume/local-storage09-crc") on node "crc" Oct 03 08:59:51 crc kubenswrapper[4790]: I1003 08:59:51.979783 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/42c4562a-910e-47f6-befc-effb8f851374-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"42c4562a-910e-47f6-befc-effb8f851374\") " pod="openstack/glance-default-external-api-0" Oct 03 08:59:51 crc kubenswrapper[4790]: I1003 08:59:51.979850 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-scgbm\" (UniqueName: \"kubernetes.io/projected/42c4562a-910e-47f6-befc-effb8f851374-kube-api-access-scgbm\") pod \"glance-default-external-api-0\" (UID: \"42c4562a-910e-47f6-befc-effb8f851374\") " pod="openstack/glance-default-external-api-0" Oct 03 08:59:51 crc kubenswrapper[4790]: I1003 08:59:51.979871 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/42c4562a-910e-47f6-befc-effb8f851374-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"42c4562a-910e-47f6-befc-effb8f851374\") " pod="openstack/glance-default-external-api-0" Oct 03 08:59:51 crc kubenswrapper[4790]: I1003 08:59:51.979891 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/42c4562a-910e-47f6-befc-effb8f851374-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"42c4562a-910e-47f6-befc-effb8f851374\") " pod="openstack/glance-default-external-api-0" Oct 03 08:59:51 crc kubenswrapper[4790]: I1003 08:59:51.979926 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"42c4562a-910e-47f6-befc-effb8f851374\") " pod="openstack/glance-default-external-api-0" Oct 03 08:59:51 crc kubenswrapper[4790]: I1003 08:59:51.979942 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/42c4562a-910e-47f6-befc-effb8f851374-logs\") pod \"glance-default-external-api-0\" (UID: \"42c4562a-910e-47f6-befc-effb8f851374\") " pod="openstack/glance-default-external-api-0" Oct 03 08:59:51 crc kubenswrapper[4790]: I1003 08:59:51.979955 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/42c4562a-910e-47f6-befc-effb8f851374-scripts\") pod \"glance-default-external-api-0\" (UID: \"42c4562a-910e-47f6-befc-effb8f851374\") " pod="openstack/glance-default-external-api-0" Oct 03 08:59:51 crc kubenswrapper[4790]: I1003 08:59:51.980003 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/42c4562a-910e-47f6-befc-effb8f851374-config-data\") pod \"glance-default-external-api-0\" (UID: \"42c4562a-910e-47f6-befc-effb8f851374\") " pod="openstack/glance-default-external-api-0" Oct 03 08:59:51 crc kubenswrapper[4790]: I1003 08:59:51.980093 4790 reconciler_common.go:293] "Volume detached for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" DevicePath \"\"" Oct 03 08:59:51 crc kubenswrapper[4790]: I1003 08:59:51.980104 4790 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab258ad1-5952-4586-97ac-9da3ad97658d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 08:59:52 crc kubenswrapper[4790]: I1003 08:59:52.051506 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ab258ad1-5952-4586-97ac-9da3ad97658d-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "ab258ad1-5952-4586-97ac-9da3ad97658d" (UID: "ab258ad1-5952-4586-97ac-9da3ad97658d"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:59:52 crc kubenswrapper[4790]: I1003 08:59:52.093590 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/42c4562a-910e-47f6-befc-effb8f851374-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"42c4562a-910e-47f6-befc-effb8f851374\") " pod="openstack/glance-default-external-api-0" Oct 03 08:59:52 crc kubenswrapper[4790]: I1003 08:59:52.093695 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-scgbm\" (UniqueName: \"kubernetes.io/projected/42c4562a-910e-47f6-befc-effb8f851374-kube-api-access-scgbm\") pod \"glance-default-external-api-0\" (UID: \"42c4562a-910e-47f6-befc-effb8f851374\") " pod="openstack/glance-default-external-api-0" Oct 03 08:59:52 crc kubenswrapper[4790]: I1003 08:59:52.093727 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/42c4562a-910e-47f6-befc-effb8f851374-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"42c4562a-910e-47f6-befc-effb8f851374\") " pod="openstack/glance-default-external-api-0" Oct 03 08:59:52 crc kubenswrapper[4790]: I1003 08:59:52.093753 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/42c4562a-910e-47f6-befc-effb8f851374-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"42c4562a-910e-47f6-befc-effb8f851374\") " pod="openstack/glance-default-external-api-0" Oct 03 08:59:52 crc kubenswrapper[4790]: I1003 08:59:52.093792 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"42c4562a-910e-47f6-befc-effb8f851374\") " pod="openstack/glance-default-external-api-0" Oct 03 08:59:52 crc kubenswrapper[4790]: I1003 08:59:52.093822 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/42c4562a-910e-47f6-befc-effb8f851374-logs\") pod \"glance-default-external-api-0\" (UID: \"42c4562a-910e-47f6-befc-effb8f851374\") " pod="openstack/glance-default-external-api-0" Oct 03 08:59:52 crc kubenswrapper[4790]: I1003 08:59:52.093843 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/42c4562a-910e-47f6-befc-effb8f851374-scripts\") pod \"glance-default-external-api-0\" (UID: \"42c4562a-910e-47f6-befc-effb8f851374\") " pod="openstack/glance-default-external-api-0" Oct 03 08:59:52 crc kubenswrapper[4790]: I1003 08:59:52.093931 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/42c4562a-910e-47f6-befc-effb8f851374-config-data\") pod \"glance-default-external-api-0\" (UID: \"42c4562a-910e-47f6-befc-effb8f851374\") " pod="openstack/glance-default-external-api-0" Oct 03 08:59:52 crc kubenswrapper[4790]: I1003 08:59:52.094039 4790 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ab258ad1-5952-4586-97ac-9da3ad97658d-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 03 08:59:52 crc kubenswrapper[4790]: I1003 08:59:52.095279 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/42c4562a-910e-47f6-befc-effb8f851374-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"42c4562a-910e-47f6-befc-effb8f851374\") " pod="openstack/glance-default-external-api-0" Oct 03 08:59:52 crc kubenswrapper[4790]: I1003 08:59:52.098098 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/42c4562a-910e-47f6-befc-effb8f851374-logs\") pod \"glance-default-external-api-0\" (UID: \"42c4562a-910e-47f6-befc-effb8f851374\") " pod="openstack/glance-default-external-api-0" Oct 03 08:59:52 crc kubenswrapper[4790]: I1003 08:59:52.098400 4790 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"42c4562a-910e-47f6-befc-effb8f851374\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/glance-default-external-api-0" Oct 03 08:59:52 crc kubenswrapper[4790]: I1003 08:59:52.104716 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 03 08:59:52 crc kubenswrapper[4790]: I1003 08:59:52.112201 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/42c4562a-910e-47f6-befc-effb8f851374-scripts\") pod \"glance-default-external-api-0\" (UID: \"42c4562a-910e-47f6-befc-effb8f851374\") " pod="openstack/glance-default-external-api-0" Oct 03 08:59:52 crc kubenswrapper[4790]: I1003 08:59:52.112762 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/42c4562a-910e-47f6-befc-effb8f851374-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"42c4562a-910e-47f6-befc-effb8f851374\") " pod="openstack/glance-default-external-api-0" Oct 03 08:59:52 crc kubenswrapper[4790]: I1003 08:59:52.114389 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/42c4562a-910e-47f6-befc-effb8f851374-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"42c4562a-910e-47f6-befc-effb8f851374\") " pod="openstack/glance-default-external-api-0" Oct 03 08:59:52 crc kubenswrapper[4790]: I1003 08:59:52.127425 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/42c4562a-910e-47f6-befc-effb8f851374-config-data\") pod \"glance-default-external-api-0\" (UID: \"42c4562a-910e-47f6-befc-effb8f851374\") " pod="openstack/glance-default-external-api-0" Oct 03 08:59:52 crc kubenswrapper[4790]: I1003 08:59:52.130797 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-scgbm\" (UniqueName: \"kubernetes.io/projected/42c4562a-910e-47f6-befc-effb8f851374-kube-api-access-scgbm\") pod \"glance-default-external-api-0\" (UID: \"42c4562a-910e-47f6-befc-effb8f851374\") " pod="openstack/glance-default-external-api-0" Oct 03 08:59:52 crc kubenswrapper[4790]: I1003 08:59:52.215005 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"42c4562a-910e-47f6-befc-effb8f851374\") " pod="openstack/glance-default-external-api-0" Oct 03 08:59:52 crc kubenswrapper[4790]: I1003 08:59:52.216297 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ab258ad1-5952-4586-97ac-9da3ad97658d-config-data" (OuterVolumeSpecName: "config-data") pod "ab258ad1-5952-4586-97ac-9da3ad97658d" (UID: "ab258ad1-5952-4586-97ac-9da3ad97658d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:59:52 crc kubenswrapper[4790]: I1003 08:59:52.245188 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 03 08:59:52 crc kubenswrapper[4790]: I1003 08:59:52.300468 4790 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ab258ad1-5952-4586-97ac-9da3ad97658d-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 08:59:52 crc kubenswrapper[4790]: I1003 08:59:52.378896 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0c264d15-baa6-494c-89e1-220296a37ce8" path="/var/lib/kubelet/pods/0c264d15-baa6-494c-89e1-220296a37ce8/volumes" Oct 03 08:59:52 crc kubenswrapper[4790]: I1003 08:59:52.380648 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c8f86f3a-d523-455f-97b2-6995541d8487" path="/var/lib/kubelet/pods/c8f86f3a-d523-455f-97b2-6995541d8487/volumes" Oct 03 08:59:52 crc kubenswrapper[4790]: I1003 08:59:52.381506 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dded1c26-62f8-4b21-a7da-f8d4b25bb677" path="/var/lib/kubelet/pods/dded1c26-62f8-4b21-a7da-f8d4b25bb677/volumes" Oct 03 08:59:52 crc kubenswrapper[4790]: I1003 08:59:52.384177 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f334f308-e77d-45ff-8c57-30164f037180" path="/var/lib/kubelet/pods/f334f308-e77d-45ff-8c57-30164f037180/volumes" Oct 03 08:59:52 crc kubenswrapper[4790]: I1003 08:59:52.409848 4790 generic.go:334] "Generic (PLEG): container finished" podID="0ffe5a55-d80d-47ce-9fc9-0611c97437d1" containerID="bc56f9848047534b21e8193971dec96ae80429f4424af6a673934098585c9f14" exitCode=0 Oct 03 08:59:52 crc kubenswrapper[4790]: I1003 08:59:52.409959 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-q2zhq" event={"ID":"0ffe5a55-d80d-47ce-9fc9-0611c97437d1","Type":"ContainerDied","Data":"bc56f9848047534b21e8193971dec96ae80429f4424af6a673934098585c9f14"} Oct 03 08:59:52 crc kubenswrapper[4790]: I1003 08:59:52.409986 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-q2zhq" event={"ID":"0ffe5a55-d80d-47ce-9fc9-0611c97437d1","Type":"ContainerStarted","Data":"fae3dfcd0544f4a5106f3020fb2ab504905ad93bcccf0490cef1d850d24d1b34"} Oct 03 08:59:52 crc kubenswrapper[4790]: I1003 08:59:52.433971 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-rq7b8" event={"ID":"231c1e2d-3330-48e9-80cb-6f490ba17e37","Type":"ContainerStarted","Data":"cc38a3e97666b4fe691265c985fd77f84b5a2e2fd71eb66b094e567d30914519"} Oct 03 08:59:52 crc kubenswrapper[4790]: I1003 08:59:52.434020 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-rq7b8" event={"ID":"231c1e2d-3330-48e9-80cb-6f490ba17e37","Type":"ContainerStarted","Data":"f9cc868dc4cf11bfd232d619cd4d949f4b0e4153fc81f21aa29ecb67df2ad976"} Oct 03 08:59:52 crc kubenswrapper[4790]: I1003 08:59:52.455754 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"ab258ad1-5952-4586-97ac-9da3ad97658d","Type":"ContainerDied","Data":"8fc3021598e228d28ab9357026f9f22e644b7b0d478b260227e6425b23115872"} Oct 03 08:59:52 crc kubenswrapper[4790]: I1003 08:59:52.455986 4790 scope.go:117] "RemoveContainer" containerID="46229c309a222bb8e95f5453935ba4bf91e74c328298d721f09c2610d2e33236" Oct 03 08:59:52 crc kubenswrapper[4790]: I1003 08:59:52.456141 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 03 08:59:52 crc kubenswrapper[4790]: I1003 08:59:52.460846 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-db-create-rq7b8" podStartSLOduration=4.460822353 podStartE2EDuration="4.460822353s" podCreationTimestamp="2025-10-03 08:59:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 08:59:52.453908844 +0000 UTC m=+1178.806843082" watchObservedRunningTime="2025-10-03 08:59:52.460822353 +0000 UTC m=+1178.813756591" Oct 03 08:59:52 crc kubenswrapper[4790]: I1003 08:59:52.507704 4790 generic.go:334] "Generic (PLEG): container finished" podID="28ca9811-81f8-4113-b7c0-ddcb16a66ad9" containerID="a7dff9a615b4a115915a822588c7e829a583ee6905544f42265c30cf055594e4" exitCode=0 Oct 03 08:59:52 crc kubenswrapper[4790]: I1003 08:59:52.509504 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-h5qqd" event={"ID":"28ca9811-81f8-4113-b7c0-ddcb16a66ad9","Type":"ContainerDied","Data":"a7dff9a615b4a115915a822588c7e829a583ee6905544f42265c30cf055594e4"} Oct 03 08:59:52 crc kubenswrapper[4790]: I1003 08:59:52.509540 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-h5qqd" event={"ID":"28ca9811-81f8-4113-b7c0-ddcb16a66ad9","Type":"ContainerStarted","Data":"019d33df680131ca2fa34954463d7e5611e8d060bb23c5fb2ff8337e7786e5e4"} Oct 03 08:59:52 crc kubenswrapper[4790]: I1003 08:59:52.533266 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 03 08:59:52 crc kubenswrapper[4790]: I1003 08:59:52.558976 4790 scope.go:117] "RemoveContainer" containerID="a00e77f9ee436c7cc8964d611a19e338bf93e927ab08a3cc85c2f19a770a75f4" Oct 03 08:59:52 crc kubenswrapper[4790]: I1003 08:59:52.564971 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 03 08:59:52 crc kubenswrapper[4790]: I1003 08:59:52.580641 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 03 08:59:52 crc kubenswrapper[4790]: I1003 08:59:52.645642 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 03 08:59:52 crc kubenswrapper[4790]: I1003 08:59:52.656475 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 03 08:59:52 crc kubenswrapper[4790]: I1003 08:59:52.674402 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 03 08:59:52 crc kubenswrapper[4790]: I1003 08:59:52.674929 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Oct 03 08:59:52 crc kubenswrapper[4790]: I1003 08:59:52.675116 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Oct 03 08:59:52 crc kubenswrapper[4790]: I1003 08:59:52.745196 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d1558347-4d48-4e51-a89c-a3d8664016a4-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"d1558347-4d48-4e51-a89c-a3d8664016a4\") " pod="openstack/glance-default-internal-api-0" Oct 03 08:59:52 crc kubenswrapper[4790]: I1003 08:59:52.745282 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"d1558347-4d48-4e51-a89c-a3d8664016a4\") " pod="openstack/glance-default-internal-api-0" Oct 03 08:59:52 crc kubenswrapper[4790]: I1003 08:59:52.745309 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d1558347-4d48-4e51-a89c-a3d8664016a4-scripts\") pod \"glance-default-internal-api-0\" (UID: \"d1558347-4d48-4e51-a89c-a3d8664016a4\") " pod="openstack/glance-default-internal-api-0" Oct 03 08:59:52 crc kubenswrapper[4790]: I1003 08:59:52.745354 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d1558347-4d48-4e51-a89c-a3d8664016a4-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"d1558347-4d48-4e51-a89c-a3d8664016a4\") " pod="openstack/glance-default-internal-api-0" Oct 03 08:59:52 crc kubenswrapper[4790]: I1003 08:59:52.745406 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fmpgd\" (UniqueName: \"kubernetes.io/projected/d1558347-4d48-4e51-a89c-a3d8664016a4-kube-api-access-fmpgd\") pod \"glance-default-internal-api-0\" (UID: \"d1558347-4d48-4e51-a89c-a3d8664016a4\") " pod="openstack/glance-default-internal-api-0" Oct 03 08:59:52 crc kubenswrapper[4790]: I1003 08:59:52.745423 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d1558347-4d48-4e51-a89c-a3d8664016a4-logs\") pod \"glance-default-internal-api-0\" (UID: \"d1558347-4d48-4e51-a89c-a3d8664016a4\") " pod="openstack/glance-default-internal-api-0" Oct 03 08:59:52 crc kubenswrapper[4790]: I1003 08:59:52.745446 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d1558347-4d48-4e51-a89c-a3d8664016a4-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"d1558347-4d48-4e51-a89c-a3d8664016a4\") " pod="openstack/glance-default-internal-api-0" Oct 03 08:59:52 crc kubenswrapper[4790]: I1003 08:59:52.745462 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d1558347-4d48-4e51-a89c-a3d8664016a4-config-data\") pod \"glance-default-internal-api-0\" (UID: \"d1558347-4d48-4e51-a89c-a3d8664016a4\") " pod="openstack/glance-default-internal-api-0" Oct 03 08:59:52 crc kubenswrapper[4790]: I1003 08:59:52.839040 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 03 08:59:52 crc kubenswrapper[4790]: I1003 08:59:52.848385 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"d1558347-4d48-4e51-a89c-a3d8664016a4\") " pod="openstack/glance-default-internal-api-0" Oct 03 08:59:52 crc kubenswrapper[4790]: I1003 08:59:52.848437 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d1558347-4d48-4e51-a89c-a3d8664016a4-scripts\") pod \"glance-default-internal-api-0\" (UID: \"d1558347-4d48-4e51-a89c-a3d8664016a4\") " pod="openstack/glance-default-internal-api-0" Oct 03 08:59:52 crc kubenswrapper[4790]: I1003 08:59:52.848491 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d1558347-4d48-4e51-a89c-a3d8664016a4-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"d1558347-4d48-4e51-a89c-a3d8664016a4\") " pod="openstack/glance-default-internal-api-0" Oct 03 08:59:52 crc kubenswrapper[4790]: I1003 08:59:52.848546 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fmpgd\" (UniqueName: \"kubernetes.io/projected/d1558347-4d48-4e51-a89c-a3d8664016a4-kube-api-access-fmpgd\") pod \"glance-default-internal-api-0\" (UID: \"d1558347-4d48-4e51-a89c-a3d8664016a4\") " pod="openstack/glance-default-internal-api-0" Oct 03 08:59:52 crc kubenswrapper[4790]: I1003 08:59:52.848565 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d1558347-4d48-4e51-a89c-a3d8664016a4-logs\") pod \"glance-default-internal-api-0\" (UID: \"d1558347-4d48-4e51-a89c-a3d8664016a4\") " pod="openstack/glance-default-internal-api-0" Oct 03 08:59:52 crc kubenswrapper[4790]: I1003 08:59:52.848603 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d1558347-4d48-4e51-a89c-a3d8664016a4-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"d1558347-4d48-4e51-a89c-a3d8664016a4\") " pod="openstack/glance-default-internal-api-0" Oct 03 08:59:52 crc kubenswrapper[4790]: I1003 08:59:52.848622 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d1558347-4d48-4e51-a89c-a3d8664016a4-config-data\") pod \"glance-default-internal-api-0\" (UID: \"d1558347-4d48-4e51-a89c-a3d8664016a4\") " pod="openstack/glance-default-internal-api-0" Oct 03 08:59:52 crc kubenswrapper[4790]: I1003 08:59:52.848654 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d1558347-4d48-4e51-a89c-a3d8664016a4-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"d1558347-4d48-4e51-a89c-a3d8664016a4\") " pod="openstack/glance-default-internal-api-0" Oct 03 08:59:52 crc kubenswrapper[4790]: I1003 08:59:52.851166 4790 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"d1558347-4d48-4e51-a89c-a3d8664016a4\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/glance-default-internal-api-0" Oct 03 08:59:52 crc kubenswrapper[4790]: I1003 08:59:52.851471 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d1558347-4d48-4e51-a89c-a3d8664016a4-logs\") pod \"glance-default-internal-api-0\" (UID: \"d1558347-4d48-4e51-a89c-a3d8664016a4\") " pod="openstack/glance-default-internal-api-0" Oct 03 08:59:52 crc kubenswrapper[4790]: I1003 08:59:52.852306 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d1558347-4d48-4e51-a89c-a3d8664016a4-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"d1558347-4d48-4e51-a89c-a3d8664016a4\") " pod="openstack/glance-default-internal-api-0" Oct 03 08:59:52 crc kubenswrapper[4790]: I1003 08:59:52.852310 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d1558347-4d48-4e51-a89c-a3d8664016a4-scripts\") pod \"glance-default-internal-api-0\" (UID: \"d1558347-4d48-4e51-a89c-a3d8664016a4\") " pod="openstack/glance-default-internal-api-0" Oct 03 08:59:52 crc kubenswrapper[4790]: I1003 08:59:52.857016 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d1558347-4d48-4e51-a89c-a3d8664016a4-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"d1558347-4d48-4e51-a89c-a3d8664016a4\") " pod="openstack/glance-default-internal-api-0" Oct 03 08:59:52 crc kubenswrapper[4790]: I1003 08:59:52.867438 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d1558347-4d48-4e51-a89c-a3d8664016a4-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"d1558347-4d48-4e51-a89c-a3d8664016a4\") " pod="openstack/glance-default-internal-api-0" Oct 03 08:59:52 crc kubenswrapper[4790]: I1003 08:59:52.867853 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d1558347-4d48-4e51-a89c-a3d8664016a4-config-data\") pod \"glance-default-internal-api-0\" (UID: \"d1558347-4d48-4e51-a89c-a3d8664016a4\") " pod="openstack/glance-default-internal-api-0" Oct 03 08:59:52 crc kubenswrapper[4790]: I1003 08:59:52.871269 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fmpgd\" (UniqueName: \"kubernetes.io/projected/d1558347-4d48-4e51-a89c-a3d8664016a4-kube-api-access-fmpgd\") pod \"glance-default-internal-api-0\" (UID: \"d1558347-4d48-4e51-a89c-a3d8664016a4\") " pod="openstack/glance-default-internal-api-0" Oct 03 08:59:52 crc kubenswrapper[4790]: I1003 08:59:52.952108 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"d1558347-4d48-4e51-a89c-a3d8664016a4\") " pod="openstack/glance-default-internal-api-0" Oct 03 08:59:53 crc kubenswrapper[4790]: I1003 08:59:53.072430 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 03 08:59:53 crc kubenswrapper[4790]: E1003 08:59:53.196133 4790 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod64754289_91b5_4f84_9809_6271b6a13031.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod64754289_91b5_4f84_9809_6271b6a13031.slice/crio-b0ff64c50c118d67a4cf15daa2b8ee6c4f295601ca118a67059cebe19412eeaf\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddf20e14b_d7d8_4cab_8b7c_61122da75fbf.slice/crio-166a3036570750727b0dd758229bd28aafecd40eb06b2140fd160ae44b5f7978\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddf20e14b_d7d8_4cab_8b7c_61122da75fbf.slice\": RecentStats: unable to find data in memory cache]" Oct 03 08:59:53 crc kubenswrapper[4790]: I1003 08:59:53.224719 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 03 08:59:53 crc kubenswrapper[4790]: I1003 08:59:53.548276 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b240084f-7a9d-4dca-89a2-097e9904c9d8","Type":"ContainerStarted","Data":"cd1266527775f4ca99436e80c5cbf44aeccc122b210d128117301265eb46d58e"} Oct 03 08:59:53 crc kubenswrapper[4790]: I1003 08:59:53.548646 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b240084f-7a9d-4dca-89a2-097e9904c9d8","Type":"ContainerStarted","Data":"f0cb2a0bc89ba7208436edb795d32f60f7db32722c1dd8ae4d49479e79cab334"} Oct 03 08:59:53 crc kubenswrapper[4790]: I1003 08:59:53.557450 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"1289ded7-2702-43dc-9d28-00567ba7b668","Type":"ContainerStarted","Data":"d832348fc2c09e05da07e840c59a55fcac9c167d6241125bfa3e8161838c510e"} Oct 03 08:59:53 crc kubenswrapper[4790]: I1003 08:59:53.559603 4790 generic.go:334] "Generic (PLEG): container finished" podID="71091fa8-ba4a-42b7-b738-a6b012c5b65c" containerID="aa79f711bd93e6248a4171b28d343a171e80f79979523fb04a3f7639633acf14" exitCode=0 Oct 03 08:59:53 crc kubenswrapper[4790]: I1003 08:59:53.559666 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-8454698868-xzzl2" event={"ID":"71091fa8-ba4a-42b7-b738-a6b012c5b65c","Type":"ContainerDied","Data":"aa79f711bd93e6248a4171b28d343a171e80f79979523fb04a3f7639633acf14"} Oct 03 08:59:53 crc kubenswrapper[4790]: I1003 08:59:53.564339 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"42c4562a-910e-47f6-befc-effb8f851374","Type":"ContainerStarted","Data":"e5b82a6d0fd58c076cdef22de2f4e27b108b2dbc2787aacc47e3679bbc98fa98"} Oct 03 08:59:53 crc kubenswrapper[4790]: I1003 08:59:53.574984 4790 generic.go:334] "Generic (PLEG): container finished" podID="231c1e2d-3330-48e9-80cb-6f490ba17e37" containerID="cc38a3e97666b4fe691265c985fd77f84b5a2e2fd71eb66b094e567d30914519" exitCode=0 Oct 03 08:59:53 crc kubenswrapper[4790]: I1003 08:59:53.575709 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-rq7b8" event={"ID":"231c1e2d-3330-48e9-80cb-6f490ba17e37","Type":"ContainerDied","Data":"cc38a3e97666b4fe691265c985fd77f84b5a2e2fd71eb66b094e567d30914519"} Oct 03 08:59:53 crc kubenswrapper[4790]: I1003 08:59:53.815617 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 03 08:59:54 crc kubenswrapper[4790]: I1003 08:59:54.380034 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ab258ad1-5952-4586-97ac-9da3ad97658d" path="/var/lib/kubelet/pods/ab258ad1-5952-4586-97ac-9da3ad97658d/volumes" Oct 03 08:59:54 crc kubenswrapper[4790]: I1003 08:59:54.397228 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-q2zhq" Oct 03 08:59:54 crc kubenswrapper[4790]: I1003 08:59:54.421222 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-h5qqd" Oct 03 08:59:54 crc kubenswrapper[4790]: I1003 08:59:54.433934 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-8454698868-xzzl2" Oct 03 08:59:54 crc kubenswrapper[4790]: I1003 08:59:54.490892 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/71091fa8-ba4a-42b7-b738-a6b012c5b65c-httpd-config\") pod \"71091fa8-ba4a-42b7-b738-a6b012c5b65c\" (UID: \"71091fa8-ba4a-42b7-b738-a6b012c5b65c\") " Oct 03 08:59:54 crc kubenswrapper[4790]: I1003 08:59:54.490969 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/71091fa8-ba4a-42b7-b738-a6b012c5b65c-config\") pod \"71091fa8-ba4a-42b7-b738-a6b012c5b65c\" (UID: \"71091fa8-ba4a-42b7-b738-a6b012c5b65c\") " Oct 03 08:59:54 crc kubenswrapper[4790]: I1003 08:59:54.491110 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w46v\" (UniqueName: \"kubernetes.io/projected/28ca9811-81f8-4113-b7c0-ddcb16a66ad9-kube-api-access-2w46v\") pod \"28ca9811-81f8-4113-b7c0-ddcb16a66ad9\" (UID: \"28ca9811-81f8-4113-b7c0-ddcb16a66ad9\") " Oct 03 08:59:54 crc kubenswrapper[4790]: I1003 08:59:54.491151 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rwvfp\" (UniqueName: \"kubernetes.io/projected/0ffe5a55-d80d-47ce-9fc9-0611c97437d1-kube-api-access-rwvfp\") pod \"0ffe5a55-d80d-47ce-9fc9-0611c97437d1\" (UID: \"0ffe5a55-d80d-47ce-9fc9-0611c97437d1\") " Oct 03 08:59:54 crc kubenswrapper[4790]: I1003 08:59:54.491199 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/71091fa8-ba4a-42b7-b738-a6b012c5b65c-combined-ca-bundle\") pod \"71091fa8-ba4a-42b7-b738-a6b012c5b65c\" (UID: \"71091fa8-ba4a-42b7-b738-a6b012c5b65c\") " Oct 03 08:59:54 crc kubenswrapper[4790]: I1003 08:59:54.491257 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/71091fa8-ba4a-42b7-b738-a6b012c5b65c-ovndb-tls-certs\") pod \"71091fa8-ba4a-42b7-b738-a6b012c5b65c\" (UID: \"71091fa8-ba4a-42b7-b738-a6b012c5b65c\") " Oct 03 08:59:54 crc kubenswrapper[4790]: I1003 08:59:54.491283 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9qbtw\" (UniqueName: \"kubernetes.io/projected/71091fa8-ba4a-42b7-b738-a6b012c5b65c-kube-api-access-9qbtw\") pod \"71091fa8-ba4a-42b7-b738-a6b012c5b65c\" (UID: \"71091fa8-ba4a-42b7-b738-a6b012c5b65c\") " Oct 03 08:59:54 crc kubenswrapper[4790]: I1003 08:59:54.496811 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/71091fa8-ba4a-42b7-b738-a6b012c5b65c-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "71091fa8-ba4a-42b7-b738-a6b012c5b65c" (UID: "71091fa8-ba4a-42b7-b738-a6b012c5b65c"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:59:54 crc kubenswrapper[4790]: I1003 08:59:54.502068 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/71091fa8-ba4a-42b7-b738-a6b012c5b65c-kube-api-access-9qbtw" (OuterVolumeSpecName: "kube-api-access-9qbtw") pod "71091fa8-ba4a-42b7-b738-a6b012c5b65c" (UID: "71091fa8-ba4a-42b7-b738-a6b012c5b65c"). InnerVolumeSpecName "kube-api-access-9qbtw". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:59:54 crc kubenswrapper[4790]: I1003 08:59:54.521012 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0ffe5a55-d80d-47ce-9fc9-0611c97437d1-kube-api-access-rwvfp" (OuterVolumeSpecName: "kube-api-access-rwvfp") pod "0ffe5a55-d80d-47ce-9fc9-0611c97437d1" (UID: "0ffe5a55-d80d-47ce-9fc9-0611c97437d1"). InnerVolumeSpecName "kube-api-access-rwvfp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:59:54 crc kubenswrapper[4790]: I1003 08:59:54.568144 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/28ca9811-81f8-4113-b7c0-ddcb16a66ad9-kube-api-access-2w46v" (OuterVolumeSpecName: "kube-api-access-2w46v") pod "28ca9811-81f8-4113-b7c0-ddcb16a66ad9" (UID: "28ca9811-81f8-4113-b7c0-ddcb16a66ad9"). InnerVolumeSpecName "kube-api-access-2w46v". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:59:54 crc kubenswrapper[4790]: I1003 08:59:54.573831 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/71091fa8-ba4a-42b7-b738-a6b012c5b65c-config" (OuterVolumeSpecName: "config") pod "71091fa8-ba4a-42b7-b738-a6b012c5b65c" (UID: "71091fa8-ba4a-42b7-b738-a6b012c5b65c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:59:54 crc kubenswrapper[4790]: I1003 08:59:54.602779 4790 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/71091fa8-ba4a-42b7-b738-a6b012c5b65c-httpd-config\") on node \"crc\" DevicePath \"\"" Oct 03 08:59:54 crc kubenswrapper[4790]: I1003 08:59:54.603041 4790 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/71091fa8-ba4a-42b7-b738-a6b012c5b65c-config\") on node \"crc\" DevicePath \"\"" Oct 03 08:59:54 crc kubenswrapper[4790]: I1003 08:59:54.603057 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w46v\" (UniqueName: \"kubernetes.io/projected/28ca9811-81f8-4113-b7c0-ddcb16a66ad9-kube-api-access-2w46v\") on node \"crc\" DevicePath \"\"" Oct 03 08:59:54 crc kubenswrapper[4790]: I1003 08:59:54.603080 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rwvfp\" (UniqueName: \"kubernetes.io/projected/0ffe5a55-d80d-47ce-9fc9-0611c97437d1-kube-api-access-rwvfp\") on node \"crc\" DevicePath \"\"" Oct 03 08:59:54 crc kubenswrapper[4790]: I1003 08:59:54.603092 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9qbtw\" (UniqueName: \"kubernetes.io/projected/71091fa8-ba4a-42b7-b738-a6b012c5b65c-kube-api-access-9qbtw\") on node \"crc\" DevicePath \"\"" Oct 03 08:59:54 crc kubenswrapper[4790]: I1003 08:59:54.628248 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-8454698868-xzzl2" event={"ID":"71091fa8-ba4a-42b7-b738-a6b012c5b65c","Type":"ContainerDied","Data":"8ddbf07efb39bd64f43d898b232d9e11186dd4ea0b665ea2919fd71b603b4471"} Oct 03 08:59:54 crc kubenswrapper[4790]: I1003 08:59:54.628282 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-8454698868-xzzl2" Oct 03 08:59:54 crc kubenswrapper[4790]: I1003 08:59:54.633609 4790 scope.go:117] "RemoveContainer" containerID="40fed40024a54795fdebb448fab0ca6cabc5e85aef38ed269bd60974b3067a14" Oct 03 08:59:54 crc kubenswrapper[4790]: I1003 08:59:54.642055 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"42c4562a-910e-47f6-befc-effb8f851374","Type":"ContainerStarted","Data":"7609e325ca1fbf157e38dba2929e072e5800855856ca70cacf484a194bfe6720"} Oct 03 08:59:54 crc kubenswrapper[4790]: I1003 08:59:54.673106 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/71091fa8-ba4a-42b7-b738-a6b012c5b65c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "71091fa8-ba4a-42b7-b738-a6b012c5b65c" (UID: "71091fa8-ba4a-42b7-b738-a6b012c5b65c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:59:54 crc kubenswrapper[4790]: I1003 08:59:54.673292 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b240084f-7a9d-4dca-89a2-097e9904c9d8","Type":"ContainerStarted","Data":"6e35397ceee89337e761cce23007eac86e8710600413274aa29eb15d9a6fe140"} Oct 03 08:59:54 crc kubenswrapper[4790]: I1003 08:59:54.676830 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-h5qqd" event={"ID":"28ca9811-81f8-4113-b7c0-ddcb16a66ad9","Type":"ContainerDied","Data":"019d33df680131ca2fa34954463d7e5611e8d060bb23c5fb2ff8337e7786e5e4"} Oct 03 08:59:54 crc kubenswrapper[4790]: I1003 08:59:54.676875 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="019d33df680131ca2fa34954463d7e5611e8d060bb23c5fb2ff8337e7786e5e4" Oct 03 08:59:54 crc kubenswrapper[4790]: I1003 08:59:54.676986 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-h5qqd" Oct 03 08:59:54 crc kubenswrapper[4790]: I1003 08:59:54.705028 4790 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/71091fa8-ba4a-42b7-b738-a6b012c5b65c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 08:59:54 crc kubenswrapper[4790]: I1003 08:59:54.707247 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-q2zhq" event={"ID":"0ffe5a55-d80d-47ce-9fc9-0611c97437d1","Type":"ContainerDied","Data":"fae3dfcd0544f4a5106f3020fb2ab504905ad93bcccf0490cef1d850d24d1b34"} Oct 03 08:59:54 crc kubenswrapper[4790]: I1003 08:59:54.707285 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fae3dfcd0544f4a5106f3020fb2ab504905ad93bcccf0490cef1d850d24d1b34" Oct 03 08:59:54 crc kubenswrapper[4790]: I1003 08:59:54.707388 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-q2zhq" Oct 03 08:59:54 crc kubenswrapper[4790]: I1003 08:59:54.711595 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"1289ded7-2702-43dc-9d28-00567ba7b668","Type":"ContainerStarted","Data":"6f93470a6b007ab67974bc4f559266dd19fcf2a84c969a48fddc3e060000b863"} Oct 03 08:59:54 crc kubenswrapper[4790]: I1003 08:59:54.717786 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"d1558347-4d48-4e51-a89c-a3d8664016a4","Type":"ContainerStarted","Data":"f5313d396cc25709b8d05470aecd1d8d56dc758ab51b39bda5b4125a94531a34"} Oct 03 08:59:54 crc kubenswrapper[4790]: I1003 08:59:54.719760 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/71091fa8-ba4a-42b7-b738-a6b012c5b65c-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "71091fa8-ba4a-42b7-b738-a6b012c5b65c" (UID: "71091fa8-ba4a-42b7-b738-a6b012c5b65c"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 08:59:54 crc kubenswrapper[4790]: I1003 08:59:54.764898 4790 scope.go:117] "RemoveContainer" containerID="aa79f711bd93e6248a4171b28d343a171e80f79979523fb04a3f7639633acf14" Oct 03 08:59:54 crc kubenswrapper[4790]: I1003 08:59:54.807014 4790 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/71091fa8-ba4a-42b7-b738-a6b012c5b65c-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 03 08:59:55 crc kubenswrapper[4790]: I1003 08:59:55.001266 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-8454698868-xzzl2"] Oct 03 08:59:55 crc kubenswrapper[4790]: I1003 08:59:55.017497 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-8454698868-xzzl2"] Oct 03 08:59:55 crc kubenswrapper[4790]: I1003 08:59:55.319502 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-rq7b8" Oct 03 08:59:55 crc kubenswrapper[4790]: I1003 08:59:55.327910 4790 scope.go:117] "RemoveContainer" containerID="f7e1596a2e3a93c9acb16c7a72441fad846a55cd8aad3832a2f6428d26e70811" Oct 03 08:59:55 crc kubenswrapper[4790]: E1003 08:59:55.329544 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 20s restarting failed container=watcher-decision-engine pod=watcher-decision-engine-0_openstack(f0afbb0f-f4dc-4814-8c7f-791351972c0b)\"" pod="openstack/watcher-decision-engine-0" podUID="f0afbb0f-f4dc-4814-8c7f-791351972c0b" Oct 03 08:59:55 crc kubenswrapper[4790]: I1003 08:59:55.439681 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f7swv\" (UniqueName: \"kubernetes.io/projected/231c1e2d-3330-48e9-80cb-6f490ba17e37-kube-api-access-f7swv\") pod \"231c1e2d-3330-48e9-80cb-6f490ba17e37\" (UID: \"231c1e2d-3330-48e9-80cb-6f490ba17e37\") " Oct 03 08:59:55 crc kubenswrapper[4790]: I1003 08:59:55.460060 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/231c1e2d-3330-48e9-80cb-6f490ba17e37-kube-api-access-f7swv" (OuterVolumeSpecName: "kube-api-access-f7swv") pod "231c1e2d-3330-48e9-80cb-6f490ba17e37" (UID: "231c1e2d-3330-48e9-80cb-6f490ba17e37"). InnerVolumeSpecName "kube-api-access-f7swv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 08:59:55 crc kubenswrapper[4790]: I1003 08:59:55.544010 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f7swv\" (UniqueName: \"kubernetes.io/projected/231c1e2d-3330-48e9-80cb-6f490ba17e37-kube-api-access-f7swv\") on node \"crc\" DevicePath \"\"" Oct 03 08:59:55 crc kubenswrapper[4790]: I1003 08:59:55.731117 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"1289ded7-2702-43dc-9d28-00567ba7b668","Type":"ContainerStarted","Data":"e93b660c3745b23a0e083084df6a68417a3cf63db528a20b3faa55f659d1a6df"} Oct 03 08:59:55 crc kubenswrapper[4790]: I1003 08:59:55.735428 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"d1558347-4d48-4e51-a89c-a3d8664016a4","Type":"ContainerStarted","Data":"5b9c44fe14d94cc8157b8e0ca38aa881c8c90c6f145a133592df93bb5e3979b4"} Oct 03 08:59:55 crc kubenswrapper[4790]: I1003 08:59:55.743437 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-rq7b8" Oct 03 08:59:55 crc kubenswrapper[4790]: I1003 08:59:55.743496 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-rq7b8" event={"ID":"231c1e2d-3330-48e9-80cb-6f490ba17e37","Type":"ContainerDied","Data":"f9cc868dc4cf11bfd232d619cd4d949f4b0e4153fc81f21aa29ecb67df2ad976"} Oct 03 08:59:55 crc kubenswrapper[4790]: I1003 08:59:55.743532 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f9cc868dc4cf11bfd232d619cd4d949f4b0e4153fc81f21aa29ecb67df2ad976" Oct 03 08:59:55 crc kubenswrapper[4790]: I1003 08:59:55.745514 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b240084f-7a9d-4dca-89a2-097e9904c9d8","Type":"ContainerStarted","Data":"4b5db09faa3aba2f47ac76ad755fd446f439e2dbcd278e953321395ca15d9504"} Oct 03 08:59:55 crc kubenswrapper[4790]: I1003 08:59:55.757567 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=4.757522552 podStartE2EDuration="4.757522552s" podCreationTimestamp="2025-10-03 08:59:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 08:59:55.747731244 +0000 UTC m=+1182.100665512" watchObservedRunningTime="2025-10-03 08:59:55.757522552 +0000 UTC m=+1182.110456790" Oct 03 08:59:56 crc kubenswrapper[4790]: I1003 08:59:56.206994 4790 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-59b887b4f6-vdnlv" podUID="f334f308-e77d-45ff-8c57-30164f037180" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.179:9311/healthcheck\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 03 08:59:56 crc kubenswrapper[4790]: I1003 08:59:56.207386 4790 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-59b887b4f6-vdnlv" podUID="f334f308-e77d-45ff-8c57-30164f037180" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.179:9311/healthcheck\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 03 08:59:56 crc kubenswrapper[4790]: I1003 08:59:56.342115 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="71091fa8-ba4a-42b7-b738-a6b012c5b65c" path="/var/lib/kubelet/pods/71091fa8-ba4a-42b7-b738-a6b012c5b65c/volumes" Oct 03 08:59:56 crc kubenswrapper[4790]: I1003 08:59:56.757418 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"d1558347-4d48-4e51-a89c-a3d8664016a4","Type":"ContainerStarted","Data":"3eda0f2dfe38c6c19b892b80f6e083e17c3ea2ee5be8501cdc2f90dfc6e55286"} Oct 03 08:59:56 crc kubenswrapper[4790]: I1003 08:59:56.759836 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"42c4562a-910e-47f6-befc-effb8f851374","Type":"ContainerStarted","Data":"039f246b28f08b0f3587b14e7e6a2f757ea255b360034a8066f75a004e344d0f"} Oct 03 08:59:56 crc kubenswrapper[4790]: I1003 08:59:56.782828 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=4.782811531 podStartE2EDuration="4.782811531s" podCreationTimestamp="2025-10-03 08:59:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 08:59:56.778915034 +0000 UTC m=+1183.131849272" watchObservedRunningTime="2025-10-03 08:59:56.782811531 +0000 UTC m=+1183.135745769" Oct 03 08:59:56 crc kubenswrapper[4790]: I1003 08:59:56.808173 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=5.808152215 podStartE2EDuration="5.808152215s" podCreationTimestamp="2025-10-03 08:59:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 08:59:56.806061568 +0000 UTC m=+1183.158995826" watchObservedRunningTime="2025-10-03 08:59:56.808152215 +0000 UTC m=+1183.161086483" Oct 03 08:59:57 crc kubenswrapper[4790]: I1003 08:59:57.105701 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Oct 03 08:59:57 crc kubenswrapper[4790]: I1003 08:59:57.169127 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 03 08:59:57 crc kubenswrapper[4790]: I1003 08:59:57.780594 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b240084f-7a9d-4dca-89a2-097e9904c9d8","Type":"ContainerStarted","Data":"0900704de45ab7ab971162b5bd26d7cdc98dd24f14d425080eb1d39ba170f191"} Oct 03 08:59:57 crc kubenswrapper[4790]: I1003 08:59:57.781252 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 03 08:59:57 crc kubenswrapper[4790]: I1003 08:59:57.781772 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="b240084f-7a9d-4dca-89a2-097e9904c9d8" containerName="sg-core" containerID="cri-o://4b5db09faa3aba2f47ac76ad755fd446f439e2dbcd278e953321395ca15d9504" gracePeriod=30 Oct 03 08:59:57 crc kubenswrapper[4790]: I1003 08:59:57.781816 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="b240084f-7a9d-4dca-89a2-097e9904c9d8" containerName="proxy-httpd" containerID="cri-o://0900704de45ab7ab971162b5bd26d7cdc98dd24f14d425080eb1d39ba170f191" gracePeriod=30 Oct 03 08:59:57 crc kubenswrapper[4790]: I1003 08:59:57.781843 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="b240084f-7a9d-4dca-89a2-097e9904c9d8" containerName="ceilometer-notification-agent" containerID="cri-o://6e35397ceee89337e761cce23007eac86e8710600413274aa29eb15d9a6fe140" gracePeriod=30 Oct 03 08:59:57 crc kubenswrapper[4790]: I1003 08:59:57.781722 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="b240084f-7a9d-4dca-89a2-097e9904c9d8" containerName="ceilometer-central-agent" containerID="cri-o://cd1266527775f4ca99436e80c5cbf44aeccc122b210d128117301265eb46d58e" gracePeriod=30 Oct 03 08:59:58 crc kubenswrapper[4790]: I1003 08:59:58.793687 4790 generic.go:334] "Generic (PLEG): container finished" podID="b240084f-7a9d-4dca-89a2-097e9904c9d8" containerID="4b5db09faa3aba2f47ac76ad755fd446f439e2dbcd278e953321395ca15d9504" exitCode=2 Oct 03 08:59:58 crc kubenswrapper[4790]: I1003 08:59:58.793716 4790 generic.go:334] "Generic (PLEG): container finished" podID="b240084f-7a9d-4dca-89a2-097e9904c9d8" containerID="6e35397ceee89337e761cce23007eac86e8710600413274aa29eb15d9a6fe140" exitCode=0 Oct 03 08:59:58 crc kubenswrapper[4790]: I1003 08:59:58.793737 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b240084f-7a9d-4dca-89a2-097e9904c9d8","Type":"ContainerDied","Data":"4b5db09faa3aba2f47ac76ad755fd446f439e2dbcd278e953321395ca15d9504"} Oct 03 08:59:58 crc kubenswrapper[4790]: I1003 08:59:58.793764 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b240084f-7a9d-4dca-89a2-097e9904c9d8","Type":"ContainerDied","Data":"6e35397ceee89337e761cce23007eac86e8710600413274aa29eb15d9a6fe140"} Oct 03 08:59:59 crc kubenswrapper[4790]: I1003 08:59:59.807170 4790 generic.go:334] "Generic (PLEG): container finished" podID="b240084f-7a9d-4dca-89a2-097e9904c9d8" containerID="cd1266527775f4ca99436e80c5cbf44aeccc122b210d128117301265eb46d58e" exitCode=0 Oct 03 08:59:59 crc kubenswrapper[4790]: I1003 08:59:59.807243 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b240084f-7a9d-4dca-89a2-097e9904c9d8","Type":"ContainerDied","Data":"cd1266527775f4ca99436e80c5cbf44aeccc122b210d128117301265eb46d58e"} Oct 03 09:00:00 crc kubenswrapper[4790]: I1003 09:00:00.144739 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=4.314285585 podStartE2EDuration="9.144711246s" podCreationTimestamp="2025-10-03 08:59:51 +0000 UTC" firstStartedPulling="2025-10-03 08:59:52.569783679 +0000 UTC m=+1178.922717917" lastFinishedPulling="2025-10-03 08:59:57.40020934 +0000 UTC m=+1183.753143578" observedRunningTime="2025-10-03 08:59:57.817899059 +0000 UTC m=+1184.170833307" watchObservedRunningTime="2025-10-03 09:00:00.144711246 +0000 UTC m=+1186.497645484" Oct 03 09:00:00 crc kubenswrapper[4790]: I1003 09:00:00.148527 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29324700-zkgcz"] Oct 03 09:00:00 crc kubenswrapper[4790]: E1003 09:00:00.148940 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="71091fa8-ba4a-42b7-b738-a6b012c5b65c" containerName="neutron-api" Oct 03 09:00:00 crc kubenswrapper[4790]: I1003 09:00:00.148963 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="71091fa8-ba4a-42b7-b738-a6b012c5b65c" containerName="neutron-api" Oct 03 09:00:00 crc kubenswrapper[4790]: E1003 09:00:00.148987 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="71091fa8-ba4a-42b7-b738-a6b012c5b65c" containerName="neutron-httpd" Oct 03 09:00:00 crc kubenswrapper[4790]: I1003 09:00:00.148995 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="71091fa8-ba4a-42b7-b738-a6b012c5b65c" containerName="neutron-httpd" Oct 03 09:00:00 crc kubenswrapper[4790]: E1003 09:00:00.149014 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="231c1e2d-3330-48e9-80cb-6f490ba17e37" containerName="mariadb-database-create" Oct 03 09:00:00 crc kubenswrapper[4790]: I1003 09:00:00.149021 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="231c1e2d-3330-48e9-80cb-6f490ba17e37" containerName="mariadb-database-create" Oct 03 09:00:00 crc kubenswrapper[4790]: E1003 09:00:00.149049 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ffe5a55-d80d-47ce-9fc9-0611c97437d1" containerName="mariadb-database-create" Oct 03 09:00:00 crc kubenswrapper[4790]: I1003 09:00:00.149055 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ffe5a55-d80d-47ce-9fc9-0611c97437d1" containerName="mariadb-database-create" Oct 03 09:00:00 crc kubenswrapper[4790]: E1003 09:00:00.149066 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="28ca9811-81f8-4113-b7c0-ddcb16a66ad9" containerName="mariadb-database-create" Oct 03 09:00:00 crc kubenswrapper[4790]: I1003 09:00:00.149071 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="28ca9811-81f8-4113-b7c0-ddcb16a66ad9" containerName="mariadb-database-create" Oct 03 09:00:00 crc kubenswrapper[4790]: I1003 09:00:00.149245 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="231c1e2d-3330-48e9-80cb-6f490ba17e37" containerName="mariadb-database-create" Oct 03 09:00:00 crc kubenswrapper[4790]: I1003 09:00:00.149261 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="28ca9811-81f8-4113-b7c0-ddcb16a66ad9" containerName="mariadb-database-create" Oct 03 09:00:00 crc kubenswrapper[4790]: I1003 09:00:00.149278 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="0ffe5a55-d80d-47ce-9fc9-0611c97437d1" containerName="mariadb-database-create" Oct 03 09:00:00 crc kubenswrapper[4790]: I1003 09:00:00.149285 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="71091fa8-ba4a-42b7-b738-a6b012c5b65c" containerName="neutron-api" Oct 03 09:00:00 crc kubenswrapper[4790]: I1003 09:00:00.149302 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="71091fa8-ba4a-42b7-b738-a6b012c5b65c" containerName="neutron-httpd" Oct 03 09:00:00 crc kubenswrapper[4790]: I1003 09:00:00.150027 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29324700-zkgcz" Oct 03 09:00:00 crc kubenswrapper[4790]: I1003 09:00:00.164385 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 03 09:00:00 crc kubenswrapper[4790]: I1003 09:00:00.164531 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 03 09:00:00 crc kubenswrapper[4790]: I1003 09:00:00.171289 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29324700-zkgcz"] Oct 03 09:00:00 crc kubenswrapper[4790]: I1003 09:00:00.227465 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5fa19264-9605-4c16-a1d4-b412e599ebb1-secret-volume\") pod \"collect-profiles-29324700-zkgcz\" (UID: \"5fa19264-9605-4c16-a1d4-b412e599ebb1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324700-zkgcz" Oct 03 09:00:00 crc kubenswrapper[4790]: I1003 09:00:00.227519 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5fa19264-9605-4c16-a1d4-b412e599ebb1-config-volume\") pod \"collect-profiles-29324700-zkgcz\" (UID: \"5fa19264-9605-4c16-a1d4-b412e599ebb1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324700-zkgcz" Oct 03 09:00:00 crc kubenswrapper[4790]: I1003 09:00:00.227550 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2f6xl\" (UniqueName: \"kubernetes.io/projected/5fa19264-9605-4c16-a1d4-b412e599ebb1-kube-api-access-2f6xl\") pod \"collect-profiles-29324700-zkgcz\" (UID: \"5fa19264-9605-4c16-a1d4-b412e599ebb1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324700-zkgcz" Oct 03 09:00:00 crc kubenswrapper[4790]: I1003 09:00:00.328897 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2f6xl\" (UniqueName: \"kubernetes.io/projected/5fa19264-9605-4c16-a1d4-b412e599ebb1-kube-api-access-2f6xl\") pod \"collect-profiles-29324700-zkgcz\" (UID: \"5fa19264-9605-4c16-a1d4-b412e599ebb1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324700-zkgcz" Oct 03 09:00:00 crc kubenswrapper[4790]: I1003 09:00:00.329098 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5fa19264-9605-4c16-a1d4-b412e599ebb1-secret-volume\") pod \"collect-profiles-29324700-zkgcz\" (UID: \"5fa19264-9605-4c16-a1d4-b412e599ebb1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324700-zkgcz" Oct 03 09:00:00 crc kubenswrapper[4790]: I1003 09:00:00.329130 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5fa19264-9605-4c16-a1d4-b412e599ebb1-config-volume\") pod \"collect-profiles-29324700-zkgcz\" (UID: \"5fa19264-9605-4c16-a1d4-b412e599ebb1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324700-zkgcz" Oct 03 09:00:00 crc kubenswrapper[4790]: I1003 09:00:00.330025 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5fa19264-9605-4c16-a1d4-b412e599ebb1-config-volume\") pod \"collect-profiles-29324700-zkgcz\" (UID: \"5fa19264-9605-4c16-a1d4-b412e599ebb1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324700-zkgcz" Oct 03 09:00:00 crc kubenswrapper[4790]: I1003 09:00:00.336434 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5fa19264-9605-4c16-a1d4-b412e599ebb1-secret-volume\") pod \"collect-profiles-29324700-zkgcz\" (UID: \"5fa19264-9605-4c16-a1d4-b412e599ebb1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324700-zkgcz" Oct 03 09:00:00 crc kubenswrapper[4790]: I1003 09:00:00.355176 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2f6xl\" (UniqueName: \"kubernetes.io/projected/5fa19264-9605-4c16-a1d4-b412e599ebb1-kube-api-access-2f6xl\") pod \"collect-profiles-29324700-zkgcz\" (UID: \"5fa19264-9605-4c16-a1d4-b412e599ebb1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324700-zkgcz" Oct 03 09:00:00 crc kubenswrapper[4790]: I1003 09:00:00.470990 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29324700-zkgcz" Oct 03 09:00:00 crc kubenswrapper[4790]: I1003 09:00:00.976533 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29324700-zkgcz"] Oct 03 09:00:01 crc kubenswrapper[4790]: I1003 09:00:01.825039 4790 generic.go:334] "Generic (PLEG): container finished" podID="5fa19264-9605-4c16-a1d4-b412e599ebb1" containerID="2de2f05a97ca023e000fd486bdb5e97319530dd3d0544ccc7b0ef8575bc70a81" exitCode=0 Oct 03 09:00:01 crc kubenswrapper[4790]: I1003 09:00:01.825108 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29324700-zkgcz" event={"ID":"5fa19264-9605-4c16-a1d4-b412e599ebb1","Type":"ContainerDied","Data":"2de2f05a97ca023e000fd486bdb5e97319530dd3d0544ccc7b0ef8575bc70a81"} Oct 03 09:00:01 crc kubenswrapper[4790]: I1003 09:00:01.825320 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29324700-zkgcz" event={"ID":"5fa19264-9605-4c16-a1d4-b412e599ebb1","Type":"ContainerStarted","Data":"40a7231f28166236e58b5628ab9648e676410c1c172fd5b8c1bb9f84bc98f135"} Oct 03 09:00:01 crc kubenswrapper[4790]: I1003 09:00:01.925611 4790 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack/watcher-decision-engine-0" Oct 03 09:00:01 crc kubenswrapper[4790]: I1003 09:00:01.925658 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-decision-engine-0" Oct 03 09:00:01 crc kubenswrapper[4790]: I1003 09:00:01.926333 4790 scope.go:117] "RemoveContainer" containerID="f7e1596a2e3a93c9acb16c7a72441fad846a55cd8aad3832a2f6428d26e70811" Oct 03 09:00:02 crc kubenswrapper[4790]: I1003 09:00:02.245822 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Oct 03 09:00:02 crc kubenswrapper[4790]: I1003 09:00:02.245881 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Oct 03 09:00:02 crc kubenswrapper[4790]: I1003 09:00:02.286434 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Oct 03 09:00:02 crc kubenswrapper[4790]: I1003 09:00:02.310714 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Oct 03 09:00:02 crc kubenswrapper[4790]: I1003 09:00:02.319898 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Oct 03 09:00:02 crc kubenswrapper[4790]: I1003 09:00:02.852325 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"f0afbb0f-f4dc-4814-8c7f-791351972c0b","Type":"ContainerStarted","Data":"f2af8048b69e7ae948e0156778d10832fb1b1892abc42d130934e3ab87b85c32"} Oct 03 09:00:02 crc kubenswrapper[4790]: I1003 09:00:02.853179 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Oct 03 09:00:02 crc kubenswrapper[4790]: I1003 09:00:02.853207 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Oct 03 09:00:03 crc kubenswrapper[4790]: I1003 09:00:03.072668 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Oct 03 09:00:03 crc kubenswrapper[4790]: I1003 09:00:03.072715 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Oct 03 09:00:03 crc kubenswrapper[4790]: I1003 09:00:03.153194 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Oct 03 09:00:03 crc kubenswrapper[4790]: I1003 09:00:03.158268 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Oct 03 09:00:03 crc kubenswrapper[4790]: I1003 09:00:03.355387 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29324700-zkgcz" Oct 03 09:00:03 crc kubenswrapper[4790]: I1003 09:00:03.407795 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5fa19264-9605-4c16-a1d4-b412e599ebb1-config-volume\") pod \"5fa19264-9605-4c16-a1d4-b412e599ebb1\" (UID: \"5fa19264-9605-4c16-a1d4-b412e599ebb1\") " Oct 03 09:00:03 crc kubenswrapper[4790]: I1003 09:00:03.407981 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2f6xl\" (UniqueName: \"kubernetes.io/projected/5fa19264-9605-4c16-a1d4-b412e599ebb1-kube-api-access-2f6xl\") pod \"5fa19264-9605-4c16-a1d4-b412e599ebb1\" (UID: \"5fa19264-9605-4c16-a1d4-b412e599ebb1\") " Oct 03 09:00:03 crc kubenswrapper[4790]: I1003 09:00:03.408087 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5fa19264-9605-4c16-a1d4-b412e599ebb1-secret-volume\") pod \"5fa19264-9605-4c16-a1d4-b412e599ebb1\" (UID: \"5fa19264-9605-4c16-a1d4-b412e599ebb1\") " Oct 03 09:00:03 crc kubenswrapper[4790]: I1003 09:00:03.413841 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5fa19264-9605-4c16-a1d4-b412e599ebb1-config-volume" (OuterVolumeSpecName: "config-volume") pod "5fa19264-9605-4c16-a1d4-b412e599ebb1" (UID: "5fa19264-9605-4c16-a1d4-b412e599ebb1"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 09:00:03 crc kubenswrapper[4790]: I1003 09:00:03.432104 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fa19264-9605-4c16-a1d4-b412e599ebb1-kube-api-access-2f6xl" (OuterVolumeSpecName: "kube-api-access-2f6xl") pod "5fa19264-9605-4c16-a1d4-b412e599ebb1" (UID: "5fa19264-9605-4c16-a1d4-b412e599ebb1"). InnerVolumeSpecName "kube-api-access-2f6xl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:00:03 crc kubenswrapper[4790]: I1003 09:00:03.433714 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fa19264-9605-4c16-a1d4-b412e599ebb1-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "5fa19264-9605-4c16-a1d4-b412e599ebb1" (UID: "5fa19264-9605-4c16-a1d4-b412e599ebb1"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:00:03 crc kubenswrapper[4790]: I1003 09:00:03.515931 4790 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5fa19264-9605-4c16-a1d4-b412e599ebb1-config-volume\") on node \"crc\" DevicePath \"\"" Oct 03 09:00:03 crc kubenswrapper[4790]: I1003 09:00:03.516243 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2f6xl\" (UniqueName: \"kubernetes.io/projected/5fa19264-9605-4c16-a1d4-b412e599ebb1-kube-api-access-2f6xl\") on node \"crc\" DevicePath \"\"" Oct 03 09:00:03 crc kubenswrapper[4790]: I1003 09:00:03.516254 4790 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5fa19264-9605-4c16-a1d4-b412e599ebb1-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 03 09:00:03 crc kubenswrapper[4790]: E1003 09:00:03.682288 4790 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod64754289_91b5_4f84_9809_6271b6a13031.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod64754289_91b5_4f84_9809_6271b6a13031.slice/crio-b0ff64c50c118d67a4cf15daa2b8ee6c4f295601ca118a67059cebe19412eeaf\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddf20e14b_d7d8_4cab_8b7c_61122da75fbf.slice/crio-166a3036570750727b0dd758229bd28aafecd40eb06b2140fd160ae44b5f7978\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddf20e14b_d7d8_4cab_8b7c_61122da75fbf.slice\": RecentStats: unable to find data in memory cache]" Oct 03 09:00:03 crc kubenswrapper[4790]: I1003 09:00:03.865697 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29324700-zkgcz" Oct 03 09:00:03 crc kubenswrapper[4790]: I1003 09:00:03.867095 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29324700-zkgcz" event={"ID":"5fa19264-9605-4c16-a1d4-b412e599ebb1","Type":"ContainerDied","Data":"40a7231f28166236e58b5628ab9648e676410c1c172fd5b8c1bb9f84bc98f135"} Oct 03 09:00:03 crc kubenswrapper[4790]: I1003 09:00:03.867127 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="40a7231f28166236e58b5628ab9648e676410c1c172fd5b8c1bb9f84bc98f135" Oct 03 09:00:03 crc kubenswrapper[4790]: I1003 09:00:03.867145 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Oct 03 09:00:03 crc kubenswrapper[4790]: I1003 09:00:03.867408 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Oct 03 09:00:05 crc kubenswrapper[4790]: I1003 09:00:05.047019 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Oct 03 09:00:05 crc kubenswrapper[4790]: I1003 09:00:05.047405 4790 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 03 09:00:05 crc kubenswrapper[4790]: I1003 09:00:05.299936 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Oct 03 09:00:05 crc kubenswrapper[4790]: I1003 09:00:05.735356 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Oct 03 09:00:05 crc kubenswrapper[4790]: I1003 09:00:05.884856 4790 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 03 09:00:05 crc kubenswrapper[4790]: I1003 09:00:05.953619 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Oct 03 09:00:08 crc kubenswrapper[4790]: I1003 09:00:08.315492 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-510d-account-create-n6565"] Oct 03 09:00:08 crc kubenswrapper[4790]: E1003 09:00:08.316259 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5fa19264-9605-4c16-a1d4-b412e599ebb1" containerName="collect-profiles" Oct 03 09:00:08 crc kubenswrapper[4790]: I1003 09:00:08.316272 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="5fa19264-9605-4c16-a1d4-b412e599ebb1" containerName="collect-profiles" Oct 03 09:00:08 crc kubenswrapper[4790]: I1003 09:00:08.316470 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="5fa19264-9605-4c16-a1d4-b412e599ebb1" containerName="collect-profiles" Oct 03 09:00:08 crc kubenswrapper[4790]: I1003 09:00:08.317094 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-510d-account-create-n6565" Oct 03 09:00:08 crc kubenswrapper[4790]: I1003 09:00:08.320269 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Oct 03 09:00:08 crc kubenswrapper[4790]: I1003 09:00:08.341807 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-510d-account-create-n6565"] Oct 03 09:00:08 crc kubenswrapper[4790]: I1003 09:00:08.509191 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4g4m5\" (UniqueName: \"kubernetes.io/projected/0a10df34-7c11-4117-9c35-81b9fa58e7e9-kube-api-access-4g4m5\") pod \"nova-api-510d-account-create-n6565\" (UID: \"0a10df34-7c11-4117-9c35-81b9fa58e7e9\") " pod="openstack/nova-api-510d-account-create-n6565" Oct 03 09:00:08 crc kubenswrapper[4790]: I1003 09:00:08.512354 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-4d8b-account-create-qx5g5"] Oct 03 09:00:08 crc kubenswrapper[4790]: I1003 09:00:08.513943 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-4d8b-account-create-qx5g5" Oct 03 09:00:08 crc kubenswrapper[4790]: I1003 09:00:08.516860 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Oct 03 09:00:08 crc kubenswrapper[4790]: I1003 09:00:08.524262 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-4d8b-account-create-qx5g5"] Oct 03 09:00:08 crc kubenswrapper[4790]: I1003 09:00:08.611600 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lw4df\" (UniqueName: \"kubernetes.io/projected/65738136-e138-4c95-80a9-ee6cc6b54f8f-kube-api-access-lw4df\") pod \"nova-cell0-4d8b-account-create-qx5g5\" (UID: \"65738136-e138-4c95-80a9-ee6cc6b54f8f\") " pod="openstack/nova-cell0-4d8b-account-create-qx5g5" Oct 03 09:00:08 crc kubenswrapper[4790]: I1003 09:00:08.611709 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4g4m5\" (UniqueName: \"kubernetes.io/projected/0a10df34-7c11-4117-9c35-81b9fa58e7e9-kube-api-access-4g4m5\") pod \"nova-api-510d-account-create-n6565\" (UID: \"0a10df34-7c11-4117-9c35-81b9fa58e7e9\") " pod="openstack/nova-api-510d-account-create-n6565" Oct 03 09:00:08 crc kubenswrapper[4790]: I1003 09:00:08.643407 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4g4m5\" (UniqueName: \"kubernetes.io/projected/0a10df34-7c11-4117-9c35-81b9fa58e7e9-kube-api-access-4g4m5\") pod \"nova-api-510d-account-create-n6565\" (UID: \"0a10df34-7c11-4117-9c35-81b9fa58e7e9\") " pod="openstack/nova-api-510d-account-create-n6565" Oct 03 09:00:08 crc kubenswrapper[4790]: I1003 09:00:08.713188 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lw4df\" (UniqueName: \"kubernetes.io/projected/65738136-e138-4c95-80a9-ee6cc6b54f8f-kube-api-access-lw4df\") pod \"nova-cell0-4d8b-account-create-qx5g5\" (UID: \"65738136-e138-4c95-80a9-ee6cc6b54f8f\") " pod="openstack/nova-cell0-4d8b-account-create-qx5g5" Oct 03 09:00:08 crc kubenswrapper[4790]: I1003 09:00:08.726109 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-e243-account-create-6jxhb"] Oct 03 09:00:08 crc kubenswrapper[4790]: I1003 09:00:08.727683 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-e243-account-create-6jxhb" Oct 03 09:00:08 crc kubenswrapper[4790]: I1003 09:00:08.729662 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Oct 03 09:00:08 crc kubenswrapper[4790]: I1003 09:00:08.736425 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lw4df\" (UniqueName: \"kubernetes.io/projected/65738136-e138-4c95-80a9-ee6cc6b54f8f-kube-api-access-lw4df\") pod \"nova-cell0-4d8b-account-create-qx5g5\" (UID: \"65738136-e138-4c95-80a9-ee6cc6b54f8f\") " pod="openstack/nova-cell0-4d8b-account-create-qx5g5" Oct 03 09:00:08 crc kubenswrapper[4790]: I1003 09:00:08.752684 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-e243-account-create-6jxhb"] Oct 03 09:00:08 crc kubenswrapper[4790]: I1003 09:00:08.842275 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-4d8b-account-create-qx5g5" Oct 03 09:00:08 crc kubenswrapper[4790]: I1003 09:00:08.917243 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cg79m\" (UniqueName: \"kubernetes.io/projected/1a0c8205-2d15-4333-ad01-ccfb999d5058-kube-api-access-cg79m\") pod \"nova-cell1-e243-account-create-6jxhb\" (UID: \"1a0c8205-2d15-4333-ad01-ccfb999d5058\") " pod="openstack/nova-cell1-e243-account-create-6jxhb" Oct 03 09:00:08 crc kubenswrapper[4790]: I1003 09:00:08.948925 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-510d-account-create-n6565" Oct 03 09:00:09 crc kubenswrapper[4790]: I1003 09:00:09.018566 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cg79m\" (UniqueName: \"kubernetes.io/projected/1a0c8205-2d15-4333-ad01-ccfb999d5058-kube-api-access-cg79m\") pod \"nova-cell1-e243-account-create-6jxhb\" (UID: \"1a0c8205-2d15-4333-ad01-ccfb999d5058\") " pod="openstack/nova-cell1-e243-account-create-6jxhb" Oct 03 09:00:09 crc kubenswrapper[4790]: I1003 09:00:09.289071 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cg79m\" (UniqueName: \"kubernetes.io/projected/1a0c8205-2d15-4333-ad01-ccfb999d5058-kube-api-access-cg79m\") pod \"nova-cell1-e243-account-create-6jxhb\" (UID: \"1a0c8205-2d15-4333-ad01-ccfb999d5058\") " pod="openstack/nova-cell1-e243-account-create-6jxhb" Oct 03 09:00:09 crc kubenswrapper[4790]: I1003 09:00:09.347289 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-4d8b-account-create-qx5g5"] Oct 03 09:00:09 crc kubenswrapper[4790]: I1003 09:00:09.405739 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-e243-account-create-6jxhb" Oct 03 09:00:09 crc kubenswrapper[4790]: I1003 09:00:09.725478 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-510d-account-create-n6565"] Oct 03 09:00:09 crc kubenswrapper[4790]: W1003 09:00:09.737122 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0a10df34_7c11_4117_9c35_81b9fa58e7e9.slice/crio-809d4251de8a9e3bf93ce70f6f31ef3ce78f732fc7575a200725c2b0d26f69c8 WatchSource:0}: Error finding container 809d4251de8a9e3bf93ce70f6f31ef3ce78f732fc7575a200725c2b0d26f69c8: Status 404 returned error can't find the container with id 809d4251de8a9e3bf93ce70f6f31ef3ce78f732fc7575a200725c2b0d26f69c8 Oct 03 09:00:09 crc kubenswrapper[4790]: I1003 09:00:09.927280 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-4d8b-account-create-qx5g5" event={"ID":"65738136-e138-4c95-80a9-ee6cc6b54f8f","Type":"ContainerStarted","Data":"93b11fd1cc534e59c11142d27c2367f2da2cd23b978580621a93c89a9cad1b82"} Oct 03 09:00:09 crc kubenswrapper[4790]: I1003 09:00:09.927335 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-4d8b-account-create-qx5g5" event={"ID":"65738136-e138-4c95-80a9-ee6cc6b54f8f","Type":"ContainerStarted","Data":"8b342fcd3ba6c729dd8bdb641193275d0c393080c0e2c9a1f977b52a3ec684f8"} Oct 03 09:00:09 crc kubenswrapper[4790]: I1003 09:00:09.930164 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-510d-account-create-n6565" event={"ID":"0a10df34-7c11-4117-9c35-81b9fa58e7e9","Type":"ContainerStarted","Data":"809d4251de8a9e3bf93ce70f6f31ef3ce78f732fc7575a200725c2b0d26f69c8"} Oct 03 09:00:09 crc kubenswrapper[4790]: I1003 09:00:09.952629 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-e243-account-create-6jxhb"] Oct 03 09:00:09 crc kubenswrapper[4790]: I1003 09:00:09.978089 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-4d8b-account-create-qx5g5" podStartSLOduration=1.9780652060000001 podStartE2EDuration="1.978065206s" podCreationTimestamp="2025-10-03 09:00:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 09:00:09.958968793 +0000 UTC m=+1196.311903041" watchObservedRunningTime="2025-10-03 09:00:09.978065206 +0000 UTC m=+1196.330999444" Oct 03 09:00:10 crc kubenswrapper[4790]: I1003 09:00:10.941489 4790 generic.go:334] "Generic (PLEG): container finished" podID="65738136-e138-4c95-80a9-ee6cc6b54f8f" containerID="93b11fd1cc534e59c11142d27c2367f2da2cd23b978580621a93c89a9cad1b82" exitCode=0 Oct 03 09:00:10 crc kubenswrapper[4790]: I1003 09:00:10.941542 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-4d8b-account-create-qx5g5" event={"ID":"65738136-e138-4c95-80a9-ee6cc6b54f8f","Type":"ContainerDied","Data":"93b11fd1cc534e59c11142d27c2367f2da2cd23b978580621a93c89a9cad1b82"} Oct 03 09:00:10 crc kubenswrapper[4790]: I1003 09:00:10.943919 4790 generic.go:334] "Generic (PLEG): container finished" podID="0a10df34-7c11-4117-9c35-81b9fa58e7e9" containerID="168b94606eb5a413ca725f93786204e75532fba048cab9f8cd2851139bd0b841" exitCode=0 Oct 03 09:00:10 crc kubenswrapper[4790]: I1003 09:00:10.943999 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-510d-account-create-n6565" event={"ID":"0a10df34-7c11-4117-9c35-81b9fa58e7e9","Type":"ContainerDied","Data":"168b94606eb5a413ca725f93786204e75532fba048cab9f8cd2851139bd0b841"} Oct 03 09:00:10 crc kubenswrapper[4790]: I1003 09:00:10.946200 4790 generic.go:334] "Generic (PLEG): container finished" podID="1a0c8205-2d15-4333-ad01-ccfb999d5058" containerID="9e6dd9a9680ae937c98491561b19adaedb04ca56fedc6564020ecf1aed06adfb" exitCode=0 Oct 03 09:00:10 crc kubenswrapper[4790]: I1003 09:00:10.946244 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-e243-account-create-6jxhb" event={"ID":"1a0c8205-2d15-4333-ad01-ccfb999d5058","Type":"ContainerDied","Data":"9e6dd9a9680ae937c98491561b19adaedb04ca56fedc6564020ecf1aed06adfb"} Oct 03 09:00:10 crc kubenswrapper[4790]: I1003 09:00:10.946266 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-e243-account-create-6jxhb" event={"ID":"1a0c8205-2d15-4333-ad01-ccfb999d5058","Type":"ContainerStarted","Data":"71a578d6be58c8f1a67fb3f50fc89027126675d017bfa0e1a15afc101e33d1fd"} Oct 03 09:00:11 crc kubenswrapper[4790]: I1003 09:00:11.925864 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-decision-engine-0" Oct 03 09:00:11 crc kubenswrapper[4790]: I1003 09:00:11.959995 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/watcher-decision-engine-0" Oct 03 09:00:11 crc kubenswrapper[4790]: I1003 09:00:11.961073 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-decision-engine-0" Oct 03 09:00:12 crc kubenswrapper[4790]: I1003 09:00:12.004609 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/watcher-decision-engine-0" Oct 03 09:00:12 crc kubenswrapper[4790]: I1003 09:00:12.050142 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-decision-engine-0"] Oct 03 09:00:12 crc kubenswrapper[4790]: I1003 09:00:12.591129 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-510d-account-create-n6565" Oct 03 09:00:12 crc kubenswrapper[4790]: I1003 09:00:12.600405 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-4d8b-account-create-qx5g5" Oct 03 09:00:12 crc kubenswrapper[4790]: I1003 09:00:12.609910 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-e243-account-create-6jxhb" Oct 03 09:00:12 crc kubenswrapper[4790]: I1003 09:00:12.699745 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lw4df\" (UniqueName: \"kubernetes.io/projected/65738136-e138-4c95-80a9-ee6cc6b54f8f-kube-api-access-lw4df\") pod \"65738136-e138-4c95-80a9-ee6cc6b54f8f\" (UID: \"65738136-e138-4c95-80a9-ee6cc6b54f8f\") " Oct 03 09:00:12 crc kubenswrapper[4790]: I1003 09:00:12.700104 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4g4m5\" (UniqueName: \"kubernetes.io/projected/0a10df34-7c11-4117-9c35-81b9fa58e7e9-kube-api-access-4g4m5\") pod \"0a10df34-7c11-4117-9c35-81b9fa58e7e9\" (UID: \"0a10df34-7c11-4117-9c35-81b9fa58e7e9\") " Oct 03 09:00:12 crc kubenswrapper[4790]: I1003 09:00:12.700210 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cg79m\" (UniqueName: \"kubernetes.io/projected/1a0c8205-2d15-4333-ad01-ccfb999d5058-kube-api-access-cg79m\") pod \"1a0c8205-2d15-4333-ad01-ccfb999d5058\" (UID: \"1a0c8205-2d15-4333-ad01-ccfb999d5058\") " Oct 03 09:00:12 crc kubenswrapper[4790]: I1003 09:00:12.706978 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/65738136-e138-4c95-80a9-ee6cc6b54f8f-kube-api-access-lw4df" (OuterVolumeSpecName: "kube-api-access-lw4df") pod "65738136-e138-4c95-80a9-ee6cc6b54f8f" (UID: "65738136-e138-4c95-80a9-ee6cc6b54f8f"). InnerVolumeSpecName "kube-api-access-lw4df". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:00:12 crc kubenswrapper[4790]: I1003 09:00:12.713560 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1a0c8205-2d15-4333-ad01-ccfb999d5058-kube-api-access-cg79m" (OuterVolumeSpecName: "kube-api-access-cg79m") pod "1a0c8205-2d15-4333-ad01-ccfb999d5058" (UID: "1a0c8205-2d15-4333-ad01-ccfb999d5058"). InnerVolumeSpecName "kube-api-access-cg79m". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:00:12 crc kubenswrapper[4790]: I1003 09:00:12.717148 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0a10df34-7c11-4117-9c35-81b9fa58e7e9-kube-api-access-4g4m5" (OuterVolumeSpecName: "kube-api-access-4g4m5") pod "0a10df34-7c11-4117-9c35-81b9fa58e7e9" (UID: "0a10df34-7c11-4117-9c35-81b9fa58e7e9"). InnerVolumeSpecName "kube-api-access-4g4m5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:00:12 crc kubenswrapper[4790]: I1003 09:00:12.802543 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lw4df\" (UniqueName: \"kubernetes.io/projected/65738136-e138-4c95-80a9-ee6cc6b54f8f-kube-api-access-lw4df\") on node \"crc\" DevicePath \"\"" Oct 03 09:00:12 crc kubenswrapper[4790]: I1003 09:00:12.802594 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4g4m5\" (UniqueName: \"kubernetes.io/projected/0a10df34-7c11-4117-9c35-81b9fa58e7e9-kube-api-access-4g4m5\") on node \"crc\" DevicePath \"\"" Oct 03 09:00:12 crc kubenswrapper[4790]: I1003 09:00:12.802608 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cg79m\" (UniqueName: \"kubernetes.io/projected/1a0c8205-2d15-4333-ad01-ccfb999d5058-kube-api-access-cg79m\") on node \"crc\" DevicePath \"\"" Oct 03 09:00:12 crc kubenswrapper[4790]: I1003 09:00:12.963862 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-e243-account-create-6jxhb" event={"ID":"1a0c8205-2d15-4333-ad01-ccfb999d5058","Type":"ContainerDied","Data":"71a578d6be58c8f1a67fb3f50fc89027126675d017bfa0e1a15afc101e33d1fd"} Oct 03 09:00:12 crc kubenswrapper[4790]: I1003 09:00:12.963902 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="71a578d6be58c8f1a67fb3f50fc89027126675d017bfa0e1a15afc101e33d1fd" Oct 03 09:00:12 crc kubenswrapper[4790]: I1003 09:00:12.963912 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-e243-account-create-6jxhb" Oct 03 09:00:12 crc kubenswrapper[4790]: I1003 09:00:12.965750 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-4d8b-account-create-qx5g5" event={"ID":"65738136-e138-4c95-80a9-ee6cc6b54f8f","Type":"ContainerDied","Data":"8b342fcd3ba6c729dd8bdb641193275d0c393080c0e2c9a1f977b52a3ec684f8"} Oct 03 09:00:12 crc kubenswrapper[4790]: I1003 09:00:12.965773 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8b342fcd3ba6c729dd8bdb641193275d0c393080c0e2c9a1f977b52a3ec684f8" Oct 03 09:00:12 crc kubenswrapper[4790]: I1003 09:00:12.965818 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-4d8b-account-create-qx5g5" Oct 03 09:00:12 crc kubenswrapper[4790]: I1003 09:00:12.976251 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-510d-account-create-n6565" event={"ID":"0a10df34-7c11-4117-9c35-81b9fa58e7e9","Type":"ContainerDied","Data":"809d4251de8a9e3bf93ce70f6f31ef3ce78f732fc7575a200725c2b0d26f69c8"} Oct 03 09:00:12 crc kubenswrapper[4790]: I1003 09:00:12.976282 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-510d-account-create-n6565" Oct 03 09:00:12 crc kubenswrapper[4790]: I1003 09:00:12.976304 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="809d4251de8a9e3bf93ce70f6f31ef3ce78f732fc7575a200725c2b0d26f69c8" Oct 03 09:00:13 crc kubenswrapper[4790]: E1003 09:00:13.964125 4790 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod64754289_91b5_4f84_9809_6271b6a13031.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddf20e14b_d7d8_4cab_8b7c_61122da75fbf.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddf20e14b_d7d8_4cab_8b7c_61122da75fbf.slice/crio-166a3036570750727b0dd758229bd28aafecd40eb06b2140fd160ae44b5f7978\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod64754289_91b5_4f84_9809_6271b6a13031.slice/crio-b0ff64c50c118d67a4cf15daa2b8ee6c4f295601ca118a67059cebe19412eeaf\": RecentStats: unable to find data in memory cache]" Oct 03 09:00:13 crc kubenswrapper[4790]: I1003 09:00:13.986053 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/watcher-decision-engine-0" podUID="f0afbb0f-f4dc-4814-8c7f-791351972c0b" containerName="watcher-decision-engine" containerID="cri-o://f2af8048b69e7ae948e0156778d10832fb1b1892abc42d130934e3ab87b85c32" gracePeriod=30 Oct 03 09:00:16 crc kubenswrapper[4790]: I1003 09:00:16.005516 4790 generic.go:334] "Generic (PLEG): container finished" podID="f0afbb0f-f4dc-4814-8c7f-791351972c0b" containerID="f2af8048b69e7ae948e0156778d10832fb1b1892abc42d130934e3ab87b85c32" exitCode=0 Oct 03 09:00:16 crc kubenswrapper[4790]: I1003 09:00:16.005612 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"f0afbb0f-f4dc-4814-8c7f-791351972c0b","Type":"ContainerDied","Data":"f2af8048b69e7ae948e0156778d10832fb1b1892abc42d130934e3ab87b85c32"} Oct 03 09:00:16 crc kubenswrapper[4790]: I1003 09:00:16.006161 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"f0afbb0f-f4dc-4814-8c7f-791351972c0b","Type":"ContainerDied","Data":"a908578b6502ff3984c4ef14391723bc62abedc7411bba3d07eac6cc51be543c"} Oct 03 09:00:16 crc kubenswrapper[4790]: I1003 09:00:16.006175 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a908578b6502ff3984c4ef14391723bc62abedc7411bba3d07eac6cc51be543c" Oct 03 09:00:16 crc kubenswrapper[4790]: I1003 09:00:16.006191 4790 scope.go:117] "RemoveContainer" containerID="f7e1596a2e3a93c9acb16c7a72441fad846a55cd8aad3832a2f6428d26e70811" Oct 03 09:00:16 crc kubenswrapper[4790]: I1003 09:00:16.008238 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-decision-engine-0" Oct 03 09:00:16 crc kubenswrapper[4790]: I1003 09:00:16.176818 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/f0afbb0f-f4dc-4814-8c7f-791351972c0b-custom-prometheus-ca\") pod \"f0afbb0f-f4dc-4814-8c7f-791351972c0b\" (UID: \"f0afbb0f-f4dc-4814-8c7f-791351972c0b\") " Oct 03 09:00:16 crc kubenswrapper[4790]: I1003 09:00:16.176868 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f0afbb0f-f4dc-4814-8c7f-791351972c0b-logs\") pod \"f0afbb0f-f4dc-4814-8c7f-791351972c0b\" (UID: \"f0afbb0f-f4dc-4814-8c7f-791351972c0b\") " Oct 03 09:00:16 crc kubenswrapper[4790]: I1003 09:00:16.177008 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ncf84\" (UniqueName: \"kubernetes.io/projected/f0afbb0f-f4dc-4814-8c7f-791351972c0b-kube-api-access-ncf84\") pod \"f0afbb0f-f4dc-4814-8c7f-791351972c0b\" (UID: \"f0afbb0f-f4dc-4814-8c7f-791351972c0b\") " Oct 03 09:00:16 crc kubenswrapper[4790]: I1003 09:00:16.177112 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f0afbb0f-f4dc-4814-8c7f-791351972c0b-config-data\") pod \"f0afbb0f-f4dc-4814-8c7f-791351972c0b\" (UID: \"f0afbb0f-f4dc-4814-8c7f-791351972c0b\") " Oct 03 09:00:16 crc kubenswrapper[4790]: I1003 09:00:16.177130 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f0afbb0f-f4dc-4814-8c7f-791351972c0b-combined-ca-bundle\") pod \"f0afbb0f-f4dc-4814-8c7f-791351972c0b\" (UID: \"f0afbb0f-f4dc-4814-8c7f-791351972c0b\") " Oct 03 09:00:16 crc kubenswrapper[4790]: I1003 09:00:16.177334 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f0afbb0f-f4dc-4814-8c7f-791351972c0b-logs" (OuterVolumeSpecName: "logs") pod "f0afbb0f-f4dc-4814-8c7f-791351972c0b" (UID: "f0afbb0f-f4dc-4814-8c7f-791351972c0b"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 09:00:16 crc kubenswrapper[4790]: I1003 09:00:16.177555 4790 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f0afbb0f-f4dc-4814-8c7f-791351972c0b-logs\") on node \"crc\" DevicePath \"\"" Oct 03 09:00:16 crc kubenswrapper[4790]: I1003 09:00:16.183326 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f0afbb0f-f4dc-4814-8c7f-791351972c0b-kube-api-access-ncf84" (OuterVolumeSpecName: "kube-api-access-ncf84") pod "f0afbb0f-f4dc-4814-8c7f-791351972c0b" (UID: "f0afbb0f-f4dc-4814-8c7f-791351972c0b"). InnerVolumeSpecName "kube-api-access-ncf84". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:00:16 crc kubenswrapper[4790]: I1003 09:00:16.206817 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f0afbb0f-f4dc-4814-8c7f-791351972c0b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f0afbb0f-f4dc-4814-8c7f-791351972c0b" (UID: "f0afbb0f-f4dc-4814-8c7f-791351972c0b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:00:16 crc kubenswrapper[4790]: I1003 09:00:16.209656 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f0afbb0f-f4dc-4814-8c7f-791351972c0b-custom-prometheus-ca" (OuterVolumeSpecName: "custom-prometheus-ca") pod "f0afbb0f-f4dc-4814-8c7f-791351972c0b" (UID: "f0afbb0f-f4dc-4814-8c7f-791351972c0b"). InnerVolumeSpecName "custom-prometheus-ca". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:00:16 crc kubenswrapper[4790]: I1003 09:00:16.229804 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f0afbb0f-f4dc-4814-8c7f-791351972c0b-config-data" (OuterVolumeSpecName: "config-data") pod "f0afbb0f-f4dc-4814-8c7f-791351972c0b" (UID: "f0afbb0f-f4dc-4814-8c7f-791351972c0b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:00:16 crc kubenswrapper[4790]: I1003 09:00:16.279403 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ncf84\" (UniqueName: \"kubernetes.io/projected/f0afbb0f-f4dc-4814-8c7f-791351972c0b-kube-api-access-ncf84\") on node \"crc\" DevicePath \"\"" Oct 03 09:00:16 crc kubenswrapper[4790]: I1003 09:00:16.279658 4790 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f0afbb0f-f4dc-4814-8c7f-791351972c0b-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 09:00:16 crc kubenswrapper[4790]: I1003 09:00:16.279752 4790 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f0afbb0f-f4dc-4814-8c7f-791351972c0b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 09:00:16 crc kubenswrapper[4790]: I1003 09:00:16.279808 4790 reconciler_common.go:293] "Volume detached for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/f0afbb0f-f4dc-4814-8c7f-791351972c0b-custom-prometheus-ca\") on node \"crc\" DevicePath \"\"" Oct 03 09:00:17 crc kubenswrapper[4790]: I1003 09:00:17.016176 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-decision-engine-0" Oct 03 09:00:17 crc kubenswrapper[4790]: I1003 09:00:17.048952 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-decision-engine-0"] Oct 03 09:00:17 crc kubenswrapper[4790]: I1003 09:00:17.058740 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/watcher-decision-engine-0"] Oct 03 09:00:17 crc kubenswrapper[4790]: I1003 09:00:17.072991 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-decision-engine-0"] Oct 03 09:00:17 crc kubenswrapper[4790]: E1003 09:00:17.073805 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f0afbb0f-f4dc-4814-8c7f-791351972c0b" containerName="watcher-decision-engine" Oct 03 09:00:17 crc kubenswrapper[4790]: I1003 09:00:17.073829 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="f0afbb0f-f4dc-4814-8c7f-791351972c0b" containerName="watcher-decision-engine" Oct 03 09:00:17 crc kubenswrapper[4790]: E1003 09:00:17.073845 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0a10df34-7c11-4117-9c35-81b9fa58e7e9" containerName="mariadb-account-create" Oct 03 09:00:17 crc kubenswrapper[4790]: I1003 09:00:17.073854 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a10df34-7c11-4117-9c35-81b9fa58e7e9" containerName="mariadb-account-create" Oct 03 09:00:17 crc kubenswrapper[4790]: E1003 09:00:17.073874 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1a0c8205-2d15-4333-ad01-ccfb999d5058" containerName="mariadb-account-create" Oct 03 09:00:17 crc kubenswrapper[4790]: I1003 09:00:17.073885 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="1a0c8205-2d15-4333-ad01-ccfb999d5058" containerName="mariadb-account-create" Oct 03 09:00:17 crc kubenswrapper[4790]: E1003 09:00:17.073899 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="65738136-e138-4c95-80a9-ee6cc6b54f8f" containerName="mariadb-account-create" Oct 03 09:00:17 crc kubenswrapper[4790]: I1003 09:00:17.073907 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="65738136-e138-4c95-80a9-ee6cc6b54f8f" containerName="mariadb-account-create" Oct 03 09:00:17 crc kubenswrapper[4790]: E1003 09:00:17.073926 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f0afbb0f-f4dc-4814-8c7f-791351972c0b" containerName="watcher-decision-engine" Oct 03 09:00:17 crc kubenswrapper[4790]: I1003 09:00:17.073933 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="f0afbb0f-f4dc-4814-8c7f-791351972c0b" containerName="watcher-decision-engine" Oct 03 09:00:17 crc kubenswrapper[4790]: E1003 09:00:17.073955 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f0afbb0f-f4dc-4814-8c7f-791351972c0b" containerName="watcher-decision-engine" Oct 03 09:00:17 crc kubenswrapper[4790]: I1003 09:00:17.073964 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="f0afbb0f-f4dc-4814-8c7f-791351972c0b" containerName="watcher-decision-engine" Oct 03 09:00:17 crc kubenswrapper[4790]: I1003 09:00:17.074189 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="f0afbb0f-f4dc-4814-8c7f-791351972c0b" containerName="watcher-decision-engine" Oct 03 09:00:17 crc kubenswrapper[4790]: I1003 09:00:17.074206 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="f0afbb0f-f4dc-4814-8c7f-791351972c0b" containerName="watcher-decision-engine" Oct 03 09:00:17 crc kubenswrapper[4790]: I1003 09:00:17.074221 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="f0afbb0f-f4dc-4814-8c7f-791351972c0b" containerName="watcher-decision-engine" Oct 03 09:00:17 crc kubenswrapper[4790]: I1003 09:00:17.074239 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="1a0c8205-2d15-4333-ad01-ccfb999d5058" containerName="mariadb-account-create" Oct 03 09:00:17 crc kubenswrapper[4790]: I1003 09:00:17.074251 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="0a10df34-7c11-4117-9c35-81b9fa58e7e9" containerName="mariadb-account-create" Oct 03 09:00:17 crc kubenswrapper[4790]: I1003 09:00:17.074263 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="65738136-e138-4c95-80a9-ee6cc6b54f8f" containerName="mariadb-account-create" Oct 03 09:00:17 crc kubenswrapper[4790]: I1003 09:00:17.075102 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-decision-engine-0" Oct 03 09:00:17 crc kubenswrapper[4790]: I1003 09:00:17.080045 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-decision-engine-config-data" Oct 03 09:00:17 crc kubenswrapper[4790]: I1003 09:00:17.085359 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-decision-engine-0"] Oct 03 09:00:17 crc kubenswrapper[4790]: I1003 09:00:17.195238 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5d8afb4c-874f-4d5d-ad9b-e11a24bd9799-config-data\") pod \"watcher-decision-engine-0\" (UID: \"5d8afb4c-874f-4d5d-ad9b-e11a24bd9799\") " pod="openstack/watcher-decision-engine-0" Oct 03 09:00:17 crc kubenswrapper[4790]: I1003 09:00:17.195350 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/5d8afb4c-874f-4d5d-ad9b-e11a24bd9799-custom-prometheus-ca\") pod \"watcher-decision-engine-0\" (UID: \"5d8afb4c-874f-4d5d-ad9b-e11a24bd9799\") " pod="openstack/watcher-decision-engine-0" Oct 03 09:00:17 crc kubenswrapper[4790]: I1003 09:00:17.195407 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5d8afb4c-874f-4d5d-ad9b-e11a24bd9799-logs\") pod \"watcher-decision-engine-0\" (UID: \"5d8afb4c-874f-4d5d-ad9b-e11a24bd9799\") " pod="openstack/watcher-decision-engine-0" Oct 03 09:00:17 crc kubenswrapper[4790]: I1003 09:00:17.195473 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d8afb4c-874f-4d5d-ad9b-e11a24bd9799-combined-ca-bundle\") pod \"watcher-decision-engine-0\" (UID: \"5d8afb4c-874f-4d5d-ad9b-e11a24bd9799\") " pod="openstack/watcher-decision-engine-0" Oct 03 09:00:17 crc kubenswrapper[4790]: I1003 09:00:17.195503 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sttrw\" (UniqueName: \"kubernetes.io/projected/5d8afb4c-874f-4d5d-ad9b-e11a24bd9799-kube-api-access-sttrw\") pod \"watcher-decision-engine-0\" (UID: \"5d8afb4c-874f-4d5d-ad9b-e11a24bd9799\") " pod="openstack/watcher-decision-engine-0" Oct 03 09:00:17 crc kubenswrapper[4790]: I1003 09:00:17.297221 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/5d8afb4c-874f-4d5d-ad9b-e11a24bd9799-custom-prometheus-ca\") pod \"watcher-decision-engine-0\" (UID: \"5d8afb4c-874f-4d5d-ad9b-e11a24bd9799\") " pod="openstack/watcher-decision-engine-0" Oct 03 09:00:17 crc kubenswrapper[4790]: I1003 09:00:17.297292 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5d8afb4c-874f-4d5d-ad9b-e11a24bd9799-logs\") pod \"watcher-decision-engine-0\" (UID: \"5d8afb4c-874f-4d5d-ad9b-e11a24bd9799\") " pod="openstack/watcher-decision-engine-0" Oct 03 09:00:17 crc kubenswrapper[4790]: I1003 09:00:17.297355 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d8afb4c-874f-4d5d-ad9b-e11a24bd9799-combined-ca-bundle\") pod \"watcher-decision-engine-0\" (UID: \"5d8afb4c-874f-4d5d-ad9b-e11a24bd9799\") " pod="openstack/watcher-decision-engine-0" Oct 03 09:00:17 crc kubenswrapper[4790]: I1003 09:00:17.297382 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sttrw\" (UniqueName: \"kubernetes.io/projected/5d8afb4c-874f-4d5d-ad9b-e11a24bd9799-kube-api-access-sttrw\") pod \"watcher-decision-engine-0\" (UID: \"5d8afb4c-874f-4d5d-ad9b-e11a24bd9799\") " pod="openstack/watcher-decision-engine-0" Oct 03 09:00:17 crc kubenswrapper[4790]: I1003 09:00:17.297509 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5d8afb4c-874f-4d5d-ad9b-e11a24bd9799-config-data\") pod \"watcher-decision-engine-0\" (UID: \"5d8afb4c-874f-4d5d-ad9b-e11a24bd9799\") " pod="openstack/watcher-decision-engine-0" Oct 03 09:00:17 crc kubenswrapper[4790]: I1003 09:00:17.297776 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5d8afb4c-874f-4d5d-ad9b-e11a24bd9799-logs\") pod \"watcher-decision-engine-0\" (UID: \"5d8afb4c-874f-4d5d-ad9b-e11a24bd9799\") " pod="openstack/watcher-decision-engine-0" Oct 03 09:00:17 crc kubenswrapper[4790]: I1003 09:00:17.303840 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d8afb4c-874f-4d5d-ad9b-e11a24bd9799-combined-ca-bundle\") pod \"watcher-decision-engine-0\" (UID: \"5d8afb4c-874f-4d5d-ad9b-e11a24bd9799\") " pod="openstack/watcher-decision-engine-0" Oct 03 09:00:17 crc kubenswrapper[4790]: I1003 09:00:17.304114 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5d8afb4c-874f-4d5d-ad9b-e11a24bd9799-config-data\") pod \"watcher-decision-engine-0\" (UID: \"5d8afb4c-874f-4d5d-ad9b-e11a24bd9799\") " pod="openstack/watcher-decision-engine-0" Oct 03 09:00:17 crc kubenswrapper[4790]: I1003 09:00:17.304829 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/5d8afb4c-874f-4d5d-ad9b-e11a24bd9799-custom-prometheus-ca\") pod \"watcher-decision-engine-0\" (UID: \"5d8afb4c-874f-4d5d-ad9b-e11a24bd9799\") " pod="openstack/watcher-decision-engine-0" Oct 03 09:00:17 crc kubenswrapper[4790]: I1003 09:00:17.313593 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sttrw\" (UniqueName: \"kubernetes.io/projected/5d8afb4c-874f-4d5d-ad9b-e11a24bd9799-kube-api-access-sttrw\") pod \"watcher-decision-engine-0\" (UID: \"5d8afb4c-874f-4d5d-ad9b-e11a24bd9799\") " pod="openstack/watcher-decision-engine-0" Oct 03 09:00:17 crc kubenswrapper[4790]: I1003 09:00:17.395692 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-decision-engine-0" Oct 03 09:00:17 crc kubenswrapper[4790]: I1003 09:00:17.891561 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-decision-engine-0"] Oct 03 09:00:17 crc kubenswrapper[4790]: W1003 09:00:17.893618 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5d8afb4c_874f_4d5d_ad9b_e11a24bd9799.slice/crio-37c9b9b342a9e22858dc9a5c26365a700d6f86cefd3b4a9930981ca659c76d8a WatchSource:0}: Error finding container 37c9b9b342a9e22858dc9a5c26365a700d6f86cefd3b4a9930981ca659c76d8a: Status 404 returned error can't find the container with id 37c9b9b342a9e22858dc9a5c26365a700d6f86cefd3b4a9930981ca659c76d8a Oct 03 09:00:18 crc kubenswrapper[4790]: I1003 09:00:18.029374 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"5d8afb4c-874f-4d5d-ad9b-e11a24bd9799","Type":"ContainerStarted","Data":"37c9b9b342a9e22858dc9a5c26365a700d6f86cefd3b4a9930981ca659c76d8a"} Oct 03 09:00:18 crc kubenswrapper[4790]: I1003 09:00:18.339474 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f0afbb0f-f4dc-4814-8c7f-791351972c0b" path="/var/lib/kubelet/pods/f0afbb0f-f4dc-4814-8c7f-791351972c0b/volumes" Oct 03 09:00:18 crc kubenswrapper[4790]: I1003 09:00:18.698807 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-tzhvj"] Oct 03 09:00:18 crc kubenswrapper[4790]: E1003 09:00:18.699289 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f0afbb0f-f4dc-4814-8c7f-791351972c0b" containerName="watcher-decision-engine" Oct 03 09:00:18 crc kubenswrapper[4790]: I1003 09:00:18.699312 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="f0afbb0f-f4dc-4814-8c7f-791351972c0b" containerName="watcher-decision-engine" Oct 03 09:00:18 crc kubenswrapper[4790]: I1003 09:00:18.699549 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="f0afbb0f-f4dc-4814-8c7f-791351972c0b" containerName="watcher-decision-engine" Oct 03 09:00:18 crc kubenswrapper[4790]: I1003 09:00:18.700391 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-tzhvj" Oct 03 09:00:18 crc kubenswrapper[4790]: I1003 09:00:18.703335 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-mtdp6" Oct 03 09:00:18 crc kubenswrapper[4790]: I1003 09:00:18.703850 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Oct 03 09:00:18 crc kubenswrapper[4790]: I1003 09:00:18.704109 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Oct 03 09:00:18 crc kubenswrapper[4790]: I1003 09:00:18.717551 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-tzhvj"] Oct 03 09:00:18 crc kubenswrapper[4790]: I1003 09:00:18.827507 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/42766976-3706-48c2-b456-1eef4f3d1303-config-data\") pod \"nova-cell0-conductor-db-sync-tzhvj\" (UID: \"42766976-3706-48c2-b456-1eef4f3d1303\") " pod="openstack/nova-cell0-conductor-db-sync-tzhvj" Oct 03 09:00:18 crc kubenswrapper[4790]: I1003 09:00:18.827800 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/42766976-3706-48c2-b456-1eef4f3d1303-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-tzhvj\" (UID: \"42766976-3706-48c2-b456-1eef4f3d1303\") " pod="openstack/nova-cell0-conductor-db-sync-tzhvj" Oct 03 09:00:18 crc kubenswrapper[4790]: I1003 09:00:18.827924 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/42766976-3706-48c2-b456-1eef4f3d1303-scripts\") pod \"nova-cell0-conductor-db-sync-tzhvj\" (UID: \"42766976-3706-48c2-b456-1eef4f3d1303\") " pod="openstack/nova-cell0-conductor-db-sync-tzhvj" Oct 03 09:00:18 crc kubenswrapper[4790]: I1003 09:00:18.828062 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b55tj\" (UniqueName: \"kubernetes.io/projected/42766976-3706-48c2-b456-1eef4f3d1303-kube-api-access-b55tj\") pod \"nova-cell0-conductor-db-sync-tzhvj\" (UID: \"42766976-3706-48c2-b456-1eef4f3d1303\") " pod="openstack/nova-cell0-conductor-db-sync-tzhvj" Oct 03 09:00:18 crc kubenswrapper[4790]: I1003 09:00:18.929668 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b55tj\" (UniqueName: \"kubernetes.io/projected/42766976-3706-48c2-b456-1eef4f3d1303-kube-api-access-b55tj\") pod \"nova-cell0-conductor-db-sync-tzhvj\" (UID: \"42766976-3706-48c2-b456-1eef4f3d1303\") " pod="openstack/nova-cell0-conductor-db-sync-tzhvj" Oct 03 09:00:18 crc kubenswrapper[4790]: I1003 09:00:18.929825 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/42766976-3706-48c2-b456-1eef4f3d1303-config-data\") pod \"nova-cell0-conductor-db-sync-tzhvj\" (UID: \"42766976-3706-48c2-b456-1eef4f3d1303\") " pod="openstack/nova-cell0-conductor-db-sync-tzhvj" Oct 03 09:00:18 crc kubenswrapper[4790]: I1003 09:00:18.929850 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/42766976-3706-48c2-b456-1eef4f3d1303-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-tzhvj\" (UID: \"42766976-3706-48c2-b456-1eef4f3d1303\") " pod="openstack/nova-cell0-conductor-db-sync-tzhvj" Oct 03 09:00:18 crc kubenswrapper[4790]: I1003 09:00:18.929873 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/42766976-3706-48c2-b456-1eef4f3d1303-scripts\") pod \"nova-cell0-conductor-db-sync-tzhvj\" (UID: \"42766976-3706-48c2-b456-1eef4f3d1303\") " pod="openstack/nova-cell0-conductor-db-sync-tzhvj" Oct 03 09:00:18 crc kubenswrapper[4790]: I1003 09:00:18.935767 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/42766976-3706-48c2-b456-1eef4f3d1303-scripts\") pod \"nova-cell0-conductor-db-sync-tzhvj\" (UID: \"42766976-3706-48c2-b456-1eef4f3d1303\") " pod="openstack/nova-cell0-conductor-db-sync-tzhvj" Oct 03 09:00:18 crc kubenswrapper[4790]: I1003 09:00:18.936379 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/42766976-3706-48c2-b456-1eef4f3d1303-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-tzhvj\" (UID: \"42766976-3706-48c2-b456-1eef4f3d1303\") " pod="openstack/nova-cell0-conductor-db-sync-tzhvj" Oct 03 09:00:18 crc kubenswrapper[4790]: I1003 09:00:18.937419 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/42766976-3706-48c2-b456-1eef4f3d1303-config-data\") pod \"nova-cell0-conductor-db-sync-tzhvj\" (UID: \"42766976-3706-48c2-b456-1eef4f3d1303\") " pod="openstack/nova-cell0-conductor-db-sync-tzhvj" Oct 03 09:00:18 crc kubenswrapper[4790]: I1003 09:00:18.947297 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b55tj\" (UniqueName: \"kubernetes.io/projected/42766976-3706-48c2-b456-1eef4f3d1303-kube-api-access-b55tj\") pod \"nova-cell0-conductor-db-sync-tzhvj\" (UID: \"42766976-3706-48c2-b456-1eef4f3d1303\") " pod="openstack/nova-cell0-conductor-db-sync-tzhvj" Oct 03 09:00:19 crc kubenswrapper[4790]: I1003 09:00:19.026958 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-tzhvj" Oct 03 09:00:19 crc kubenswrapper[4790]: I1003 09:00:19.055313 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"5d8afb4c-874f-4d5d-ad9b-e11a24bd9799","Type":"ContainerStarted","Data":"25aa0f02f4de5911a9715a2c44a3a07108a9401aec445aa322e92d314f2bacd5"} Oct 03 09:00:19 crc kubenswrapper[4790]: I1003 09:00:19.089433 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/watcher-decision-engine-0" podStartSLOduration=2.08941115 podStartE2EDuration="2.08941115s" podCreationTimestamp="2025-10-03 09:00:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 09:00:19.083798496 +0000 UTC m=+1205.436732754" watchObservedRunningTime="2025-10-03 09:00:19.08941115 +0000 UTC m=+1205.442345398" Oct 03 09:00:19 crc kubenswrapper[4790]: I1003 09:00:19.571142 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-tzhvj"] Oct 03 09:00:19 crc kubenswrapper[4790]: W1003 09:00:19.577333 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod42766976_3706_48c2_b456_1eef4f3d1303.slice/crio-83f733a8b445a67799b91805bb7fbdc8f8697369422c22cead87a2aec2e9cf8a WatchSource:0}: Error finding container 83f733a8b445a67799b91805bb7fbdc8f8697369422c22cead87a2aec2e9cf8a: Status 404 returned error can't find the container with id 83f733a8b445a67799b91805bb7fbdc8f8697369422c22cead87a2aec2e9cf8a Oct 03 09:00:20 crc kubenswrapper[4790]: I1003 09:00:20.067512 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-tzhvj" event={"ID":"42766976-3706-48c2-b456-1eef4f3d1303","Type":"ContainerStarted","Data":"83f733a8b445a67799b91805bb7fbdc8f8697369422c22cead87a2aec2e9cf8a"} Oct 03 09:00:21 crc kubenswrapper[4790]: I1003 09:00:21.871222 4790 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="b240084f-7a9d-4dca-89a2-097e9904c9d8" containerName="proxy-httpd" probeResult="failure" output="HTTP probe failed with statuscode: 503" Oct 03 09:00:27 crc kubenswrapper[4790]: I1003 09:00:27.396481 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-decision-engine-0" Oct 03 09:00:27 crc kubenswrapper[4790]: I1003 09:00:27.427501 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/watcher-decision-engine-0" Oct 03 09:00:28 crc kubenswrapper[4790]: I1003 09:00:28.165840 4790 generic.go:334] "Generic (PLEG): container finished" podID="b240084f-7a9d-4dca-89a2-097e9904c9d8" containerID="0900704de45ab7ab971162b5bd26d7cdc98dd24f14d425080eb1d39ba170f191" exitCode=137 Oct 03 09:00:28 crc kubenswrapper[4790]: I1003 09:00:28.165908 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b240084f-7a9d-4dca-89a2-097e9904c9d8","Type":"ContainerDied","Data":"0900704de45ab7ab971162b5bd26d7cdc98dd24f14d425080eb1d39ba170f191"} Oct 03 09:00:28 crc kubenswrapper[4790]: I1003 09:00:28.166613 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-decision-engine-0" Oct 03 09:00:28 crc kubenswrapper[4790]: I1003 09:00:28.228448 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/watcher-decision-engine-0" Oct 03 09:00:28 crc kubenswrapper[4790]: I1003 09:00:28.391537 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 03 09:00:28 crc kubenswrapper[4790]: I1003 09:00:28.499622 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b240084f-7a9d-4dca-89a2-097e9904c9d8-combined-ca-bundle\") pod \"b240084f-7a9d-4dca-89a2-097e9904c9d8\" (UID: \"b240084f-7a9d-4dca-89a2-097e9904c9d8\") " Oct 03 09:00:28 crc kubenswrapper[4790]: I1003 09:00:28.499817 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b240084f-7a9d-4dca-89a2-097e9904c9d8-run-httpd\") pod \"b240084f-7a9d-4dca-89a2-097e9904c9d8\" (UID: \"b240084f-7a9d-4dca-89a2-097e9904c9d8\") " Oct 03 09:00:28 crc kubenswrapper[4790]: I1003 09:00:28.499863 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-smttp\" (UniqueName: \"kubernetes.io/projected/b240084f-7a9d-4dca-89a2-097e9904c9d8-kube-api-access-smttp\") pod \"b240084f-7a9d-4dca-89a2-097e9904c9d8\" (UID: \"b240084f-7a9d-4dca-89a2-097e9904c9d8\") " Oct 03 09:00:28 crc kubenswrapper[4790]: I1003 09:00:28.499946 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b240084f-7a9d-4dca-89a2-097e9904c9d8-log-httpd\") pod \"b240084f-7a9d-4dca-89a2-097e9904c9d8\" (UID: \"b240084f-7a9d-4dca-89a2-097e9904c9d8\") " Oct 03 09:00:28 crc kubenswrapper[4790]: I1003 09:00:28.499969 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b240084f-7a9d-4dca-89a2-097e9904c9d8-scripts\") pod \"b240084f-7a9d-4dca-89a2-097e9904c9d8\" (UID: \"b240084f-7a9d-4dca-89a2-097e9904c9d8\") " Oct 03 09:00:28 crc kubenswrapper[4790]: I1003 09:00:28.500375 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b240084f-7a9d-4dca-89a2-097e9904c9d8-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "b240084f-7a9d-4dca-89a2-097e9904c9d8" (UID: "b240084f-7a9d-4dca-89a2-097e9904c9d8"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 09:00:28 crc kubenswrapper[4790]: I1003 09:00:28.500448 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b240084f-7a9d-4dca-89a2-097e9904c9d8-config-data\") pod \"b240084f-7a9d-4dca-89a2-097e9904c9d8\" (UID: \"b240084f-7a9d-4dca-89a2-097e9904c9d8\") " Oct 03 09:00:28 crc kubenswrapper[4790]: I1003 09:00:28.500648 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b240084f-7a9d-4dca-89a2-097e9904c9d8-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "b240084f-7a9d-4dca-89a2-097e9904c9d8" (UID: "b240084f-7a9d-4dca-89a2-097e9904c9d8"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 09:00:28 crc kubenswrapper[4790]: I1003 09:00:28.500710 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b240084f-7a9d-4dca-89a2-097e9904c9d8-sg-core-conf-yaml\") pod \"b240084f-7a9d-4dca-89a2-097e9904c9d8\" (UID: \"b240084f-7a9d-4dca-89a2-097e9904c9d8\") " Oct 03 09:00:28 crc kubenswrapper[4790]: I1003 09:00:28.501359 4790 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b240084f-7a9d-4dca-89a2-097e9904c9d8-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 03 09:00:28 crc kubenswrapper[4790]: I1003 09:00:28.501377 4790 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b240084f-7a9d-4dca-89a2-097e9904c9d8-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 03 09:00:28 crc kubenswrapper[4790]: I1003 09:00:28.505721 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b240084f-7a9d-4dca-89a2-097e9904c9d8-scripts" (OuterVolumeSpecName: "scripts") pod "b240084f-7a9d-4dca-89a2-097e9904c9d8" (UID: "b240084f-7a9d-4dca-89a2-097e9904c9d8"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:00:28 crc kubenswrapper[4790]: I1003 09:00:28.505777 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b240084f-7a9d-4dca-89a2-097e9904c9d8-kube-api-access-smttp" (OuterVolumeSpecName: "kube-api-access-smttp") pod "b240084f-7a9d-4dca-89a2-097e9904c9d8" (UID: "b240084f-7a9d-4dca-89a2-097e9904c9d8"). InnerVolumeSpecName "kube-api-access-smttp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:00:28 crc kubenswrapper[4790]: I1003 09:00:28.528821 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b240084f-7a9d-4dca-89a2-097e9904c9d8-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "b240084f-7a9d-4dca-89a2-097e9904c9d8" (UID: "b240084f-7a9d-4dca-89a2-097e9904c9d8"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:00:28 crc kubenswrapper[4790]: I1003 09:00:28.602330 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b240084f-7a9d-4dca-89a2-097e9904c9d8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b240084f-7a9d-4dca-89a2-097e9904c9d8" (UID: "b240084f-7a9d-4dca-89a2-097e9904c9d8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:00:28 crc kubenswrapper[4790]: I1003 09:00:28.602634 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b240084f-7a9d-4dca-89a2-097e9904c9d8-combined-ca-bundle\") pod \"b240084f-7a9d-4dca-89a2-097e9904c9d8\" (UID: \"b240084f-7a9d-4dca-89a2-097e9904c9d8\") " Oct 03 09:00:28 crc kubenswrapper[4790]: W1003 09:00:28.602768 4790 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/b240084f-7a9d-4dca-89a2-097e9904c9d8/volumes/kubernetes.io~secret/combined-ca-bundle Oct 03 09:00:28 crc kubenswrapper[4790]: I1003 09:00:28.602784 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b240084f-7a9d-4dca-89a2-097e9904c9d8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b240084f-7a9d-4dca-89a2-097e9904c9d8" (UID: "b240084f-7a9d-4dca-89a2-097e9904c9d8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:00:28 crc kubenswrapper[4790]: I1003 09:00:28.603133 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-smttp\" (UniqueName: \"kubernetes.io/projected/b240084f-7a9d-4dca-89a2-097e9904c9d8-kube-api-access-smttp\") on node \"crc\" DevicePath \"\"" Oct 03 09:00:28 crc kubenswrapper[4790]: I1003 09:00:28.603166 4790 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b240084f-7a9d-4dca-89a2-097e9904c9d8-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 09:00:28 crc kubenswrapper[4790]: I1003 09:00:28.603178 4790 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b240084f-7a9d-4dca-89a2-097e9904c9d8-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 03 09:00:28 crc kubenswrapper[4790]: I1003 09:00:28.603186 4790 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b240084f-7a9d-4dca-89a2-097e9904c9d8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 09:00:28 crc kubenswrapper[4790]: I1003 09:00:28.633787 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b240084f-7a9d-4dca-89a2-097e9904c9d8-config-data" (OuterVolumeSpecName: "config-data") pod "b240084f-7a9d-4dca-89a2-097e9904c9d8" (UID: "b240084f-7a9d-4dca-89a2-097e9904c9d8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:00:28 crc kubenswrapper[4790]: I1003 09:00:28.705772 4790 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b240084f-7a9d-4dca-89a2-097e9904c9d8-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 09:00:29 crc kubenswrapper[4790]: I1003 09:00:29.179905 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 03 09:00:29 crc kubenswrapper[4790]: I1003 09:00:29.179920 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b240084f-7a9d-4dca-89a2-097e9904c9d8","Type":"ContainerDied","Data":"f0cb2a0bc89ba7208436edb795d32f60f7db32722c1dd8ae4d49479e79cab334"} Oct 03 09:00:29 crc kubenswrapper[4790]: I1003 09:00:29.179991 4790 scope.go:117] "RemoveContainer" containerID="0900704de45ab7ab971162b5bd26d7cdc98dd24f14d425080eb1d39ba170f191" Oct 03 09:00:29 crc kubenswrapper[4790]: I1003 09:00:29.182234 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-tzhvj" event={"ID":"42766976-3706-48c2-b456-1eef4f3d1303","Type":"ContainerStarted","Data":"0305656b4c1094354b0983f64912be6419b8c3b1db2abe014347fc0bc66115fb"} Oct 03 09:00:29 crc kubenswrapper[4790]: I1003 09:00:29.208200 4790 scope.go:117] "RemoveContainer" containerID="4b5db09faa3aba2f47ac76ad755fd446f439e2dbcd278e953321395ca15d9504" Oct 03 09:00:29 crc kubenswrapper[4790]: I1003 09:00:29.217735 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-tzhvj" podStartSLOduration=2.716996173 podStartE2EDuration="11.217692911s" podCreationTimestamp="2025-10-03 09:00:18 +0000 UTC" firstStartedPulling="2025-10-03 09:00:19.581090764 +0000 UTC m=+1205.934025002" lastFinishedPulling="2025-10-03 09:00:28.081787502 +0000 UTC m=+1214.434721740" observedRunningTime="2025-10-03 09:00:29.217189538 +0000 UTC m=+1215.570123786" watchObservedRunningTime="2025-10-03 09:00:29.217692911 +0000 UTC m=+1215.570627149" Oct 03 09:00:29 crc kubenswrapper[4790]: I1003 09:00:29.250774 4790 scope.go:117] "RemoveContainer" containerID="6e35397ceee89337e761cce23007eac86e8710600413274aa29eb15d9a6fe140" Oct 03 09:00:29 crc kubenswrapper[4790]: I1003 09:00:29.260655 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 03 09:00:29 crc kubenswrapper[4790]: I1003 09:00:29.313742 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 03 09:00:29 crc kubenswrapper[4790]: I1003 09:00:29.354739 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 03 09:00:29 crc kubenswrapper[4790]: E1003 09:00:29.355258 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b240084f-7a9d-4dca-89a2-097e9904c9d8" containerName="ceilometer-notification-agent" Oct 03 09:00:29 crc kubenswrapper[4790]: I1003 09:00:29.355274 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="b240084f-7a9d-4dca-89a2-097e9904c9d8" containerName="ceilometer-notification-agent" Oct 03 09:00:29 crc kubenswrapper[4790]: E1003 09:00:29.355330 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b240084f-7a9d-4dca-89a2-097e9904c9d8" containerName="proxy-httpd" Oct 03 09:00:29 crc kubenswrapper[4790]: I1003 09:00:29.355339 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="b240084f-7a9d-4dca-89a2-097e9904c9d8" containerName="proxy-httpd" Oct 03 09:00:29 crc kubenswrapper[4790]: E1003 09:00:29.355360 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b240084f-7a9d-4dca-89a2-097e9904c9d8" containerName="ceilometer-central-agent" Oct 03 09:00:29 crc kubenswrapper[4790]: I1003 09:00:29.355368 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="b240084f-7a9d-4dca-89a2-097e9904c9d8" containerName="ceilometer-central-agent" Oct 03 09:00:29 crc kubenswrapper[4790]: E1003 09:00:29.355387 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b240084f-7a9d-4dca-89a2-097e9904c9d8" containerName="sg-core" Oct 03 09:00:29 crc kubenswrapper[4790]: I1003 09:00:29.355394 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="b240084f-7a9d-4dca-89a2-097e9904c9d8" containerName="sg-core" Oct 03 09:00:29 crc kubenswrapper[4790]: I1003 09:00:29.355633 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="b240084f-7a9d-4dca-89a2-097e9904c9d8" containerName="ceilometer-notification-agent" Oct 03 09:00:29 crc kubenswrapper[4790]: I1003 09:00:29.355656 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="b240084f-7a9d-4dca-89a2-097e9904c9d8" containerName="sg-core" Oct 03 09:00:29 crc kubenswrapper[4790]: I1003 09:00:29.355669 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="b240084f-7a9d-4dca-89a2-097e9904c9d8" containerName="ceilometer-central-agent" Oct 03 09:00:29 crc kubenswrapper[4790]: I1003 09:00:29.355682 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="b240084f-7a9d-4dca-89a2-097e9904c9d8" containerName="proxy-httpd" Oct 03 09:00:29 crc kubenswrapper[4790]: I1003 09:00:29.357744 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 03 09:00:29 crc kubenswrapper[4790]: I1003 09:00:29.382780 4790 scope.go:117] "RemoveContainer" containerID="cd1266527775f4ca99436e80c5cbf44aeccc122b210d128117301265eb46d58e" Oct 03 09:00:29 crc kubenswrapper[4790]: I1003 09:00:29.383391 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 03 09:00:29 crc kubenswrapper[4790]: I1003 09:00:29.383649 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 03 09:00:29 crc kubenswrapper[4790]: I1003 09:00:29.388698 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 03 09:00:29 crc kubenswrapper[4790]: I1003 09:00:29.518113 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/80973b39-7225-498d-9bf5-12a6b7d4f3dc-config-data\") pod \"ceilometer-0\" (UID: \"80973b39-7225-498d-9bf5-12a6b7d4f3dc\") " pod="openstack/ceilometer-0" Oct 03 09:00:29 crc kubenswrapper[4790]: I1003 09:00:29.518663 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-28gh2\" (UniqueName: \"kubernetes.io/projected/80973b39-7225-498d-9bf5-12a6b7d4f3dc-kube-api-access-28gh2\") pod \"ceilometer-0\" (UID: \"80973b39-7225-498d-9bf5-12a6b7d4f3dc\") " pod="openstack/ceilometer-0" Oct 03 09:00:29 crc kubenswrapper[4790]: I1003 09:00:29.518834 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/80973b39-7225-498d-9bf5-12a6b7d4f3dc-scripts\") pod \"ceilometer-0\" (UID: \"80973b39-7225-498d-9bf5-12a6b7d4f3dc\") " pod="openstack/ceilometer-0" Oct 03 09:00:29 crc kubenswrapper[4790]: I1003 09:00:29.518873 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/80973b39-7225-498d-9bf5-12a6b7d4f3dc-run-httpd\") pod \"ceilometer-0\" (UID: \"80973b39-7225-498d-9bf5-12a6b7d4f3dc\") " pod="openstack/ceilometer-0" Oct 03 09:00:29 crc kubenswrapper[4790]: I1003 09:00:29.518923 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/80973b39-7225-498d-9bf5-12a6b7d4f3dc-log-httpd\") pod \"ceilometer-0\" (UID: \"80973b39-7225-498d-9bf5-12a6b7d4f3dc\") " pod="openstack/ceilometer-0" Oct 03 09:00:29 crc kubenswrapper[4790]: I1003 09:00:29.518996 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/80973b39-7225-498d-9bf5-12a6b7d4f3dc-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"80973b39-7225-498d-9bf5-12a6b7d4f3dc\") " pod="openstack/ceilometer-0" Oct 03 09:00:29 crc kubenswrapper[4790]: I1003 09:00:29.519083 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/80973b39-7225-498d-9bf5-12a6b7d4f3dc-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"80973b39-7225-498d-9bf5-12a6b7d4f3dc\") " pod="openstack/ceilometer-0" Oct 03 09:00:29 crc kubenswrapper[4790]: I1003 09:00:29.620428 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/80973b39-7225-498d-9bf5-12a6b7d4f3dc-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"80973b39-7225-498d-9bf5-12a6b7d4f3dc\") " pod="openstack/ceilometer-0" Oct 03 09:00:29 crc kubenswrapper[4790]: I1003 09:00:29.620533 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/80973b39-7225-498d-9bf5-12a6b7d4f3dc-config-data\") pod \"ceilometer-0\" (UID: \"80973b39-7225-498d-9bf5-12a6b7d4f3dc\") " pod="openstack/ceilometer-0" Oct 03 09:00:29 crc kubenswrapper[4790]: I1003 09:00:29.620559 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-28gh2\" (UniqueName: \"kubernetes.io/projected/80973b39-7225-498d-9bf5-12a6b7d4f3dc-kube-api-access-28gh2\") pod \"ceilometer-0\" (UID: \"80973b39-7225-498d-9bf5-12a6b7d4f3dc\") " pod="openstack/ceilometer-0" Oct 03 09:00:29 crc kubenswrapper[4790]: I1003 09:00:29.620623 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/80973b39-7225-498d-9bf5-12a6b7d4f3dc-scripts\") pod \"ceilometer-0\" (UID: \"80973b39-7225-498d-9bf5-12a6b7d4f3dc\") " pod="openstack/ceilometer-0" Oct 03 09:00:29 crc kubenswrapper[4790]: I1003 09:00:29.620641 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/80973b39-7225-498d-9bf5-12a6b7d4f3dc-run-httpd\") pod \"ceilometer-0\" (UID: \"80973b39-7225-498d-9bf5-12a6b7d4f3dc\") " pod="openstack/ceilometer-0" Oct 03 09:00:29 crc kubenswrapper[4790]: I1003 09:00:29.620676 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/80973b39-7225-498d-9bf5-12a6b7d4f3dc-log-httpd\") pod \"ceilometer-0\" (UID: \"80973b39-7225-498d-9bf5-12a6b7d4f3dc\") " pod="openstack/ceilometer-0" Oct 03 09:00:29 crc kubenswrapper[4790]: I1003 09:00:29.620708 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/80973b39-7225-498d-9bf5-12a6b7d4f3dc-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"80973b39-7225-498d-9bf5-12a6b7d4f3dc\") " pod="openstack/ceilometer-0" Oct 03 09:00:29 crc kubenswrapper[4790]: I1003 09:00:29.621403 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/80973b39-7225-498d-9bf5-12a6b7d4f3dc-run-httpd\") pod \"ceilometer-0\" (UID: \"80973b39-7225-498d-9bf5-12a6b7d4f3dc\") " pod="openstack/ceilometer-0" Oct 03 09:00:29 crc kubenswrapper[4790]: I1003 09:00:29.621485 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/80973b39-7225-498d-9bf5-12a6b7d4f3dc-log-httpd\") pod \"ceilometer-0\" (UID: \"80973b39-7225-498d-9bf5-12a6b7d4f3dc\") " pod="openstack/ceilometer-0" Oct 03 09:00:29 crc kubenswrapper[4790]: I1003 09:00:29.625290 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/80973b39-7225-498d-9bf5-12a6b7d4f3dc-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"80973b39-7225-498d-9bf5-12a6b7d4f3dc\") " pod="openstack/ceilometer-0" Oct 03 09:00:29 crc kubenswrapper[4790]: I1003 09:00:29.625718 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/80973b39-7225-498d-9bf5-12a6b7d4f3dc-config-data\") pod \"ceilometer-0\" (UID: \"80973b39-7225-498d-9bf5-12a6b7d4f3dc\") " pod="openstack/ceilometer-0" Oct 03 09:00:29 crc kubenswrapper[4790]: I1003 09:00:29.626472 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/80973b39-7225-498d-9bf5-12a6b7d4f3dc-scripts\") pod \"ceilometer-0\" (UID: \"80973b39-7225-498d-9bf5-12a6b7d4f3dc\") " pod="openstack/ceilometer-0" Oct 03 09:00:29 crc kubenswrapper[4790]: I1003 09:00:29.630594 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/80973b39-7225-498d-9bf5-12a6b7d4f3dc-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"80973b39-7225-498d-9bf5-12a6b7d4f3dc\") " pod="openstack/ceilometer-0" Oct 03 09:00:29 crc kubenswrapper[4790]: I1003 09:00:29.639769 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-28gh2\" (UniqueName: \"kubernetes.io/projected/80973b39-7225-498d-9bf5-12a6b7d4f3dc-kube-api-access-28gh2\") pod \"ceilometer-0\" (UID: \"80973b39-7225-498d-9bf5-12a6b7d4f3dc\") " pod="openstack/ceilometer-0" Oct 03 09:00:29 crc kubenswrapper[4790]: I1003 09:00:29.702145 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 03 09:00:30 crc kubenswrapper[4790]: I1003 09:00:30.165190 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 03 09:00:30 crc kubenswrapper[4790]: I1003 09:00:30.198859 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"80973b39-7225-498d-9bf5-12a6b7d4f3dc","Type":"ContainerStarted","Data":"1f83e9d32b08f17ab0f8fee683988d4d7f9aeddfb5450f3610e57fde4c832a3a"} Oct 03 09:00:30 crc kubenswrapper[4790]: I1003 09:00:30.342035 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b240084f-7a9d-4dca-89a2-097e9904c9d8" path="/var/lib/kubelet/pods/b240084f-7a9d-4dca-89a2-097e9904c9d8/volumes" Oct 03 09:00:31 crc kubenswrapper[4790]: I1003 09:00:31.211985 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"80973b39-7225-498d-9bf5-12a6b7d4f3dc","Type":"ContainerStarted","Data":"6458bb4a693bb122671c72474f7f5ba3d390676b6a0b20c111150d5d3a3dac6a"} Oct 03 09:00:31 crc kubenswrapper[4790]: I1003 09:00:31.212241 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"80973b39-7225-498d-9bf5-12a6b7d4f3dc","Type":"ContainerStarted","Data":"78c2d1368c85c5a888003bc1bca204a67c3948b5750dd423aab901655e8b693b"} Oct 03 09:00:32 crc kubenswrapper[4790]: I1003 09:00:32.235601 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"80973b39-7225-498d-9bf5-12a6b7d4f3dc","Type":"ContainerStarted","Data":"2994d0f1b20da31133454d8d7621c18e7eaf60baecb000eb5f27958614e19bec"} Oct 03 09:00:33 crc kubenswrapper[4790]: I1003 09:00:33.248454 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"80973b39-7225-498d-9bf5-12a6b7d4f3dc","Type":"ContainerStarted","Data":"893fd81d194bae337b26a4de650ee7402cf6546a76510b16713b2fa91e4add13"} Oct 03 09:00:33 crc kubenswrapper[4790]: I1003 09:00:33.248738 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 03 09:00:33 crc kubenswrapper[4790]: I1003 09:00:33.277262 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.645309885 podStartE2EDuration="4.277241976s" podCreationTimestamp="2025-10-03 09:00:29 +0000 UTC" firstStartedPulling="2025-10-03 09:00:30.170589216 +0000 UTC m=+1216.523523454" lastFinishedPulling="2025-10-03 09:00:32.802521307 +0000 UTC m=+1219.155455545" observedRunningTime="2025-10-03 09:00:33.272705663 +0000 UTC m=+1219.625639921" watchObservedRunningTime="2025-10-03 09:00:33.277241976 +0000 UTC m=+1219.630176214" Oct 03 09:00:42 crc kubenswrapper[4790]: I1003 09:00:42.342653 4790 generic.go:334] "Generic (PLEG): container finished" podID="42766976-3706-48c2-b456-1eef4f3d1303" containerID="0305656b4c1094354b0983f64912be6419b8c3b1db2abe014347fc0bc66115fb" exitCode=0 Oct 03 09:00:42 crc kubenswrapper[4790]: I1003 09:00:42.342772 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-tzhvj" event={"ID":"42766976-3706-48c2-b456-1eef4f3d1303","Type":"ContainerDied","Data":"0305656b4c1094354b0983f64912be6419b8c3b1db2abe014347fc0bc66115fb"} Oct 03 09:00:43 crc kubenswrapper[4790]: I1003 09:00:43.744165 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-tzhvj" Oct 03 09:00:43 crc kubenswrapper[4790]: I1003 09:00:43.900537 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/42766976-3706-48c2-b456-1eef4f3d1303-config-data\") pod \"42766976-3706-48c2-b456-1eef4f3d1303\" (UID: \"42766976-3706-48c2-b456-1eef4f3d1303\") " Oct 03 09:00:43 crc kubenswrapper[4790]: I1003 09:00:43.900596 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/42766976-3706-48c2-b456-1eef4f3d1303-combined-ca-bundle\") pod \"42766976-3706-48c2-b456-1eef4f3d1303\" (UID: \"42766976-3706-48c2-b456-1eef4f3d1303\") " Oct 03 09:00:43 crc kubenswrapper[4790]: I1003 09:00:43.900616 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b55tj\" (UniqueName: \"kubernetes.io/projected/42766976-3706-48c2-b456-1eef4f3d1303-kube-api-access-b55tj\") pod \"42766976-3706-48c2-b456-1eef4f3d1303\" (UID: \"42766976-3706-48c2-b456-1eef4f3d1303\") " Oct 03 09:00:43 crc kubenswrapper[4790]: I1003 09:00:43.900660 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/42766976-3706-48c2-b456-1eef4f3d1303-scripts\") pod \"42766976-3706-48c2-b456-1eef4f3d1303\" (UID: \"42766976-3706-48c2-b456-1eef4f3d1303\") " Oct 03 09:00:43 crc kubenswrapper[4790]: I1003 09:00:43.905745 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/42766976-3706-48c2-b456-1eef4f3d1303-kube-api-access-b55tj" (OuterVolumeSpecName: "kube-api-access-b55tj") pod "42766976-3706-48c2-b456-1eef4f3d1303" (UID: "42766976-3706-48c2-b456-1eef4f3d1303"). InnerVolumeSpecName "kube-api-access-b55tj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:00:43 crc kubenswrapper[4790]: I1003 09:00:43.906383 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/42766976-3706-48c2-b456-1eef4f3d1303-scripts" (OuterVolumeSpecName: "scripts") pod "42766976-3706-48c2-b456-1eef4f3d1303" (UID: "42766976-3706-48c2-b456-1eef4f3d1303"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:00:43 crc kubenswrapper[4790]: I1003 09:00:43.927737 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/42766976-3706-48c2-b456-1eef4f3d1303-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "42766976-3706-48c2-b456-1eef4f3d1303" (UID: "42766976-3706-48c2-b456-1eef4f3d1303"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:00:43 crc kubenswrapper[4790]: I1003 09:00:43.929771 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/42766976-3706-48c2-b456-1eef4f3d1303-config-data" (OuterVolumeSpecName: "config-data") pod "42766976-3706-48c2-b456-1eef4f3d1303" (UID: "42766976-3706-48c2-b456-1eef4f3d1303"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:00:44 crc kubenswrapper[4790]: I1003 09:00:44.003030 4790 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/42766976-3706-48c2-b456-1eef4f3d1303-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 09:00:44 crc kubenswrapper[4790]: I1003 09:00:44.003078 4790 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/42766976-3706-48c2-b456-1eef4f3d1303-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 09:00:44 crc kubenswrapper[4790]: I1003 09:00:44.003119 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b55tj\" (UniqueName: \"kubernetes.io/projected/42766976-3706-48c2-b456-1eef4f3d1303-kube-api-access-b55tj\") on node \"crc\" DevicePath \"\"" Oct 03 09:00:44 crc kubenswrapper[4790]: I1003 09:00:44.003129 4790 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/42766976-3706-48c2-b456-1eef4f3d1303-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 09:00:44 crc kubenswrapper[4790]: I1003 09:00:44.367353 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-tzhvj" event={"ID":"42766976-3706-48c2-b456-1eef4f3d1303","Type":"ContainerDied","Data":"83f733a8b445a67799b91805bb7fbdc8f8697369422c22cead87a2aec2e9cf8a"} Oct 03 09:00:44 crc kubenswrapper[4790]: I1003 09:00:44.367403 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-tzhvj" Oct 03 09:00:44 crc kubenswrapper[4790]: I1003 09:00:44.367415 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="83f733a8b445a67799b91805bb7fbdc8f8697369422c22cead87a2aec2e9cf8a" Oct 03 09:00:44 crc kubenswrapper[4790]: I1003 09:00:44.463528 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Oct 03 09:00:44 crc kubenswrapper[4790]: E1003 09:00:44.464002 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="42766976-3706-48c2-b456-1eef4f3d1303" containerName="nova-cell0-conductor-db-sync" Oct 03 09:00:44 crc kubenswrapper[4790]: I1003 09:00:44.464024 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="42766976-3706-48c2-b456-1eef4f3d1303" containerName="nova-cell0-conductor-db-sync" Oct 03 09:00:44 crc kubenswrapper[4790]: I1003 09:00:44.464208 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="42766976-3706-48c2-b456-1eef4f3d1303" containerName="nova-cell0-conductor-db-sync" Oct 03 09:00:44 crc kubenswrapper[4790]: I1003 09:00:44.465047 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Oct 03 09:00:44 crc kubenswrapper[4790]: I1003 09:00:44.467534 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-mtdp6" Oct 03 09:00:44 crc kubenswrapper[4790]: I1003 09:00:44.468021 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Oct 03 09:00:44 crc kubenswrapper[4790]: I1003 09:00:44.473175 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Oct 03 09:00:44 crc kubenswrapper[4790]: I1003 09:00:44.613358 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6stp2\" (UniqueName: \"kubernetes.io/projected/46dabafe-d4c6-4b3e-84f6-3eb83eee9647-kube-api-access-6stp2\") pod \"nova-cell0-conductor-0\" (UID: \"46dabafe-d4c6-4b3e-84f6-3eb83eee9647\") " pod="openstack/nova-cell0-conductor-0" Oct 03 09:00:44 crc kubenswrapper[4790]: I1003 09:00:44.613426 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46dabafe-d4c6-4b3e-84f6-3eb83eee9647-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"46dabafe-d4c6-4b3e-84f6-3eb83eee9647\") " pod="openstack/nova-cell0-conductor-0" Oct 03 09:00:44 crc kubenswrapper[4790]: I1003 09:00:44.613462 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/46dabafe-d4c6-4b3e-84f6-3eb83eee9647-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"46dabafe-d4c6-4b3e-84f6-3eb83eee9647\") " pod="openstack/nova-cell0-conductor-0" Oct 03 09:00:44 crc kubenswrapper[4790]: I1003 09:00:44.715400 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6stp2\" (UniqueName: \"kubernetes.io/projected/46dabafe-d4c6-4b3e-84f6-3eb83eee9647-kube-api-access-6stp2\") pod \"nova-cell0-conductor-0\" (UID: \"46dabafe-d4c6-4b3e-84f6-3eb83eee9647\") " pod="openstack/nova-cell0-conductor-0" Oct 03 09:00:44 crc kubenswrapper[4790]: I1003 09:00:44.715493 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46dabafe-d4c6-4b3e-84f6-3eb83eee9647-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"46dabafe-d4c6-4b3e-84f6-3eb83eee9647\") " pod="openstack/nova-cell0-conductor-0" Oct 03 09:00:44 crc kubenswrapper[4790]: I1003 09:00:44.715530 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/46dabafe-d4c6-4b3e-84f6-3eb83eee9647-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"46dabafe-d4c6-4b3e-84f6-3eb83eee9647\") " pod="openstack/nova-cell0-conductor-0" Oct 03 09:00:44 crc kubenswrapper[4790]: I1003 09:00:44.720200 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46dabafe-d4c6-4b3e-84f6-3eb83eee9647-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"46dabafe-d4c6-4b3e-84f6-3eb83eee9647\") " pod="openstack/nova-cell0-conductor-0" Oct 03 09:00:44 crc kubenswrapper[4790]: I1003 09:00:44.720208 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/46dabafe-d4c6-4b3e-84f6-3eb83eee9647-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"46dabafe-d4c6-4b3e-84f6-3eb83eee9647\") " pod="openstack/nova-cell0-conductor-0" Oct 03 09:00:44 crc kubenswrapper[4790]: I1003 09:00:44.734684 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6stp2\" (UniqueName: \"kubernetes.io/projected/46dabafe-d4c6-4b3e-84f6-3eb83eee9647-kube-api-access-6stp2\") pod \"nova-cell0-conductor-0\" (UID: \"46dabafe-d4c6-4b3e-84f6-3eb83eee9647\") " pod="openstack/nova-cell0-conductor-0" Oct 03 09:00:44 crc kubenswrapper[4790]: I1003 09:00:44.786330 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Oct 03 09:00:45 crc kubenswrapper[4790]: I1003 09:00:45.266895 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Oct 03 09:00:45 crc kubenswrapper[4790]: W1003 09:00:45.269430 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod46dabafe_d4c6_4b3e_84f6_3eb83eee9647.slice/crio-2b0c354546d8cac4ba0ebc292c7109c307fe8a5c3d36d49194ae242c6e88e9ac WatchSource:0}: Error finding container 2b0c354546d8cac4ba0ebc292c7109c307fe8a5c3d36d49194ae242c6e88e9ac: Status 404 returned error can't find the container with id 2b0c354546d8cac4ba0ebc292c7109c307fe8a5c3d36d49194ae242c6e88e9ac Oct 03 09:00:45 crc kubenswrapper[4790]: I1003 09:00:45.401605 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"46dabafe-d4c6-4b3e-84f6-3eb83eee9647","Type":"ContainerStarted","Data":"2b0c354546d8cac4ba0ebc292c7109c307fe8a5c3d36d49194ae242c6e88e9ac"} Oct 03 09:00:46 crc kubenswrapper[4790]: I1003 09:00:46.411801 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"46dabafe-d4c6-4b3e-84f6-3eb83eee9647","Type":"ContainerStarted","Data":"5b46fca477d878354b77e8821cf8ed795a17bbfe94b055f2d020d9ef8a4a5fe3"} Oct 03 09:00:46 crc kubenswrapper[4790]: I1003 09:00:46.412250 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Oct 03 09:00:46 crc kubenswrapper[4790]: I1003 09:00:46.436258 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.436211008 podStartE2EDuration="2.436211008s" podCreationTimestamp="2025-10-03 09:00:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 09:00:46.425272468 +0000 UTC m=+1232.778206706" watchObservedRunningTime="2025-10-03 09:00:46.436211008 +0000 UTC m=+1232.789145246" Oct 03 09:00:54 crc kubenswrapper[4790]: I1003 09:00:54.824510 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Oct 03 09:00:55 crc kubenswrapper[4790]: I1003 09:00:55.267533 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-4z5vg"] Oct 03 09:00:55 crc kubenswrapper[4790]: I1003 09:00:55.270497 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-4z5vg" Oct 03 09:00:55 crc kubenswrapper[4790]: I1003 09:00:55.273534 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Oct 03 09:00:55 crc kubenswrapper[4790]: I1003 09:00:55.274107 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Oct 03 09:00:55 crc kubenswrapper[4790]: I1003 09:00:55.281749 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-4z5vg"] Oct 03 09:00:55 crc kubenswrapper[4790]: I1003 09:00:55.435633 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Oct 03 09:00:55 crc kubenswrapper[4790]: I1003 09:00:55.437332 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 03 09:00:55 crc kubenswrapper[4790]: I1003 09:00:55.444989 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Oct 03 09:00:55 crc kubenswrapper[4790]: I1003 09:00:55.445738 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/62abaf76-b94a-430c-932f-75164034006a-scripts\") pod \"nova-cell0-cell-mapping-4z5vg\" (UID: \"62abaf76-b94a-430c-932f-75164034006a\") " pod="openstack/nova-cell0-cell-mapping-4z5vg" Oct 03 09:00:55 crc kubenswrapper[4790]: I1003 09:00:55.445823 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/62abaf76-b94a-430c-932f-75164034006a-config-data\") pod \"nova-cell0-cell-mapping-4z5vg\" (UID: \"62abaf76-b94a-430c-932f-75164034006a\") " pod="openstack/nova-cell0-cell-mapping-4z5vg" Oct 03 09:00:55 crc kubenswrapper[4790]: I1003 09:00:55.445884 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62abaf76-b94a-430c-932f-75164034006a-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-4z5vg\" (UID: \"62abaf76-b94a-430c-932f-75164034006a\") " pod="openstack/nova-cell0-cell-mapping-4z5vg" Oct 03 09:00:55 crc kubenswrapper[4790]: I1003 09:00:55.446049 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5m4bc\" (UniqueName: \"kubernetes.io/projected/62abaf76-b94a-430c-932f-75164034006a-kube-api-access-5m4bc\") pod \"nova-cell0-cell-mapping-4z5vg\" (UID: \"62abaf76-b94a-430c-932f-75164034006a\") " pod="openstack/nova-cell0-cell-mapping-4z5vg" Oct 03 09:00:55 crc kubenswrapper[4790]: I1003 09:00:55.453811 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 03 09:00:55 crc kubenswrapper[4790]: I1003 09:00:55.535648 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Oct 03 09:00:55 crc kubenswrapper[4790]: I1003 09:00:55.537466 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 03 09:00:55 crc kubenswrapper[4790]: I1003 09:00:55.539876 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Oct 03 09:00:55 crc kubenswrapper[4790]: I1003 09:00:55.547860 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/43f3baeb-3387-4187-97b5-6bced61686f8-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"43f3baeb-3387-4187-97b5-6bced61686f8\") " pod="openstack/nova-api-0" Oct 03 09:00:55 crc kubenswrapper[4790]: I1003 09:00:55.547944 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/62abaf76-b94a-430c-932f-75164034006a-scripts\") pod \"nova-cell0-cell-mapping-4z5vg\" (UID: \"62abaf76-b94a-430c-932f-75164034006a\") " pod="openstack/nova-cell0-cell-mapping-4z5vg" Oct 03 09:00:55 crc kubenswrapper[4790]: I1003 09:00:55.547989 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tlwrc\" (UniqueName: \"kubernetes.io/projected/43f3baeb-3387-4187-97b5-6bced61686f8-kube-api-access-tlwrc\") pod \"nova-api-0\" (UID: \"43f3baeb-3387-4187-97b5-6bced61686f8\") " pod="openstack/nova-api-0" Oct 03 09:00:55 crc kubenswrapper[4790]: I1003 09:00:55.548018 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/62abaf76-b94a-430c-932f-75164034006a-config-data\") pod \"nova-cell0-cell-mapping-4z5vg\" (UID: \"62abaf76-b94a-430c-932f-75164034006a\") " pod="openstack/nova-cell0-cell-mapping-4z5vg" Oct 03 09:00:55 crc kubenswrapper[4790]: I1003 09:00:55.548053 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/43f3baeb-3387-4187-97b5-6bced61686f8-config-data\") pod \"nova-api-0\" (UID: \"43f3baeb-3387-4187-97b5-6bced61686f8\") " pod="openstack/nova-api-0" Oct 03 09:00:55 crc kubenswrapper[4790]: I1003 09:00:55.548096 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62abaf76-b94a-430c-932f-75164034006a-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-4z5vg\" (UID: \"62abaf76-b94a-430c-932f-75164034006a\") " pod="openstack/nova-cell0-cell-mapping-4z5vg" Oct 03 09:00:55 crc kubenswrapper[4790]: I1003 09:00:55.548118 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/43f3baeb-3387-4187-97b5-6bced61686f8-logs\") pod \"nova-api-0\" (UID: \"43f3baeb-3387-4187-97b5-6bced61686f8\") " pod="openstack/nova-api-0" Oct 03 09:00:55 crc kubenswrapper[4790]: I1003 09:00:55.548219 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5m4bc\" (UniqueName: \"kubernetes.io/projected/62abaf76-b94a-430c-932f-75164034006a-kube-api-access-5m4bc\") pod \"nova-cell0-cell-mapping-4z5vg\" (UID: \"62abaf76-b94a-430c-932f-75164034006a\") " pod="openstack/nova-cell0-cell-mapping-4z5vg" Oct 03 09:00:55 crc kubenswrapper[4790]: I1003 09:00:55.557562 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 03 09:00:55 crc kubenswrapper[4790]: I1003 09:00:55.567953 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62abaf76-b94a-430c-932f-75164034006a-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-4z5vg\" (UID: \"62abaf76-b94a-430c-932f-75164034006a\") " pod="openstack/nova-cell0-cell-mapping-4z5vg" Oct 03 09:00:55 crc kubenswrapper[4790]: I1003 09:00:55.569232 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5m4bc\" (UniqueName: \"kubernetes.io/projected/62abaf76-b94a-430c-932f-75164034006a-kube-api-access-5m4bc\") pod \"nova-cell0-cell-mapping-4z5vg\" (UID: \"62abaf76-b94a-430c-932f-75164034006a\") " pod="openstack/nova-cell0-cell-mapping-4z5vg" Oct 03 09:00:55 crc kubenswrapper[4790]: I1003 09:00:55.573501 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/62abaf76-b94a-430c-932f-75164034006a-config-data\") pod \"nova-cell0-cell-mapping-4z5vg\" (UID: \"62abaf76-b94a-430c-932f-75164034006a\") " pod="openstack/nova-cell0-cell-mapping-4z5vg" Oct 03 09:00:55 crc kubenswrapper[4790]: I1003 09:00:55.593223 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/62abaf76-b94a-430c-932f-75164034006a-scripts\") pod \"nova-cell0-cell-mapping-4z5vg\" (UID: \"62abaf76-b94a-430c-932f-75164034006a\") " pod="openstack/nova-cell0-cell-mapping-4z5vg" Oct 03 09:00:55 crc kubenswrapper[4790]: I1003 09:00:55.593766 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-4z5vg" Oct 03 09:00:55 crc kubenswrapper[4790]: I1003 09:00:55.659862 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e9504fc0-4063-48bf-87a3-f5e75deba42f-config-data\") pod \"nova-metadata-0\" (UID: \"e9504fc0-4063-48bf-87a3-f5e75deba42f\") " pod="openstack/nova-metadata-0" Oct 03 09:00:55 crc kubenswrapper[4790]: I1003 09:00:55.659922 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/43f3baeb-3387-4187-97b5-6bced61686f8-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"43f3baeb-3387-4187-97b5-6bced61686f8\") " pod="openstack/nova-api-0" Oct 03 09:00:55 crc kubenswrapper[4790]: I1003 09:00:55.659982 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e9504fc0-4063-48bf-87a3-f5e75deba42f-logs\") pod \"nova-metadata-0\" (UID: \"e9504fc0-4063-48bf-87a3-f5e75deba42f\") " pod="openstack/nova-metadata-0" Oct 03 09:00:55 crc kubenswrapper[4790]: I1003 09:00:55.660002 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tlwrc\" (UniqueName: \"kubernetes.io/projected/43f3baeb-3387-4187-97b5-6bced61686f8-kube-api-access-tlwrc\") pod \"nova-api-0\" (UID: \"43f3baeb-3387-4187-97b5-6bced61686f8\") " pod="openstack/nova-api-0" Oct 03 09:00:55 crc kubenswrapper[4790]: I1003 09:00:55.660029 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e9504fc0-4063-48bf-87a3-f5e75deba42f-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"e9504fc0-4063-48bf-87a3-f5e75deba42f\") " pod="openstack/nova-metadata-0" Oct 03 09:00:55 crc kubenswrapper[4790]: I1003 09:00:55.660054 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/43f3baeb-3387-4187-97b5-6bced61686f8-config-data\") pod \"nova-api-0\" (UID: \"43f3baeb-3387-4187-97b5-6bced61686f8\") " pod="openstack/nova-api-0" Oct 03 09:00:55 crc kubenswrapper[4790]: I1003 09:00:55.660088 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/43f3baeb-3387-4187-97b5-6bced61686f8-logs\") pod \"nova-api-0\" (UID: \"43f3baeb-3387-4187-97b5-6bced61686f8\") " pod="openstack/nova-api-0" Oct 03 09:00:55 crc kubenswrapper[4790]: I1003 09:00:55.660121 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n5ssq\" (UniqueName: \"kubernetes.io/projected/e9504fc0-4063-48bf-87a3-f5e75deba42f-kube-api-access-n5ssq\") pod \"nova-metadata-0\" (UID: \"e9504fc0-4063-48bf-87a3-f5e75deba42f\") " pod="openstack/nova-metadata-0" Oct 03 09:00:55 crc kubenswrapper[4790]: I1003 09:00:55.663956 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/43f3baeb-3387-4187-97b5-6bced61686f8-logs\") pod \"nova-api-0\" (UID: \"43f3baeb-3387-4187-97b5-6bced61686f8\") " pod="openstack/nova-api-0" Oct 03 09:00:55 crc kubenswrapper[4790]: I1003 09:00:55.672380 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/43f3baeb-3387-4187-97b5-6bced61686f8-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"43f3baeb-3387-4187-97b5-6bced61686f8\") " pod="openstack/nova-api-0" Oct 03 09:00:55 crc kubenswrapper[4790]: I1003 09:00:55.683453 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/43f3baeb-3387-4187-97b5-6bced61686f8-config-data\") pod \"nova-api-0\" (UID: \"43f3baeb-3387-4187-97b5-6bced61686f8\") " pod="openstack/nova-api-0" Oct 03 09:00:55 crc kubenswrapper[4790]: I1003 09:00:55.687239 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Oct 03 09:00:55 crc kubenswrapper[4790]: I1003 09:00:55.688587 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 03 09:00:55 crc kubenswrapper[4790]: I1003 09:00:55.699674 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Oct 03 09:00:55 crc kubenswrapper[4790]: I1003 09:00:55.709745 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 03 09:00:55 crc kubenswrapper[4790]: I1003 09:00:55.710982 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 03 09:00:55 crc kubenswrapper[4790]: I1003 09:00:55.716509 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tlwrc\" (UniqueName: \"kubernetes.io/projected/43f3baeb-3387-4187-97b5-6bced61686f8-kube-api-access-tlwrc\") pod \"nova-api-0\" (UID: \"43f3baeb-3387-4187-97b5-6bced61686f8\") " pod="openstack/nova-api-0" Oct 03 09:00:55 crc kubenswrapper[4790]: I1003 09:00:55.726940 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Oct 03 09:00:55 crc kubenswrapper[4790]: I1003 09:00:55.727217 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 03 09:00:55 crc kubenswrapper[4790]: I1003 09:00:55.736219 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 03 09:00:55 crc kubenswrapper[4790]: I1003 09:00:55.747882 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7859b488d7-9flsg"] Oct 03 09:00:55 crc kubenswrapper[4790]: I1003 09:00:55.749677 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7859b488d7-9flsg" Oct 03 09:00:55 crc kubenswrapper[4790]: I1003 09:00:55.759187 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 03 09:00:55 crc kubenswrapper[4790]: I1003 09:00:55.761423 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e9504fc0-4063-48bf-87a3-f5e75deba42f-logs\") pod \"nova-metadata-0\" (UID: \"e9504fc0-4063-48bf-87a3-f5e75deba42f\") " pod="openstack/nova-metadata-0" Oct 03 09:00:55 crc kubenswrapper[4790]: I1003 09:00:55.761473 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e9504fc0-4063-48bf-87a3-f5e75deba42f-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"e9504fc0-4063-48bf-87a3-f5e75deba42f\") " pod="openstack/nova-metadata-0" Oct 03 09:00:55 crc kubenswrapper[4790]: I1003 09:00:55.761529 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n5ssq\" (UniqueName: \"kubernetes.io/projected/e9504fc0-4063-48bf-87a3-f5e75deba42f-kube-api-access-n5ssq\") pod \"nova-metadata-0\" (UID: \"e9504fc0-4063-48bf-87a3-f5e75deba42f\") " pod="openstack/nova-metadata-0" Oct 03 09:00:55 crc kubenswrapper[4790]: I1003 09:00:55.762126 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e9504fc0-4063-48bf-87a3-f5e75deba42f-logs\") pod \"nova-metadata-0\" (UID: \"e9504fc0-4063-48bf-87a3-f5e75deba42f\") " pod="openstack/nova-metadata-0" Oct 03 09:00:55 crc kubenswrapper[4790]: I1003 09:00:55.767096 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e9504fc0-4063-48bf-87a3-f5e75deba42f-config-data\") pod \"nova-metadata-0\" (UID: \"e9504fc0-4063-48bf-87a3-f5e75deba42f\") " pod="openstack/nova-metadata-0" Oct 03 09:00:55 crc kubenswrapper[4790]: I1003 09:00:55.773491 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7859b488d7-9flsg"] Oct 03 09:00:55 crc kubenswrapper[4790]: I1003 09:00:55.787384 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e9504fc0-4063-48bf-87a3-f5e75deba42f-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"e9504fc0-4063-48bf-87a3-f5e75deba42f\") " pod="openstack/nova-metadata-0" Oct 03 09:00:55 crc kubenswrapper[4790]: I1003 09:00:55.790007 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e9504fc0-4063-48bf-87a3-f5e75deba42f-config-data\") pod \"nova-metadata-0\" (UID: \"e9504fc0-4063-48bf-87a3-f5e75deba42f\") " pod="openstack/nova-metadata-0" Oct 03 09:00:55 crc kubenswrapper[4790]: I1003 09:00:55.795004 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n5ssq\" (UniqueName: \"kubernetes.io/projected/e9504fc0-4063-48bf-87a3-f5e75deba42f-kube-api-access-n5ssq\") pod \"nova-metadata-0\" (UID: \"e9504fc0-4063-48bf-87a3-f5e75deba42f\") " pod="openstack/nova-metadata-0" Oct 03 09:00:55 crc kubenswrapper[4790]: I1003 09:00:55.868848 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kchpd\" (UniqueName: \"kubernetes.io/projected/ac5587ac-3a33-40b8-b068-bbfee9cb702b-kube-api-access-kchpd\") pod \"dnsmasq-dns-7859b488d7-9flsg\" (UID: \"ac5587ac-3a33-40b8-b068-bbfee9cb702b\") " pod="openstack/dnsmasq-dns-7859b488d7-9flsg" Oct 03 09:00:55 crc kubenswrapper[4790]: I1003 09:00:55.869123 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/17497332-4b10-4784-9052-b310611d9c0a-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"17497332-4b10-4784-9052-b310611d9c0a\") " pod="openstack/nova-cell1-novncproxy-0" Oct 03 09:00:55 crc kubenswrapper[4790]: I1003 09:00:55.869217 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/40eba639-5cf7-43e9-a24a-ed0027de7815-config-data\") pod \"nova-scheduler-0\" (UID: \"40eba639-5cf7-43e9-a24a-ed0027de7815\") " pod="openstack/nova-scheduler-0" Oct 03 09:00:55 crc kubenswrapper[4790]: I1003 09:00:55.869252 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cqv4j\" (UniqueName: \"kubernetes.io/projected/17497332-4b10-4784-9052-b310611d9c0a-kube-api-access-cqv4j\") pod \"nova-cell1-novncproxy-0\" (UID: \"17497332-4b10-4784-9052-b310611d9c0a\") " pod="openstack/nova-cell1-novncproxy-0" Oct 03 09:00:55 crc kubenswrapper[4790]: I1003 09:00:55.869281 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/17497332-4b10-4784-9052-b310611d9c0a-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"17497332-4b10-4784-9052-b310611d9c0a\") " pod="openstack/nova-cell1-novncproxy-0" Oct 03 09:00:55 crc kubenswrapper[4790]: I1003 09:00:55.869304 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/40eba639-5cf7-43e9-a24a-ed0027de7815-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"40eba639-5cf7-43e9-a24a-ed0027de7815\") " pod="openstack/nova-scheduler-0" Oct 03 09:00:55 crc kubenswrapper[4790]: I1003 09:00:55.869332 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ac5587ac-3a33-40b8-b068-bbfee9cb702b-config\") pod \"dnsmasq-dns-7859b488d7-9flsg\" (UID: \"ac5587ac-3a33-40b8-b068-bbfee9cb702b\") " pod="openstack/dnsmasq-dns-7859b488d7-9flsg" Oct 03 09:00:55 crc kubenswrapper[4790]: I1003 09:00:55.869433 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ac5587ac-3a33-40b8-b068-bbfee9cb702b-ovsdbserver-sb\") pod \"dnsmasq-dns-7859b488d7-9flsg\" (UID: \"ac5587ac-3a33-40b8-b068-bbfee9cb702b\") " pod="openstack/dnsmasq-dns-7859b488d7-9flsg" Oct 03 09:00:55 crc kubenswrapper[4790]: I1003 09:00:55.869487 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ac5587ac-3a33-40b8-b068-bbfee9cb702b-dns-swift-storage-0\") pod \"dnsmasq-dns-7859b488d7-9flsg\" (UID: \"ac5587ac-3a33-40b8-b068-bbfee9cb702b\") " pod="openstack/dnsmasq-dns-7859b488d7-9flsg" Oct 03 09:00:55 crc kubenswrapper[4790]: I1003 09:00:55.869522 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mrl5t\" (UniqueName: \"kubernetes.io/projected/40eba639-5cf7-43e9-a24a-ed0027de7815-kube-api-access-mrl5t\") pod \"nova-scheduler-0\" (UID: \"40eba639-5cf7-43e9-a24a-ed0027de7815\") " pod="openstack/nova-scheduler-0" Oct 03 09:00:55 crc kubenswrapper[4790]: I1003 09:00:55.869612 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ac5587ac-3a33-40b8-b068-bbfee9cb702b-dns-svc\") pod \"dnsmasq-dns-7859b488d7-9flsg\" (UID: \"ac5587ac-3a33-40b8-b068-bbfee9cb702b\") " pod="openstack/dnsmasq-dns-7859b488d7-9flsg" Oct 03 09:00:55 crc kubenswrapper[4790]: I1003 09:00:55.869633 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ac5587ac-3a33-40b8-b068-bbfee9cb702b-ovsdbserver-nb\") pod \"dnsmasq-dns-7859b488d7-9flsg\" (UID: \"ac5587ac-3a33-40b8-b068-bbfee9cb702b\") " pod="openstack/dnsmasq-dns-7859b488d7-9flsg" Oct 03 09:00:55 crc kubenswrapper[4790]: I1003 09:00:55.890353 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 03 09:00:55 crc kubenswrapper[4790]: I1003 09:00:55.970932 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/40eba639-5cf7-43e9-a24a-ed0027de7815-config-data\") pod \"nova-scheduler-0\" (UID: \"40eba639-5cf7-43e9-a24a-ed0027de7815\") " pod="openstack/nova-scheduler-0" Oct 03 09:00:55 crc kubenswrapper[4790]: I1003 09:00:55.970991 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqv4j\" (UniqueName: \"kubernetes.io/projected/17497332-4b10-4784-9052-b310611d9c0a-kube-api-access-cqv4j\") pod \"nova-cell1-novncproxy-0\" (UID: \"17497332-4b10-4784-9052-b310611d9c0a\") " pod="openstack/nova-cell1-novncproxy-0" Oct 03 09:00:55 crc kubenswrapper[4790]: I1003 09:00:55.971037 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/17497332-4b10-4784-9052-b310611d9c0a-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"17497332-4b10-4784-9052-b310611d9c0a\") " pod="openstack/nova-cell1-novncproxy-0" Oct 03 09:00:55 crc kubenswrapper[4790]: I1003 09:00:55.971071 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/40eba639-5cf7-43e9-a24a-ed0027de7815-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"40eba639-5cf7-43e9-a24a-ed0027de7815\") " pod="openstack/nova-scheduler-0" Oct 03 09:00:55 crc kubenswrapper[4790]: I1003 09:00:55.971099 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ac5587ac-3a33-40b8-b068-bbfee9cb702b-config\") pod \"dnsmasq-dns-7859b488d7-9flsg\" (UID: \"ac5587ac-3a33-40b8-b068-bbfee9cb702b\") " pod="openstack/dnsmasq-dns-7859b488d7-9flsg" Oct 03 09:00:55 crc kubenswrapper[4790]: I1003 09:00:55.971124 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ac5587ac-3a33-40b8-b068-bbfee9cb702b-ovsdbserver-sb\") pod \"dnsmasq-dns-7859b488d7-9flsg\" (UID: \"ac5587ac-3a33-40b8-b068-bbfee9cb702b\") " pod="openstack/dnsmasq-dns-7859b488d7-9flsg" Oct 03 09:00:55 crc kubenswrapper[4790]: I1003 09:00:55.971143 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ac5587ac-3a33-40b8-b068-bbfee9cb702b-dns-swift-storage-0\") pod \"dnsmasq-dns-7859b488d7-9flsg\" (UID: \"ac5587ac-3a33-40b8-b068-bbfee9cb702b\") " pod="openstack/dnsmasq-dns-7859b488d7-9flsg" Oct 03 09:00:55 crc kubenswrapper[4790]: I1003 09:00:55.971162 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mrl5t\" (UniqueName: \"kubernetes.io/projected/40eba639-5cf7-43e9-a24a-ed0027de7815-kube-api-access-mrl5t\") pod \"nova-scheduler-0\" (UID: \"40eba639-5cf7-43e9-a24a-ed0027de7815\") " pod="openstack/nova-scheduler-0" Oct 03 09:00:55 crc kubenswrapper[4790]: I1003 09:00:55.971190 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ac5587ac-3a33-40b8-b068-bbfee9cb702b-dns-svc\") pod \"dnsmasq-dns-7859b488d7-9flsg\" (UID: \"ac5587ac-3a33-40b8-b068-bbfee9cb702b\") " pod="openstack/dnsmasq-dns-7859b488d7-9flsg" Oct 03 09:00:55 crc kubenswrapper[4790]: I1003 09:00:55.971206 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ac5587ac-3a33-40b8-b068-bbfee9cb702b-ovsdbserver-nb\") pod \"dnsmasq-dns-7859b488d7-9flsg\" (UID: \"ac5587ac-3a33-40b8-b068-bbfee9cb702b\") " pod="openstack/dnsmasq-dns-7859b488d7-9flsg" Oct 03 09:00:55 crc kubenswrapper[4790]: I1003 09:00:55.971230 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kchpd\" (UniqueName: \"kubernetes.io/projected/ac5587ac-3a33-40b8-b068-bbfee9cb702b-kube-api-access-kchpd\") pod \"dnsmasq-dns-7859b488d7-9flsg\" (UID: \"ac5587ac-3a33-40b8-b068-bbfee9cb702b\") " pod="openstack/dnsmasq-dns-7859b488d7-9flsg" Oct 03 09:00:55 crc kubenswrapper[4790]: I1003 09:00:55.971270 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/17497332-4b10-4784-9052-b310611d9c0a-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"17497332-4b10-4784-9052-b310611d9c0a\") " pod="openstack/nova-cell1-novncproxy-0" Oct 03 09:00:55 crc kubenswrapper[4790]: I1003 09:00:55.973634 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ac5587ac-3a33-40b8-b068-bbfee9cb702b-ovsdbserver-sb\") pod \"dnsmasq-dns-7859b488d7-9flsg\" (UID: \"ac5587ac-3a33-40b8-b068-bbfee9cb702b\") " pod="openstack/dnsmasq-dns-7859b488d7-9flsg" Oct 03 09:00:55 crc kubenswrapper[4790]: I1003 09:00:55.975731 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ac5587ac-3a33-40b8-b068-bbfee9cb702b-dns-swift-storage-0\") pod \"dnsmasq-dns-7859b488d7-9flsg\" (UID: \"ac5587ac-3a33-40b8-b068-bbfee9cb702b\") " pod="openstack/dnsmasq-dns-7859b488d7-9flsg" Oct 03 09:00:55 crc kubenswrapper[4790]: I1003 09:00:55.977065 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ac5587ac-3a33-40b8-b068-bbfee9cb702b-dns-svc\") pod \"dnsmasq-dns-7859b488d7-9flsg\" (UID: \"ac5587ac-3a33-40b8-b068-bbfee9cb702b\") " pod="openstack/dnsmasq-dns-7859b488d7-9flsg" Oct 03 09:00:55 crc kubenswrapper[4790]: I1003 09:00:55.977268 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ac5587ac-3a33-40b8-b068-bbfee9cb702b-ovsdbserver-nb\") pod \"dnsmasq-dns-7859b488d7-9flsg\" (UID: \"ac5587ac-3a33-40b8-b068-bbfee9cb702b\") " pod="openstack/dnsmasq-dns-7859b488d7-9flsg" Oct 03 09:00:55 crc kubenswrapper[4790]: I1003 09:00:55.987376 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ac5587ac-3a33-40b8-b068-bbfee9cb702b-config\") pod \"dnsmasq-dns-7859b488d7-9flsg\" (UID: \"ac5587ac-3a33-40b8-b068-bbfee9cb702b\") " pod="openstack/dnsmasq-dns-7859b488d7-9flsg" Oct 03 09:00:55 crc kubenswrapper[4790]: I1003 09:00:55.990461 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/40eba639-5cf7-43e9-a24a-ed0027de7815-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"40eba639-5cf7-43e9-a24a-ed0027de7815\") " pod="openstack/nova-scheduler-0" Oct 03 09:00:55 crc kubenswrapper[4790]: I1003 09:00:55.991057 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/17497332-4b10-4784-9052-b310611d9c0a-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"17497332-4b10-4784-9052-b310611d9c0a\") " pod="openstack/nova-cell1-novncproxy-0" Oct 03 09:00:55 crc kubenswrapper[4790]: I1003 09:00:55.992174 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/40eba639-5cf7-43e9-a24a-ed0027de7815-config-data\") pod \"nova-scheduler-0\" (UID: \"40eba639-5cf7-43e9-a24a-ed0027de7815\") " pod="openstack/nova-scheduler-0" Oct 03 09:00:55 crc kubenswrapper[4790]: I1003 09:00:55.992976 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/17497332-4b10-4784-9052-b310611d9c0a-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"17497332-4b10-4784-9052-b310611d9c0a\") " pod="openstack/nova-cell1-novncproxy-0" Oct 03 09:00:56 crc kubenswrapper[4790]: I1003 09:00:56.003889 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqv4j\" (UniqueName: \"kubernetes.io/projected/17497332-4b10-4784-9052-b310611d9c0a-kube-api-access-cqv4j\") pod \"nova-cell1-novncproxy-0\" (UID: \"17497332-4b10-4784-9052-b310611d9c0a\") " pod="openstack/nova-cell1-novncproxy-0" Oct 03 09:00:56 crc kubenswrapper[4790]: I1003 09:00:56.008992 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kchpd\" (UniqueName: \"kubernetes.io/projected/ac5587ac-3a33-40b8-b068-bbfee9cb702b-kube-api-access-kchpd\") pod \"dnsmasq-dns-7859b488d7-9flsg\" (UID: \"ac5587ac-3a33-40b8-b068-bbfee9cb702b\") " pod="openstack/dnsmasq-dns-7859b488d7-9flsg" Oct 03 09:00:56 crc kubenswrapper[4790]: I1003 09:00:56.030276 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mrl5t\" (UniqueName: \"kubernetes.io/projected/40eba639-5cf7-43e9-a24a-ed0027de7815-kube-api-access-mrl5t\") pod \"nova-scheduler-0\" (UID: \"40eba639-5cf7-43e9-a24a-ed0027de7815\") " pod="openstack/nova-scheduler-0" Oct 03 09:00:56 crc kubenswrapper[4790]: I1003 09:00:56.151552 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 03 09:00:56 crc kubenswrapper[4790]: I1003 09:00:56.162600 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 03 09:00:56 crc kubenswrapper[4790]: I1003 09:00:56.178635 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7859b488d7-9flsg" Oct 03 09:00:56 crc kubenswrapper[4790]: I1003 09:00:56.299048 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-4z5vg"] Oct 03 09:00:56 crc kubenswrapper[4790]: W1003 09:00:56.323259 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod62abaf76_b94a_430c_932f_75164034006a.slice/crio-97ac07f74773a15bf70301819e8094a8cb29e87220356ff565620b7f296aeeed WatchSource:0}: Error finding container 97ac07f74773a15bf70301819e8094a8cb29e87220356ff565620b7f296aeeed: Status 404 returned error can't find the container with id 97ac07f74773a15bf70301819e8094a8cb29e87220356ff565620b7f296aeeed Oct 03 09:00:56 crc kubenswrapper[4790]: I1003 09:00:56.444443 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 03 09:00:56 crc kubenswrapper[4790]: W1003 09:00:56.488212 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod43f3baeb_3387_4187_97b5_6bced61686f8.slice/crio-3526166fa4fab27aef3edccb4cde482b2b8f0c2cbaa1a5296a292fd1ee07b07d WatchSource:0}: Error finding container 3526166fa4fab27aef3edccb4cde482b2b8f0c2cbaa1a5296a292fd1ee07b07d: Status 404 returned error can't find the container with id 3526166fa4fab27aef3edccb4cde482b2b8f0c2cbaa1a5296a292fd1ee07b07d Oct 03 09:00:56 crc kubenswrapper[4790]: I1003 09:00:56.558557 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 03 09:00:56 crc kubenswrapper[4790]: I1003 09:00:56.602976 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"43f3baeb-3387-4187-97b5-6bced61686f8","Type":"ContainerStarted","Data":"3526166fa4fab27aef3edccb4cde482b2b8f0c2cbaa1a5296a292fd1ee07b07d"} Oct 03 09:00:56 crc kubenswrapper[4790]: I1003 09:00:56.604348 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-4z5vg" event={"ID":"62abaf76-b94a-430c-932f-75164034006a","Type":"ContainerStarted","Data":"97ac07f74773a15bf70301819e8094a8cb29e87220356ff565620b7f296aeeed"} Oct 03 09:00:56 crc kubenswrapper[4790]: I1003 09:00:56.607787 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-vhvvz"] Oct 03 09:00:56 crc kubenswrapper[4790]: I1003 09:00:56.615176 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-vhvvz" Oct 03 09:00:56 crc kubenswrapper[4790]: I1003 09:00:56.618382 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Oct 03 09:00:56 crc kubenswrapper[4790]: I1003 09:00:56.619145 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Oct 03 09:00:56 crc kubenswrapper[4790]: I1003 09:00:56.662736 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-vhvvz"] Oct 03 09:00:56 crc kubenswrapper[4790]: I1003 09:00:56.690226 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1d1f1bdf-e9e5-4a90-b873-ae755f89c4cc-scripts\") pod \"nova-cell1-conductor-db-sync-vhvvz\" (UID: \"1d1f1bdf-e9e5-4a90-b873-ae755f89c4cc\") " pod="openstack/nova-cell1-conductor-db-sync-vhvvz" Oct 03 09:00:56 crc kubenswrapper[4790]: I1003 09:00:56.690286 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9kqfc\" (UniqueName: \"kubernetes.io/projected/1d1f1bdf-e9e5-4a90-b873-ae755f89c4cc-kube-api-access-9kqfc\") pod \"nova-cell1-conductor-db-sync-vhvvz\" (UID: \"1d1f1bdf-e9e5-4a90-b873-ae755f89c4cc\") " pod="openstack/nova-cell1-conductor-db-sync-vhvvz" Oct 03 09:00:56 crc kubenswrapper[4790]: I1003 09:00:56.690449 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1d1f1bdf-e9e5-4a90-b873-ae755f89c4cc-config-data\") pod \"nova-cell1-conductor-db-sync-vhvvz\" (UID: \"1d1f1bdf-e9e5-4a90-b873-ae755f89c4cc\") " pod="openstack/nova-cell1-conductor-db-sync-vhvvz" Oct 03 09:00:56 crc kubenswrapper[4790]: I1003 09:00:56.690472 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d1f1bdf-e9e5-4a90-b873-ae755f89c4cc-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-vhvvz\" (UID: \"1d1f1bdf-e9e5-4a90-b873-ae755f89c4cc\") " pod="openstack/nova-cell1-conductor-db-sync-vhvvz" Oct 03 09:00:56 crc kubenswrapper[4790]: I1003 09:00:56.792296 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1d1f1bdf-e9e5-4a90-b873-ae755f89c4cc-config-data\") pod \"nova-cell1-conductor-db-sync-vhvvz\" (UID: \"1d1f1bdf-e9e5-4a90-b873-ae755f89c4cc\") " pod="openstack/nova-cell1-conductor-db-sync-vhvvz" Oct 03 09:00:56 crc kubenswrapper[4790]: I1003 09:00:56.792350 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d1f1bdf-e9e5-4a90-b873-ae755f89c4cc-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-vhvvz\" (UID: \"1d1f1bdf-e9e5-4a90-b873-ae755f89c4cc\") " pod="openstack/nova-cell1-conductor-db-sync-vhvvz" Oct 03 09:00:56 crc kubenswrapper[4790]: I1003 09:00:56.792455 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1d1f1bdf-e9e5-4a90-b873-ae755f89c4cc-scripts\") pod \"nova-cell1-conductor-db-sync-vhvvz\" (UID: \"1d1f1bdf-e9e5-4a90-b873-ae755f89c4cc\") " pod="openstack/nova-cell1-conductor-db-sync-vhvvz" Oct 03 09:00:56 crc kubenswrapper[4790]: I1003 09:00:56.792488 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9kqfc\" (UniqueName: \"kubernetes.io/projected/1d1f1bdf-e9e5-4a90-b873-ae755f89c4cc-kube-api-access-9kqfc\") pod \"nova-cell1-conductor-db-sync-vhvvz\" (UID: \"1d1f1bdf-e9e5-4a90-b873-ae755f89c4cc\") " pod="openstack/nova-cell1-conductor-db-sync-vhvvz" Oct 03 09:00:56 crc kubenswrapper[4790]: I1003 09:00:56.806534 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d1f1bdf-e9e5-4a90-b873-ae755f89c4cc-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-vhvvz\" (UID: \"1d1f1bdf-e9e5-4a90-b873-ae755f89c4cc\") " pod="openstack/nova-cell1-conductor-db-sync-vhvvz" Oct 03 09:00:56 crc kubenswrapper[4790]: I1003 09:00:56.811051 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1d1f1bdf-e9e5-4a90-b873-ae755f89c4cc-scripts\") pod \"nova-cell1-conductor-db-sync-vhvvz\" (UID: \"1d1f1bdf-e9e5-4a90-b873-ae755f89c4cc\") " pod="openstack/nova-cell1-conductor-db-sync-vhvvz" Oct 03 09:00:56 crc kubenswrapper[4790]: I1003 09:00:56.820952 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9kqfc\" (UniqueName: \"kubernetes.io/projected/1d1f1bdf-e9e5-4a90-b873-ae755f89c4cc-kube-api-access-9kqfc\") pod \"nova-cell1-conductor-db-sync-vhvvz\" (UID: \"1d1f1bdf-e9e5-4a90-b873-ae755f89c4cc\") " pod="openstack/nova-cell1-conductor-db-sync-vhvvz" Oct 03 09:00:56 crc kubenswrapper[4790]: I1003 09:00:56.825419 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1d1f1bdf-e9e5-4a90-b873-ae755f89c4cc-config-data\") pod \"nova-cell1-conductor-db-sync-vhvvz\" (UID: \"1d1f1bdf-e9e5-4a90-b873-ae755f89c4cc\") " pod="openstack/nova-cell1-conductor-db-sync-vhvvz" Oct 03 09:00:56 crc kubenswrapper[4790]: I1003 09:00:56.930770 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7859b488d7-9flsg"] Oct 03 09:00:56 crc kubenswrapper[4790]: I1003 09:00:56.947334 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 03 09:00:57 crc kubenswrapper[4790]: I1003 09:00:57.018909 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-vhvvz" Oct 03 09:00:57 crc kubenswrapper[4790]: I1003 09:00:57.164122 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 03 09:00:57 crc kubenswrapper[4790]: W1003 09:00:57.191881 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod40eba639_5cf7_43e9_a24a_ed0027de7815.slice/crio-5ef62d6f196903bc469127cce486ea1dc905ed230e76b5a12d5ad5b5d795b96b WatchSource:0}: Error finding container 5ef62d6f196903bc469127cce486ea1dc905ed230e76b5a12d5ad5b5d795b96b: Status 404 returned error can't find the container with id 5ef62d6f196903bc469127cce486ea1dc905ed230e76b5a12d5ad5b5d795b96b Oct 03 09:00:57 crc kubenswrapper[4790]: I1003 09:00:57.628516 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"40eba639-5cf7-43e9-a24a-ed0027de7815","Type":"ContainerStarted","Data":"5ef62d6f196903bc469127cce486ea1dc905ed230e76b5a12d5ad5b5d795b96b"} Oct 03 09:00:57 crc kubenswrapper[4790]: I1003 09:00:57.632790 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"e9504fc0-4063-48bf-87a3-f5e75deba42f","Type":"ContainerStarted","Data":"e3a3d7fabaf421f080f8b7b749a5e04b086add7d5f809ec91647bae1123ec07b"} Oct 03 09:00:57 crc kubenswrapper[4790]: I1003 09:00:57.635802 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-vhvvz"] Oct 03 09:00:57 crc kubenswrapper[4790]: I1003 09:00:57.640430 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"17497332-4b10-4784-9052-b310611d9c0a","Type":"ContainerStarted","Data":"c8470b883c2a6f444f8bd680b2ed161ba12789d3815860a07e6daf6ccc290d88"} Oct 03 09:00:57 crc kubenswrapper[4790]: I1003 09:00:57.644612 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7859b488d7-9flsg" event={"ID":"ac5587ac-3a33-40b8-b068-bbfee9cb702b","Type":"ContainerStarted","Data":"b97964d83ecfd2cc23becb7229b3a327bd0cd183d18069d6578472649c370e17"} Oct 03 09:00:57 crc kubenswrapper[4790]: I1003 09:00:57.644639 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7859b488d7-9flsg" event={"ID":"ac5587ac-3a33-40b8-b068-bbfee9cb702b","Type":"ContainerStarted","Data":"cd16f9894b502ba18df3f7209cf6f8adaacadae989d1010144da452f1076fe80"} Oct 03 09:00:57 crc kubenswrapper[4790]: I1003 09:00:57.649532 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-4z5vg" event={"ID":"62abaf76-b94a-430c-932f-75164034006a","Type":"ContainerStarted","Data":"c3fb7da91946b3152ddae2cd5e951b930c2d505ffcb9a61ef1bdc5685a8c21da"} Oct 03 09:00:57 crc kubenswrapper[4790]: I1003 09:00:57.685848 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-4z5vg" podStartSLOduration=2.685830282 podStartE2EDuration="2.685830282s" podCreationTimestamp="2025-10-03 09:00:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 09:00:57.682012327 +0000 UTC m=+1244.034946565" watchObservedRunningTime="2025-10-03 09:00:57.685830282 +0000 UTC m=+1244.038764520" Oct 03 09:00:58 crc kubenswrapper[4790]: I1003 09:00:58.665002 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-vhvvz" event={"ID":"1d1f1bdf-e9e5-4a90-b873-ae755f89c4cc","Type":"ContainerStarted","Data":"6cdc7e4e2579294bae2eea362c1ec158e112005ba8b149ba23fea5acc8e1c523"} Oct 03 09:00:58 crc kubenswrapper[4790]: I1003 09:00:58.666488 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-vhvvz" event={"ID":"1d1f1bdf-e9e5-4a90-b873-ae755f89c4cc","Type":"ContainerStarted","Data":"1fa7c036124f1ba2e40d222e1d6d73c93f73cec2a2f174b51d3fc595305f8994"} Oct 03 09:00:58 crc kubenswrapper[4790]: I1003 09:00:58.671034 4790 generic.go:334] "Generic (PLEG): container finished" podID="ac5587ac-3a33-40b8-b068-bbfee9cb702b" containerID="b97964d83ecfd2cc23becb7229b3a327bd0cd183d18069d6578472649c370e17" exitCode=0 Oct 03 09:00:58 crc kubenswrapper[4790]: I1003 09:00:58.671817 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7859b488d7-9flsg" event={"ID":"ac5587ac-3a33-40b8-b068-bbfee9cb702b","Type":"ContainerDied","Data":"b97964d83ecfd2cc23becb7229b3a327bd0cd183d18069d6578472649c370e17"} Oct 03 09:00:58 crc kubenswrapper[4790]: I1003 09:00:58.671875 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7859b488d7-9flsg" event={"ID":"ac5587ac-3a33-40b8-b068-bbfee9cb702b","Type":"ContainerStarted","Data":"2ce6ad919c4abde1bd65f5a01d210a624389d2c7552ea3c4d04d6f1762fbca6d"} Oct 03 09:00:58 crc kubenswrapper[4790]: I1003 09:00:58.671994 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7859b488d7-9flsg" Oct 03 09:00:58 crc kubenswrapper[4790]: I1003 09:00:58.687674 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-vhvvz" podStartSLOduration=2.687654618 podStartE2EDuration="2.687654618s" podCreationTimestamp="2025-10-03 09:00:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 09:00:58.686783684 +0000 UTC m=+1245.039717932" watchObservedRunningTime="2025-10-03 09:00:58.687654618 +0000 UTC m=+1245.040588856" Oct 03 09:00:58 crc kubenswrapper[4790]: I1003 09:00:58.717511 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7859b488d7-9flsg" podStartSLOduration=3.717489906 podStartE2EDuration="3.717489906s" podCreationTimestamp="2025-10-03 09:00:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 09:00:58.706996038 +0000 UTC m=+1245.059930266" watchObservedRunningTime="2025-10-03 09:00:58.717489906 +0000 UTC m=+1245.070424144" Oct 03 09:00:59 crc kubenswrapper[4790]: I1003 09:00:59.391945 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 03 09:00:59 crc kubenswrapper[4790]: I1003 09:00:59.405349 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 03 09:00:59 crc kubenswrapper[4790]: I1003 09:00:59.976603 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Oct 03 09:01:00 crc kubenswrapper[4790]: I1003 09:01:00.144668 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cron-29324701-xvbcm"] Oct 03 09:01:00 crc kubenswrapper[4790]: I1003 09:01:00.145909 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29324701-xvbcm" Oct 03 09:01:00 crc kubenswrapper[4790]: I1003 09:01:00.163225 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29324701-xvbcm"] Oct 03 09:01:00 crc kubenswrapper[4790]: I1003 09:01:00.306099 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0eab3760-3551-42a8-9eaa-f53d4b94a8ca-config-data\") pod \"keystone-cron-29324701-xvbcm\" (UID: \"0eab3760-3551-42a8-9eaa-f53d4b94a8ca\") " pod="openstack/keystone-cron-29324701-xvbcm" Oct 03 09:01:00 crc kubenswrapper[4790]: I1003 09:01:00.306168 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z7222\" (UniqueName: \"kubernetes.io/projected/0eab3760-3551-42a8-9eaa-f53d4b94a8ca-kube-api-access-z7222\") pod \"keystone-cron-29324701-xvbcm\" (UID: \"0eab3760-3551-42a8-9eaa-f53d4b94a8ca\") " pod="openstack/keystone-cron-29324701-xvbcm" Oct 03 09:01:00 crc kubenswrapper[4790]: I1003 09:01:00.306205 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/0eab3760-3551-42a8-9eaa-f53d4b94a8ca-fernet-keys\") pod \"keystone-cron-29324701-xvbcm\" (UID: \"0eab3760-3551-42a8-9eaa-f53d4b94a8ca\") " pod="openstack/keystone-cron-29324701-xvbcm" Oct 03 09:01:00 crc kubenswrapper[4790]: I1003 09:01:00.306256 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0eab3760-3551-42a8-9eaa-f53d4b94a8ca-combined-ca-bundle\") pod \"keystone-cron-29324701-xvbcm\" (UID: \"0eab3760-3551-42a8-9eaa-f53d4b94a8ca\") " pod="openstack/keystone-cron-29324701-xvbcm" Oct 03 09:01:00 crc kubenswrapper[4790]: I1003 09:01:00.407729 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0eab3760-3551-42a8-9eaa-f53d4b94a8ca-config-data\") pod \"keystone-cron-29324701-xvbcm\" (UID: \"0eab3760-3551-42a8-9eaa-f53d4b94a8ca\") " pod="openstack/keystone-cron-29324701-xvbcm" Oct 03 09:01:00 crc kubenswrapper[4790]: I1003 09:01:00.408126 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z7222\" (UniqueName: \"kubernetes.io/projected/0eab3760-3551-42a8-9eaa-f53d4b94a8ca-kube-api-access-z7222\") pod \"keystone-cron-29324701-xvbcm\" (UID: \"0eab3760-3551-42a8-9eaa-f53d4b94a8ca\") " pod="openstack/keystone-cron-29324701-xvbcm" Oct 03 09:01:00 crc kubenswrapper[4790]: I1003 09:01:00.408163 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/0eab3760-3551-42a8-9eaa-f53d4b94a8ca-fernet-keys\") pod \"keystone-cron-29324701-xvbcm\" (UID: \"0eab3760-3551-42a8-9eaa-f53d4b94a8ca\") " pod="openstack/keystone-cron-29324701-xvbcm" Oct 03 09:01:00 crc kubenswrapper[4790]: I1003 09:01:00.408249 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0eab3760-3551-42a8-9eaa-f53d4b94a8ca-combined-ca-bundle\") pod \"keystone-cron-29324701-xvbcm\" (UID: \"0eab3760-3551-42a8-9eaa-f53d4b94a8ca\") " pod="openstack/keystone-cron-29324701-xvbcm" Oct 03 09:01:00 crc kubenswrapper[4790]: I1003 09:01:00.418508 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0eab3760-3551-42a8-9eaa-f53d4b94a8ca-config-data\") pod \"keystone-cron-29324701-xvbcm\" (UID: \"0eab3760-3551-42a8-9eaa-f53d4b94a8ca\") " pod="openstack/keystone-cron-29324701-xvbcm" Oct 03 09:01:00 crc kubenswrapper[4790]: I1003 09:01:00.419608 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0eab3760-3551-42a8-9eaa-f53d4b94a8ca-combined-ca-bundle\") pod \"keystone-cron-29324701-xvbcm\" (UID: \"0eab3760-3551-42a8-9eaa-f53d4b94a8ca\") " pod="openstack/keystone-cron-29324701-xvbcm" Oct 03 09:01:00 crc kubenswrapper[4790]: I1003 09:01:00.434416 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z7222\" (UniqueName: \"kubernetes.io/projected/0eab3760-3551-42a8-9eaa-f53d4b94a8ca-kube-api-access-z7222\") pod \"keystone-cron-29324701-xvbcm\" (UID: \"0eab3760-3551-42a8-9eaa-f53d4b94a8ca\") " pod="openstack/keystone-cron-29324701-xvbcm" Oct 03 09:01:00 crc kubenswrapper[4790]: I1003 09:01:00.436860 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/0eab3760-3551-42a8-9eaa-f53d4b94a8ca-fernet-keys\") pod \"keystone-cron-29324701-xvbcm\" (UID: \"0eab3760-3551-42a8-9eaa-f53d4b94a8ca\") " pod="openstack/keystone-cron-29324701-xvbcm" Oct 03 09:01:00 crc kubenswrapper[4790]: I1003 09:01:00.465245 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29324701-xvbcm" Oct 03 09:01:03 crc kubenswrapper[4790]: I1003 09:01:03.815938 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 03 09:01:03 crc kubenswrapper[4790]: I1003 09:01:03.816704 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="a93cbe46-9c4b-49b9-916f-b26ed8c80fe9" containerName="kube-state-metrics" containerID="cri-o://1c1a53cc61127423e411317bf7e6b8b910d0c813b5d7526ab3073aa3cd1b416e" gracePeriod=30 Oct 03 09:01:04 crc kubenswrapper[4790]: I1003 09:01:04.756464 4790 generic.go:334] "Generic (PLEG): container finished" podID="a93cbe46-9c4b-49b9-916f-b26ed8c80fe9" containerID="1c1a53cc61127423e411317bf7e6b8b910d0c813b5d7526ab3073aa3cd1b416e" exitCode=2 Oct 03 09:01:04 crc kubenswrapper[4790]: I1003 09:01:04.757186 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"a93cbe46-9c4b-49b9-916f-b26ed8c80fe9","Type":"ContainerDied","Data":"1c1a53cc61127423e411317bf7e6b8b910d0c813b5d7526ab3073aa3cd1b416e"} Oct 03 09:01:04 crc kubenswrapper[4790]: I1003 09:01:04.797550 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 03 09:01:04 crc kubenswrapper[4790]: I1003 09:01:04.907619 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-frf8p\" (UniqueName: \"kubernetes.io/projected/a93cbe46-9c4b-49b9-916f-b26ed8c80fe9-kube-api-access-frf8p\") pod \"a93cbe46-9c4b-49b9-916f-b26ed8c80fe9\" (UID: \"a93cbe46-9c4b-49b9-916f-b26ed8c80fe9\") " Oct 03 09:01:04 crc kubenswrapper[4790]: I1003 09:01:04.915360 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a93cbe46-9c4b-49b9-916f-b26ed8c80fe9-kube-api-access-frf8p" (OuterVolumeSpecName: "kube-api-access-frf8p") pod "a93cbe46-9c4b-49b9-916f-b26ed8c80fe9" (UID: "a93cbe46-9c4b-49b9-916f-b26ed8c80fe9"). InnerVolumeSpecName "kube-api-access-frf8p". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:01:04 crc kubenswrapper[4790]: I1003 09:01:04.938345 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29324701-xvbcm"] Oct 03 09:01:04 crc kubenswrapper[4790]: W1003 09:01:04.946888 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0eab3760_3551_42a8_9eaa_f53d4b94a8ca.slice/crio-b442af1de11863dfdd5bed757d1323674d03aa01951fdf370d84295b264e597a WatchSource:0}: Error finding container b442af1de11863dfdd5bed757d1323674d03aa01951fdf370d84295b264e597a: Status 404 returned error can't find the container with id b442af1de11863dfdd5bed757d1323674d03aa01951fdf370d84295b264e597a Oct 03 09:01:05 crc kubenswrapper[4790]: I1003 09:01:05.014057 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-frf8p\" (UniqueName: \"kubernetes.io/projected/a93cbe46-9c4b-49b9-916f-b26ed8c80fe9-kube-api-access-frf8p\") on node \"crc\" DevicePath \"\"" Oct 03 09:01:05 crc kubenswrapper[4790]: I1003 09:01:05.769551 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"43f3baeb-3387-4187-97b5-6bced61686f8","Type":"ContainerStarted","Data":"54227f5e2ebb232ea8dd8f6eaa942ce9aa076442a04c518835d2c29183c72eae"} Oct 03 09:01:05 crc kubenswrapper[4790]: I1003 09:01:05.769952 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"43f3baeb-3387-4187-97b5-6bced61686f8","Type":"ContainerStarted","Data":"09ba3b0adb608ee4689e6e35f08981ea03dadc09746b6a6f46d5f4d2df66a45e"} Oct 03 09:01:05 crc kubenswrapper[4790]: I1003 09:01:05.771660 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"40eba639-5cf7-43e9-a24a-ed0027de7815","Type":"ContainerStarted","Data":"dee59fba0543ec8d8516375c3df52bfe9f8c7532838a62f064236c0d359c967e"} Oct 03 09:01:05 crc kubenswrapper[4790]: I1003 09:01:05.773714 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"e9504fc0-4063-48bf-87a3-f5e75deba42f","Type":"ContainerStarted","Data":"b3c7fadbe8c3098ca776e6142c6fb85e9f18c3076604286e7dbcc5c1c15cc8c7"} Oct 03 09:01:05 crc kubenswrapper[4790]: I1003 09:01:05.773754 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"e9504fc0-4063-48bf-87a3-f5e75deba42f","Type":"ContainerStarted","Data":"8e837c52774af4ef3334da1340eb30cba3810dfe1ad01ff69c870840019e5bc1"} Oct 03 09:01:05 crc kubenswrapper[4790]: I1003 09:01:05.773778 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="e9504fc0-4063-48bf-87a3-f5e75deba42f" containerName="nova-metadata-log" containerID="cri-o://8e837c52774af4ef3334da1340eb30cba3810dfe1ad01ff69c870840019e5bc1" gracePeriod=30 Oct 03 09:01:05 crc kubenswrapper[4790]: I1003 09:01:05.773835 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="e9504fc0-4063-48bf-87a3-f5e75deba42f" containerName="nova-metadata-metadata" containerID="cri-o://b3c7fadbe8c3098ca776e6142c6fb85e9f18c3076604286e7dbcc5c1c15cc8c7" gracePeriod=30 Oct 03 09:01:05 crc kubenswrapper[4790]: I1003 09:01:05.775939 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"a93cbe46-9c4b-49b9-916f-b26ed8c80fe9","Type":"ContainerDied","Data":"6bf9ae8c2995bec5c9ed445d0c82a475195feee6453695ddc6d967018851ec9b"} Oct 03 09:01:05 crc kubenswrapper[4790]: I1003 09:01:05.775980 4790 scope.go:117] "RemoveContainer" containerID="1c1a53cc61127423e411317bf7e6b8b910d0c813b5d7526ab3073aa3cd1b416e" Oct 03 09:01:05 crc kubenswrapper[4790]: I1003 09:01:05.775984 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 03 09:01:05 crc kubenswrapper[4790]: I1003 09:01:05.785060 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29324701-xvbcm" event={"ID":"0eab3760-3551-42a8-9eaa-f53d4b94a8ca","Type":"ContainerStarted","Data":"7db553d5fffb26bff30e236f927e2cb44a7155f07b6763d801d64ead489e1684"} Oct 03 09:01:05 crc kubenswrapper[4790]: I1003 09:01:05.785248 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29324701-xvbcm" event={"ID":"0eab3760-3551-42a8-9eaa-f53d4b94a8ca","Type":"ContainerStarted","Data":"b442af1de11863dfdd5bed757d1323674d03aa01951fdf370d84295b264e597a"} Oct 03 09:01:05 crc kubenswrapper[4790]: I1003 09:01:05.801193 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"17497332-4b10-4784-9052-b310611d9c0a","Type":"ContainerStarted","Data":"c27b832dd86ad054387e110100163290aa0d4e73019f64f872f90c9276de4f38"} Oct 03 09:01:05 crc kubenswrapper[4790]: I1003 09:01:05.801897 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="17497332-4b10-4784-9052-b310611d9c0a" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://c27b832dd86ad054387e110100163290aa0d4e73019f64f872f90c9276de4f38" gracePeriod=30 Oct 03 09:01:05 crc kubenswrapper[4790]: I1003 09:01:05.817001 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.902722046 podStartE2EDuration="10.816972892s" podCreationTimestamp="2025-10-03 09:00:55 +0000 UTC" firstStartedPulling="2025-10-03 09:00:56.491970133 +0000 UTC m=+1242.844904371" lastFinishedPulling="2025-10-03 09:01:04.406220979 +0000 UTC m=+1250.759155217" observedRunningTime="2025-10-03 09:01:05.803723419 +0000 UTC m=+1252.156657677" watchObservedRunningTime="2025-10-03 09:01:05.816972892 +0000 UTC m=+1252.169907150" Oct 03 09:01:05 crc kubenswrapper[4790]: I1003 09:01:05.872494 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=3.684091061 podStartE2EDuration="10.872469193s" podCreationTimestamp="2025-10-03 09:00:55 +0000 UTC" firstStartedPulling="2025-10-03 09:00:57.196825371 +0000 UTC m=+1243.549759609" lastFinishedPulling="2025-10-03 09:01:04.385203503 +0000 UTC m=+1250.738137741" observedRunningTime="2025-10-03 09:01:05.831831029 +0000 UTC m=+1252.184765267" watchObservedRunningTime="2025-10-03 09:01:05.872469193 +0000 UTC m=+1252.225403441" Oct 03 09:01:05 crc kubenswrapper[4790]: I1003 09:01:05.891816 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 03 09:01:05 crc kubenswrapper[4790]: I1003 09:01:05.891939 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 03 09:01:05 crc kubenswrapper[4790]: I1003 09:01:05.919780 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=3.111734253 podStartE2EDuration="10.919758199s" podCreationTimestamp="2025-10-03 09:00:55 +0000 UTC" firstStartedPulling="2025-10-03 09:00:56.579516572 +0000 UTC m=+1242.932450810" lastFinishedPulling="2025-10-03 09:01:04.387540518 +0000 UTC m=+1250.740474756" observedRunningTime="2025-10-03 09:01:05.856270559 +0000 UTC m=+1252.209204817" watchObservedRunningTime="2025-10-03 09:01:05.919758199 +0000 UTC m=+1252.272692437" Oct 03 09:01:05 crc kubenswrapper[4790]: I1003 09:01:05.931120 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-cron-29324701-xvbcm" podStartSLOduration=5.93109583 podStartE2EDuration="5.93109583s" podCreationTimestamp="2025-10-03 09:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 09:01:05.873409489 +0000 UTC m=+1252.226343747" watchObservedRunningTime="2025-10-03 09:01:05.93109583 +0000 UTC m=+1252.284030058" Oct 03 09:01:05 crc kubenswrapper[4790]: I1003 09:01:05.943746 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=3.519181611 podStartE2EDuration="10.943727706s" podCreationTimestamp="2025-10-03 09:00:55 +0000 UTC" firstStartedPulling="2025-10-03 09:00:56.975219757 +0000 UTC m=+1243.328153995" lastFinishedPulling="2025-10-03 09:01:04.399765852 +0000 UTC m=+1250.752700090" observedRunningTime="2025-10-03 09:01:05.890051255 +0000 UTC m=+1252.242985503" watchObservedRunningTime="2025-10-03 09:01:05.943727706 +0000 UTC m=+1252.296661944" Oct 03 09:01:05 crc kubenswrapper[4790]: I1003 09:01:05.995048 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 03 09:01:06 crc kubenswrapper[4790]: I1003 09:01:06.007669 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 03 09:01:06 crc kubenswrapper[4790]: I1003 09:01:06.030434 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Oct 03 09:01:06 crc kubenswrapper[4790]: E1003 09:01:06.031308 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a93cbe46-9c4b-49b9-916f-b26ed8c80fe9" containerName="kube-state-metrics" Oct 03 09:01:06 crc kubenswrapper[4790]: I1003 09:01:06.031398 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="a93cbe46-9c4b-49b9-916f-b26ed8c80fe9" containerName="kube-state-metrics" Oct 03 09:01:06 crc kubenswrapper[4790]: I1003 09:01:06.031737 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="a93cbe46-9c4b-49b9-916f-b26ed8c80fe9" containerName="kube-state-metrics" Oct 03 09:01:06 crc kubenswrapper[4790]: I1003 09:01:06.032666 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 03 09:01:06 crc kubenswrapper[4790]: I1003 09:01:06.038186 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-kube-state-metrics-svc" Oct 03 09:01:06 crc kubenswrapper[4790]: I1003 09:01:06.038912 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"kube-state-metrics-tls-config" Oct 03 09:01:06 crc kubenswrapper[4790]: I1003 09:01:06.041111 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 03 09:01:06 crc kubenswrapper[4790]: I1003 09:01:06.144108 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/272a9be6-ce74-4ee6-a1d2-17a8935d8c2e-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"272a9be6-ce74-4ee6-a1d2-17a8935d8c2e\") " pod="openstack/kube-state-metrics-0" Oct 03 09:01:06 crc kubenswrapper[4790]: I1003 09:01:06.144215 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/272a9be6-ce74-4ee6-a1d2-17a8935d8c2e-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"272a9be6-ce74-4ee6-a1d2-17a8935d8c2e\") " pod="openstack/kube-state-metrics-0" Oct 03 09:01:06 crc kubenswrapper[4790]: I1003 09:01:06.144542 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/272a9be6-ce74-4ee6-a1d2-17a8935d8c2e-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"272a9be6-ce74-4ee6-a1d2-17a8935d8c2e\") " pod="openstack/kube-state-metrics-0" Oct 03 09:01:06 crc kubenswrapper[4790]: I1003 09:01:06.144601 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vwkz4\" (UniqueName: \"kubernetes.io/projected/272a9be6-ce74-4ee6-a1d2-17a8935d8c2e-kube-api-access-vwkz4\") pod \"kube-state-metrics-0\" (UID: \"272a9be6-ce74-4ee6-a1d2-17a8935d8c2e\") " pod="openstack/kube-state-metrics-0" Oct 03 09:01:06 crc kubenswrapper[4790]: I1003 09:01:06.152805 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Oct 03 09:01:06 crc kubenswrapper[4790]: I1003 09:01:06.153009 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Oct 03 09:01:06 crc kubenswrapper[4790]: I1003 09:01:06.164339 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Oct 03 09:01:06 crc kubenswrapper[4790]: I1003 09:01:06.180798 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7859b488d7-9flsg" Oct 03 09:01:06 crc kubenswrapper[4790]: I1003 09:01:06.197540 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Oct 03 09:01:06 crc kubenswrapper[4790]: I1003 09:01:06.246878 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/272a9be6-ce74-4ee6-a1d2-17a8935d8c2e-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"272a9be6-ce74-4ee6-a1d2-17a8935d8c2e\") " pod="openstack/kube-state-metrics-0" Oct 03 09:01:06 crc kubenswrapper[4790]: I1003 09:01:06.246929 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vwkz4\" (UniqueName: \"kubernetes.io/projected/272a9be6-ce74-4ee6-a1d2-17a8935d8c2e-kube-api-access-vwkz4\") pod \"kube-state-metrics-0\" (UID: \"272a9be6-ce74-4ee6-a1d2-17a8935d8c2e\") " pod="openstack/kube-state-metrics-0" Oct 03 09:01:06 crc kubenswrapper[4790]: I1003 09:01:06.247086 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/272a9be6-ce74-4ee6-a1d2-17a8935d8c2e-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"272a9be6-ce74-4ee6-a1d2-17a8935d8c2e\") " pod="openstack/kube-state-metrics-0" Oct 03 09:01:06 crc kubenswrapper[4790]: I1003 09:01:06.247156 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/272a9be6-ce74-4ee6-a1d2-17a8935d8c2e-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"272a9be6-ce74-4ee6-a1d2-17a8935d8c2e\") " pod="openstack/kube-state-metrics-0" Oct 03 09:01:06 crc kubenswrapper[4790]: I1003 09:01:06.259656 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/272a9be6-ce74-4ee6-a1d2-17a8935d8c2e-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"272a9be6-ce74-4ee6-a1d2-17a8935d8c2e\") " pod="openstack/kube-state-metrics-0" Oct 03 09:01:06 crc kubenswrapper[4790]: I1003 09:01:06.263776 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/272a9be6-ce74-4ee6-a1d2-17a8935d8c2e-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"272a9be6-ce74-4ee6-a1d2-17a8935d8c2e\") " pod="openstack/kube-state-metrics-0" Oct 03 09:01:06 crc kubenswrapper[4790]: I1003 09:01:06.278424 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/272a9be6-ce74-4ee6-a1d2-17a8935d8c2e-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"272a9be6-ce74-4ee6-a1d2-17a8935d8c2e\") " pod="openstack/kube-state-metrics-0" Oct 03 09:01:06 crc kubenswrapper[4790]: I1003 09:01:06.303204 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-66cd99db9f-qwm9f"] Oct 03 09:01:06 crc kubenswrapper[4790]: I1003 09:01:06.303519 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-66cd99db9f-qwm9f" podUID="5d1475d1-40ee-4daf-a4ad-f52488205cc0" containerName="dnsmasq-dns" containerID="cri-o://b690c39b3b3f4a00e09558c8151b25a3c526c946ee83f9dd9530853ac267bfdb" gracePeriod=10 Oct 03 09:01:06 crc kubenswrapper[4790]: I1003 09:01:06.313420 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vwkz4\" (UniqueName: \"kubernetes.io/projected/272a9be6-ce74-4ee6-a1d2-17a8935d8c2e-kube-api-access-vwkz4\") pod \"kube-state-metrics-0\" (UID: \"272a9be6-ce74-4ee6-a1d2-17a8935d8c2e\") " pod="openstack/kube-state-metrics-0" Oct 03 09:01:06 crc kubenswrapper[4790]: I1003 09:01:06.364419 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 03 09:01:06 crc kubenswrapper[4790]: I1003 09:01:06.372907 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a93cbe46-9c4b-49b9-916f-b26ed8c80fe9" path="/var/lib/kubelet/pods/a93cbe46-9c4b-49b9-916f-b26ed8c80fe9/volumes" Oct 03 09:01:06 crc kubenswrapper[4790]: I1003 09:01:06.678116 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 03 09:01:06 crc kubenswrapper[4790]: I1003 09:01:06.678430 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="80973b39-7225-498d-9bf5-12a6b7d4f3dc" containerName="ceilometer-central-agent" containerID="cri-o://78c2d1368c85c5a888003bc1bca204a67c3948b5750dd423aab901655e8b693b" gracePeriod=30 Oct 03 09:01:06 crc kubenswrapper[4790]: I1003 09:01:06.678559 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="80973b39-7225-498d-9bf5-12a6b7d4f3dc" containerName="proxy-httpd" containerID="cri-o://893fd81d194bae337b26a4de650ee7402cf6546a76510b16713b2fa91e4add13" gracePeriod=30 Oct 03 09:01:06 crc kubenswrapper[4790]: I1003 09:01:06.678629 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="80973b39-7225-498d-9bf5-12a6b7d4f3dc" containerName="sg-core" containerID="cri-o://2994d0f1b20da31133454d8d7621c18e7eaf60baecb000eb5f27958614e19bec" gracePeriod=30 Oct 03 09:01:06 crc kubenswrapper[4790]: I1003 09:01:06.678669 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="80973b39-7225-498d-9bf5-12a6b7d4f3dc" containerName="ceilometer-notification-agent" containerID="cri-o://6458bb4a693bb122671c72474f7f5ba3d390676b6a0b20c111150d5d3a3dac6a" gracePeriod=30 Oct 03 09:01:06 crc kubenswrapper[4790]: I1003 09:01:06.828705 4790 generic.go:334] "Generic (PLEG): container finished" podID="80973b39-7225-498d-9bf5-12a6b7d4f3dc" containerID="2994d0f1b20da31133454d8d7621c18e7eaf60baecb000eb5f27958614e19bec" exitCode=2 Oct 03 09:01:06 crc kubenswrapper[4790]: I1003 09:01:06.829106 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"80973b39-7225-498d-9bf5-12a6b7d4f3dc","Type":"ContainerDied","Data":"2994d0f1b20da31133454d8d7621c18e7eaf60baecb000eb5f27958614e19bec"} Oct 03 09:01:06 crc kubenswrapper[4790]: I1003 09:01:06.836728 4790 generic.go:334] "Generic (PLEG): container finished" podID="5d1475d1-40ee-4daf-a4ad-f52488205cc0" containerID="b690c39b3b3f4a00e09558c8151b25a3c526c946ee83f9dd9530853ac267bfdb" exitCode=0 Oct 03 09:01:06 crc kubenswrapper[4790]: I1003 09:01:06.836796 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-66cd99db9f-qwm9f" event={"ID":"5d1475d1-40ee-4daf-a4ad-f52488205cc0","Type":"ContainerDied","Data":"b690c39b3b3f4a00e09558c8151b25a3c526c946ee83f9dd9530853ac267bfdb"} Oct 03 09:01:06 crc kubenswrapper[4790]: I1003 09:01:06.843871 4790 generic.go:334] "Generic (PLEG): container finished" podID="e9504fc0-4063-48bf-87a3-f5e75deba42f" containerID="b3c7fadbe8c3098ca776e6142c6fb85e9f18c3076604286e7dbcc5c1c15cc8c7" exitCode=0 Oct 03 09:01:06 crc kubenswrapper[4790]: I1003 09:01:06.844010 4790 generic.go:334] "Generic (PLEG): container finished" podID="e9504fc0-4063-48bf-87a3-f5e75deba42f" containerID="8e837c52774af4ef3334da1340eb30cba3810dfe1ad01ff69c870840019e5bc1" exitCode=143 Oct 03 09:01:06 crc kubenswrapper[4790]: I1003 09:01:06.843940 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"e9504fc0-4063-48bf-87a3-f5e75deba42f","Type":"ContainerDied","Data":"b3c7fadbe8c3098ca776e6142c6fb85e9f18c3076604286e7dbcc5c1c15cc8c7"} Oct 03 09:01:06 crc kubenswrapper[4790]: I1003 09:01:06.844152 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"e9504fc0-4063-48bf-87a3-f5e75deba42f","Type":"ContainerDied","Data":"8e837c52774af4ef3334da1340eb30cba3810dfe1ad01ff69c870840019e5bc1"} Oct 03 09:01:06 crc kubenswrapper[4790]: I1003 09:01:06.844187 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"e9504fc0-4063-48bf-87a3-f5e75deba42f","Type":"ContainerDied","Data":"e3a3d7fabaf421f080f8b7b749a5e04b086add7d5f809ec91647bae1123ec07b"} Oct 03 09:01:06 crc kubenswrapper[4790]: I1003 09:01:06.844200 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e3a3d7fabaf421f080f8b7b749a5e04b086add7d5f809ec91647bae1123ec07b" Oct 03 09:01:06 crc kubenswrapper[4790]: I1003 09:01:06.865316 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 03 09:01:06 crc kubenswrapper[4790]: I1003 09:01:06.881834 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Oct 03 09:01:06 crc kubenswrapper[4790]: I1003 09:01:06.981229 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n5ssq\" (UniqueName: \"kubernetes.io/projected/e9504fc0-4063-48bf-87a3-f5e75deba42f-kube-api-access-n5ssq\") pod \"e9504fc0-4063-48bf-87a3-f5e75deba42f\" (UID: \"e9504fc0-4063-48bf-87a3-f5e75deba42f\") " Oct 03 09:01:06 crc kubenswrapper[4790]: I1003 09:01:06.981474 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e9504fc0-4063-48bf-87a3-f5e75deba42f-logs\") pod \"e9504fc0-4063-48bf-87a3-f5e75deba42f\" (UID: \"e9504fc0-4063-48bf-87a3-f5e75deba42f\") " Oct 03 09:01:06 crc kubenswrapper[4790]: I1003 09:01:06.981511 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e9504fc0-4063-48bf-87a3-f5e75deba42f-combined-ca-bundle\") pod \"e9504fc0-4063-48bf-87a3-f5e75deba42f\" (UID: \"e9504fc0-4063-48bf-87a3-f5e75deba42f\") " Oct 03 09:01:06 crc kubenswrapper[4790]: I1003 09:01:06.981544 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e9504fc0-4063-48bf-87a3-f5e75deba42f-config-data\") pod \"e9504fc0-4063-48bf-87a3-f5e75deba42f\" (UID: \"e9504fc0-4063-48bf-87a3-f5e75deba42f\") " Oct 03 09:01:06 crc kubenswrapper[4790]: I1003 09:01:06.986001 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e9504fc0-4063-48bf-87a3-f5e75deba42f-logs" (OuterVolumeSpecName: "logs") pod "e9504fc0-4063-48bf-87a3-f5e75deba42f" (UID: "e9504fc0-4063-48bf-87a3-f5e75deba42f"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 09:01:06 crc kubenswrapper[4790]: I1003 09:01:06.991194 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-66cd99db9f-qwm9f" Oct 03 09:01:06 crc kubenswrapper[4790]: I1003 09:01:06.991717 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e9504fc0-4063-48bf-87a3-f5e75deba42f-kube-api-access-n5ssq" (OuterVolumeSpecName: "kube-api-access-n5ssq") pod "e9504fc0-4063-48bf-87a3-f5e75deba42f" (UID: "e9504fc0-4063-48bf-87a3-f5e75deba42f"). InnerVolumeSpecName "kube-api-access-n5ssq". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:01:07 crc kubenswrapper[4790]: I1003 09:01:07.021218 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e9504fc0-4063-48bf-87a3-f5e75deba42f-config-data" (OuterVolumeSpecName: "config-data") pod "e9504fc0-4063-48bf-87a3-f5e75deba42f" (UID: "e9504fc0-4063-48bf-87a3-f5e75deba42f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:01:07 crc kubenswrapper[4790]: I1003 09:01:07.037885 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e9504fc0-4063-48bf-87a3-f5e75deba42f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e9504fc0-4063-48bf-87a3-f5e75deba42f" (UID: "e9504fc0-4063-48bf-87a3-f5e75deba42f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:01:07 crc kubenswrapper[4790]: I1003 09:01:07.084209 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5d1475d1-40ee-4daf-a4ad-f52488205cc0-dns-svc\") pod \"5d1475d1-40ee-4daf-a4ad-f52488205cc0\" (UID: \"5d1475d1-40ee-4daf-a4ad-f52488205cc0\") " Oct 03 09:01:07 crc kubenswrapper[4790]: I1003 09:01:07.084265 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5d1475d1-40ee-4daf-a4ad-f52488205cc0-ovsdbserver-nb\") pod \"5d1475d1-40ee-4daf-a4ad-f52488205cc0\" (UID: \"5d1475d1-40ee-4daf-a4ad-f52488205cc0\") " Oct 03 09:01:07 crc kubenswrapper[4790]: I1003 09:01:07.084366 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5d1475d1-40ee-4daf-a4ad-f52488205cc0-ovsdbserver-sb\") pod \"5d1475d1-40ee-4daf-a4ad-f52488205cc0\" (UID: \"5d1475d1-40ee-4daf-a4ad-f52488205cc0\") " Oct 03 09:01:07 crc kubenswrapper[4790]: I1003 09:01:07.084396 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5d1475d1-40ee-4daf-a4ad-f52488205cc0-dns-swift-storage-0\") pod \"5d1475d1-40ee-4daf-a4ad-f52488205cc0\" (UID: \"5d1475d1-40ee-4daf-a4ad-f52488205cc0\") " Oct 03 09:01:07 crc kubenswrapper[4790]: I1003 09:01:07.084424 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8b5l6\" (UniqueName: \"kubernetes.io/projected/5d1475d1-40ee-4daf-a4ad-f52488205cc0-kube-api-access-8b5l6\") pod \"5d1475d1-40ee-4daf-a4ad-f52488205cc0\" (UID: \"5d1475d1-40ee-4daf-a4ad-f52488205cc0\") " Oct 03 09:01:07 crc kubenswrapper[4790]: I1003 09:01:07.084506 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5d1475d1-40ee-4daf-a4ad-f52488205cc0-config\") pod \"5d1475d1-40ee-4daf-a4ad-f52488205cc0\" (UID: \"5d1475d1-40ee-4daf-a4ad-f52488205cc0\") " Oct 03 09:01:07 crc kubenswrapper[4790]: I1003 09:01:07.085130 4790 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e9504fc0-4063-48bf-87a3-f5e75deba42f-logs\") on node \"crc\" DevicePath \"\"" Oct 03 09:01:07 crc kubenswrapper[4790]: I1003 09:01:07.085149 4790 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e9504fc0-4063-48bf-87a3-f5e75deba42f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 09:01:07 crc kubenswrapper[4790]: I1003 09:01:07.085161 4790 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e9504fc0-4063-48bf-87a3-f5e75deba42f-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 09:01:07 crc kubenswrapper[4790]: I1003 09:01:07.085173 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n5ssq\" (UniqueName: \"kubernetes.io/projected/e9504fc0-4063-48bf-87a3-f5e75deba42f-kube-api-access-n5ssq\") on node \"crc\" DevicePath \"\"" Oct 03 09:01:07 crc kubenswrapper[4790]: I1003 09:01:07.113625 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5d1475d1-40ee-4daf-a4ad-f52488205cc0-kube-api-access-8b5l6" (OuterVolumeSpecName: "kube-api-access-8b5l6") pod "5d1475d1-40ee-4daf-a4ad-f52488205cc0" (UID: "5d1475d1-40ee-4daf-a4ad-f52488205cc0"). InnerVolumeSpecName "kube-api-access-8b5l6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:01:07 crc kubenswrapper[4790]: I1003 09:01:07.144334 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5d1475d1-40ee-4daf-a4ad-f52488205cc0-config" (OuterVolumeSpecName: "config") pod "5d1475d1-40ee-4daf-a4ad-f52488205cc0" (UID: "5d1475d1-40ee-4daf-a4ad-f52488205cc0"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 09:01:07 crc kubenswrapper[4790]: I1003 09:01:07.175136 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5d1475d1-40ee-4daf-a4ad-f52488205cc0-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "5d1475d1-40ee-4daf-a4ad-f52488205cc0" (UID: "5d1475d1-40ee-4daf-a4ad-f52488205cc0"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 09:01:07 crc kubenswrapper[4790]: I1003 09:01:07.184561 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5d1475d1-40ee-4daf-a4ad-f52488205cc0-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "5d1475d1-40ee-4daf-a4ad-f52488205cc0" (UID: "5d1475d1-40ee-4daf-a4ad-f52488205cc0"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 09:01:07 crc kubenswrapper[4790]: I1003 09:01:07.186909 4790 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5d1475d1-40ee-4daf-a4ad-f52488205cc0-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 03 09:01:07 crc kubenswrapper[4790]: I1003 09:01:07.186944 4790 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5d1475d1-40ee-4daf-a4ad-f52488205cc0-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 03 09:01:07 crc kubenswrapper[4790]: I1003 09:01:07.186957 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8b5l6\" (UniqueName: \"kubernetes.io/projected/5d1475d1-40ee-4daf-a4ad-f52488205cc0-kube-api-access-8b5l6\") on node \"crc\" DevicePath \"\"" Oct 03 09:01:07 crc kubenswrapper[4790]: I1003 09:01:07.186973 4790 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5d1475d1-40ee-4daf-a4ad-f52488205cc0-config\") on node \"crc\" DevicePath \"\"" Oct 03 09:01:07 crc kubenswrapper[4790]: I1003 09:01:07.186950 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5d1475d1-40ee-4daf-a4ad-f52488205cc0-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "5d1475d1-40ee-4daf-a4ad-f52488205cc0" (UID: "5d1475d1-40ee-4daf-a4ad-f52488205cc0"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 09:01:07 crc kubenswrapper[4790]: I1003 09:01:07.214087 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5d1475d1-40ee-4daf-a4ad-f52488205cc0-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "5d1475d1-40ee-4daf-a4ad-f52488205cc0" (UID: "5d1475d1-40ee-4daf-a4ad-f52488205cc0"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 09:01:07 crc kubenswrapper[4790]: I1003 09:01:07.247462 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 03 09:01:07 crc kubenswrapper[4790]: I1003 09:01:07.256948 4790 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 03 09:01:07 crc kubenswrapper[4790]: I1003 09:01:07.288937 4790 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5d1475d1-40ee-4daf-a4ad-f52488205cc0-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 03 09:01:07 crc kubenswrapper[4790]: I1003 09:01:07.290616 4790 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5d1475d1-40ee-4daf-a4ad-f52488205cc0-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 03 09:01:07 crc kubenswrapper[4790]: I1003 09:01:07.873810 4790 generic.go:334] "Generic (PLEG): container finished" podID="80973b39-7225-498d-9bf5-12a6b7d4f3dc" containerID="893fd81d194bae337b26a4de650ee7402cf6546a76510b16713b2fa91e4add13" exitCode=0 Oct 03 09:01:07 crc kubenswrapper[4790]: I1003 09:01:07.874688 4790 generic.go:334] "Generic (PLEG): container finished" podID="80973b39-7225-498d-9bf5-12a6b7d4f3dc" containerID="78c2d1368c85c5a888003bc1bca204a67c3948b5750dd423aab901655e8b693b" exitCode=0 Oct 03 09:01:07 crc kubenswrapper[4790]: I1003 09:01:07.874789 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"80973b39-7225-498d-9bf5-12a6b7d4f3dc","Type":"ContainerDied","Data":"893fd81d194bae337b26a4de650ee7402cf6546a76510b16713b2fa91e4add13"} Oct 03 09:01:07 crc kubenswrapper[4790]: I1003 09:01:07.874822 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"80973b39-7225-498d-9bf5-12a6b7d4f3dc","Type":"ContainerDied","Data":"78c2d1368c85c5a888003bc1bca204a67c3948b5750dd423aab901655e8b693b"} Oct 03 09:01:07 crc kubenswrapper[4790]: I1003 09:01:07.879424 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-66cd99db9f-qwm9f" Oct 03 09:01:07 crc kubenswrapper[4790]: I1003 09:01:07.879758 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-66cd99db9f-qwm9f" event={"ID":"5d1475d1-40ee-4daf-a4ad-f52488205cc0","Type":"ContainerDied","Data":"45f9b309c2953395a38c17e5dbdb1b0ccaa4165b1663e892840088cab640e128"} Oct 03 09:01:07 crc kubenswrapper[4790]: I1003 09:01:07.879918 4790 scope.go:117] "RemoveContainer" containerID="b690c39b3b3f4a00e09558c8151b25a3c526c946ee83f9dd9530853ac267bfdb" Oct 03 09:01:07 crc kubenswrapper[4790]: I1003 09:01:07.886704 4790 generic.go:334] "Generic (PLEG): container finished" podID="62abaf76-b94a-430c-932f-75164034006a" containerID="c3fb7da91946b3152ddae2cd5e951b930c2d505ffcb9a61ef1bdc5685a8c21da" exitCode=0 Oct 03 09:01:07 crc kubenswrapper[4790]: I1003 09:01:07.886758 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-4z5vg" event={"ID":"62abaf76-b94a-430c-932f-75164034006a","Type":"ContainerDied","Data":"c3fb7da91946b3152ddae2cd5e951b930c2d505ffcb9a61ef1bdc5685a8c21da"} Oct 03 09:01:07 crc kubenswrapper[4790]: I1003 09:01:07.889986 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 03 09:01:07 crc kubenswrapper[4790]: I1003 09:01:07.893385 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"272a9be6-ce74-4ee6-a1d2-17a8935d8c2e","Type":"ContainerStarted","Data":"77c0dad2984d643d4140f4df6e05494c35156d3e9a8ac64dd49b32f1379a6383"} Oct 03 09:01:07 crc kubenswrapper[4790]: I1003 09:01:07.924665 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-66cd99db9f-qwm9f"] Oct 03 09:01:07 crc kubenswrapper[4790]: I1003 09:01:07.929322 4790 scope.go:117] "RemoveContainer" containerID="9f1a374df2e3e420a0d83c2145ea53c80d54efbcdf52e221e00cd34741ffcb14" Oct 03 09:01:07 crc kubenswrapper[4790]: I1003 09:01:07.934193 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-66cd99db9f-qwm9f"] Oct 03 09:01:07 crc kubenswrapper[4790]: I1003 09:01:07.943352 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 03 09:01:08 crc kubenswrapper[4790]: I1003 09:01:08.014474 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Oct 03 09:01:08 crc kubenswrapper[4790]: I1003 09:01:08.058550 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Oct 03 09:01:08 crc kubenswrapper[4790]: E1003 09:01:08.061290 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5d1475d1-40ee-4daf-a4ad-f52488205cc0" containerName="dnsmasq-dns" Oct 03 09:01:08 crc kubenswrapper[4790]: I1003 09:01:08.061322 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d1475d1-40ee-4daf-a4ad-f52488205cc0" containerName="dnsmasq-dns" Oct 03 09:01:08 crc kubenswrapper[4790]: E1003 09:01:08.061348 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e9504fc0-4063-48bf-87a3-f5e75deba42f" containerName="nova-metadata-log" Oct 03 09:01:08 crc kubenswrapper[4790]: I1003 09:01:08.061357 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="e9504fc0-4063-48bf-87a3-f5e75deba42f" containerName="nova-metadata-log" Oct 03 09:01:08 crc kubenswrapper[4790]: E1003 09:01:08.061419 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5d1475d1-40ee-4daf-a4ad-f52488205cc0" containerName="init" Oct 03 09:01:08 crc kubenswrapper[4790]: I1003 09:01:08.061431 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d1475d1-40ee-4daf-a4ad-f52488205cc0" containerName="init" Oct 03 09:01:08 crc kubenswrapper[4790]: E1003 09:01:08.061458 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e9504fc0-4063-48bf-87a3-f5e75deba42f" containerName="nova-metadata-metadata" Oct 03 09:01:08 crc kubenswrapper[4790]: I1003 09:01:08.061465 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="e9504fc0-4063-48bf-87a3-f5e75deba42f" containerName="nova-metadata-metadata" Oct 03 09:01:08 crc kubenswrapper[4790]: I1003 09:01:08.061974 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="e9504fc0-4063-48bf-87a3-f5e75deba42f" containerName="nova-metadata-metadata" Oct 03 09:01:08 crc kubenswrapper[4790]: I1003 09:01:08.062013 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="5d1475d1-40ee-4daf-a4ad-f52488205cc0" containerName="dnsmasq-dns" Oct 03 09:01:08 crc kubenswrapper[4790]: I1003 09:01:08.062045 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="e9504fc0-4063-48bf-87a3-f5e75deba42f" containerName="nova-metadata-log" Oct 03 09:01:08 crc kubenswrapper[4790]: I1003 09:01:08.067382 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 03 09:01:08 crc kubenswrapper[4790]: I1003 09:01:08.072175 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Oct 03 09:01:08 crc kubenswrapper[4790]: I1003 09:01:08.072446 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Oct 03 09:01:08 crc kubenswrapper[4790]: I1003 09:01:08.085675 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 03 09:01:08 crc kubenswrapper[4790]: I1003 09:01:08.118335 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7ddcf8dc-3b68-43c0-a78c-88aeb1fe7f30-config-data\") pod \"nova-metadata-0\" (UID: \"7ddcf8dc-3b68-43c0-a78c-88aeb1fe7f30\") " pod="openstack/nova-metadata-0" Oct 03 09:01:08 crc kubenswrapper[4790]: I1003 09:01:08.118633 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ddcf8dc-3b68-43c0-a78c-88aeb1fe7f30-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"7ddcf8dc-3b68-43c0-a78c-88aeb1fe7f30\") " pod="openstack/nova-metadata-0" Oct 03 09:01:08 crc kubenswrapper[4790]: I1003 09:01:08.118800 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-spc2w\" (UniqueName: \"kubernetes.io/projected/7ddcf8dc-3b68-43c0-a78c-88aeb1fe7f30-kube-api-access-spc2w\") pod \"nova-metadata-0\" (UID: \"7ddcf8dc-3b68-43c0-a78c-88aeb1fe7f30\") " pod="openstack/nova-metadata-0" Oct 03 09:01:08 crc kubenswrapper[4790]: I1003 09:01:08.118921 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/7ddcf8dc-3b68-43c0-a78c-88aeb1fe7f30-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"7ddcf8dc-3b68-43c0-a78c-88aeb1fe7f30\") " pod="openstack/nova-metadata-0" Oct 03 09:01:08 crc kubenswrapper[4790]: I1003 09:01:08.119143 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7ddcf8dc-3b68-43c0-a78c-88aeb1fe7f30-logs\") pod \"nova-metadata-0\" (UID: \"7ddcf8dc-3b68-43c0-a78c-88aeb1fe7f30\") " pod="openstack/nova-metadata-0" Oct 03 09:01:08 crc kubenswrapper[4790]: I1003 09:01:08.221460 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-spc2w\" (UniqueName: \"kubernetes.io/projected/7ddcf8dc-3b68-43c0-a78c-88aeb1fe7f30-kube-api-access-spc2w\") pod \"nova-metadata-0\" (UID: \"7ddcf8dc-3b68-43c0-a78c-88aeb1fe7f30\") " pod="openstack/nova-metadata-0" Oct 03 09:01:08 crc kubenswrapper[4790]: I1003 09:01:08.221556 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/7ddcf8dc-3b68-43c0-a78c-88aeb1fe7f30-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"7ddcf8dc-3b68-43c0-a78c-88aeb1fe7f30\") " pod="openstack/nova-metadata-0" Oct 03 09:01:08 crc kubenswrapper[4790]: I1003 09:01:08.221620 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7ddcf8dc-3b68-43c0-a78c-88aeb1fe7f30-logs\") pod \"nova-metadata-0\" (UID: \"7ddcf8dc-3b68-43c0-a78c-88aeb1fe7f30\") " pod="openstack/nova-metadata-0" Oct 03 09:01:08 crc kubenswrapper[4790]: I1003 09:01:08.221652 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7ddcf8dc-3b68-43c0-a78c-88aeb1fe7f30-config-data\") pod \"nova-metadata-0\" (UID: \"7ddcf8dc-3b68-43c0-a78c-88aeb1fe7f30\") " pod="openstack/nova-metadata-0" Oct 03 09:01:08 crc kubenswrapper[4790]: I1003 09:01:08.221673 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ddcf8dc-3b68-43c0-a78c-88aeb1fe7f30-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"7ddcf8dc-3b68-43c0-a78c-88aeb1fe7f30\") " pod="openstack/nova-metadata-0" Oct 03 09:01:08 crc kubenswrapper[4790]: I1003 09:01:08.222644 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7ddcf8dc-3b68-43c0-a78c-88aeb1fe7f30-logs\") pod \"nova-metadata-0\" (UID: \"7ddcf8dc-3b68-43c0-a78c-88aeb1fe7f30\") " pod="openstack/nova-metadata-0" Oct 03 09:01:08 crc kubenswrapper[4790]: I1003 09:01:08.234258 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ddcf8dc-3b68-43c0-a78c-88aeb1fe7f30-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"7ddcf8dc-3b68-43c0-a78c-88aeb1fe7f30\") " pod="openstack/nova-metadata-0" Oct 03 09:01:08 crc kubenswrapper[4790]: I1003 09:01:08.234716 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/7ddcf8dc-3b68-43c0-a78c-88aeb1fe7f30-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"7ddcf8dc-3b68-43c0-a78c-88aeb1fe7f30\") " pod="openstack/nova-metadata-0" Oct 03 09:01:08 crc kubenswrapper[4790]: I1003 09:01:08.234970 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7ddcf8dc-3b68-43c0-a78c-88aeb1fe7f30-config-data\") pod \"nova-metadata-0\" (UID: \"7ddcf8dc-3b68-43c0-a78c-88aeb1fe7f30\") " pod="openstack/nova-metadata-0" Oct 03 09:01:08 crc kubenswrapper[4790]: I1003 09:01:08.244323 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-spc2w\" (UniqueName: \"kubernetes.io/projected/7ddcf8dc-3b68-43c0-a78c-88aeb1fe7f30-kube-api-access-spc2w\") pod \"nova-metadata-0\" (UID: \"7ddcf8dc-3b68-43c0-a78c-88aeb1fe7f30\") " pod="openstack/nova-metadata-0" Oct 03 09:01:08 crc kubenswrapper[4790]: I1003 09:01:08.339898 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5d1475d1-40ee-4daf-a4ad-f52488205cc0" path="/var/lib/kubelet/pods/5d1475d1-40ee-4daf-a4ad-f52488205cc0/volumes" Oct 03 09:01:08 crc kubenswrapper[4790]: I1003 09:01:08.340937 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e9504fc0-4063-48bf-87a3-f5e75deba42f" path="/var/lib/kubelet/pods/e9504fc0-4063-48bf-87a3-f5e75deba42f/volumes" Oct 03 09:01:08 crc kubenswrapper[4790]: I1003 09:01:08.390204 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 03 09:01:08 crc kubenswrapper[4790]: I1003 09:01:08.858673 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 03 09:01:08 crc kubenswrapper[4790]: I1003 09:01:08.901172 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"7ddcf8dc-3b68-43c0-a78c-88aeb1fe7f30","Type":"ContainerStarted","Data":"ef8dab063301d159b631cc434c15c25c1cef16c626df36fc71e2d949b85848d2"} Oct 03 09:01:08 crc kubenswrapper[4790]: I1003 09:01:08.903906 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"272a9be6-ce74-4ee6-a1d2-17a8935d8c2e","Type":"ContainerStarted","Data":"911e4f0cc68821effe8bf84a3f61cafc1338c4eedf7fed054eea2bb060b20cf4"} Oct 03 09:01:08 crc kubenswrapper[4790]: I1003 09:01:08.937498 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=3.538010365 podStartE2EDuration="3.937473222s" podCreationTimestamp="2025-10-03 09:01:05 +0000 UTC" firstStartedPulling="2025-10-03 09:01:07.256681778 +0000 UTC m=+1253.609616016" lastFinishedPulling="2025-10-03 09:01:07.656144635 +0000 UTC m=+1254.009078873" observedRunningTime="2025-10-03 09:01:08.919204791 +0000 UTC m=+1255.272139049" watchObservedRunningTime="2025-10-03 09:01:08.937473222 +0000 UTC m=+1255.290407480" Oct 03 09:01:09 crc kubenswrapper[4790]: I1003 09:01:09.185117 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-4z5vg" Oct 03 09:01:09 crc kubenswrapper[4790]: I1003 09:01:09.245934 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62abaf76-b94a-430c-932f-75164034006a-combined-ca-bundle\") pod \"62abaf76-b94a-430c-932f-75164034006a\" (UID: \"62abaf76-b94a-430c-932f-75164034006a\") " Oct 03 09:01:09 crc kubenswrapper[4790]: I1003 09:01:09.246202 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/62abaf76-b94a-430c-932f-75164034006a-scripts\") pod \"62abaf76-b94a-430c-932f-75164034006a\" (UID: \"62abaf76-b94a-430c-932f-75164034006a\") " Oct 03 09:01:09 crc kubenswrapper[4790]: I1003 09:01:09.246263 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5m4bc\" (UniqueName: \"kubernetes.io/projected/62abaf76-b94a-430c-932f-75164034006a-kube-api-access-5m4bc\") pod \"62abaf76-b94a-430c-932f-75164034006a\" (UID: \"62abaf76-b94a-430c-932f-75164034006a\") " Oct 03 09:01:09 crc kubenswrapper[4790]: I1003 09:01:09.246339 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/62abaf76-b94a-430c-932f-75164034006a-config-data\") pod \"62abaf76-b94a-430c-932f-75164034006a\" (UID: \"62abaf76-b94a-430c-932f-75164034006a\") " Oct 03 09:01:09 crc kubenswrapper[4790]: I1003 09:01:09.251748 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/62abaf76-b94a-430c-932f-75164034006a-kube-api-access-5m4bc" (OuterVolumeSpecName: "kube-api-access-5m4bc") pod "62abaf76-b94a-430c-932f-75164034006a" (UID: "62abaf76-b94a-430c-932f-75164034006a"). InnerVolumeSpecName "kube-api-access-5m4bc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:01:09 crc kubenswrapper[4790]: I1003 09:01:09.267805 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/62abaf76-b94a-430c-932f-75164034006a-scripts" (OuterVolumeSpecName: "scripts") pod "62abaf76-b94a-430c-932f-75164034006a" (UID: "62abaf76-b94a-430c-932f-75164034006a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:01:09 crc kubenswrapper[4790]: I1003 09:01:09.312684 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/62abaf76-b94a-430c-932f-75164034006a-config-data" (OuterVolumeSpecName: "config-data") pod "62abaf76-b94a-430c-932f-75164034006a" (UID: "62abaf76-b94a-430c-932f-75164034006a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:01:09 crc kubenswrapper[4790]: I1003 09:01:09.335558 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/62abaf76-b94a-430c-932f-75164034006a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "62abaf76-b94a-430c-932f-75164034006a" (UID: "62abaf76-b94a-430c-932f-75164034006a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:01:09 crc kubenswrapper[4790]: I1003 09:01:09.349884 4790 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62abaf76-b94a-430c-932f-75164034006a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 09:01:09 crc kubenswrapper[4790]: I1003 09:01:09.349944 4790 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/62abaf76-b94a-430c-932f-75164034006a-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 09:01:09 crc kubenswrapper[4790]: I1003 09:01:09.349956 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5m4bc\" (UniqueName: \"kubernetes.io/projected/62abaf76-b94a-430c-932f-75164034006a-kube-api-access-5m4bc\") on node \"crc\" DevicePath \"\"" Oct 03 09:01:09 crc kubenswrapper[4790]: I1003 09:01:09.349966 4790 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/62abaf76-b94a-430c-932f-75164034006a-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 09:01:09 crc kubenswrapper[4790]: I1003 09:01:09.914560 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-4z5vg" event={"ID":"62abaf76-b94a-430c-932f-75164034006a","Type":"ContainerDied","Data":"97ac07f74773a15bf70301819e8094a8cb29e87220356ff565620b7f296aeeed"} Oct 03 09:01:09 crc kubenswrapper[4790]: I1003 09:01:09.914625 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="97ac07f74773a15bf70301819e8094a8cb29e87220356ff565620b7f296aeeed" Oct 03 09:01:09 crc kubenswrapper[4790]: I1003 09:01:09.914690 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-4z5vg" Oct 03 09:01:09 crc kubenswrapper[4790]: I1003 09:01:09.917607 4790 generic.go:334] "Generic (PLEG): container finished" podID="0eab3760-3551-42a8-9eaa-f53d4b94a8ca" containerID="7db553d5fffb26bff30e236f927e2cb44a7155f07b6763d801d64ead489e1684" exitCode=0 Oct 03 09:01:09 crc kubenswrapper[4790]: I1003 09:01:09.917661 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29324701-xvbcm" event={"ID":"0eab3760-3551-42a8-9eaa-f53d4b94a8ca","Type":"ContainerDied","Data":"7db553d5fffb26bff30e236f927e2cb44a7155f07b6763d801d64ead489e1684"} Oct 03 09:01:09 crc kubenswrapper[4790]: I1003 09:01:09.921080 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"7ddcf8dc-3b68-43c0-a78c-88aeb1fe7f30","Type":"ContainerStarted","Data":"f14b6768a33be3ff60a8d67fd4aa51fad64d0d22318d5ed66162db615d88bc98"} Oct 03 09:01:09 crc kubenswrapper[4790]: I1003 09:01:09.921129 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Oct 03 09:01:09 crc kubenswrapper[4790]: I1003 09:01:09.921143 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"7ddcf8dc-3b68-43c0-a78c-88aeb1fe7f30","Type":"ContainerStarted","Data":"afde0c6fb85d57536f16ea26c916182fb31d89bcea7ae3310771bc9f2cb86ab4"} Oct 03 09:01:09 crc kubenswrapper[4790]: I1003 09:01:09.971175 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.97115335 podStartE2EDuration="2.97115335s" podCreationTimestamp="2025-10-03 09:01:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 09:01:09.966686648 +0000 UTC m=+1256.319620886" watchObservedRunningTime="2025-10-03 09:01:09.97115335 +0000 UTC m=+1256.324087588" Oct 03 09:01:10 crc kubenswrapper[4790]: I1003 09:01:10.075219 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 03 09:01:10 crc kubenswrapper[4790]: I1003 09:01:10.075754 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="43f3baeb-3387-4187-97b5-6bced61686f8" containerName="nova-api-log" containerID="cri-o://09ba3b0adb608ee4689e6e35f08981ea03dadc09746b6a6f46d5f4d2df66a45e" gracePeriod=30 Oct 03 09:01:10 crc kubenswrapper[4790]: I1003 09:01:10.075879 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="43f3baeb-3387-4187-97b5-6bced61686f8" containerName="nova-api-api" containerID="cri-o://54227f5e2ebb232ea8dd8f6eaa942ce9aa076442a04c518835d2c29183c72eae" gracePeriod=30 Oct 03 09:01:10 crc kubenswrapper[4790]: I1003 09:01:10.086734 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Oct 03 09:01:10 crc kubenswrapper[4790]: I1003 09:01:10.086936 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="40eba639-5cf7-43e9-a24a-ed0027de7815" containerName="nova-scheduler-scheduler" containerID="cri-o://dee59fba0543ec8d8516375c3df52bfe9f8c7532838a62f064236c0d359c967e" gracePeriod=30 Oct 03 09:01:10 crc kubenswrapper[4790]: I1003 09:01:10.104144 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 03 09:01:10 crc kubenswrapper[4790]: I1003 09:01:10.724566 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 03 09:01:10 crc kubenswrapper[4790]: I1003 09:01:10.775643 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/43f3baeb-3387-4187-97b5-6bced61686f8-logs\") pod \"43f3baeb-3387-4187-97b5-6bced61686f8\" (UID: \"43f3baeb-3387-4187-97b5-6bced61686f8\") " Oct 03 09:01:10 crc kubenswrapper[4790]: I1003 09:01:10.775831 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/43f3baeb-3387-4187-97b5-6bced61686f8-config-data\") pod \"43f3baeb-3387-4187-97b5-6bced61686f8\" (UID: \"43f3baeb-3387-4187-97b5-6bced61686f8\") " Oct 03 09:01:10 crc kubenswrapper[4790]: I1003 09:01:10.776018 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/43f3baeb-3387-4187-97b5-6bced61686f8-logs" (OuterVolumeSpecName: "logs") pod "43f3baeb-3387-4187-97b5-6bced61686f8" (UID: "43f3baeb-3387-4187-97b5-6bced61686f8"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 09:01:10 crc kubenswrapper[4790]: I1003 09:01:10.776105 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/43f3baeb-3387-4187-97b5-6bced61686f8-combined-ca-bundle\") pod \"43f3baeb-3387-4187-97b5-6bced61686f8\" (UID: \"43f3baeb-3387-4187-97b5-6bced61686f8\") " Oct 03 09:01:10 crc kubenswrapper[4790]: I1003 09:01:10.776157 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tlwrc\" (UniqueName: \"kubernetes.io/projected/43f3baeb-3387-4187-97b5-6bced61686f8-kube-api-access-tlwrc\") pod \"43f3baeb-3387-4187-97b5-6bced61686f8\" (UID: \"43f3baeb-3387-4187-97b5-6bced61686f8\") " Oct 03 09:01:10 crc kubenswrapper[4790]: I1003 09:01:10.776862 4790 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/43f3baeb-3387-4187-97b5-6bced61686f8-logs\") on node \"crc\" DevicePath \"\"" Oct 03 09:01:10 crc kubenswrapper[4790]: I1003 09:01:10.785834 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43f3baeb-3387-4187-97b5-6bced61686f8-kube-api-access-tlwrc" (OuterVolumeSpecName: "kube-api-access-tlwrc") pod "43f3baeb-3387-4187-97b5-6bced61686f8" (UID: "43f3baeb-3387-4187-97b5-6bced61686f8"). InnerVolumeSpecName "kube-api-access-tlwrc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:01:10 crc kubenswrapper[4790]: I1003 09:01:10.816444 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43f3baeb-3387-4187-97b5-6bced61686f8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "43f3baeb-3387-4187-97b5-6bced61686f8" (UID: "43f3baeb-3387-4187-97b5-6bced61686f8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:01:10 crc kubenswrapper[4790]: I1003 09:01:10.852297 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43f3baeb-3387-4187-97b5-6bced61686f8-config-data" (OuterVolumeSpecName: "config-data") pod "43f3baeb-3387-4187-97b5-6bced61686f8" (UID: "43f3baeb-3387-4187-97b5-6bced61686f8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:01:10 crc kubenswrapper[4790]: I1003 09:01:10.878715 4790 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/43f3baeb-3387-4187-97b5-6bced61686f8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 09:01:10 crc kubenswrapper[4790]: I1003 09:01:10.878762 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tlwrc\" (UniqueName: \"kubernetes.io/projected/43f3baeb-3387-4187-97b5-6bced61686f8-kube-api-access-tlwrc\") on node \"crc\" DevicePath \"\"" Oct 03 09:01:10 crc kubenswrapper[4790]: I1003 09:01:10.878776 4790 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/43f3baeb-3387-4187-97b5-6bced61686f8-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 09:01:10 crc kubenswrapper[4790]: I1003 09:01:10.929754 4790 generic.go:334] "Generic (PLEG): container finished" podID="43f3baeb-3387-4187-97b5-6bced61686f8" containerID="54227f5e2ebb232ea8dd8f6eaa942ce9aa076442a04c518835d2c29183c72eae" exitCode=0 Oct 03 09:01:10 crc kubenswrapper[4790]: I1003 09:01:10.929784 4790 generic.go:334] "Generic (PLEG): container finished" podID="43f3baeb-3387-4187-97b5-6bced61686f8" containerID="09ba3b0adb608ee4689e6e35f08981ea03dadc09746b6a6f46d5f4d2df66a45e" exitCode=143 Oct 03 09:01:10 crc kubenswrapper[4790]: I1003 09:01:10.929812 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 03 09:01:10 crc kubenswrapper[4790]: I1003 09:01:10.929852 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"43f3baeb-3387-4187-97b5-6bced61686f8","Type":"ContainerDied","Data":"54227f5e2ebb232ea8dd8f6eaa942ce9aa076442a04c518835d2c29183c72eae"} Oct 03 09:01:10 crc kubenswrapper[4790]: I1003 09:01:10.929876 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"43f3baeb-3387-4187-97b5-6bced61686f8","Type":"ContainerDied","Data":"09ba3b0adb608ee4689e6e35f08981ea03dadc09746b6a6f46d5f4d2df66a45e"} Oct 03 09:01:10 crc kubenswrapper[4790]: I1003 09:01:10.929887 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"43f3baeb-3387-4187-97b5-6bced61686f8","Type":"ContainerDied","Data":"3526166fa4fab27aef3edccb4cde482b2b8f0c2cbaa1a5296a292fd1ee07b07d"} Oct 03 09:01:10 crc kubenswrapper[4790]: I1003 09:01:10.929901 4790 scope.go:117] "RemoveContainer" containerID="54227f5e2ebb232ea8dd8f6eaa942ce9aa076442a04c518835d2c29183c72eae" Oct 03 09:01:10 crc kubenswrapper[4790]: I1003 09:01:10.974687 4790 scope.go:117] "RemoveContainer" containerID="09ba3b0adb608ee4689e6e35f08981ea03dadc09746b6a6f46d5f4d2df66a45e" Oct 03 09:01:10 crc kubenswrapper[4790]: I1003 09:01:10.998892 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 03 09:01:11 crc kubenswrapper[4790]: I1003 09:01:11.016214 4790 scope.go:117] "RemoveContainer" containerID="54227f5e2ebb232ea8dd8f6eaa942ce9aa076442a04c518835d2c29183c72eae" Oct 03 09:01:11 crc kubenswrapper[4790]: E1003 09:01:11.017394 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"54227f5e2ebb232ea8dd8f6eaa942ce9aa076442a04c518835d2c29183c72eae\": container with ID starting with 54227f5e2ebb232ea8dd8f6eaa942ce9aa076442a04c518835d2c29183c72eae not found: ID does not exist" containerID="54227f5e2ebb232ea8dd8f6eaa942ce9aa076442a04c518835d2c29183c72eae" Oct 03 09:01:11 crc kubenswrapper[4790]: I1003 09:01:11.017439 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"54227f5e2ebb232ea8dd8f6eaa942ce9aa076442a04c518835d2c29183c72eae"} err="failed to get container status \"54227f5e2ebb232ea8dd8f6eaa942ce9aa076442a04c518835d2c29183c72eae\": rpc error: code = NotFound desc = could not find container \"54227f5e2ebb232ea8dd8f6eaa942ce9aa076442a04c518835d2c29183c72eae\": container with ID starting with 54227f5e2ebb232ea8dd8f6eaa942ce9aa076442a04c518835d2c29183c72eae not found: ID does not exist" Oct 03 09:01:11 crc kubenswrapper[4790]: I1003 09:01:11.017467 4790 scope.go:117] "RemoveContainer" containerID="09ba3b0adb608ee4689e6e35f08981ea03dadc09746b6a6f46d5f4d2df66a45e" Oct 03 09:01:11 crc kubenswrapper[4790]: E1003 09:01:11.017827 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"09ba3b0adb608ee4689e6e35f08981ea03dadc09746b6a6f46d5f4d2df66a45e\": container with ID starting with 09ba3b0adb608ee4689e6e35f08981ea03dadc09746b6a6f46d5f4d2df66a45e not found: ID does not exist" containerID="09ba3b0adb608ee4689e6e35f08981ea03dadc09746b6a6f46d5f4d2df66a45e" Oct 03 09:01:11 crc kubenswrapper[4790]: I1003 09:01:11.017863 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"09ba3b0adb608ee4689e6e35f08981ea03dadc09746b6a6f46d5f4d2df66a45e"} err="failed to get container status \"09ba3b0adb608ee4689e6e35f08981ea03dadc09746b6a6f46d5f4d2df66a45e\": rpc error: code = NotFound desc = could not find container \"09ba3b0adb608ee4689e6e35f08981ea03dadc09746b6a6f46d5f4d2df66a45e\": container with ID starting with 09ba3b0adb608ee4689e6e35f08981ea03dadc09746b6a6f46d5f4d2df66a45e not found: ID does not exist" Oct 03 09:01:11 crc kubenswrapper[4790]: I1003 09:01:11.017884 4790 scope.go:117] "RemoveContainer" containerID="54227f5e2ebb232ea8dd8f6eaa942ce9aa076442a04c518835d2c29183c72eae" Oct 03 09:01:11 crc kubenswrapper[4790]: I1003 09:01:11.018149 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"54227f5e2ebb232ea8dd8f6eaa942ce9aa076442a04c518835d2c29183c72eae"} err="failed to get container status \"54227f5e2ebb232ea8dd8f6eaa942ce9aa076442a04c518835d2c29183c72eae\": rpc error: code = NotFound desc = could not find container \"54227f5e2ebb232ea8dd8f6eaa942ce9aa076442a04c518835d2c29183c72eae\": container with ID starting with 54227f5e2ebb232ea8dd8f6eaa942ce9aa076442a04c518835d2c29183c72eae not found: ID does not exist" Oct 03 09:01:11 crc kubenswrapper[4790]: I1003 09:01:11.018173 4790 scope.go:117] "RemoveContainer" containerID="09ba3b0adb608ee4689e6e35f08981ea03dadc09746b6a6f46d5f4d2df66a45e" Oct 03 09:01:11 crc kubenswrapper[4790]: I1003 09:01:11.018387 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"09ba3b0adb608ee4689e6e35f08981ea03dadc09746b6a6f46d5f4d2df66a45e"} err="failed to get container status \"09ba3b0adb608ee4689e6e35f08981ea03dadc09746b6a6f46d5f4d2df66a45e\": rpc error: code = NotFound desc = could not find container \"09ba3b0adb608ee4689e6e35f08981ea03dadc09746b6a6f46d5f4d2df66a45e\": container with ID starting with 09ba3b0adb608ee4689e6e35f08981ea03dadc09746b6a6f46d5f4d2df66a45e not found: ID does not exist" Oct 03 09:01:11 crc kubenswrapper[4790]: I1003 09:01:11.033524 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Oct 03 09:01:11 crc kubenswrapper[4790]: I1003 09:01:11.044088 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Oct 03 09:01:11 crc kubenswrapper[4790]: E1003 09:01:11.044556 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="43f3baeb-3387-4187-97b5-6bced61686f8" containerName="nova-api-log" Oct 03 09:01:11 crc kubenswrapper[4790]: I1003 09:01:11.044590 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="43f3baeb-3387-4187-97b5-6bced61686f8" containerName="nova-api-log" Oct 03 09:01:11 crc kubenswrapper[4790]: E1003 09:01:11.044630 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="62abaf76-b94a-430c-932f-75164034006a" containerName="nova-manage" Oct 03 09:01:11 crc kubenswrapper[4790]: I1003 09:01:11.044637 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="62abaf76-b94a-430c-932f-75164034006a" containerName="nova-manage" Oct 03 09:01:11 crc kubenswrapper[4790]: E1003 09:01:11.044663 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="43f3baeb-3387-4187-97b5-6bced61686f8" containerName="nova-api-api" Oct 03 09:01:11 crc kubenswrapper[4790]: I1003 09:01:11.044671 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="43f3baeb-3387-4187-97b5-6bced61686f8" containerName="nova-api-api" Oct 03 09:01:11 crc kubenswrapper[4790]: I1003 09:01:11.044895 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="43f3baeb-3387-4187-97b5-6bced61686f8" containerName="nova-api-log" Oct 03 09:01:11 crc kubenswrapper[4790]: I1003 09:01:11.044916 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="62abaf76-b94a-430c-932f-75164034006a" containerName="nova-manage" Oct 03 09:01:11 crc kubenswrapper[4790]: I1003 09:01:11.044928 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="43f3baeb-3387-4187-97b5-6bced61686f8" containerName="nova-api-api" Oct 03 09:01:11 crc kubenswrapper[4790]: I1003 09:01:11.046158 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 03 09:01:11 crc kubenswrapper[4790]: I1003 09:01:11.049255 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Oct 03 09:01:11 crc kubenswrapper[4790]: I1003 09:01:11.054612 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 03 09:01:11 crc kubenswrapper[4790]: I1003 09:01:11.092811 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ae318a68-abcf-4d99-ac81-222288742659-config-data\") pod \"nova-api-0\" (UID: \"ae318a68-abcf-4d99-ac81-222288742659\") " pod="openstack/nova-api-0" Oct 03 09:01:11 crc kubenswrapper[4790]: I1003 09:01:11.092888 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rllwn\" (UniqueName: \"kubernetes.io/projected/ae318a68-abcf-4d99-ac81-222288742659-kube-api-access-rllwn\") pod \"nova-api-0\" (UID: \"ae318a68-abcf-4d99-ac81-222288742659\") " pod="openstack/nova-api-0" Oct 03 09:01:11 crc kubenswrapper[4790]: I1003 09:01:11.093042 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ae318a68-abcf-4d99-ac81-222288742659-logs\") pod \"nova-api-0\" (UID: \"ae318a68-abcf-4d99-ac81-222288742659\") " pod="openstack/nova-api-0" Oct 03 09:01:11 crc kubenswrapper[4790]: I1003 09:01:11.093187 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ae318a68-abcf-4d99-ac81-222288742659-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"ae318a68-abcf-4d99-ac81-222288742659\") " pod="openstack/nova-api-0" Oct 03 09:01:11 crc kubenswrapper[4790]: E1003 09:01:11.157709 4790 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="dee59fba0543ec8d8516375c3df52bfe9f8c7532838a62f064236c0d359c967e" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 03 09:01:11 crc kubenswrapper[4790]: E1003 09:01:11.160745 4790 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="dee59fba0543ec8d8516375c3df52bfe9f8c7532838a62f064236c0d359c967e" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 03 09:01:11 crc kubenswrapper[4790]: E1003 09:01:11.162340 4790 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="dee59fba0543ec8d8516375c3df52bfe9f8c7532838a62f064236c0d359c967e" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 03 09:01:11 crc kubenswrapper[4790]: E1003 09:01:11.162385 4790 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="40eba639-5cf7-43e9-a24a-ed0027de7815" containerName="nova-scheduler-scheduler" Oct 03 09:01:11 crc kubenswrapper[4790]: I1003 09:01:11.195479 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ae318a68-abcf-4d99-ac81-222288742659-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"ae318a68-abcf-4d99-ac81-222288742659\") " pod="openstack/nova-api-0" Oct 03 09:01:11 crc kubenswrapper[4790]: I1003 09:01:11.195687 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ae318a68-abcf-4d99-ac81-222288742659-config-data\") pod \"nova-api-0\" (UID: \"ae318a68-abcf-4d99-ac81-222288742659\") " pod="openstack/nova-api-0" Oct 03 09:01:11 crc kubenswrapper[4790]: I1003 09:01:11.195747 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rllwn\" (UniqueName: \"kubernetes.io/projected/ae318a68-abcf-4d99-ac81-222288742659-kube-api-access-rllwn\") pod \"nova-api-0\" (UID: \"ae318a68-abcf-4d99-ac81-222288742659\") " pod="openstack/nova-api-0" Oct 03 09:01:11 crc kubenswrapper[4790]: I1003 09:01:11.195774 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ae318a68-abcf-4d99-ac81-222288742659-logs\") pod \"nova-api-0\" (UID: \"ae318a68-abcf-4d99-ac81-222288742659\") " pod="openstack/nova-api-0" Oct 03 09:01:11 crc kubenswrapper[4790]: I1003 09:01:11.196359 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ae318a68-abcf-4d99-ac81-222288742659-logs\") pod \"nova-api-0\" (UID: \"ae318a68-abcf-4d99-ac81-222288742659\") " pod="openstack/nova-api-0" Oct 03 09:01:11 crc kubenswrapper[4790]: I1003 09:01:11.203375 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ae318a68-abcf-4d99-ac81-222288742659-config-data\") pod \"nova-api-0\" (UID: \"ae318a68-abcf-4d99-ac81-222288742659\") " pod="openstack/nova-api-0" Oct 03 09:01:11 crc kubenswrapper[4790]: I1003 09:01:11.203424 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ae318a68-abcf-4d99-ac81-222288742659-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"ae318a68-abcf-4d99-ac81-222288742659\") " pod="openstack/nova-api-0" Oct 03 09:01:11 crc kubenswrapper[4790]: I1003 09:01:11.220544 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rllwn\" (UniqueName: \"kubernetes.io/projected/ae318a68-abcf-4d99-ac81-222288742659-kube-api-access-rllwn\") pod \"nova-api-0\" (UID: \"ae318a68-abcf-4d99-ac81-222288742659\") " pod="openstack/nova-api-0" Oct 03 09:01:11 crc kubenswrapper[4790]: I1003 09:01:11.376687 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 03 09:01:11 crc kubenswrapper[4790]: I1003 09:01:11.394106 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29324701-xvbcm" Oct 03 09:01:11 crc kubenswrapper[4790]: I1003 09:01:11.501612 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0eab3760-3551-42a8-9eaa-f53d4b94a8ca-combined-ca-bundle\") pod \"0eab3760-3551-42a8-9eaa-f53d4b94a8ca\" (UID: \"0eab3760-3551-42a8-9eaa-f53d4b94a8ca\") " Oct 03 09:01:11 crc kubenswrapper[4790]: I1003 09:01:11.501690 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z7222\" (UniqueName: \"kubernetes.io/projected/0eab3760-3551-42a8-9eaa-f53d4b94a8ca-kube-api-access-z7222\") pod \"0eab3760-3551-42a8-9eaa-f53d4b94a8ca\" (UID: \"0eab3760-3551-42a8-9eaa-f53d4b94a8ca\") " Oct 03 09:01:11 crc kubenswrapper[4790]: I1003 09:01:11.501742 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0eab3760-3551-42a8-9eaa-f53d4b94a8ca-config-data\") pod \"0eab3760-3551-42a8-9eaa-f53d4b94a8ca\" (UID: \"0eab3760-3551-42a8-9eaa-f53d4b94a8ca\") " Oct 03 09:01:11 crc kubenswrapper[4790]: I1003 09:01:11.501805 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/0eab3760-3551-42a8-9eaa-f53d4b94a8ca-fernet-keys\") pod \"0eab3760-3551-42a8-9eaa-f53d4b94a8ca\" (UID: \"0eab3760-3551-42a8-9eaa-f53d4b94a8ca\") " Oct 03 09:01:11 crc kubenswrapper[4790]: I1003 09:01:11.507449 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0eab3760-3551-42a8-9eaa-f53d4b94a8ca-kube-api-access-z7222" (OuterVolumeSpecName: "kube-api-access-z7222") pod "0eab3760-3551-42a8-9eaa-f53d4b94a8ca" (UID: "0eab3760-3551-42a8-9eaa-f53d4b94a8ca"). InnerVolumeSpecName "kube-api-access-z7222". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:01:11 crc kubenswrapper[4790]: I1003 09:01:11.507537 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0eab3760-3551-42a8-9eaa-f53d4b94a8ca-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "0eab3760-3551-42a8-9eaa-f53d4b94a8ca" (UID: "0eab3760-3551-42a8-9eaa-f53d4b94a8ca"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:01:11 crc kubenswrapper[4790]: I1003 09:01:11.556809 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0eab3760-3551-42a8-9eaa-f53d4b94a8ca-config-data" (OuterVolumeSpecName: "config-data") pod "0eab3760-3551-42a8-9eaa-f53d4b94a8ca" (UID: "0eab3760-3551-42a8-9eaa-f53d4b94a8ca"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:01:11 crc kubenswrapper[4790]: I1003 09:01:11.559969 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0eab3760-3551-42a8-9eaa-f53d4b94a8ca-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0eab3760-3551-42a8-9eaa-f53d4b94a8ca" (UID: "0eab3760-3551-42a8-9eaa-f53d4b94a8ca"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:01:11 crc kubenswrapper[4790]: I1003 09:01:11.603927 4790 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0eab3760-3551-42a8-9eaa-f53d4b94a8ca-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 09:01:11 crc kubenswrapper[4790]: I1003 09:01:11.603963 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z7222\" (UniqueName: \"kubernetes.io/projected/0eab3760-3551-42a8-9eaa-f53d4b94a8ca-kube-api-access-z7222\") on node \"crc\" DevicePath \"\"" Oct 03 09:01:11 crc kubenswrapper[4790]: I1003 09:01:11.603975 4790 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0eab3760-3551-42a8-9eaa-f53d4b94a8ca-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 09:01:11 crc kubenswrapper[4790]: I1003 09:01:11.603986 4790 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/0eab3760-3551-42a8-9eaa-f53d4b94a8ca-fernet-keys\") on node \"crc\" DevicePath \"\"" Oct 03 09:01:11 crc kubenswrapper[4790]: I1003 09:01:11.865607 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 03 09:01:11 crc kubenswrapper[4790]: I1003 09:01:11.939948 4790 generic.go:334] "Generic (PLEG): container finished" podID="40eba639-5cf7-43e9-a24a-ed0027de7815" containerID="dee59fba0543ec8d8516375c3df52bfe9f8c7532838a62f064236c0d359c967e" exitCode=0 Oct 03 09:01:11 crc kubenswrapper[4790]: I1003 09:01:11.940016 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"40eba639-5cf7-43e9-a24a-ed0027de7815","Type":"ContainerDied","Data":"dee59fba0543ec8d8516375c3df52bfe9f8c7532838a62f064236c0d359c967e"} Oct 03 09:01:11 crc kubenswrapper[4790]: I1003 09:01:11.942088 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29324701-xvbcm" event={"ID":"0eab3760-3551-42a8-9eaa-f53d4b94a8ca","Type":"ContainerDied","Data":"b442af1de11863dfdd5bed757d1323674d03aa01951fdf370d84295b264e597a"} Oct 03 09:01:11 crc kubenswrapper[4790]: I1003 09:01:11.942112 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b442af1de11863dfdd5bed757d1323674d03aa01951fdf370d84295b264e597a" Oct 03 09:01:11 crc kubenswrapper[4790]: I1003 09:01:11.942123 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29324701-xvbcm" Oct 03 09:01:11 crc kubenswrapper[4790]: I1003 09:01:11.944336 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="7ddcf8dc-3b68-43c0-a78c-88aeb1fe7f30" containerName="nova-metadata-log" containerID="cri-o://afde0c6fb85d57536f16ea26c916182fb31d89bcea7ae3310771bc9f2cb86ab4" gracePeriod=30 Oct 03 09:01:11 crc kubenswrapper[4790]: I1003 09:01:11.944915 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="7ddcf8dc-3b68-43c0-a78c-88aeb1fe7f30" containerName="nova-metadata-metadata" containerID="cri-o://f14b6768a33be3ff60a8d67fd4aa51fad64d0d22318d5ed66162db615d88bc98" gracePeriod=30 Oct 03 09:01:12 crc kubenswrapper[4790]: W1003 09:01:12.119675 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podae318a68_abcf_4d99_ac81_222288742659.slice/crio-6fc4ffb28c39d299990eb381bbfbdc0a384c81fd683af0228092326cf0ff0e3b WatchSource:0}: Error finding container 6fc4ffb28c39d299990eb381bbfbdc0a384c81fd683af0228092326cf0ff0e3b: Status 404 returned error can't find the container with id 6fc4ffb28c39d299990eb381bbfbdc0a384c81fd683af0228092326cf0ff0e3b Oct 03 09:01:12 crc kubenswrapper[4790]: I1003 09:01:12.340445 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43f3baeb-3387-4187-97b5-6bced61686f8" path="/var/lib/kubelet/pods/43f3baeb-3387-4187-97b5-6bced61686f8/volumes" Oct 03 09:01:12 crc kubenswrapper[4790]: I1003 09:01:12.858865 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 03 09:01:12 crc kubenswrapper[4790]: I1003 09:01:12.933463 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mrl5t\" (UniqueName: \"kubernetes.io/projected/40eba639-5cf7-43e9-a24a-ed0027de7815-kube-api-access-mrl5t\") pod \"40eba639-5cf7-43e9-a24a-ed0027de7815\" (UID: \"40eba639-5cf7-43e9-a24a-ed0027de7815\") " Oct 03 09:01:12 crc kubenswrapper[4790]: I1003 09:01:12.933633 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/40eba639-5cf7-43e9-a24a-ed0027de7815-config-data\") pod \"40eba639-5cf7-43e9-a24a-ed0027de7815\" (UID: \"40eba639-5cf7-43e9-a24a-ed0027de7815\") " Oct 03 09:01:12 crc kubenswrapper[4790]: I1003 09:01:12.933665 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/40eba639-5cf7-43e9-a24a-ed0027de7815-combined-ca-bundle\") pod \"40eba639-5cf7-43e9-a24a-ed0027de7815\" (UID: \"40eba639-5cf7-43e9-a24a-ed0027de7815\") " Oct 03 09:01:12 crc kubenswrapper[4790]: I1003 09:01:12.938820 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/40eba639-5cf7-43e9-a24a-ed0027de7815-kube-api-access-mrl5t" (OuterVolumeSpecName: "kube-api-access-mrl5t") pod "40eba639-5cf7-43e9-a24a-ed0027de7815" (UID: "40eba639-5cf7-43e9-a24a-ed0027de7815"). InnerVolumeSpecName "kube-api-access-mrl5t". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:01:12 crc kubenswrapper[4790]: I1003 09:01:12.971650 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/40eba639-5cf7-43e9-a24a-ed0027de7815-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "40eba639-5cf7-43e9-a24a-ed0027de7815" (UID: "40eba639-5cf7-43e9-a24a-ed0027de7815"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:01:12 crc kubenswrapper[4790]: I1003 09:01:12.976910 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/40eba639-5cf7-43e9-a24a-ed0027de7815-config-data" (OuterVolumeSpecName: "config-data") pod "40eba639-5cf7-43e9-a24a-ed0027de7815" (UID: "40eba639-5cf7-43e9-a24a-ed0027de7815"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:01:12 crc kubenswrapper[4790]: I1003 09:01:12.987157 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 03 09:01:12 crc kubenswrapper[4790]: I1003 09:01:12.987157 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"40eba639-5cf7-43e9-a24a-ed0027de7815","Type":"ContainerDied","Data":"5ef62d6f196903bc469127cce486ea1dc905ed230e76b5a12d5ad5b5d795b96b"} Oct 03 09:01:12 crc kubenswrapper[4790]: I1003 09:01:12.987293 4790 scope.go:117] "RemoveContainer" containerID="dee59fba0543ec8d8516375c3df52bfe9f8c7532838a62f064236c0d359c967e" Oct 03 09:01:12 crc kubenswrapper[4790]: I1003 09:01:12.991861 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ae318a68-abcf-4d99-ac81-222288742659","Type":"ContainerStarted","Data":"507f5bfd2cc42ead6ab943ac90369c48007561516a67a237767096f54ee24f69"} Oct 03 09:01:12 crc kubenswrapper[4790]: I1003 09:01:12.991905 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ae318a68-abcf-4d99-ac81-222288742659","Type":"ContainerStarted","Data":"9ebf0a050ae49b26574fcd2a3a3d7554664709d9a087cf5a8bc8807d902e4fff"} Oct 03 09:01:12 crc kubenswrapper[4790]: I1003 09:01:12.991920 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ae318a68-abcf-4d99-ac81-222288742659","Type":"ContainerStarted","Data":"6fc4ffb28c39d299990eb381bbfbdc0a384c81fd683af0228092326cf0ff0e3b"} Oct 03 09:01:13 crc kubenswrapper[4790]: I1003 09:01:13.001803 4790 generic.go:334] "Generic (PLEG): container finished" podID="80973b39-7225-498d-9bf5-12a6b7d4f3dc" containerID="6458bb4a693bb122671c72474f7f5ba3d390676b6a0b20c111150d5d3a3dac6a" exitCode=0 Oct 03 09:01:13 crc kubenswrapper[4790]: I1003 09:01:13.001869 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"80973b39-7225-498d-9bf5-12a6b7d4f3dc","Type":"ContainerDied","Data":"6458bb4a693bb122671c72474f7f5ba3d390676b6a0b20c111150d5d3a3dac6a"} Oct 03 09:01:13 crc kubenswrapper[4790]: I1003 09:01:13.005655 4790 generic.go:334] "Generic (PLEG): container finished" podID="7ddcf8dc-3b68-43c0-a78c-88aeb1fe7f30" containerID="f14b6768a33be3ff60a8d67fd4aa51fad64d0d22318d5ed66162db615d88bc98" exitCode=0 Oct 03 09:01:13 crc kubenswrapper[4790]: I1003 09:01:13.005694 4790 generic.go:334] "Generic (PLEG): container finished" podID="7ddcf8dc-3b68-43c0-a78c-88aeb1fe7f30" containerID="afde0c6fb85d57536f16ea26c916182fb31d89bcea7ae3310771bc9f2cb86ab4" exitCode=143 Oct 03 09:01:13 crc kubenswrapper[4790]: I1003 09:01:13.005716 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"7ddcf8dc-3b68-43c0-a78c-88aeb1fe7f30","Type":"ContainerDied","Data":"f14b6768a33be3ff60a8d67fd4aa51fad64d0d22318d5ed66162db615d88bc98"} Oct 03 09:01:13 crc kubenswrapper[4790]: I1003 09:01:13.005740 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"7ddcf8dc-3b68-43c0-a78c-88aeb1fe7f30","Type":"ContainerDied","Data":"afde0c6fb85d57536f16ea26c916182fb31d89bcea7ae3310771bc9f2cb86ab4"} Oct 03 09:01:13 crc kubenswrapper[4790]: I1003 09:01:13.022798 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=3.022776462 podStartE2EDuration="3.022776462s" podCreationTimestamp="2025-10-03 09:01:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 09:01:13.00847841 +0000 UTC m=+1259.361412648" watchObservedRunningTime="2025-10-03 09:01:13.022776462 +0000 UTC m=+1259.375710700" Oct 03 09:01:13 crc kubenswrapper[4790]: I1003 09:01:13.037263 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mrl5t\" (UniqueName: \"kubernetes.io/projected/40eba639-5cf7-43e9-a24a-ed0027de7815-kube-api-access-mrl5t\") on node \"crc\" DevicePath \"\"" Oct 03 09:01:13 crc kubenswrapper[4790]: I1003 09:01:13.037312 4790 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/40eba639-5cf7-43e9-a24a-ed0027de7815-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 09:01:13 crc kubenswrapper[4790]: I1003 09:01:13.037323 4790 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/40eba639-5cf7-43e9-a24a-ed0027de7815-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 09:01:13 crc kubenswrapper[4790]: I1003 09:01:13.043271 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Oct 03 09:01:13 crc kubenswrapper[4790]: I1003 09:01:13.087723 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Oct 03 09:01:13 crc kubenswrapper[4790]: I1003 09:01:13.099212 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Oct 03 09:01:13 crc kubenswrapper[4790]: E1003 09:01:13.099711 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0eab3760-3551-42a8-9eaa-f53d4b94a8ca" containerName="keystone-cron" Oct 03 09:01:13 crc kubenswrapper[4790]: I1003 09:01:13.099734 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="0eab3760-3551-42a8-9eaa-f53d4b94a8ca" containerName="keystone-cron" Oct 03 09:01:13 crc kubenswrapper[4790]: E1003 09:01:13.099767 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="40eba639-5cf7-43e9-a24a-ed0027de7815" containerName="nova-scheduler-scheduler" Oct 03 09:01:13 crc kubenswrapper[4790]: I1003 09:01:13.099777 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="40eba639-5cf7-43e9-a24a-ed0027de7815" containerName="nova-scheduler-scheduler" Oct 03 09:01:13 crc kubenswrapper[4790]: I1003 09:01:13.100011 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="40eba639-5cf7-43e9-a24a-ed0027de7815" containerName="nova-scheduler-scheduler" Oct 03 09:01:13 crc kubenswrapper[4790]: I1003 09:01:13.100048 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="0eab3760-3551-42a8-9eaa-f53d4b94a8ca" containerName="keystone-cron" Oct 03 09:01:13 crc kubenswrapper[4790]: I1003 09:01:13.101116 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 03 09:01:13 crc kubenswrapper[4790]: I1003 09:01:13.105144 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Oct 03 09:01:13 crc kubenswrapper[4790]: I1003 09:01:13.112642 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 03 09:01:13 crc kubenswrapper[4790]: I1003 09:01:13.114404 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 03 09:01:13 crc kubenswrapper[4790]: I1003 09:01:13.138041 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/7ddcf8dc-3b68-43c0-a78c-88aeb1fe7f30-nova-metadata-tls-certs\") pod \"7ddcf8dc-3b68-43c0-a78c-88aeb1fe7f30\" (UID: \"7ddcf8dc-3b68-43c0-a78c-88aeb1fe7f30\") " Oct 03 09:01:13 crc kubenswrapper[4790]: I1003 09:01:13.138223 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-spc2w\" (UniqueName: \"kubernetes.io/projected/7ddcf8dc-3b68-43c0-a78c-88aeb1fe7f30-kube-api-access-spc2w\") pod \"7ddcf8dc-3b68-43c0-a78c-88aeb1fe7f30\" (UID: \"7ddcf8dc-3b68-43c0-a78c-88aeb1fe7f30\") " Oct 03 09:01:13 crc kubenswrapper[4790]: I1003 09:01:13.138244 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ddcf8dc-3b68-43c0-a78c-88aeb1fe7f30-combined-ca-bundle\") pod \"7ddcf8dc-3b68-43c0-a78c-88aeb1fe7f30\" (UID: \"7ddcf8dc-3b68-43c0-a78c-88aeb1fe7f30\") " Oct 03 09:01:13 crc kubenswrapper[4790]: I1003 09:01:13.138313 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7ddcf8dc-3b68-43c0-a78c-88aeb1fe7f30-logs\") pod \"7ddcf8dc-3b68-43c0-a78c-88aeb1fe7f30\" (UID: \"7ddcf8dc-3b68-43c0-a78c-88aeb1fe7f30\") " Oct 03 09:01:13 crc kubenswrapper[4790]: I1003 09:01:13.138410 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7ddcf8dc-3b68-43c0-a78c-88aeb1fe7f30-config-data\") pod \"7ddcf8dc-3b68-43c0-a78c-88aeb1fe7f30\" (UID: \"7ddcf8dc-3b68-43c0-a78c-88aeb1fe7f30\") " Oct 03 09:01:13 crc kubenswrapper[4790]: I1003 09:01:13.138650 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8b2718fc-f9ec-42fb-9aad-8ff7232c4ee2-config-data\") pod \"nova-scheduler-0\" (UID: \"8b2718fc-f9ec-42fb-9aad-8ff7232c4ee2\") " pod="openstack/nova-scheduler-0" Oct 03 09:01:13 crc kubenswrapper[4790]: I1003 09:01:13.138697 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wtvdh\" (UniqueName: \"kubernetes.io/projected/8b2718fc-f9ec-42fb-9aad-8ff7232c4ee2-kube-api-access-wtvdh\") pod \"nova-scheduler-0\" (UID: \"8b2718fc-f9ec-42fb-9aad-8ff7232c4ee2\") " pod="openstack/nova-scheduler-0" Oct 03 09:01:13 crc kubenswrapper[4790]: I1003 09:01:13.138809 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b2718fc-f9ec-42fb-9aad-8ff7232c4ee2-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"8b2718fc-f9ec-42fb-9aad-8ff7232c4ee2\") " pod="openstack/nova-scheduler-0" Oct 03 09:01:13 crc kubenswrapper[4790]: I1003 09:01:13.141795 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7ddcf8dc-3b68-43c0-a78c-88aeb1fe7f30-logs" (OuterVolumeSpecName: "logs") pod "7ddcf8dc-3b68-43c0-a78c-88aeb1fe7f30" (UID: "7ddcf8dc-3b68-43c0-a78c-88aeb1fe7f30"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 09:01:13 crc kubenswrapper[4790]: I1003 09:01:13.146924 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7ddcf8dc-3b68-43c0-a78c-88aeb1fe7f30-kube-api-access-spc2w" (OuterVolumeSpecName: "kube-api-access-spc2w") pod "7ddcf8dc-3b68-43c0-a78c-88aeb1fe7f30" (UID: "7ddcf8dc-3b68-43c0-a78c-88aeb1fe7f30"). InnerVolumeSpecName "kube-api-access-spc2w". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:01:13 crc kubenswrapper[4790]: I1003 09:01:13.175816 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7ddcf8dc-3b68-43c0-a78c-88aeb1fe7f30-config-data" (OuterVolumeSpecName: "config-data") pod "7ddcf8dc-3b68-43c0-a78c-88aeb1fe7f30" (UID: "7ddcf8dc-3b68-43c0-a78c-88aeb1fe7f30"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:01:13 crc kubenswrapper[4790]: I1003 09:01:13.226745 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7ddcf8dc-3b68-43c0-a78c-88aeb1fe7f30-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7ddcf8dc-3b68-43c0-a78c-88aeb1fe7f30" (UID: "7ddcf8dc-3b68-43c0-a78c-88aeb1fe7f30"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:01:13 crc kubenswrapper[4790]: I1003 09:01:13.232862 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7ddcf8dc-3b68-43c0-a78c-88aeb1fe7f30-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "7ddcf8dc-3b68-43c0-a78c-88aeb1fe7f30" (UID: "7ddcf8dc-3b68-43c0-a78c-88aeb1fe7f30"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:01:13 crc kubenswrapper[4790]: I1003 09:01:13.241130 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b2718fc-f9ec-42fb-9aad-8ff7232c4ee2-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"8b2718fc-f9ec-42fb-9aad-8ff7232c4ee2\") " pod="openstack/nova-scheduler-0" Oct 03 09:01:13 crc kubenswrapper[4790]: I1003 09:01:13.241494 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8b2718fc-f9ec-42fb-9aad-8ff7232c4ee2-config-data\") pod \"nova-scheduler-0\" (UID: \"8b2718fc-f9ec-42fb-9aad-8ff7232c4ee2\") " pod="openstack/nova-scheduler-0" Oct 03 09:01:13 crc kubenswrapper[4790]: I1003 09:01:13.241713 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wtvdh\" (UniqueName: \"kubernetes.io/projected/8b2718fc-f9ec-42fb-9aad-8ff7232c4ee2-kube-api-access-wtvdh\") pod \"nova-scheduler-0\" (UID: \"8b2718fc-f9ec-42fb-9aad-8ff7232c4ee2\") " pod="openstack/nova-scheduler-0" Oct 03 09:01:13 crc kubenswrapper[4790]: I1003 09:01:13.242283 4790 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7ddcf8dc-3b68-43c0-a78c-88aeb1fe7f30-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 09:01:13 crc kubenswrapper[4790]: I1003 09:01:13.242418 4790 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/7ddcf8dc-3b68-43c0-a78c-88aeb1fe7f30-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 03 09:01:13 crc kubenswrapper[4790]: I1003 09:01:13.242517 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-spc2w\" (UniqueName: \"kubernetes.io/projected/7ddcf8dc-3b68-43c0-a78c-88aeb1fe7f30-kube-api-access-spc2w\") on node \"crc\" DevicePath \"\"" Oct 03 09:01:13 crc kubenswrapper[4790]: I1003 09:01:13.242624 4790 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ddcf8dc-3b68-43c0-a78c-88aeb1fe7f30-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 09:01:13 crc kubenswrapper[4790]: I1003 09:01:13.242709 4790 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7ddcf8dc-3b68-43c0-a78c-88aeb1fe7f30-logs\") on node \"crc\" DevicePath \"\"" Oct 03 09:01:13 crc kubenswrapper[4790]: I1003 09:01:13.245131 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8b2718fc-f9ec-42fb-9aad-8ff7232c4ee2-config-data\") pod \"nova-scheduler-0\" (UID: \"8b2718fc-f9ec-42fb-9aad-8ff7232c4ee2\") " pod="openstack/nova-scheduler-0" Oct 03 09:01:13 crc kubenswrapper[4790]: I1003 09:01:13.245308 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b2718fc-f9ec-42fb-9aad-8ff7232c4ee2-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"8b2718fc-f9ec-42fb-9aad-8ff7232c4ee2\") " pod="openstack/nova-scheduler-0" Oct 03 09:01:13 crc kubenswrapper[4790]: I1003 09:01:13.270596 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wtvdh\" (UniqueName: \"kubernetes.io/projected/8b2718fc-f9ec-42fb-9aad-8ff7232c4ee2-kube-api-access-wtvdh\") pod \"nova-scheduler-0\" (UID: \"8b2718fc-f9ec-42fb-9aad-8ff7232c4ee2\") " pod="openstack/nova-scheduler-0" Oct 03 09:01:13 crc kubenswrapper[4790]: I1003 09:01:13.295614 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 03 09:01:13 crc kubenswrapper[4790]: I1003 09:01:13.343804 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/80973b39-7225-498d-9bf5-12a6b7d4f3dc-config-data\") pod \"80973b39-7225-498d-9bf5-12a6b7d4f3dc\" (UID: \"80973b39-7225-498d-9bf5-12a6b7d4f3dc\") " Oct 03 09:01:13 crc kubenswrapper[4790]: I1003 09:01:13.343869 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-28gh2\" (UniqueName: \"kubernetes.io/projected/80973b39-7225-498d-9bf5-12a6b7d4f3dc-kube-api-access-28gh2\") pod \"80973b39-7225-498d-9bf5-12a6b7d4f3dc\" (UID: \"80973b39-7225-498d-9bf5-12a6b7d4f3dc\") " Oct 03 09:01:13 crc kubenswrapper[4790]: I1003 09:01:13.343938 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/80973b39-7225-498d-9bf5-12a6b7d4f3dc-sg-core-conf-yaml\") pod \"80973b39-7225-498d-9bf5-12a6b7d4f3dc\" (UID: \"80973b39-7225-498d-9bf5-12a6b7d4f3dc\") " Oct 03 09:01:13 crc kubenswrapper[4790]: I1003 09:01:13.343997 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/80973b39-7225-498d-9bf5-12a6b7d4f3dc-run-httpd\") pod \"80973b39-7225-498d-9bf5-12a6b7d4f3dc\" (UID: \"80973b39-7225-498d-9bf5-12a6b7d4f3dc\") " Oct 03 09:01:13 crc kubenswrapper[4790]: I1003 09:01:13.344056 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/80973b39-7225-498d-9bf5-12a6b7d4f3dc-log-httpd\") pod \"80973b39-7225-498d-9bf5-12a6b7d4f3dc\" (UID: \"80973b39-7225-498d-9bf5-12a6b7d4f3dc\") " Oct 03 09:01:13 crc kubenswrapper[4790]: I1003 09:01:13.344108 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/80973b39-7225-498d-9bf5-12a6b7d4f3dc-scripts\") pod \"80973b39-7225-498d-9bf5-12a6b7d4f3dc\" (UID: \"80973b39-7225-498d-9bf5-12a6b7d4f3dc\") " Oct 03 09:01:13 crc kubenswrapper[4790]: I1003 09:01:13.344230 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/80973b39-7225-498d-9bf5-12a6b7d4f3dc-combined-ca-bundle\") pod \"80973b39-7225-498d-9bf5-12a6b7d4f3dc\" (UID: \"80973b39-7225-498d-9bf5-12a6b7d4f3dc\") " Oct 03 09:01:13 crc kubenswrapper[4790]: I1003 09:01:13.345097 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/80973b39-7225-498d-9bf5-12a6b7d4f3dc-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "80973b39-7225-498d-9bf5-12a6b7d4f3dc" (UID: "80973b39-7225-498d-9bf5-12a6b7d4f3dc"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 09:01:13 crc kubenswrapper[4790]: I1003 09:01:13.348704 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/80973b39-7225-498d-9bf5-12a6b7d4f3dc-scripts" (OuterVolumeSpecName: "scripts") pod "80973b39-7225-498d-9bf5-12a6b7d4f3dc" (UID: "80973b39-7225-498d-9bf5-12a6b7d4f3dc"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:01:13 crc kubenswrapper[4790]: I1003 09:01:13.350861 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/80973b39-7225-498d-9bf5-12a6b7d4f3dc-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "80973b39-7225-498d-9bf5-12a6b7d4f3dc" (UID: "80973b39-7225-498d-9bf5-12a6b7d4f3dc"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 09:01:13 crc kubenswrapper[4790]: I1003 09:01:13.359646 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/80973b39-7225-498d-9bf5-12a6b7d4f3dc-kube-api-access-28gh2" (OuterVolumeSpecName: "kube-api-access-28gh2") pod "80973b39-7225-498d-9bf5-12a6b7d4f3dc" (UID: "80973b39-7225-498d-9bf5-12a6b7d4f3dc"). InnerVolumeSpecName "kube-api-access-28gh2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:01:13 crc kubenswrapper[4790]: I1003 09:01:13.392665 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/80973b39-7225-498d-9bf5-12a6b7d4f3dc-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "80973b39-7225-498d-9bf5-12a6b7d4f3dc" (UID: "80973b39-7225-498d-9bf5-12a6b7d4f3dc"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:01:13 crc kubenswrapper[4790]: I1003 09:01:13.435654 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/80973b39-7225-498d-9bf5-12a6b7d4f3dc-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "80973b39-7225-498d-9bf5-12a6b7d4f3dc" (UID: "80973b39-7225-498d-9bf5-12a6b7d4f3dc"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:01:13 crc kubenswrapper[4790]: I1003 09:01:13.442243 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 03 09:01:13 crc kubenswrapper[4790]: I1003 09:01:13.448027 4790 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/80973b39-7225-498d-9bf5-12a6b7d4f3dc-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 03 09:01:13 crc kubenswrapper[4790]: I1003 09:01:13.448068 4790 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/80973b39-7225-498d-9bf5-12a6b7d4f3dc-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 03 09:01:13 crc kubenswrapper[4790]: I1003 09:01:13.448083 4790 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/80973b39-7225-498d-9bf5-12a6b7d4f3dc-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 09:01:13 crc kubenswrapper[4790]: I1003 09:01:13.448095 4790 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/80973b39-7225-498d-9bf5-12a6b7d4f3dc-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 09:01:13 crc kubenswrapper[4790]: I1003 09:01:13.448111 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-28gh2\" (UniqueName: \"kubernetes.io/projected/80973b39-7225-498d-9bf5-12a6b7d4f3dc-kube-api-access-28gh2\") on node \"crc\" DevicePath \"\"" Oct 03 09:01:13 crc kubenswrapper[4790]: I1003 09:01:13.448122 4790 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/80973b39-7225-498d-9bf5-12a6b7d4f3dc-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 03 09:01:13 crc kubenswrapper[4790]: I1003 09:01:13.448710 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/80973b39-7225-498d-9bf5-12a6b7d4f3dc-config-data" (OuterVolumeSpecName: "config-data") pod "80973b39-7225-498d-9bf5-12a6b7d4f3dc" (UID: "80973b39-7225-498d-9bf5-12a6b7d4f3dc"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:01:13 crc kubenswrapper[4790]: I1003 09:01:13.551261 4790 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/80973b39-7225-498d-9bf5-12a6b7d4f3dc-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 09:01:13 crc kubenswrapper[4790]: I1003 09:01:13.898111 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 03 09:01:14 crc kubenswrapper[4790]: I1003 09:01:14.017138 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"8b2718fc-f9ec-42fb-9aad-8ff7232c4ee2","Type":"ContainerStarted","Data":"c7dd849ea70c2f5fe02cee83a028d389a040e2b4070b68e3a4aa78092f555d57"} Oct 03 09:01:14 crc kubenswrapper[4790]: I1003 09:01:14.025067 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 03 09:01:14 crc kubenswrapper[4790]: I1003 09:01:14.025247 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"80973b39-7225-498d-9bf5-12a6b7d4f3dc","Type":"ContainerDied","Data":"1f83e9d32b08f17ab0f8fee683988d4d7f9aeddfb5450f3610e57fde4c832a3a"} Oct 03 09:01:14 crc kubenswrapper[4790]: I1003 09:01:14.025446 4790 scope.go:117] "RemoveContainer" containerID="893fd81d194bae337b26a4de650ee7402cf6546a76510b16713b2fa91e4add13" Oct 03 09:01:14 crc kubenswrapper[4790]: I1003 09:01:14.029303 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 03 09:01:14 crc kubenswrapper[4790]: I1003 09:01:14.030381 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"7ddcf8dc-3b68-43c0-a78c-88aeb1fe7f30","Type":"ContainerDied","Data":"ef8dab063301d159b631cc434c15c25c1cef16c626df36fc71e2d949b85848d2"} Oct 03 09:01:14 crc kubenswrapper[4790]: I1003 09:01:14.048670 4790 scope.go:117] "RemoveContainer" containerID="2994d0f1b20da31133454d8d7621c18e7eaf60baecb000eb5f27958614e19bec" Oct 03 09:01:14 crc kubenswrapper[4790]: I1003 09:01:14.077816 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 03 09:01:14 crc kubenswrapper[4790]: I1003 09:01:14.096637 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 03 09:01:14 crc kubenswrapper[4790]: I1003 09:01:14.101867 4790 scope.go:117] "RemoveContainer" containerID="6458bb4a693bb122671c72474f7f5ba3d390676b6a0b20c111150d5d3a3dac6a" Oct 03 09:01:14 crc kubenswrapper[4790]: I1003 09:01:14.123881 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 03 09:01:14 crc kubenswrapper[4790]: I1003 09:01:14.143691 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Oct 03 09:01:14 crc kubenswrapper[4790]: I1003 09:01:14.152631 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 03 09:01:14 crc kubenswrapper[4790]: E1003 09:01:14.153358 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ddcf8dc-3b68-43c0-a78c-88aeb1fe7f30" containerName="nova-metadata-metadata" Oct 03 09:01:14 crc kubenswrapper[4790]: I1003 09:01:14.153491 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ddcf8dc-3b68-43c0-a78c-88aeb1fe7f30" containerName="nova-metadata-metadata" Oct 03 09:01:14 crc kubenswrapper[4790]: E1003 09:01:14.153601 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ddcf8dc-3b68-43c0-a78c-88aeb1fe7f30" containerName="nova-metadata-log" Oct 03 09:01:14 crc kubenswrapper[4790]: I1003 09:01:14.153723 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ddcf8dc-3b68-43c0-a78c-88aeb1fe7f30" containerName="nova-metadata-log" Oct 03 09:01:14 crc kubenswrapper[4790]: E1003 09:01:14.153845 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="80973b39-7225-498d-9bf5-12a6b7d4f3dc" containerName="ceilometer-notification-agent" Oct 03 09:01:14 crc kubenswrapper[4790]: I1003 09:01:14.153924 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="80973b39-7225-498d-9bf5-12a6b7d4f3dc" containerName="ceilometer-notification-agent" Oct 03 09:01:14 crc kubenswrapper[4790]: E1003 09:01:14.154012 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="80973b39-7225-498d-9bf5-12a6b7d4f3dc" containerName="sg-core" Oct 03 09:01:14 crc kubenswrapper[4790]: I1003 09:01:14.154099 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="80973b39-7225-498d-9bf5-12a6b7d4f3dc" containerName="sg-core" Oct 03 09:01:14 crc kubenswrapper[4790]: E1003 09:01:14.154185 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="80973b39-7225-498d-9bf5-12a6b7d4f3dc" containerName="ceilometer-central-agent" Oct 03 09:01:14 crc kubenswrapper[4790]: I1003 09:01:14.154259 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="80973b39-7225-498d-9bf5-12a6b7d4f3dc" containerName="ceilometer-central-agent" Oct 03 09:01:14 crc kubenswrapper[4790]: E1003 09:01:14.154354 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="80973b39-7225-498d-9bf5-12a6b7d4f3dc" containerName="proxy-httpd" Oct 03 09:01:14 crc kubenswrapper[4790]: I1003 09:01:14.154429 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="80973b39-7225-498d-9bf5-12a6b7d4f3dc" containerName="proxy-httpd" Oct 03 09:01:14 crc kubenswrapper[4790]: I1003 09:01:14.154791 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="80973b39-7225-498d-9bf5-12a6b7d4f3dc" containerName="ceilometer-central-agent" Oct 03 09:01:14 crc kubenswrapper[4790]: I1003 09:01:14.154901 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="80973b39-7225-498d-9bf5-12a6b7d4f3dc" containerName="proxy-httpd" Oct 03 09:01:14 crc kubenswrapper[4790]: I1003 09:01:14.154984 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="7ddcf8dc-3b68-43c0-a78c-88aeb1fe7f30" containerName="nova-metadata-log" Oct 03 09:01:14 crc kubenswrapper[4790]: I1003 09:01:14.155074 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="80973b39-7225-498d-9bf5-12a6b7d4f3dc" containerName="ceilometer-notification-agent" Oct 03 09:01:14 crc kubenswrapper[4790]: I1003 09:01:14.155166 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="80973b39-7225-498d-9bf5-12a6b7d4f3dc" containerName="sg-core" Oct 03 09:01:14 crc kubenswrapper[4790]: I1003 09:01:14.155247 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="7ddcf8dc-3b68-43c0-a78c-88aeb1fe7f30" containerName="nova-metadata-metadata" Oct 03 09:01:14 crc kubenswrapper[4790]: I1003 09:01:14.157731 4790 scope.go:117] "RemoveContainer" containerID="78c2d1368c85c5a888003bc1bca204a67c3948b5750dd423aab901655e8b693b" Oct 03 09:01:14 crc kubenswrapper[4790]: I1003 09:01:14.159528 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 03 09:01:14 crc kubenswrapper[4790]: I1003 09:01:14.162367 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 03 09:01:14 crc kubenswrapper[4790]: I1003 09:01:14.162623 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 03 09:01:14 crc kubenswrapper[4790]: I1003 09:01:14.162992 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 03 09:01:14 crc kubenswrapper[4790]: I1003 09:01:14.166076 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Oct 03 09:01:14 crc kubenswrapper[4790]: I1003 09:01:14.173323 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Oct 03 09:01:14 crc kubenswrapper[4790]: I1003 09:01:14.186441 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 03 09:01:14 crc kubenswrapper[4790]: I1003 09:01:14.186566 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 03 09:01:14 crc kubenswrapper[4790]: I1003 09:01:14.192733 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Oct 03 09:01:14 crc kubenswrapper[4790]: I1003 09:01:14.193300 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Oct 03 09:01:14 crc kubenswrapper[4790]: I1003 09:01:14.201088 4790 scope.go:117] "RemoveContainer" containerID="f14b6768a33be3ff60a8d67fd4aa51fad64d0d22318d5ed66162db615d88bc98" Oct 03 09:01:14 crc kubenswrapper[4790]: I1003 09:01:14.226280 4790 scope.go:117] "RemoveContainer" containerID="afde0c6fb85d57536f16ea26c916182fb31d89bcea7ae3310771bc9f2cb86ab4" Oct 03 09:01:14 crc kubenswrapper[4790]: I1003 09:01:14.266982 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/da09420c-8163-4ee2-a87c-775dc33c4c88-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"da09420c-8163-4ee2-a87c-775dc33c4c88\") " pod="openstack/ceilometer-0" Oct 03 09:01:14 crc kubenswrapper[4790]: I1003 09:01:14.267071 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/da09420c-8163-4ee2-a87c-775dc33c4c88-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"da09420c-8163-4ee2-a87c-775dc33c4c88\") " pod="openstack/ceilometer-0" Oct 03 09:01:14 crc kubenswrapper[4790]: I1003 09:01:14.267335 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hg6s2\" (UniqueName: \"kubernetes.io/projected/da09420c-8163-4ee2-a87c-775dc33c4c88-kube-api-access-hg6s2\") pod \"ceilometer-0\" (UID: \"da09420c-8163-4ee2-a87c-775dc33c4c88\") " pod="openstack/ceilometer-0" Oct 03 09:01:14 crc kubenswrapper[4790]: I1003 09:01:14.267419 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/da09420c-8163-4ee2-a87c-775dc33c4c88-run-httpd\") pod \"ceilometer-0\" (UID: \"da09420c-8163-4ee2-a87c-775dc33c4c88\") " pod="openstack/ceilometer-0" Oct 03 09:01:14 crc kubenswrapper[4790]: I1003 09:01:14.267496 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/da09420c-8163-4ee2-a87c-775dc33c4c88-log-httpd\") pod \"ceilometer-0\" (UID: \"da09420c-8163-4ee2-a87c-775dc33c4c88\") " pod="openstack/ceilometer-0" Oct 03 09:01:14 crc kubenswrapper[4790]: I1003 09:01:14.267658 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da09420c-8163-4ee2-a87c-775dc33c4c88-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"da09420c-8163-4ee2-a87c-775dc33c4c88\") " pod="openstack/ceilometer-0" Oct 03 09:01:14 crc kubenswrapper[4790]: I1003 09:01:14.268653 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/da09420c-8163-4ee2-a87c-775dc33c4c88-scripts\") pod \"ceilometer-0\" (UID: \"da09420c-8163-4ee2-a87c-775dc33c4c88\") " pod="openstack/ceilometer-0" Oct 03 09:01:14 crc kubenswrapper[4790]: I1003 09:01:14.268901 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/da09420c-8163-4ee2-a87c-775dc33c4c88-config-data\") pod \"ceilometer-0\" (UID: \"da09420c-8163-4ee2-a87c-775dc33c4c88\") " pod="openstack/ceilometer-0" Oct 03 09:01:14 crc kubenswrapper[4790]: I1003 09:01:14.348189 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="40eba639-5cf7-43e9-a24a-ed0027de7815" path="/var/lib/kubelet/pods/40eba639-5cf7-43e9-a24a-ed0027de7815/volumes" Oct 03 09:01:14 crc kubenswrapper[4790]: I1003 09:01:14.349143 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7ddcf8dc-3b68-43c0-a78c-88aeb1fe7f30" path="/var/lib/kubelet/pods/7ddcf8dc-3b68-43c0-a78c-88aeb1fe7f30/volumes" Oct 03 09:01:14 crc kubenswrapper[4790]: I1003 09:01:14.350371 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="80973b39-7225-498d-9bf5-12a6b7d4f3dc" path="/var/lib/kubelet/pods/80973b39-7225-498d-9bf5-12a6b7d4f3dc/volumes" Oct 03 09:01:14 crc kubenswrapper[4790]: I1003 09:01:14.370747 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/da09420c-8163-4ee2-a87c-775dc33c4c88-run-httpd\") pod \"ceilometer-0\" (UID: \"da09420c-8163-4ee2-a87c-775dc33c4c88\") " pod="openstack/ceilometer-0" Oct 03 09:01:14 crc kubenswrapper[4790]: I1003 09:01:14.370804 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/da09420c-8163-4ee2-a87c-775dc33c4c88-log-httpd\") pod \"ceilometer-0\" (UID: \"da09420c-8163-4ee2-a87c-775dc33c4c88\") " pod="openstack/ceilometer-0" Oct 03 09:01:14 crc kubenswrapper[4790]: I1003 09:01:14.370837 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da09420c-8163-4ee2-a87c-775dc33c4c88-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"da09420c-8163-4ee2-a87c-775dc33c4c88\") " pod="openstack/ceilometer-0" Oct 03 09:01:14 crc kubenswrapper[4790]: I1003 09:01:14.370864 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c4e8431e-8789-4bda-a146-4b2c056eb240-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"c4e8431e-8789-4bda-a146-4b2c056eb240\") " pod="openstack/nova-metadata-0" Oct 03 09:01:14 crc kubenswrapper[4790]: I1003 09:01:14.370890 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/da09420c-8163-4ee2-a87c-775dc33c4c88-scripts\") pod \"ceilometer-0\" (UID: \"da09420c-8163-4ee2-a87c-775dc33c4c88\") " pod="openstack/ceilometer-0" Oct 03 09:01:14 crc kubenswrapper[4790]: I1003 09:01:14.370923 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c4e8431e-8789-4bda-a146-4b2c056eb240-config-data\") pod \"nova-metadata-0\" (UID: \"c4e8431e-8789-4bda-a146-4b2c056eb240\") " pod="openstack/nova-metadata-0" Oct 03 09:01:14 crc kubenswrapper[4790]: I1003 09:01:14.370940 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/da09420c-8163-4ee2-a87c-775dc33c4c88-config-data\") pod \"ceilometer-0\" (UID: \"da09420c-8163-4ee2-a87c-775dc33c4c88\") " pod="openstack/ceilometer-0" Oct 03 09:01:14 crc kubenswrapper[4790]: I1003 09:01:14.370998 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/da09420c-8163-4ee2-a87c-775dc33c4c88-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"da09420c-8163-4ee2-a87c-775dc33c4c88\") " pod="openstack/ceilometer-0" Oct 03 09:01:14 crc kubenswrapper[4790]: I1003 09:01:14.371034 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8wcfn\" (UniqueName: \"kubernetes.io/projected/c4e8431e-8789-4bda-a146-4b2c056eb240-kube-api-access-8wcfn\") pod \"nova-metadata-0\" (UID: \"c4e8431e-8789-4bda-a146-4b2c056eb240\") " pod="openstack/nova-metadata-0" Oct 03 09:01:14 crc kubenswrapper[4790]: I1003 09:01:14.371052 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/c4e8431e-8789-4bda-a146-4b2c056eb240-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"c4e8431e-8789-4bda-a146-4b2c056eb240\") " pod="openstack/nova-metadata-0" Oct 03 09:01:14 crc kubenswrapper[4790]: I1003 09:01:14.371071 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/da09420c-8163-4ee2-a87c-775dc33c4c88-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"da09420c-8163-4ee2-a87c-775dc33c4c88\") " pod="openstack/ceilometer-0" Oct 03 09:01:14 crc kubenswrapper[4790]: I1003 09:01:14.371086 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c4e8431e-8789-4bda-a146-4b2c056eb240-logs\") pod \"nova-metadata-0\" (UID: \"c4e8431e-8789-4bda-a146-4b2c056eb240\") " pod="openstack/nova-metadata-0" Oct 03 09:01:14 crc kubenswrapper[4790]: I1003 09:01:14.371125 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hg6s2\" (UniqueName: \"kubernetes.io/projected/da09420c-8163-4ee2-a87c-775dc33c4c88-kube-api-access-hg6s2\") pod \"ceilometer-0\" (UID: \"da09420c-8163-4ee2-a87c-775dc33c4c88\") " pod="openstack/ceilometer-0" Oct 03 09:01:14 crc kubenswrapper[4790]: I1003 09:01:14.371711 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/da09420c-8163-4ee2-a87c-775dc33c4c88-run-httpd\") pod \"ceilometer-0\" (UID: \"da09420c-8163-4ee2-a87c-775dc33c4c88\") " pod="openstack/ceilometer-0" Oct 03 09:01:14 crc kubenswrapper[4790]: I1003 09:01:14.371946 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/da09420c-8163-4ee2-a87c-775dc33c4c88-log-httpd\") pod \"ceilometer-0\" (UID: \"da09420c-8163-4ee2-a87c-775dc33c4c88\") " pod="openstack/ceilometer-0" Oct 03 09:01:14 crc kubenswrapper[4790]: I1003 09:01:14.376055 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 03 09:01:14 crc kubenswrapper[4790]: I1003 09:01:14.376122 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 03 09:01:14 crc kubenswrapper[4790]: I1003 09:01:14.376160 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Oct 03 09:01:14 crc kubenswrapper[4790]: I1003 09:01:14.376522 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da09420c-8163-4ee2-a87c-775dc33c4c88-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"da09420c-8163-4ee2-a87c-775dc33c4c88\") " pod="openstack/ceilometer-0" Oct 03 09:01:14 crc kubenswrapper[4790]: I1003 09:01:14.385410 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/da09420c-8163-4ee2-a87c-775dc33c4c88-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"da09420c-8163-4ee2-a87c-775dc33c4c88\") " pod="openstack/ceilometer-0" Oct 03 09:01:14 crc kubenswrapper[4790]: I1003 09:01:14.385633 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/da09420c-8163-4ee2-a87c-775dc33c4c88-config-data\") pod \"ceilometer-0\" (UID: \"da09420c-8163-4ee2-a87c-775dc33c4c88\") " pod="openstack/ceilometer-0" Oct 03 09:01:14 crc kubenswrapper[4790]: I1003 09:01:14.388445 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/da09420c-8163-4ee2-a87c-775dc33c4c88-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"da09420c-8163-4ee2-a87c-775dc33c4c88\") " pod="openstack/ceilometer-0" Oct 03 09:01:14 crc kubenswrapper[4790]: I1003 09:01:14.391040 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hg6s2\" (UniqueName: \"kubernetes.io/projected/da09420c-8163-4ee2-a87c-775dc33c4c88-kube-api-access-hg6s2\") pod \"ceilometer-0\" (UID: \"da09420c-8163-4ee2-a87c-775dc33c4c88\") " pod="openstack/ceilometer-0" Oct 03 09:01:14 crc kubenswrapper[4790]: I1003 09:01:14.391782 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/da09420c-8163-4ee2-a87c-775dc33c4c88-scripts\") pod \"ceilometer-0\" (UID: \"da09420c-8163-4ee2-a87c-775dc33c4c88\") " pod="openstack/ceilometer-0" Oct 03 09:01:14 crc kubenswrapper[4790]: I1003 09:01:14.473329 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c4e8431e-8789-4bda-a146-4b2c056eb240-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"c4e8431e-8789-4bda-a146-4b2c056eb240\") " pod="openstack/nova-metadata-0" Oct 03 09:01:14 crc kubenswrapper[4790]: I1003 09:01:14.473406 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c4e8431e-8789-4bda-a146-4b2c056eb240-config-data\") pod \"nova-metadata-0\" (UID: \"c4e8431e-8789-4bda-a146-4b2c056eb240\") " pod="openstack/nova-metadata-0" Oct 03 09:01:14 crc kubenswrapper[4790]: I1003 09:01:14.473558 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8wcfn\" (UniqueName: \"kubernetes.io/projected/c4e8431e-8789-4bda-a146-4b2c056eb240-kube-api-access-8wcfn\") pod \"nova-metadata-0\" (UID: \"c4e8431e-8789-4bda-a146-4b2c056eb240\") " pod="openstack/nova-metadata-0" Oct 03 09:01:14 crc kubenswrapper[4790]: I1003 09:01:14.473592 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/c4e8431e-8789-4bda-a146-4b2c056eb240-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"c4e8431e-8789-4bda-a146-4b2c056eb240\") " pod="openstack/nova-metadata-0" Oct 03 09:01:14 crc kubenswrapper[4790]: I1003 09:01:14.473611 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c4e8431e-8789-4bda-a146-4b2c056eb240-logs\") pod \"nova-metadata-0\" (UID: \"c4e8431e-8789-4bda-a146-4b2c056eb240\") " pod="openstack/nova-metadata-0" Oct 03 09:01:14 crc kubenswrapper[4790]: I1003 09:01:14.476264 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c4e8431e-8789-4bda-a146-4b2c056eb240-logs\") pod \"nova-metadata-0\" (UID: \"c4e8431e-8789-4bda-a146-4b2c056eb240\") " pod="openstack/nova-metadata-0" Oct 03 09:01:14 crc kubenswrapper[4790]: I1003 09:01:14.477351 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Oct 03 09:01:14 crc kubenswrapper[4790]: I1003 09:01:14.478948 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c4e8431e-8789-4bda-a146-4b2c056eb240-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"c4e8431e-8789-4bda-a146-4b2c056eb240\") " pod="openstack/nova-metadata-0" Oct 03 09:01:14 crc kubenswrapper[4790]: I1003 09:01:14.479234 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Oct 03 09:01:14 crc kubenswrapper[4790]: I1003 09:01:14.489867 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/c4e8431e-8789-4bda-a146-4b2c056eb240-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"c4e8431e-8789-4bda-a146-4b2c056eb240\") " pod="openstack/nova-metadata-0" Oct 03 09:01:14 crc kubenswrapper[4790]: I1003 09:01:14.491121 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c4e8431e-8789-4bda-a146-4b2c056eb240-config-data\") pod \"nova-metadata-0\" (UID: \"c4e8431e-8789-4bda-a146-4b2c056eb240\") " pod="openstack/nova-metadata-0" Oct 03 09:01:14 crc kubenswrapper[4790]: I1003 09:01:14.493868 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8wcfn\" (UniqueName: \"kubernetes.io/projected/c4e8431e-8789-4bda-a146-4b2c056eb240-kube-api-access-8wcfn\") pod \"nova-metadata-0\" (UID: \"c4e8431e-8789-4bda-a146-4b2c056eb240\") " pod="openstack/nova-metadata-0" Oct 03 09:01:14 crc kubenswrapper[4790]: I1003 09:01:14.496860 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 03 09:01:14 crc kubenswrapper[4790]: I1003 09:01:14.510743 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 03 09:01:14 crc kubenswrapper[4790]: I1003 09:01:14.990656 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 03 09:01:15 crc kubenswrapper[4790]: I1003 09:01:15.058402 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"8b2718fc-f9ec-42fb-9aad-8ff7232c4ee2","Type":"ContainerStarted","Data":"ac07591fe4cd824cfd309afb5edac51bbf69f5b2173ac133c668b99f0f65d90e"} Oct 03 09:01:15 crc kubenswrapper[4790]: I1003 09:01:15.060404 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"da09420c-8163-4ee2-a87c-775dc33c4c88","Type":"ContainerStarted","Data":"a89ad72aace2d295e7e2c1a2666fb21d262bc9cf0e59692836851e434dc2e199"} Oct 03 09:01:15 crc kubenswrapper[4790]: I1003 09:01:15.095623 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.095603949 podStartE2EDuration="2.095603949s" podCreationTimestamp="2025-10-03 09:01:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 09:01:15.080773233 +0000 UTC m=+1261.433707471" watchObservedRunningTime="2025-10-03 09:01:15.095603949 +0000 UTC m=+1261.448538187" Oct 03 09:01:15 crc kubenswrapper[4790]: I1003 09:01:15.109193 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 03 09:01:15 crc kubenswrapper[4790]: E1003 09:01:15.673272 4790 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1d1f1bdf_e9e5_4a90_b873_ae755f89c4cc.slice/crio-conmon-6cdc7e4e2579294bae2eea362c1ec158e112005ba8b149ba23fea5acc8e1c523.scope\": RecentStats: unable to find data in memory cache]" Oct 03 09:01:16 crc kubenswrapper[4790]: I1003 09:01:16.084117 4790 generic.go:334] "Generic (PLEG): container finished" podID="1d1f1bdf-e9e5-4a90-b873-ae755f89c4cc" containerID="6cdc7e4e2579294bae2eea362c1ec158e112005ba8b149ba23fea5acc8e1c523" exitCode=0 Oct 03 09:01:16 crc kubenswrapper[4790]: I1003 09:01:16.084494 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-vhvvz" event={"ID":"1d1f1bdf-e9e5-4a90-b873-ae755f89c4cc","Type":"ContainerDied","Data":"6cdc7e4e2579294bae2eea362c1ec158e112005ba8b149ba23fea5acc8e1c523"} Oct 03 09:01:16 crc kubenswrapper[4790]: I1003 09:01:16.091031 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"c4e8431e-8789-4bda-a146-4b2c056eb240","Type":"ContainerStarted","Data":"cb8f2fd06148764e43f1d0e5967da1b50cf9da473be0289f0d22f4759f30f2b5"} Oct 03 09:01:16 crc kubenswrapper[4790]: I1003 09:01:16.091083 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"c4e8431e-8789-4bda-a146-4b2c056eb240","Type":"ContainerStarted","Data":"b4e7ca2907c6c3d5b79917495ec50be29601d0b1109983fec0b2484e44f7e029"} Oct 03 09:01:16 crc kubenswrapper[4790]: I1003 09:01:16.091094 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"c4e8431e-8789-4bda-a146-4b2c056eb240","Type":"ContainerStarted","Data":"6cbfd1712bfe4132a42482d57f67e458dcb0b03feedd82c68ac1efced4af49e8"} Oct 03 09:01:16 crc kubenswrapper[4790]: I1003 09:01:16.095127 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"da09420c-8163-4ee2-a87c-775dc33c4c88","Type":"ContainerStarted","Data":"35210c95fb4d2eda702013032c49a2dda3566aae2c999ae80b2955d98312b61b"} Oct 03 09:01:16 crc kubenswrapper[4790]: I1003 09:01:16.095170 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"da09420c-8163-4ee2-a87c-775dc33c4c88","Type":"ContainerStarted","Data":"9668ae3eacbcac8ff4f675e807d86a49ceeb45ae5b2b63bc11423a34b951df68"} Oct 03 09:01:16 crc kubenswrapper[4790]: I1003 09:01:16.137849 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.137830252 podStartE2EDuration="2.137830252s" podCreationTimestamp="2025-10-03 09:01:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 09:01:16.129308659 +0000 UTC m=+1262.482242917" watchObservedRunningTime="2025-10-03 09:01:16.137830252 +0000 UTC m=+1262.490764500" Oct 03 09:01:16 crc kubenswrapper[4790]: I1003 09:01:16.386404 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Oct 03 09:01:17 crc kubenswrapper[4790]: I1003 09:01:17.108977 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"da09420c-8163-4ee2-a87c-775dc33c4c88","Type":"ContainerStarted","Data":"50001bf7bf71900dbf3d4952de5c4087cb01834ce09397e42da000ca0db17871"} Oct 03 09:01:17 crc kubenswrapper[4790]: I1003 09:01:17.495280 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-vhvvz" Oct 03 09:01:17 crc kubenswrapper[4790]: I1003 09:01:17.668067 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d1f1bdf-e9e5-4a90-b873-ae755f89c4cc-combined-ca-bundle\") pod \"1d1f1bdf-e9e5-4a90-b873-ae755f89c4cc\" (UID: \"1d1f1bdf-e9e5-4a90-b873-ae755f89c4cc\") " Oct 03 09:01:17 crc kubenswrapper[4790]: I1003 09:01:17.668170 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1d1f1bdf-e9e5-4a90-b873-ae755f89c4cc-config-data\") pod \"1d1f1bdf-e9e5-4a90-b873-ae755f89c4cc\" (UID: \"1d1f1bdf-e9e5-4a90-b873-ae755f89c4cc\") " Oct 03 09:01:17 crc kubenswrapper[4790]: I1003 09:01:17.668252 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1d1f1bdf-e9e5-4a90-b873-ae755f89c4cc-scripts\") pod \"1d1f1bdf-e9e5-4a90-b873-ae755f89c4cc\" (UID: \"1d1f1bdf-e9e5-4a90-b873-ae755f89c4cc\") " Oct 03 09:01:17 crc kubenswrapper[4790]: I1003 09:01:17.668333 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9kqfc\" (UniqueName: \"kubernetes.io/projected/1d1f1bdf-e9e5-4a90-b873-ae755f89c4cc-kube-api-access-9kqfc\") pod \"1d1f1bdf-e9e5-4a90-b873-ae755f89c4cc\" (UID: \"1d1f1bdf-e9e5-4a90-b873-ae755f89c4cc\") " Oct 03 09:01:17 crc kubenswrapper[4790]: I1003 09:01:17.677016 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d1f1bdf-e9e5-4a90-b873-ae755f89c4cc-kube-api-access-9kqfc" (OuterVolumeSpecName: "kube-api-access-9kqfc") pod "1d1f1bdf-e9e5-4a90-b873-ae755f89c4cc" (UID: "1d1f1bdf-e9e5-4a90-b873-ae755f89c4cc"). InnerVolumeSpecName "kube-api-access-9kqfc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:01:17 crc kubenswrapper[4790]: I1003 09:01:17.678786 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1d1f1bdf-e9e5-4a90-b873-ae755f89c4cc-scripts" (OuterVolumeSpecName: "scripts") pod "1d1f1bdf-e9e5-4a90-b873-ae755f89c4cc" (UID: "1d1f1bdf-e9e5-4a90-b873-ae755f89c4cc"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:01:17 crc kubenswrapper[4790]: I1003 09:01:17.706551 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1d1f1bdf-e9e5-4a90-b873-ae755f89c4cc-config-data" (OuterVolumeSpecName: "config-data") pod "1d1f1bdf-e9e5-4a90-b873-ae755f89c4cc" (UID: "1d1f1bdf-e9e5-4a90-b873-ae755f89c4cc"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:01:17 crc kubenswrapper[4790]: I1003 09:01:17.711709 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1d1f1bdf-e9e5-4a90-b873-ae755f89c4cc-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1d1f1bdf-e9e5-4a90-b873-ae755f89c4cc" (UID: "1d1f1bdf-e9e5-4a90-b873-ae755f89c4cc"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:01:17 crc kubenswrapper[4790]: I1003 09:01:17.770875 4790 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d1f1bdf-e9e5-4a90-b873-ae755f89c4cc-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 09:01:17 crc kubenswrapper[4790]: I1003 09:01:17.770904 4790 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1d1f1bdf-e9e5-4a90-b873-ae755f89c4cc-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 09:01:17 crc kubenswrapper[4790]: I1003 09:01:17.770913 4790 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1d1f1bdf-e9e5-4a90-b873-ae755f89c4cc-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 09:01:17 crc kubenswrapper[4790]: I1003 09:01:17.770921 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9kqfc\" (UniqueName: \"kubernetes.io/projected/1d1f1bdf-e9e5-4a90-b873-ae755f89c4cc-kube-api-access-9kqfc\") on node \"crc\" DevicePath \"\"" Oct 03 09:01:18 crc kubenswrapper[4790]: I1003 09:01:18.131274 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-vhvvz" event={"ID":"1d1f1bdf-e9e5-4a90-b873-ae755f89c4cc","Type":"ContainerDied","Data":"1fa7c036124f1ba2e40d222e1d6d73c93f73cec2a2f174b51d3fc595305f8994"} Oct 03 09:01:18 crc kubenswrapper[4790]: I1003 09:01:18.131661 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1fa7c036124f1ba2e40d222e1d6d73c93f73cec2a2f174b51d3fc595305f8994" Oct 03 09:01:18 crc kubenswrapper[4790]: I1003 09:01:18.131354 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-vhvvz" Oct 03 09:01:18 crc kubenswrapper[4790]: I1003 09:01:18.212900 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Oct 03 09:01:18 crc kubenswrapper[4790]: E1003 09:01:18.213353 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1d1f1bdf-e9e5-4a90-b873-ae755f89c4cc" containerName="nova-cell1-conductor-db-sync" Oct 03 09:01:18 crc kubenswrapper[4790]: I1003 09:01:18.213371 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="1d1f1bdf-e9e5-4a90-b873-ae755f89c4cc" containerName="nova-cell1-conductor-db-sync" Oct 03 09:01:18 crc kubenswrapper[4790]: I1003 09:01:18.213654 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="1d1f1bdf-e9e5-4a90-b873-ae755f89c4cc" containerName="nova-cell1-conductor-db-sync" Oct 03 09:01:18 crc kubenswrapper[4790]: I1003 09:01:18.215987 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Oct 03 09:01:18 crc kubenswrapper[4790]: I1003 09:01:18.218733 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Oct 03 09:01:18 crc kubenswrapper[4790]: I1003 09:01:18.226838 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Oct 03 09:01:18 crc kubenswrapper[4790]: I1003 09:01:18.383472 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dbb59\" (UniqueName: \"kubernetes.io/projected/a529d30e-fdca-4eb5-8233-34e63534a164-kube-api-access-dbb59\") pod \"nova-cell1-conductor-0\" (UID: \"a529d30e-fdca-4eb5-8233-34e63534a164\") " pod="openstack/nova-cell1-conductor-0" Oct 03 09:01:18 crc kubenswrapper[4790]: I1003 09:01:18.383563 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a529d30e-fdca-4eb5-8233-34e63534a164-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"a529d30e-fdca-4eb5-8233-34e63534a164\") " pod="openstack/nova-cell1-conductor-0" Oct 03 09:01:18 crc kubenswrapper[4790]: I1003 09:01:18.383701 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a529d30e-fdca-4eb5-8233-34e63534a164-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"a529d30e-fdca-4eb5-8233-34e63534a164\") " pod="openstack/nova-cell1-conductor-0" Oct 03 09:01:18 crc kubenswrapper[4790]: I1003 09:01:18.443765 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Oct 03 09:01:18 crc kubenswrapper[4790]: I1003 09:01:18.485401 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a529d30e-fdca-4eb5-8233-34e63534a164-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"a529d30e-fdca-4eb5-8233-34e63534a164\") " pod="openstack/nova-cell1-conductor-0" Oct 03 09:01:18 crc kubenswrapper[4790]: I1003 09:01:18.485648 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dbb59\" (UniqueName: \"kubernetes.io/projected/a529d30e-fdca-4eb5-8233-34e63534a164-kube-api-access-dbb59\") pod \"nova-cell1-conductor-0\" (UID: \"a529d30e-fdca-4eb5-8233-34e63534a164\") " pod="openstack/nova-cell1-conductor-0" Oct 03 09:01:18 crc kubenswrapper[4790]: I1003 09:01:18.485699 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a529d30e-fdca-4eb5-8233-34e63534a164-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"a529d30e-fdca-4eb5-8233-34e63534a164\") " pod="openstack/nova-cell1-conductor-0" Oct 03 09:01:18 crc kubenswrapper[4790]: I1003 09:01:18.490376 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a529d30e-fdca-4eb5-8233-34e63534a164-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"a529d30e-fdca-4eb5-8233-34e63534a164\") " pod="openstack/nova-cell1-conductor-0" Oct 03 09:01:18 crc kubenswrapper[4790]: I1003 09:01:18.490463 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a529d30e-fdca-4eb5-8233-34e63534a164-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"a529d30e-fdca-4eb5-8233-34e63534a164\") " pod="openstack/nova-cell1-conductor-0" Oct 03 09:01:18 crc kubenswrapper[4790]: I1003 09:01:18.502707 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dbb59\" (UniqueName: \"kubernetes.io/projected/a529d30e-fdca-4eb5-8233-34e63534a164-kube-api-access-dbb59\") pod \"nova-cell1-conductor-0\" (UID: \"a529d30e-fdca-4eb5-8233-34e63534a164\") " pod="openstack/nova-cell1-conductor-0" Oct 03 09:01:18 crc kubenswrapper[4790]: I1003 09:01:18.535425 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Oct 03 09:01:19 crc kubenswrapper[4790]: I1003 09:01:19.003551 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Oct 03 09:01:19 crc kubenswrapper[4790]: W1003 09:01:19.004600 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda529d30e_fdca_4eb5_8233_34e63534a164.slice/crio-4d274d6d9b6c7b7a2e007f5c51e277297804563060986ca179c19f2666d06e13 WatchSource:0}: Error finding container 4d274d6d9b6c7b7a2e007f5c51e277297804563060986ca179c19f2666d06e13: Status 404 returned error can't find the container with id 4d274d6d9b6c7b7a2e007f5c51e277297804563060986ca179c19f2666d06e13 Oct 03 09:01:19 crc kubenswrapper[4790]: I1003 09:01:19.142386 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"a529d30e-fdca-4eb5-8233-34e63534a164","Type":"ContainerStarted","Data":"4d274d6d9b6c7b7a2e007f5c51e277297804563060986ca179c19f2666d06e13"} Oct 03 09:01:19 crc kubenswrapper[4790]: I1003 09:01:19.145484 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"da09420c-8163-4ee2-a87c-775dc33c4c88","Type":"ContainerStarted","Data":"55218c52eb537cb2415f38b9eba50b6cc56a98f22f172ceefd877dbb935d3b21"} Oct 03 09:01:19 crc kubenswrapper[4790]: I1003 09:01:19.146970 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 03 09:01:19 crc kubenswrapper[4790]: I1003 09:01:19.511282 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 03 09:01:19 crc kubenswrapper[4790]: I1003 09:01:19.511420 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 03 09:01:20 crc kubenswrapper[4790]: I1003 09:01:20.157738 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"a529d30e-fdca-4eb5-8233-34e63534a164","Type":"ContainerStarted","Data":"6d61dbe4edd3c86f8837cf401c84ec13fe52a4ec75fc54e68317aaf4add8adcc"} Oct 03 09:01:20 crc kubenswrapper[4790]: I1003 09:01:20.172229 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.1722098069999998 podStartE2EDuration="2.172209807s" podCreationTimestamp="2025-10-03 09:01:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 09:01:20.171664153 +0000 UTC m=+1266.524598401" watchObservedRunningTime="2025-10-03 09:01:20.172209807 +0000 UTC m=+1266.525144045" Oct 03 09:01:20 crc kubenswrapper[4790]: I1003 09:01:20.174016 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=3.079991204 podStartE2EDuration="6.174007387s" podCreationTimestamp="2025-10-03 09:01:14 +0000 UTC" firstStartedPulling="2025-10-03 09:01:15.001724197 +0000 UTC m=+1261.354658435" lastFinishedPulling="2025-10-03 09:01:18.09574034 +0000 UTC m=+1264.448674618" observedRunningTime="2025-10-03 09:01:19.179093171 +0000 UTC m=+1265.532027419" watchObservedRunningTime="2025-10-03 09:01:20.174007387 +0000 UTC m=+1266.526941625" Oct 03 09:01:21 crc kubenswrapper[4790]: I1003 09:01:21.167396 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Oct 03 09:01:21 crc kubenswrapper[4790]: I1003 09:01:21.377862 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 03 09:01:21 crc kubenswrapper[4790]: I1003 09:01:21.377936 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 03 09:01:22 crc kubenswrapper[4790]: I1003 09:01:22.459898 4790 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="ae318a68-abcf-4d99-ac81-222288742659" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.213:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 03 09:01:22 crc kubenswrapper[4790]: I1003 09:01:22.459920 4790 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="ae318a68-abcf-4d99-ac81-222288742659" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.213:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 03 09:01:23 crc kubenswrapper[4790]: I1003 09:01:23.381996 4790 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/swift-proxy-7657d6d659-5lb95" podUID="9ee67672-dab9-4d89-b6b4-02eb06e8fe8d" containerName="proxy-httpd" probeResult="failure" output="HTTP probe failed with statuscode: 502" Oct 03 09:01:23 crc kubenswrapper[4790]: I1003 09:01:23.443929 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Oct 03 09:01:23 crc kubenswrapper[4790]: I1003 09:01:23.487504 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Oct 03 09:01:24 crc kubenswrapper[4790]: I1003 09:01:24.252324 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Oct 03 09:01:24 crc kubenswrapper[4790]: I1003 09:01:24.511560 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Oct 03 09:01:24 crc kubenswrapper[4790]: I1003 09:01:24.511686 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Oct 03 09:01:25 crc kubenswrapper[4790]: I1003 09:01:25.523753 4790 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="c4e8431e-8789-4bda-a146-4b2c056eb240" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.216:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 03 09:01:25 crc kubenswrapper[4790]: I1003 09:01:25.523753 4790 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="c4e8431e-8789-4bda-a146-4b2c056eb240" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.216:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 03 09:01:28 crc kubenswrapper[4790]: I1003 09:01:28.564415 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Oct 03 09:01:31 crc kubenswrapper[4790]: I1003 09:01:31.383733 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Oct 03 09:01:31 crc kubenswrapper[4790]: I1003 09:01:31.384600 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Oct 03 09:01:31 crc kubenswrapper[4790]: I1003 09:01:31.384886 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Oct 03 09:01:31 crc kubenswrapper[4790]: I1003 09:01:31.389265 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Oct 03 09:01:32 crc kubenswrapper[4790]: I1003 09:01:32.273112 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Oct 03 09:01:32 crc kubenswrapper[4790]: I1003 09:01:32.283225 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Oct 03 09:01:32 crc kubenswrapper[4790]: I1003 09:01:32.477875 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-59fd7cc775-bxj2s"] Oct 03 09:01:32 crc kubenswrapper[4790]: I1003 09:01:32.480191 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-59fd7cc775-bxj2s" Oct 03 09:01:32 crc kubenswrapper[4790]: I1003 09:01:32.490785 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-59fd7cc775-bxj2s"] Oct 03 09:01:32 crc kubenswrapper[4790]: I1003 09:01:32.575713 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5b00ed4e-aa3d-4cb2-beaf-65ea32d33f4f-config\") pod \"dnsmasq-dns-59fd7cc775-bxj2s\" (UID: \"5b00ed4e-aa3d-4cb2-beaf-65ea32d33f4f\") " pod="openstack/dnsmasq-dns-59fd7cc775-bxj2s" Oct 03 09:01:32 crc kubenswrapper[4790]: I1003 09:01:32.575988 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q5lmm\" (UniqueName: \"kubernetes.io/projected/5b00ed4e-aa3d-4cb2-beaf-65ea32d33f4f-kube-api-access-q5lmm\") pod \"dnsmasq-dns-59fd7cc775-bxj2s\" (UID: \"5b00ed4e-aa3d-4cb2-beaf-65ea32d33f4f\") " pod="openstack/dnsmasq-dns-59fd7cc775-bxj2s" Oct 03 09:01:32 crc kubenswrapper[4790]: I1003 09:01:32.576021 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5b00ed4e-aa3d-4cb2-beaf-65ea32d33f4f-ovsdbserver-nb\") pod \"dnsmasq-dns-59fd7cc775-bxj2s\" (UID: \"5b00ed4e-aa3d-4cb2-beaf-65ea32d33f4f\") " pod="openstack/dnsmasq-dns-59fd7cc775-bxj2s" Oct 03 09:01:32 crc kubenswrapper[4790]: I1003 09:01:32.576064 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5b00ed4e-aa3d-4cb2-beaf-65ea32d33f4f-ovsdbserver-sb\") pod \"dnsmasq-dns-59fd7cc775-bxj2s\" (UID: \"5b00ed4e-aa3d-4cb2-beaf-65ea32d33f4f\") " pod="openstack/dnsmasq-dns-59fd7cc775-bxj2s" Oct 03 09:01:32 crc kubenswrapper[4790]: I1003 09:01:32.576088 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5b00ed4e-aa3d-4cb2-beaf-65ea32d33f4f-dns-svc\") pod \"dnsmasq-dns-59fd7cc775-bxj2s\" (UID: \"5b00ed4e-aa3d-4cb2-beaf-65ea32d33f4f\") " pod="openstack/dnsmasq-dns-59fd7cc775-bxj2s" Oct 03 09:01:32 crc kubenswrapper[4790]: I1003 09:01:32.577547 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5b00ed4e-aa3d-4cb2-beaf-65ea32d33f4f-dns-swift-storage-0\") pod \"dnsmasq-dns-59fd7cc775-bxj2s\" (UID: \"5b00ed4e-aa3d-4cb2-beaf-65ea32d33f4f\") " pod="openstack/dnsmasq-dns-59fd7cc775-bxj2s" Oct 03 09:01:32 crc kubenswrapper[4790]: I1003 09:01:32.679831 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5b00ed4e-aa3d-4cb2-beaf-65ea32d33f4f-config\") pod \"dnsmasq-dns-59fd7cc775-bxj2s\" (UID: \"5b00ed4e-aa3d-4cb2-beaf-65ea32d33f4f\") " pod="openstack/dnsmasq-dns-59fd7cc775-bxj2s" Oct 03 09:01:32 crc kubenswrapper[4790]: I1003 09:01:32.679961 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q5lmm\" (UniqueName: \"kubernetes.io/projected/5b00ed4e-aa3d-4cb2-beaf-65ea32d33f4f-kube-api-access-q5lmm\") pod \"dnsmasq-dns-59fd7cc775-bxj2s\" (UID: \"5b00ed4e-aa3d-4cb2-beaf-65ea32d33f4f\") " pod="openstack/dnsmasq-dns-59fd7cc775-bxj2s" Oct 03 09:01:32 crc kubenswrapper[4790]: I1003 09:01:32.679993 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5b00ed4e-aa3d-4cb2-beaf-65ea32d33f4f-ovsdbserver-nb\") pod \"dnsmasq-dns-59fd7cc775-bxj2s\" (UID: \"5b00ed4e-aa3d-4cb2-beaf-65ea32d33f4f\") " pod="openstack/dnsmasq-dns-59fd7cc775-bxj2s" Oct 03 09:01:32 crc kubenswrapper[4790]: I1003 09:01:32.680040 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5b00ed4e-aa3d-4cb2-beaf-65ea32d33f4f-ovsdbserver-sb\") pod \"dnsmasq-dns-59fd7cc775-bxj2s\" (UID: \"5b00ed4e-aa3d-4cb2-beaf-65ea32d33f4f\") " pod="openstack/dnsmasq-dns-59fd7cc775-bxj2s" Oct 03 09:01:32 crc kubenswrapper[4790]: I1003 09:01:32.680065 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5b00ed4e-aa3d-4cb2-beaf-65ea32d33f4f-dns-svc\") pod \"dnsmasq-dns-59fd7cc775-bxj2s\" (UID: \"5b00ed4e-aa3d-4cb2-beaf-65ea32d33f4f\") " pod="openstack/dnsmasq-dns-59fd7cc775-bxj2s" Oct 03 09:01:32 crc kubenswrapper[4790]: I1003 09:01:32.680086 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5b00ed4e-aa3d-4cb2-beaf-65ea32d33f4f-dns-swift-storage-0\") pod \"dnsmasq-dns-59fd7cc775-bxj2s\" (UID: \"5b00ed4e-aa3d-4cb2-beaf-65ea32d33f4f\") " pod="openstack/dnsmasq-dns-59fd7cc775-bxj2s" Oct 03 09:01:32 crc kubenswrapper[4790]: I1003 09:01:32.681271 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5b00ed4e-aa3d-4cb2-beaf-65ea32d33f4f-dns-swift-storage-0\") pod \"dnsmasq-dns-59fd7cc775-bxj2s\" (UID: \"5b00ed4e-aa3d-4cb2-beaf-65ea32d33f4f\") " pod="openstack/dnsmasq-dns-59fd7cc775-bxj2s" Oct 03 09:01:32 crc kubenswrapper[4790]: I1003 09:01:32.681303 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5b00ed4e-aa3d-4cb2-beaf-65ea32d33f4f-ovsdbserver-sb\") pod \"dnsmasq-dns-59fd7cc775-bxj2s\" (UID: \"5b00ed4e-aa3d-4cb2-beaf-65ea32d33f4f\") " pod="openstack/dnsmasq-dns-59fd7cc775-bxj2s" Oct 03 09:01:32 crc kubenswrapper[4790]: I1003 09:01:32.681369 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5b00ed4e-aa3d-4cb2-beaf-65ea32d33f4f-ovsdbserver-nb\") pod \"dnsmasq-dns-59fd7cc775-bxj2s\" (UID: \"5b00ed4e-aa3d-4cb2-beaf-65ea32d33f4f\") " pod="openstack/dnsmasq-dns-59fd7cc775-bxj2s" Oct 03 09:01:32 crc kubenswrapper[4790]: I1003 09:01:32.681523 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5b00ed4e-aa3d-4cb2-beaf-65ea32d33f4f-dns-svc\") pod \"dnsmasq-dns-59fd7cc775-bxj2s\" (UID: \"5b00ed4e-aa3d-4cb2-beaf-65ea32d33f4f\") " pod="openstack/dnsmasq-dns-59fd7cc775-bxj2s" Oct 03 09:01:32 crc kubenswrapper[4790]: I1003 09:01:32.682833 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5b00ed4e-aa3d-4cb2-beaf-65ea32d33f4f-config\") pod \"dnsmasq-dns-59fd7cc775-bxj2s\" (UID: \"5b00ed4e-aa3d-4cb2-beaf-65ea32d33f4f\") " pod="openstack/dnsmasq-dns-59fd7cc775-bxj2s" Oct 03 09:01:32 crc kubenswrapper[4790]: I1003 09:01:32.703172 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q5lmm\" (UniqueName: \"kubernetes.io/projected/5b00ed4e-aa3d-4cb2-beaf-65ea32d33f4f-kube-api-access-q5lmm\") pod \"dnsmasq-dns-59fd7cc775-bxj2s\" (UID: \"5b00ed4e-aa3d-4cb2-beaf-65ea32d33f4f\") " pod="openstack/dnsmasq-dns-59fd7cc775-bxj2s" Oct 03 09:01:32 crc kubenswrapper[4790]: I1003 09:01:32.835138 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-59fd7cc775-bxj2s" Oct 03 09:01:33 crc kubenswrapper[4790]: I1003 09:01:33.317784 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-59fd7cc775-bxj2s"] Oct 03 09:01:33 crc kubenswrapper[4790]: W1003 09:01:33.333869 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5b00ed4e_aa3d_4cb2_beaf_65ea32d33f4f.slice/crio-4ba559de7664fd811939d1cb53e0f18d25d52611fc9b1284c0cb38b12647247f WatchSource:0}: Error finding container 4ba559de7664fd811939d1cb53e0f18d25d52611fc9b1284c0cb38b12647247f: Status 404 returned error can't find the container with id 4ba559de7664fd811939d1cb53e0f18d25d52611fc9b1284c0cb38b12647247f Oct 03 09:01:34 crc kubenswrapper[4790]: I1003 09:01:34.297188 4790 generic.go:334] "Generic (PLEG): container finished" podID="5b00ed4e-aa3d-4cb2-beaf-65ea32d33f4f" containerID="e2d0cd7daa387af207d23fb18bc8e2652cc63773a9e3cb1169781a30d1ccb0cc" exitCode=0 Oct 03 09:01:34 crc kubenswrapper[4790]: I1003 09:01:34.298317 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-59fd7cc775-bxj2s" event={"ID":"5b00ed4e-aa3d-4cb2-beaf-65ea32d33f4f","Type":"ContainerDied","Data":"e2d0cd7daa387af207d23fb18bc8e2652cc63773a9e3cb1169781a30d1ccb0cc"} Oct 03 09:01:34 crc kubenswrapper[4790]: I1003 09:01:34.298343 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-59fd7cc775-bxj2s" event={"ID":"5b00ed4e-aa3d-4cb2-beaf-65ea32d33f4f","Type":"ContainerStarted","Data":"4ba559de7664fd811939d1cb53e0f18d25d52611fc9b1284c0cb38b12647247f"} Oct 03 09:01:34 crc kubenswrapper[4790]: I1003 09:01:34.520718 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Oct 03 09:01:34 crc kubenswrapper[4790]: I1003 09:01:34.524371 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Oct 03 09:01:34 crc kubenswrapper[4790]: I1003 09:01:34.525262 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Oct 03 09:01:34 crc kubenswrapper[4790]: I1003 09:01:34.774971 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 03 09:01:34 crc kubenswrapper[4790]: I1003 09:01:34.775314 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="da09420c-8163-4ee2-a87c-775dc33c4c88" containerName="ceilometer-central-agent" containerID="cri-o://9668ae3eacbcac8ff4f675e807d86a49ceeb45ae5b2b63bc11423a34b951df68" gracePeriod=30 Oct 03 09:01:34 crc kubenswrapper[4790]: I1003 09:01:34.775456 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="da09420c-8163-4ee2-a87c-775dc33c4c88" containerName="proxy-httpd" containerID="cri-o://55218c52eb537cb2415f38b9eba50b6cc56a98f22f172ceefd877dbb935d3b21" gracePeriod=30 Oct 03 09:01:34 crc kubenswrapper[4790]: I1003 09:01:34.775508 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="da09420c-8163-4ee2-a87c-775dc33c4c88" containerName="sg-core" containerID="cri-o://50001bf7bf71900dbf3d4952de5c4087cb01834ce09397e42da000ca0db17871" gracePeriod=30 Oct 03 09:01:34 crc kubenswrapper[4790]: I1003 09:01:34.775550 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="da09420c-8163-4ee2-a87c-775dc33c4c88" containerName="ceilometer-notification-agent" containerID="cri-o://35210c95fb4d2eda702013032c49a2dda3566aae2c999ae80b2955d98312b61b" gracePeriod=30 Oct 03 09:01:34 crc kubenswrapper[4790]: I1003 09:01:34.792477 4790 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="da09420c-8163-4ee2-a87c-775dc33c4c88" containerName="proxy-httpd" probeResult="failure" output="Get \"https://10.217.0.215:3000/\": EOF" Oct 03 09:01:34 crc kubenswrapper[4790]: I1003 09:01:34.919331 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 03 09:01:35 crc kubenswrapper[4790]: I1003 09:01:35.312183 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-59fd7cc775-bxj2s" event={"ID":"5b00ed4e-aa3d-4cb2-beaf-65ea32d33f4f","Type":"ContainerStarted","Data":"4f2be66f7c868359503eafabcb37a05cd64a4504df08bbf7b790b398bc540ede"} Oct 03 09:01:35 crc kubenswrapper[4790]: I1003 09:01:35.312268 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-59fd7cc775-bxj2s" Oct 03 09:01:35 crc kubenswrapper[4790]: I1003 09:01:35.318098 4790 generic.go:334] "Generic (PLEG): container finished" podID="da09420c-8163-4ee2-a87c-775dc33c4c88" containerID="55218c52eb537cb2415f38b9eba50b6cc56a98f22f172ceefd877dbb935d3b21" exitCode=0 Oct 03 09:01:35 crc kubenswrapper[4790]: I1003 09:01:35.318144 4790 generic.go:334] "Generic (PLEG): container finished" podID="da09420c-8163-4ee2-a87c-775dc33c4c88" containerID="50001bf7bf71900dbf3d4952de5c4087cb01834ce09397e42da000ca0db17871" exitCode=2 Oct 03 09:01:35 crc kubenswrapper[4790]: I1003 09:01:35.318154 4790 generic.go:334] "Generic (PLEG): container finished" podID="da09420c-8163-4ee2-a87c-775dc33c4c88" containerID="9668ae3eacbcac8ff4f675e807d86a49ceeb45ae5b2b63bc11423a34b951df68" exitCode=0 Oct 03 09:01:35 crc kubenswrapper[4790]: I1003 09:01:35.318176 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"da09420c-8163-4ee2-a87c-775dc33c4c88","Type":"ContainerDied","Data":"55218c52eb537cb2415f38b9eba50b6cc56a98f22f172ceefd877dbb935d3b21"} Oct 03 09:01:35 crc kubenswrapper[4790]: I1003 09:01:35.318220 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"da09420c-8163-4ee2-a87c-775dc33c4c88","Type":"ContainerDied","Data":"50001bf7bf71900dbf3d4952de5c4087cb01834ce09397e42da000ca0db17871"} Oct 03 09:01:35 crc kubenswrapper[4790]: I1003 09:01:35.318232 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"da09420c-8163-4ee2-a87c-775dc33c4c88","Type":"ContainerDied","Data":"9668ae3eacbcac8ff4f675e807d86a49ceeb45ae5b2b63bc11423a34b951df68"} Oct 03 09:01:35 crc kubenswrapper[4790]: I1003 09:01:35.318319 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="ae318a68-abcf-4d99-ac81-222288742659" containerName="nova-api-log" containerID="cri-o://9ebf0a050ae49b26574fcd2a3a3d7554664709d9a087cf5a8bc8807d902e4fff" gracePeriod=30 Oct 03 09:01:35 crc kubenswrapper[4790]: I1003 09:01:35.318423 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="ae318a68-abcf-4d99-ac81-222288742659" containerName="nova-api-api" containerID="cri-o://507f5bfd2cc42ead6ab943ac90369c48007561516a67a237767096f54ee24f69" gracePeriod=30 Oct 03 09:01:35 crc kubenswrapper[4790]: I1003 09:01:35.324524 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Oct 03 09:01:35 crc kubenswrapper[4790]: I1003 09:01:35.354025 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-59fd7cc775-bxj2s" podStartSLOduration=3.354006416 podStartE2EDuration="3.354006416s" podCreationTimestamp="2025-10-03 09:01:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 09:01:35.343318683 +0000 UTC m=+1281.696252921" watchObservedRunningTime="2025-10-03 09:01:35.354006416 +0000 UTC m=+1281.706940654" Oct 03 09:01:36 crc kubenswrapper[4790]: I1003 09:01:36.293850 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 03 09:01:36 crc kubenswrapper[4790]: I1003 09:01:36.360893 4790 generic.go:334] "Generic (PLEG): container finished" podID="17497332-4b10-4784-9052-b310611d9c0a" containerID="c27b832dd86ad054387e110100163290aa0d4e73019f64f872f90c9276de4f38" exitCode=137 Oct 03 09:01:36 crc kubenswrapper[4790]: I1003 09:01:36.360986 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 03 09:01:36 crc kubenswrapper[4790]: I1003 09:01:36.366109 4790 generic.go:334] "Generic (PLEG): container finished" podID="ae318a68-abcf-4d99-ac81-222288742659" containerID="9ebf0a050ae49b26574fcd2a3a3d7554664709d9a087cf5a8bc8807d902e4fff" exitCode=143 Oct 03 09:01:36 crc kubenswrapper[4790]: I1003 09:01:36.443375 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"17497332-4b10-4784-9052-b310611d9c0a","Type":"ContainerDied","Data":"c27b832dd86ad054387e110100163290aa0d4e73019f64f872f90c9276de4f38"} Oct 03 09:01:36 crc kubenswrapper[4790]: I1003 09:01:36.443460 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"17497332-4b10-4784-9052-b310611d9c0a","Type":"ContainerDied","Data":"c8470b883c2a6f444f8bd680b2ed161ba12789d3815860a07e6daf6ccc290d88"} Oct 03 09:01:36 crc kubenswrapper[4790]: I1003 09:01:36.443481 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ae318a68-abcf-4d99-ac81-222288742659","Type":"ContainerDied","Data":"9ebf0a050ae49b26574fcd2a3a3d7554664709d9a087cf5a8bc8807d902e4fff"} Oct 03 09:01:36 crc kubenswrapper[4790]: I1003 09:01:36.443571 4790 scope.go:117] "RemoveContainer" containerID="c27b832dd86ad054387e110100163290aa0d4e73019f64f872f90c9276de4f38" Oct 03 09:01:36 crc kubenswrapper[4790]: I1003 09:01:36.455670 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/17497332-4b10-4784-9052-b310611d9c0a-config-data\") pod \"17497332-4b10-4784-9052-b310611d9c0a\" (UID: \"17497332-4b10-4784-9052-b310611d9c0a\") " Oct 03 09:01:36 crc kubenswrapper[4790]: I1003 09:01:36.455867 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cqv4j\" (UniqueName: \"kubernetes.io/projected/17497332-4b10-4784-9052-b310611d9c0a-kube-api-access-cqv4j\") pod \"17497332-4b10-4784-9052-b310611d9c0a\" (UID: \"17497332-4b10-4784-9052-b310611d9c0a\") " Oct 03 09:01:36 crc kubenswrapper[4790]: I1003 09:01:36.455906 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/17497332-4b10-4784-9052-b310611d9c0a-combined-ca-bundle\") pod \"17497332-4b10-4784-9052-b310611d9c0a\" (UID: \"17497332-4b10-4784-9052-b310611d9c0a\") " Oct 03 09:01:36 crc kubenswrapper[4790]: I1003 09:01:36.471355 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/17497332-4b10-4784-9052-b310611d9c0a-kube-api-access-cqv4j" (OuterVolumeSpecName: "kube-api-access-cqv4j") pod "17497332-4b10-4784-9052-b310611d9c0a" (UID: "17497332-4b10-4784-9052-b310611d9c0a"). InnerVolumeSpecName "kube-api-access-cqv4j". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:01:36 crc kubenswrapper[4790]: I1003 09:01:36.488211 4790 scope.go:117] "RemoveContainer" containerID="c27b832dd86ad054387e110100163290aa0d4e73019f64f872f90c9276de4f38" Oct 03 09:01:36 crc kubenswrapper[4790]: E1003 09:01:36.488697 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c27b832dd86ad054387e110100163290aa0d4e73019f64f872f90c9276de4f38\": container with ID starting with c27b832dd86ad054387e110100163290aa0d4e73019f64f872f90c9276de4f38 not found: ID does not exist" containerID="c27b832dd86ad054387e110100163290aa0d4e73019f64f872f90c9276de4f38" Oct 03 09:01:36 crc kubenswrapper[4790]: I1003 09:01:36.488751 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c27b832dd86ad054387e110100163290aa0d4e73019f64f872f90c9276de4f38"} err="failed to get container status \"c27b832dd86ad054387e110100163290aa0d4e73019f64f872f90c9276de4f38\": rpc error: code = NotFound desc = could not find container \"c27b832dd86ad054387e110100163290aa0d4e73019f64f872f90c9276de4f38\": container with ID starting with c27b832dd86ad054387e110100163290aa0d4e73019f64f872f90c9276de4f38 not found: ID does not exist" Oct 03 09:01:36 crc kubenswrapper[4790]: I1003 09:01:36.495255 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/17497332-4b10-4784-9052-b310611d9c0a-config-data" (OuterVolumeSpecName: "config-data") pod "17497332-4b10-4784-9052-b310611d9c0a" (UID: "17497332-4b10-4784-9052-b310611d9c0a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:01:36 crc kubenswrapper[4790]: I1003 09:01:36.497249 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/17497332-4b10-4784-9052-b310611d9c0a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "17497332-4b10-4784-9052-b310611d9c0a" (UID: "17497332-4b10-4784-9052-b310611d9c0a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:01:36 crc kubenswrapper[4790]: I1003 09:01:36.564011 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cqv4j\" (UniqueName: \"kubernetes.io/projected/17497332-4b10-4784-9052-b310611d9c0a-kube-api-access-cqv4j\") on node \"crc\" DevicePath \"\"" Oct 03 09:01:36 crc kubenswrapper[4790]: I1003 09:01:36.564052 4790 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/17497332-4b10-4784-9052-b310611d9c0a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 09:01:36 crc kubenswrapper[4790]: I1003 09:01:36.564065 4790 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/17497332-4b10-4784-9052-b310611d9c0a-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 09:01:36 crc kubenswrapper[4790]: I1003 09:01:36.696316 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 03 09:01:36 crc kubenswrapper[4790]: I1003 09:01:36.705667 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 03 09:01:36 crc kubenswrapper[4790]: I1003 09:01:36.725123 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 03 09:01:36 crc kubenswrapper[4790]: E1003 09:01:36.725698 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="17497332-4b10-4784-9052-b310611d9c0a" containerName="nova-cell1-novncproxy-novncproxy" Oct 03 09:01:36 crc kubenswrapper[4790]: I1003 09:01:36.725732 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="17497332-4b10-4784-9052-b310611d9c0a" containerName="nova-cell1-novncproxy-novncproxy" Oct 03 09:01:36 crc kubenswrapper[4790]: I1003 09:01:36.726014 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="17497332-4b10-4784-9052-b310611d9c0a" containerName="nova-cell1-novncproxy-novncproxy" Oct 03 09:01:36 crc kubenswrapper[4790]: I1003 09:01:36.726948 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 03 09:01:36 crc kubenswrapper[4790]: I1003 09:01:36.729519 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-vencrypt" Oct 03 09:01:36 crc kubenswrapper[4790]: I1003 09:01:36.729878 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-public-svc" Oct 03 09:01:36 crc kubenswrapper[4790]: I1003 09:01:36.729991 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Oct 03 09:01:36 crc kubenswrapper[4790]: I1003 09:01:36.734841 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 03 09:01:36 crc kubenswrapper[4790]: I1003 09:01:36.750918 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 03 09:01:36 crc kubenswrapper[4790]: I1003 09:01:36.767106 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/da09420c-8163-4ee2-a87c-775dc33c4c88-run-httpd\") pod \"da09420c-8163-4ee2-a87c-775dc33c4c88\" (UID: \"da09420c-8163-4ee2-a87c-775dc33c4c88\") " Oct 03 09:01:36 crc kubenswrapper[4790]: I1003 09:01:36.767160 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da09420c-8163-4ee2-a87c-775dc33c4c88-combined-ca-bundle\") pod \"da09420c-8163-4ee2-a87c-775dc33c4c88\" (UID: \"da09420c-8163-4ee2-a87c-775dc33c4c88\") " Oct 03 09:01:36 crc kubenswrapper[4790]: I1003 09:01:36.767192 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/da09420c-8163-4ee2-a87c-775dc33c4c88-scripts\") pod \"da09420c-8163-4ee2-a87c-775dc33c4c88\" (UID: \"da09420c-8163-4ee2-a87c-775dc33c4c88\") " Oct 03 09:01:36 crc kubenswrapper[4790]: I1003 09:01:36.767209 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/da09420c-8163-4ee2-a87c-775dc33c4c88-log-httpd\") pod \"da09420c-8163-4ee2-a87c-775dc33c4c88\" (UID: \"da09420c-8163-4ee2-a87c-775dc33c4c88\") " Oct 03 09:01:36 crc kubenswrapper[4790]: I1003 09:01:36.767247 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hg6s2\" (UniqueName: \"kubernetes.io/projected/da09420c-8163-4ee2-a87c-775dc33c4c88-kube-api-access-hg6s2\") pod \"da09420c-8163-4ee2-a87c-775dc33c4c88\" (UID: \"da09420c-8163-4ee2-a87c-775dc33c4c88\") " Oct 03 09:01:36 crc kubenswrapper[4790]: I1003 09:01:36.767285 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/da09420c-8163-4ee2-a87c-775dc33c4c88-config-data\") pod \"da09420c-8163-4ee2-a87c-775dc33c4c88\" (UID: \"da09420c-8163-4ee2-a87c-775dc33c4c88\") " Oct 03 09:01:36 crc kubenswrapper[4790]: I1003 09:01:36.767301 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/da09420c-8163-4ee2-a87c-775dc33c4c88-ceilometer-tls-certs\") pod \"da09420c-8163-4ee2-a87c-775dc33c4c88\" (UID: \"da09420c-8163-4ee2-a87c-775dc33c4c88\") " Oct 03 09:01:36 crc kubenswrapper[4790]: I1003 09:01:36.767368 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/da09420c-8163-4ee2-a87c-775dc33c4c88-sg-core-conf-yaml\") pod \"da09420c-8163-4ee2-a87c-775dc33c4c88\" (UID: \"da09420c-8163-4ee2-a87c-775dc33c4c88\") " Oct 03 09:01:36 crc kubenswrapper[4790]: I1003 09:01:36.767487 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bbe5d109-dfdc-4f03-afa3-a71800a7d74b-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"bbe5d109-dfdc-4f03-afa3-a71800a7d74b\") " pod="openstack/nova-cell1-novncproxy-0" Oct 03 09:01:36 crc kubenswrapper[4790]: I1003 09:01:36.767529 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/bbe5d109-dfdc-4f03-afa3-a71800a7d74b-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"bbe5d109-dfdc-4f03-afa3-a71800a7d74b\") " pod="openstack/nova-cell1-novncproxy-0" Oct 03 09:01:36 crc kubenswrapper[4790]: I1003 09:01:36.767590 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/da09420c-8163-4ee2-a87c-775dc33c4c88-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "da09420c-8163-4ee2-a87c-775dc33c4c88" (UID: "da09420c-8163-4ee2-a87c-775dc33c4c88"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 09:01:36 crc kubenswrapper[4790]: I1003 09:01:36.767624 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h6qjh\" (UniqueName: \"kubernetes.io/projected/bbe5d109-dfdc-4f03-afa3-a71800a7d74b-kube-api-access-h6qjh\") pod \"nova-cell1-novncproxy-0\" (UID: \"bbe5d109-dfdc-4f03-afa3-a71800a7d74b\") " pod="openstack/nova-cell1-novncproxy-0" Oct 03 09:01:36 crc kubenswrapper[4790]: I1003 09:01:36.767673 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/bbe5d109-dfdc-4f03-afa3-a71800a7d74b-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"bbe5d109-dfdc-4f03-afa3-a71800a7d74b\") " pod="openstack/nova-cell1-novncproxy-0" Oct 03 09:01:36 crc kubenswrapper[4790]: I1003 09:01:36.767695 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bbe5d109-dfdc-4f03-afa3-a71800a7d74b-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"bbe5d109-dfdc-4f03-afa3-a71800a7d74b\") " pod="openstack/nova-cell1-novncproxy-0" Oct 03 09:01:36 crc kubenswrapper[4790]: I1003 09:01:36.768036 4790 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/da09420c-8163-4ee2-a87c-775dc33c4c88-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 03 09:01:36 crc kubenswrapper[4790]: I1003 09:01:36.770181 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/da09420c-8163-4ee2-a87c-775dc33c4c88-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "da09420c-8163-4ee2-a87c-775dc33c4c88" (UID: "da09420c-8163-4ee2-a87c-775dc33c4c88"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 09:01:36 crc kubenswrapper[4790]: I1003 09:01:36.804848 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/da09420c-8163-4ee2-a87c-775dc33c4c88-kube-api-access-hg6s2" (OuterVolumeSpecName: "kube-api-access-hg6s2") pod "da09420c-8163-4ee2-a87c-775dc33c4c88" (UID: "da09420c-8163-4ee2-a87c-775dc33c4c88"). InnerVolumeSpecName "kube-api-access-hg6s2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:01:36 crc kubenswrapper[4790]: I1003 09:01:36.806342 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/da09420c-8163-4ee2-a87c-775dc33c4c88-scripts" (OuterVolumeSpecName: "scripts") pod "da09420c-8163-4ee2-a87c-775dc33c4c88" (UID: "da09420c-8163-4ee2-a87c-775dc33c4c88"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:01:36 crc kubenswrapper[4790]: I1003 09:01:36.813859 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/da09420c-8163-4ee2-a87c-775dc33c4c88-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "da09420c-8163-4ee2-a87c-775dc33c4c88" (UID: "da09420c-8163-4ee2-a87c-775dc33c4c88"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:01:36 crc kubenswrapper[4790]: I1003 09:01:36.845704 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/da09420c-8163-4ee2-a87c-775dc33c4c88-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "da09420c-8163-4ee2-a87c-775dc33c4c88" (UID: "da09420c-8163-4ee2-a87c-775dc33c4c88"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:01:36 crc kubenswrapper[4790]: I1003 09:01:36.869702 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bbe5d109-dfdc-4f03-afa3-a71800a7d74b-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"bbe5d109-dfdc-4f03-afa3-a71800a7d74b\") " pod="openstack/nova-cell1-novncproxy-0" Oct 03 09:01:36 crc kubenswrapper[4790]: I1003 09:01:36.869776 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/bbe5d109-dfdc-4f03-afa3-a71800a7d74b-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"bbe5d109-dfdc-4f03-afa3-a71800a7d74b\") " pod="openstack/nova-cell1-novncproxy-0" Oct 03 09:01:36 crc kubenswrapper[4790]: I1003 09:01:36.869849 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h6qjh\" (UniqueName: \"kubernetes.io/projected/bbe5d109-dfdc-4f03-afa3-a71800a7d74b-kube-api-access-h6qjh\") pod \"nova-cell1-novncproxy-0\" (UID: \"bbe5d109-dfdc-4f03-afa3-a71800a7d74b\") " pod="openstack/nova-cell1-novncproxy-0" Oct 03 09:01:36 crc kubenswrapper[4790]: I1003 09:01:36.869885 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/bbe5d109-dfdc-4f03-afa3-a71800a7d74b-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"bbe5d109-dfdc-4f03-afa3-a71800a7d74b\") " pod="openstack/nova-cell1-novncproxy-0" Oct 03 09:01:36 crc kubenswrapper[4790]: I1003 09:01:36.869903 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bbe5d109-dfdc-4f03-afa3-a71800a7d74b-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"bbe5d109-dfdc-4f03-afa3-a71800a7d74b\") " pod="openstack/nova-cell1-novncproxy-0" Oct 03 09:01:36 crc kubenswrapper[4790]: I1003 09:01:36.870002 4790 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/da09420c-8163-4ee2-a87c-775dc33c4c88-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 09:01:36 crc kubenswrapper[4790]: I1003 09:01:36.870014 4790 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/da09420c-8163-4ee2-a87c-775dc33c4c88-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 03 09:01:36 crc kubenswrapper[4790]: I1003 09:01:36.870025 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hg6s2\" (UniqueName: \"kubernetes.io/projected/da09420c-8163-4ee2-a87c-775dc33c4c88-kube-api-access-hg6s2\") on node \"crc\" DevicePath \"\"" Oct 03 09:01:36 crc kubenswrapper[4790]: I1003 09:01:36.870040 4790 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/da09420c-8163-4ee2-a87c-775dc33c4c88-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 03 09:01:36 crc kubenswrapper[4790]: I1003 09:01:36.870050 4790 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/da09420c-8163-4ee2-a87c-775dc33c4c88-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 03 09:01:36 crc kubenswrapper[4790]: I1003 09:01:36.874009 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bbe5d109-dfdc-4f03-afa3-a71800a7d74b-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"bbe5d109-dfdc-4f03-afa3-a71800a7d74b\") " pod="openstack/nova-cell1-novncproxy-0" Oct 03 09:01:36 crc kubenswrapper[4790]: I1003 09:01:36.874172 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bbe5d109-dfdc-4f03-afa3-a71800a7d74b-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"bbe5d109-dfdc-4f03-afa3-a71800a7d74b\") " pod="openstack/nova-cell1-novncproxy-0" Oct 03 09:01:36 crc kubenswrapper[4790]: I1003 09:01:36.874984 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/bbe5d109-dfdc-4f03-afa3-a71800a7d74b-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"bbe5d109-dfdc-4f03-afa3-a71800a7d74b\") " pod="openstack/nova-cell1-novncproxy-0" Oct 03 09:01:36 crc kubenswrapper[4790]: I1003 09:01:36.877817 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/bbe5d109-dfdc-4f03-afa3-a71800a7d74b-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"bbe5d109-dfdc-4f03-afa3-a71800a7d74b\") " pod="openstack/nova-cell1-novncproxy-0" Oct 03 09:01:36 crc kubenswrapper[4790]: I1003 09:01:36.882868 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/da09420c-8163-4ee2-a87c-775dc33c4c88-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "da09420c-8163-4ee2-a87c-775dc33c4c88" (UID: "da09420c-8163-4ee2-a87c-775dc33c4c88"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:01:36 crc kubenswrapper[4790]: I1003 09:01:36.884218 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/da09420c-8163-4ee2-a87c-775dc33c4c88-config-data" (OuterVolumeSpecName: "config-data") pod "da09420c-8163-4ee2-a87c-775dc33c4c88" (UID: "da09420c-8163-4ee2-a87c-775dc33c4c88"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:01:36 crc kubenswrapper[4790]: I1003 09:01:36.888676 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h6qjh\" (UniqueName: \"kubernetes.io/projected/bbe5d109-dfdc-4f03-afa3-a71800a7d74b-kube-api-access-h6qjh\") pod \"nova-cell1-novncproxy-0\" (UID: \"bbe5d109-dfdc-4f03-afa3-a71800a7d74b\") " pod="openstack/nova-cell1-novncproxy-0" Oct 03 09:01:36 crc kubenswrapper[4790]: I1003 09:01:36.971870 4790 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da09420c-8163-4ee2-a87c-775dc33c4c88-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 09:01:36 crc kubenswrapper[4790]: I1003 09:01:36.971905 4790 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/da09420c-8163-4ee2-a87c-775dc33c4c88-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 09:01:37 crc kubenswrapper[4790]: I1003 09:01:37.054241 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 03 09:01:37 crc kubenswrapper[4790]: I1003 09:01:37.394317 4790 generic.go:334] "Generic (PLEG): container finished" podID="da09420c-8163-4ee2-a87c-775dc33c4c88" containerID="35210c95fb4d2eda702013032c49a2dda3566aae2c999ae80b2955d98312b61b" exitCode=0 Oct 03 09:01:37 crc kubenswrapper[4790]: I1003 09:01:37.394391 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 03 09:01:37 crc kubenswrapper[4790]: I1003 09:01:37.394441 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"da09420c-8163-4ee2-a87c-775dc33c4c88","Type":"ContainerDied","Data":"35210c95fb4d2eda702013032c49a2dda3566aae2c999ae80b2955d98312b61b"} Oct 03 09:01:37 crc kubenswrapper[4790]: I1003 09:01:37.394481 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"da09420c-8163-4ee2-a87c-775dc33c4c88","Type":"ContainerDied","Data":"a89ad72aace2d295e7e2c1a2666fb21d262bc9cf0e59692836851e434dc2e199"} Oct 03 09:01:37 crc kubenswrapper[4790]: I1003 09:01:37.394503 4790 scope.go:117] "RemoveContainer" containerID="55218c52eb537cb2415f38b9eba50b6cc56a98f22f172ceefd877dbb935d3b21" Oct 03 09:01:37 crc kubenswrapper[4790]: I1003 09:01:37.399546 4790 generic.go:334] "Generic (PLEG): container finished" podID="ae318a68-abcf-4d99-ac81-222288742659" containerID="507f5bfd2cc42ead6ab943ac90369c48007561516a67a237767096f54ee24f69" exitCode=0 Oct 03 09:01:37 crc kubenswrapper[4790]: I1003 09:01:37.399623 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ae318a68-abcf-4d99-ac81-222288742659","Type":"ContainerDied","Data":"507f5bfd2cc42ead6ab943ac90369c48007561516a67a237767096f54ee24f69"} Oct 03 09:01:37 crc kubenswrapper[4790]: I1003 09:01:37.416464 4790 scope.go:117] "RemoveContainer" containerID="50001bf7bf71900dbf3d4952de5c4087cb01834ce09397e42da000ca0db17871" Oct 03 09:01:37 crc kubenswrapper[4790]: I1003 09:01:37.435475 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 03 09:01:37 crc kubenswrapper[4790]: I1003 09:01:37.445053 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 03 09:01:37 crc kubenswrapper[4790]: I1003 09:01:37.463670 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 03 09:01:37 crc kubenswrapper[4790]: E1003 09:01:37.464244 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da09420c-8163-4ee2-a87c-775dc33c4c88" containerName="ceilometer-notification-agent" Oct 03 09:01:37 crc kubenswrapper[4790]: I1003 09:01:37.464262 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="da09420c-8163-4ee2-a87c-775dc33c4c88" containerName="ceilometer-notification-agent" Oct 03 09:01:37 crc kubenswrapper[4790]: E1003 09:01:37.464287 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da09420c-8163-4ee2-a87c-775dc33c4c88" containerName="sg-core" Oct 03 09:01:37 crc kubenswrapper[4790]: I1003 09:01:37.464295 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="da09420c-8163-4ee2-a87c-775dc33c4c88" containerName="sg-core" Oct 03 09:01:37 crc kubenswrapper[4790]: E1003 09:01:37.464313 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da09420c-8163-4ee2-a87c-775dc33c4c88" containerName="ceilometer-central-agent" Oct 03 09:01:37 crc kubenswrapper[4790]: I1003 09:01:37.464320 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="da09420c-8163-4ee2-a87c-775dc33c4c88" containerName="ceilometer-central-agent" Oct 03 09:01:37 crc kubenswrapper[4790]: E1003 09:01:37.464346 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da09420c-8163-4ee2-a87c-775dc33c4c88" containerName="proxy-httpd" Oct 03 09:01:37 crc kubenswrapper[4790]: I1003 09:01:37.464353 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="da09420c-8163-4ee2-a87c-775dc33c4c88" containerName="proxy-httpd" Oct 03 09:01:37 crc kubenswrapper[4790]: I1003 09:01:37.465688 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="da09420c-8163-4ee2-a87c-775dc33c4c88" containerName="ceilometer-notification-agent" Oct 03 09:01:37 crc kubenswrapper[4790]: I1003 09:01:37.465730 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="da09420c-8163-4ee2-a87c-775dc33c4c88" containerName="proxy-httpd" Oct 03 09:01:37 crc kubenswrapper[4790]: I1003 09:01:37.465759 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="da09420c-8163-4ee2-a87c-775dc33c4c88" containerName="sg-core" Oct 03 09:01:37 crc kubenswrapper[4790]: I1003 09:01:37.465779 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="da09420c-8163-4ee2-a87c-775dc33c4c88" containerName="ceilometer-central-agent" Oct 03 09:01:37 crc kubenswrapper[4790]: I1003 09:01:37.472605 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 03 09:01:37 crc kubenswrapper[4790]: I1003 09:01:37.477848 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 03 09:01:37 crc kubenswrapper[4790]: I1003 09:01:37.478692 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Oct 03 09:01:37 crc kubenswrapper[4790]: I1003 09:01:37.478925 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 03 09:01:37 crc kubenswrapper[4790]: I1003 09:01:37.479802 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 03 09:01:37 crc kubenswrapper[4790]: I1003 09:01:37.481978 4790 scope.go:117] "RemoveContainer" containerID="35210c95fb4d2eda702013032c49a2dda3566aae2c999ae80b2955d98312b61b" Oct 03 09:01:37 crc kubenswrapper[4790]: I1003 09:01:37.504614 4790 scope.go:117] "RemoveContainer" containerID="9668ae3eacbcac8ff4f675e807d86a49ceeb45ae5b2b63bc11423a34b951df68" Oct 03 09:01:37 crc kubenswrapper[4790]: I1003 09:01:37.586448 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tjlpl\" (UniqueName: \"kubernetes.io/projected/ab369c6c-af88-4f07-bc5c-fb229a9b2cc8-kube-api-access-tjlpl\") pod \"ceilometer-0\" (UID: \"ab369c6c-af88-4f07-bc5c-fb229a9b2cc8\") " pod="openstack/ceilometer-0" Oct 03 09:01:37 crc kubenswrapper[4790]: I1003 09:01:37.586509 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ab369c6c-af88-4f07-bc5c-fb229a9b2cc8-config-data\") pod \"ceilometer-0\" (UID: \"ab369c6c-af88-4f07-bc5c-fb229a9b2cc8\") " pod="openstack/ceilometer-0" Oct 03 09:01:37 crc kubenswrapper[4790]: I1003 09:01:37.586530 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ab369c6c-af88-4f07-bc5c-fb229a9b2cc8-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ab369c6c-af88-4f07-bc5c-fb229a9b2cc8\") " pod="openstack/ceilometer-0" Oct 03 09:01:37 crc kubenswrapper[4790]: I1003 09:01:37.586557 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ab369c6c-af88-4f07-bc5c-fb229a9b2cc8-run-httpd\") pod \"ceilometer-0\" (UID: \"ab369c6c-af88-4f07-bc5c-fb229a9b2cc8\") " pod="openstack/ceilometer-0" Oct 03 09:01:37 crc kubenswrapper[4790]: I1003 09:01:37.586614 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/ab369c6c-af88-4f07-bc5c-fb229a9b2cc8-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"ab369c6c-af88-4f07-bc5c-fb229a9b2cc8\") " pod="openstack/ceilometer-0" Oct 03 09:01:37 crc kubenswrapper[4790]: I1003 09:01:37.586642 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ab369c6c-af88-4f07-bc5c-fb229a9b2cc8-log-httpd\") pod \"ceilometer-0\" (UID: \"ab369c6c-af88-4f07-bc5c-fb229a9b2cc8\") " pod="openstack/ceilometer-0" Oct 03 09:01:37 crc kubenswrapper[4790]: I1003 09:01:37.586710 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab369c6c-af88-4f07-bc5c-fb229a9b2cc8-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ab369c6c-af88-4f07-bc5c-fb229a9b2cc8\") " pod="openstack/ceilometer-0" Oct 03 09:01:37 crc kubenswrapper[4790]: I1003 09:01:37.586733 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ab369c6c-af88-4f07-bc5c-fb229a9b2cc8-scripts\") pod \"ceilometer-0\" (UID: \"ab369c6c-af88-4f07-bc5c-fb229a9b2cc8\") " pod="openstack/ceilometer-0" Oct 03 09:01:37 crc kubenswrapper[4790]: I1003 09:01:37.613872 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 03 09:01:37 crc kubenswrapper[4790]: I1003 09:01:37.617041 4790 scope.go:117] "RemoveContainer" containerID="55218c52eb537cb2415f38b9eba50b6cc56a98f22f172ceefd877dbb935d3b21" Oct 03 09:01:37 crc kubenswrapper[4790]: E1003 09:01:37.617388 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"55218c52eb537cb2415f38b9eba50b6cc56a98f22f172ceefd877dbb935d3b21\": container with ID starting with 55218c52eb537cb2415f38b9eba50b6cc56a98f22f172ceefd877dbb935d3b21 not found: ID does not exist" containerID="55218c52eb537cb2415f38b9eba50b6cc56a98f22f172ceefd877dbb935d3b21" Oct 03 09:01:37 crc kubenswrapper[4790]: I1003 09:01:37.617433 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"55218c52eb537cb2415f38b9eba50b6cc56a98f22f172ceefd877dbb935d3b21"} err="failed to get container status \"55218c52eb537cb2415f38b9eba50b6cc56a98f22f172ceefd877dbb935d3b21\": rpc error: code = NotFound desc = could not find container \"55218c52eb537cb2415f38b9eba50b6cc56a98f22f172ceefd877dbb935d3b21\": container with ID starting with 55218c52eb537cb2415f38b9eba50b6cc56a98f22f172ceefd877dbb935d3b21 not found: ID does not exist" Oct 03 09:01:37 crc kubenswrapper[4790]: I1003 09:01:37.617458 4790 scope.go:117] "RemoveContainer" containerID="50001bf7bf71900dbf3d4952de5c4087cb01834ce09397e42da000ca0db17871" Oct 03 09:01:37 crc kubenswrapper[4790]: E1003 09:01:37.617821 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"50001bf7bf71900dbf3d4952de5c4087cb01834ce09397e42da000ca0db17871\": container with ID starting with 50001bf7bf71900dbf3d4952de5c4087cb01834ce09397e42da000ca0db17871 not found: ID does not exist" containerID="50001bf7bf71900dbf3d4952de5c4087cb01834ce09397e42da000ca0db17871" Oct 03 09:01:37 crc kubenswrapper[4790]: I1003 09:01:37.617880 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"50001bf7bf71900dbf3d4952de5c4087cb01834ce09397e42da000ca0db17871"} err="failed to get container status \"50001bf7bf71900dbf3d4952de5c4087cb01834ce09397e42da000ca0db17871\": rpc error: code = NotFound desc = could not find container \"50001bf7bf71900dbf3d4952de5c4087cb01834ce09397e42da000ca0db17871\": container with ID starting with 50001bf7bf71900dbf3d4952de5c4087cb01834ce09397e42da000ca0db17871 not found: ID does not exist" Oct 03 09:01:37 crc kubenswrapper[4790]: I1003 09:01:37.617913 4790 scope.go:117] "RemoveContainer" containerID="35210c95fb4d2eda702013032c49a2dda3566aae2c999ae80b2955d98312b61b" Oct 03 09:01:37 crc kubenswrapper[4790]: E1003 09:01:37.618211 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"35210c95fb4d2eda702013032c49a2dda3566aae2c999ae80b2955d98312b61b\": container with ID starting with 35210c95fb4d2eda702013032c49a2dda3566aae2c999ae80b2955d98312b61b not found: ID does not exist" containerID="35210c95fb4d2eda702013032c49a2dda3566aae2c999ae80b2955d98312b61b" Oct 03 09:01:37 crc kubenswrapper[4790]: I1003 09:01:37.618235 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"35210c95fb4d2eda702013032c49a2dda3566aae2c999ae80b2955d98312b61b"} err="failed to get container status \"35210c95fb4d2eda702013032c49a2dda3566aae2c999ae80b2955d98312b61b\": rpc error: code = NotFound desc = could not find container \"35210c95fb4d2eda702013032c49a2dda3566aae2c999ae80b2955d98312b61b\": container with ID starting with 35210c95fb4d2eda702013032c49a2dda3566aae2c999ae80b2955d98312b61b not found: ID does not exist" Oct 03 09:01:37 crc kubenswrapper[4790]: I1003 09:01:37.618248 4790 scope.go:117] "RemoveContainer" containerID="9668ae3eacbcac8ff4f675e807d86a49ceeb45ae5b2b63bc11423a34b951df68" Oct 03 09:01:37 crc kubenswrapper[4790]: E1003 09:01:37.618480 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9668ae3eacbcac8ff4f675e807d86a49ceeb45ae5b2b63bc11423a34b951df68\": container with ID starting with 9668ae3eacbcac8ff4f675e807d86a49ceeb45ae5b2b63bc11423a34b951df68 not found: ID does not exist" containerID="9668ae3eacbcac8ff4f675e807d86a49ceeb45ae5b2b63bc11423a34b951df68" Oct 03 09:01:37 crc kubenswrapper[4790]: I1003 09:01:37.618519 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9668ae3eacbcac8ff4f675e807d86a49ceeb45ae5b2b63bc11423a34b951df68"} err="failed to get container status \"9668ae3eacbcac8ff4f675e807d86a49ceeb45ae5b2b63bc11423a34b951df68\": rpc error: code = NotFound desc = could not find container \"9668ae3eacbcac8ff4f675e807d86a49ceeb45ae5b2b63bc11423a34b951df68\": container with ID starting with 9668ae3eacbcac8ff4f675e807d86a49ceeb45ae5b2b63bc11423a34b951df68 not found: ID does not exist" Oct 03 09:01:37 crc kubenswrapper[4790]: I1003 09:01:37.689112 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ab369c6c-af88-4f07-bc5c-fb229a9b2cc8-run-httpd\") pod \"ceilometer-0\" (UID: \"ab369c6c-af88-4f07-bc5c-fb229a9b2cc8\") " pod="openstack/ceilometer-0" Oct 03 09:01:37 crc kubenswrapper[4790]: I1003 09:01:37.689560 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/ab369c6c-af88-4f07-bc5c-fb229a9b2cc8-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"ab369c6c-af88-4f07-bc5c-fb229a9b2cc8\") " pod="openstack/ceilometer-0" Oct 03 09:01:37 crc kubenswrapper[4790]: I1003 09:01:37.689636 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ab369c6c-af88-4f07-bc5c-fb229a9b2cc8-log-httpd\") pod \"ceilometer-0\" (UID: \"ab369c6c-af88-4f07-bc5c-fb229a9b2cc8\") " pod="openstack/ceilometer-0" Oct 03 09:01:37 crc kubenswrapper[4790]: I1003 09:01:37.689702 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab369c6c-af88-4f07-bc5c-fb229a9b2cc8-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ab369c6c-af88-4f07-bc5c-fb229a9b2cc8\") " pod="openstack/ceilometer-0" Oct 03 09:01:37 crc kubenswrapper[4790]: I1003 09:01:37.689731 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ab369c6c-af88-4f07-bc5c-fb229a9b2cc8-scripts\") pod \"ceilometer-0\" (UID: \"ab369c6c-af88-4f07-bc5c-fb229a9b2cc8\") " pod="openstack/ceilometer-0" Oct 03 09:01:37 crc kubenswrapper[4790]: I1003 09:01:37.689897 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tjlpl\" (UniqueName: \"kubernetes.io/projected/ab369c6c-af88-4f07-bc5c-fb229a9b2cc8-kube-api-access-tjlpl\") pod \"ceilometer-0\" (UID: \"ab369c6c-af88-4f07-bc5c-fb229a9b2cc8\") " pod="openstack/ceilometer-0" Oct 03 09:01:37 crc kubenswrapper[4790]: I1003 09:01:37.689942 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ab369c6c-af88-4f07-bc5c-fb229a9b2cc8-config-data\") pod \"ceilometer-0\" (UID: \"ab369c6c-af88-4f07-bc5c-fb229a9b2cc8\") " pod="openstack/ceilometer-0" Oct 03 09:01:37 crc kubenswrapper[4790]: I1003 09:01:37.689972 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ab369c6c-af88-4f07-bc5c-fb229a9b2cc8-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ab369c6c-af88-4f07-bc5c-fb229a9b2cc8\") " pod="openstack/ceilometer-0" Oct 03 09:01:37 crc kubenswrapper[4790]: I1003 09:01:37.704848 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ab369c6c-af88-4f07-bc5c-fb229a9b2cc8-run-httpd\") pod \"ceilometer-0\" (UID: \"ab369c6c-af88-4f07-bc5c-fb229a9b2cc8\") " pod="openstack/ceilometer-0" Oct 03 09:01:37 crc kubenswrapper[4790]: I1003 09:01:37.704854 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ab369c6c-af88-4f07-bc5c-fb229a9b2cc8-log-httpd\") pod \"ceilometer-0\" (UID: \"ab369c6c-af88-4f07-bc5c-fb229a9b2cc8\") " pod="openstack/ceilometer-0" Oct 03 09:01:37 crc kubenswrapper[4790]: I1003 09:01:37.707500 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ab369c6c-af88-4f07-bc5c-fb229a9b2cc8-scripts\") pod \"ceilometer-0\" (UID: \"ab369c6c-af88-4f07-bc5c-fb229a9b2cc8\") " pod="openstack/ceilometer-0" Oct 03 09:01:37 crc kubenswrapper[4790]: I1003 09:01:37.709926 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ab369c6c-af88-4f07-bc5c-fb229a9b2cc8-config-data\") pod \"ceilometer-0\" (UID: \"ab369c6c-af88-4f07-bc5c-fb229a9b2cc8\") " pod="openstack/ceilometer-0" Oct 03 09:01:37 crc kubenswrapper[4790]: I1003 09:01:37.711006 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab369c6c-af88-4f07-bc5c-fb229a9b2cc8-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ab369c6c-af88-4f07-bc5c-fb229a9b2cc8\") " pod="openstack/ceilometer-0" Oct 03 09:01:37 crc kubenswrapper[4790]: I1003 09:01:37.719263 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/ab369c6c-af88-4f07-bc5c-fb229a9b2cc8-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"ab369c6c-af88-4f07-bc5c-fb229a9b2cc8\") " pod="openstack/ceilometer-0" Oct 03 09:01:37 crc kubenswrapper[4790]: I1003 09:01:37.722429 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ab369c6c-af88-4f07-bc5c-fb229a9b2cc8-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ab369c6c-af88-4f07-bc5c-fb229a9b2cc8\") " pod="openstack/ceilometer-0" Oct 03 09:01:37 crc kubenswrapper[4790]: I1003 09:01:37.726790 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 03 09:01:37 crc kubenswrapper[4790]: I1003 09:01:37.727248 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tjlpl\" (UniqueName: \"kubernetes.io/projected/ab369c6c-af88-4f07-bc5c-fb229a9b2cc8-kube-api-access-tjlpl\") pod \"ceilometer-0\" (UID: \"ab369c6c-af88-4f07-bc5c-fb229a9b2cc8\") " pod="openstack/ceilometer-0" Oct 03 09:01:37 crc kubenswrapper[4790]: I1003 09:01:37.791156 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ae318a68-abcf-4d99-ac81-222288742659-config-data\") pod \"ae318a68-abcf-4d99-ac81-222288742659\" (UID: \"ae318a68-abcf-4d99-ac81-222288742659\") " Oct 03 09:01:37 crc kubenswrapper[4790]: I1003 09:01:37.791292 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ae318a68-abcf-4d99-ac81-222288742659-logs\") pod \"ae318a68-abcf-4d99-ac81-222288742659\" (UID: \"ae318a68-abcf-4d99-ac81-222288742659\") " Oct 03 09:01:37 crc kubenswrapper[4790]: I1003 09:01:37.791349 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rllwn\" (UniqueName: \"kubernetes.io/projected/ae318a68-abcf-4d99-ac81-222288742659-kube-api-access-rllwn\") pod \"ae318a68-abcf-4d99-ac81-222288742659\" (UID: \"ae318a68-abcf-4d99-ac81-222288742659\") " Oct 03 09:01:37 crc kubenswrapper[4790]: I1003 09:01:37.791446 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ae318a68-abcf-4d99-ac81-222288742659-combined-ca-bundle\") pod \"ae318a68-abcf-4d99-ac81-222288742659\" (UID: \"ae318a68-abcf-4d99-ac81-222288742659\") " Oct 03 09:01:37 crc kubenswrapper[4790]: I1003 09:01:37.792193 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ae318a68-abcf-4d99-ac81-222288742659-logs" (OuterVolumeSpecName: "logs") pod "ae318a68-abcf-4d99-ac81-222288742659" (UID: "ae318a68-abcf-4d99-ac81-222288742659"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 09:01:37 crc kubenswrapper[4790]: I1003 09:01:37.797189 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 03 09:01:37 crc kubenswrapper[4790]: I1003 09:01:37.799225 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ae318a68-abcf-4d99-ac81-222288742659-kube-api-access-rllwn" (OuterVolumeSpecName: "kube-api-access-rllwn") pod "ae318a68-abcf-4d99-ac81-222288742659" (UID: "ae318a68-abcf-4d99-ac81-222288742659"). InnerVolumeSpecName "kube-api-access-rllwn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:01:37 crc kubenswrapper[4790]: I1003 09:01:37.827700 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ae318a68-abcf-4d99-ac81-222288742659-config-data" (OuterVolumeSpecName: "config-data") pod "ae318a68-abcf-4d99-ac81-222288742659" (UID: "ae318a68-abcf-4d99-ac81-222288742659"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:01:37 crc kubenswrapper[4790]: I1003 09:01:37.833733 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ae318a68-abcf-4d99-ac81-222288742659-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ae318a68-abcf-4d99-ac81-222288742659" (UID: "ae318a68-abcf-4d99-ac81-222288742659"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:01:37 crc kubenswrapper[4790]: I1003 09:01:37.895446 4790 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ae318a68-abcf-4d99-ac81-222288742659-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 09:01:37 crc kubenswrapper[4790]: I1003 09:01:37.895472 4790 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ae318a68-abcf-4d99-ac81-222288742659-logs\") on node \"crc\" DevicePath \"\"" Oct 03 09:01:37 crc kubenswrapper[4790]: I1003 09:01:37.895482 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rllwn\" (UniqueName: \"kubernetes.io/projected/ae318a68-abcf-4d99-ac81-222288742659-kube-api-access-rllwn\") on node \"crc\" DevicePath \"\"" Oct 03 09:01:37 crc kubenswrapper[4790]: I1003 09:01:37.895490 4790 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ae318a68-abcf-4d99-ac81-222288742659-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 09:01:38 crc kubenswrapper[4790]: I1003 09:01:38.244803 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 03 09:01:38 crc kubenswrapper[4790]: W1003 09:01:38.251898 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podab369c6c_af88_4f07_bc5c_fb229a9b2cc8.slice/crio-b35a17ce628861d8c891ef4933e2f3d8a3d06f8d5203f13d378caae69c57cf6a WatchSource:0}: Error finding container b35a17ce628861d8c891ef4933e2f3d8a3d06f8d5203f13d378caae69c57cf6a: Status 404 returned error can't find the container with id b35a17ce628861d8c891ef4933e2f3d8a3d06f8d5203f13d378caae69c57cf6a Oct 03 09:01:38 crc kubenswrapper[4790]: I1003 09:01:38.346899 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="17497332-4b10-4784-9052-b310611d9c0a" path="/var/lib/kubelet/pods/17497332-4b10-4784-9052-b310611d9c0a/volumes" Oct 03 09:01:38 crc kubenswrapper[4790]: I1003 09:01:38.347860 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="da09420c-8163-4ee2-a87c-775dc33c4c88" path="/var/lib/kubelet/pods/da09420c-8163-4ee2-a87c-775dc33c4c88/volumes" Oct 03 09:01:38 crc kubenswrapper[4790]: I1003 09:01:38.424079 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ab369c6c-af88-4f07-bc5c-fb229a9b2cc8","Type":"ContainerStarted","Data":"b35a17ce628861d8c891ef4933e2f3d8a3d06f8d5203f13d378caae69c57cf6a"} Oct 03 09:01:38 crc kubenswrapper[4790]: I1003 09:01:38.428153 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ae318a68-abcf-4d99-ac81-222288742659","Type":"ContainerDied","Data":"6fc4ffb28c39d299990eb381bbfbdc0a384c81fd683af0228092326cf0ff0e3b"} Oct 03 09:01:38 crc kubenswrapper[4790]: I1003 09:01:38.428187 4790 scope.go:117] "RemoveContainer" containerID="507f5bfd2cc42ead6ab943ac90369c48007561516a67a237767096f54ee24f69" Oct 03 09:01:38 crc kubenswrapper[4790]: I1003 09:01:38.428306 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 03 09:01:38 crc kubenswrapper[4790]: I1003 09:01:38.434031 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"bbe5d109-dfdc-4f03-afa3-a71800a7d74b","Type":"ContainerStarted","Data":"24a159daa1d9921ff2f5c60ff24e484884eb8bfcb12fede91d73413af35f6b8e"} Oct 03 09:01:38 crc kubenswrapper[4790]: I1003 09:01:38.434070 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"bbe5d109-dfdc-4f03-afa3-a71800a7d74b","Type":"ContainerStarted","Data":"32d3d994f34d2b6cd1c2d76196f68460c1eb8e5da5b280b15e5d33b184423723"} Oct 03 09:01:38 crc kubenswrapper[4790]: I1003 09:01:38.466448 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.466382234 podStartE2EDuration="2.466382234s" podCreationTimestamp="2025-10-03 09:01:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 09:01:38.449083059 +0000 UTC m=+1284.802017307" watchObservedRunningTime="2025-10-03 09:01:38.466382234 +0000 UTC m=+1284.819316482" Oct 03 09:01:38 crc kubenswrapper[4790]: I1003 09:01:38.493556 4790 scope.go:117] "RemoveContainer" containerID="9ebf0a050ae49b26574fcd2a3a3d7554664709d9a087cf5a8bc8807d902e4fff" Oct 03 09:01:38 crc kubenswrapper[4790]: I1003 09:01:38.533952 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 03 09:01:38 crc kubenswrapper[4790]: I1003 09:01:38.559161 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Oct 03 09:01:38 crc kubenswrapper[4790]: I1003 09:01:38.566037 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Oct 03 09:01:38 crc kubenswrapper[4790]: E1003 09:01:38.566497 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ae318a68-abcf-4d99-ac81-222288742659" containerName="nova-api-log" Oct 03 09:01:38 crc kubenswrapper[4790]: I1003 09:01:38.566519 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="ae318a68-abcf-4d99-ac81-222288742659" containerName="nova-api-log" Oct 03 09:01:38 crc kubenswrapper[4790]: E1003 09:01:38.566574 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ae318a68-abcf-4d99-ac81-222288742659" containerName="nova-api-api" Oct 03 09:01:38 crc kubenswrapper[4790]: I1003 09:01:38.566597 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="ae318a68-abcf-4d99-ac81-222288742659" containerName="nova-api-api" Oct 03 09:01:38 crc kubenswrapper[4790]: I1003 09:01:38.566817 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="ae318a68-abcf-4d99-ac81-222288742659" containerName="nova-api-log" Oct 03 09:01:38 crc kubenswrapper[4790]: I1003 09:01:38.566851 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="ae318a68-abcf-4d99-ac81-222288742659" containerName="nova-api-api" Oct 03 09:01:38 crc kubenswrapper[4790]: I1003 09:01:38.568096 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 03 09:01:38 crc kubenswrapper[4790]: I1003 09:01:38.576021 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Oct 03 09:01:38 crc kubenswrapper[4790]: I1003 09:01:38.576462 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Oct 03 09:01:38 crc kubenswrapper[4790]: I1003 09:01:38.579707 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Oct 03 09:01:38 crc kubenswrapper[4790]: I1003 09:01:38.582261 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 03 09:01:38 crc kubenswrapper[4790]: I1003 09:01:38.713336 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a47cec82-31c1-4842-8a5a-fc233e1235f8-logs\") pod \"nova-api-0\" (UID: \"a47cec82-31c1-4842-8a5a-fc233e1235f8\") " pod="openstack/nova-api-0" Oct 03 09:01:38 crc kubenswrapper[4790]: I1003 09:01:38.713400 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a47cec82-31c1-4842-8a5a-fc233e1235f8-internal-tls-certs\") pod \"nova-api-0\" (UID: \"a47cec82-31c1-4842-8a5a-fc233e1235f8\") " pod="openstack/nova-api-0" Oct 03 09:01:38 crc kubenswrapper[4790]: I1003 09:01:38.713761 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a47cec82-31c1-4842-8a5a-fc233e1235f8-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"a47cec82-31c1-4842-8a5a-fc233e1235f8\") " pod="openstack/nova-api-0" Oct 03 09:01:38 crc kubenswrapper[4790]: I1003 09:01:38.713850 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fcmb4\" (UniqueName: \"kubernetes.io/projected/a47cec82-31c1-4842-8a5a-fc233e1235f8-kube-api-access-fcmb4\") pod \"nova-api-0\" (UID: \"a47cec82-31c1-4842-8a5a-fc233e1235f8\") " pod="openstack/nova-api-0" Oct 03 09:01:38 crc kubenswrapper[4790]: I1003 09:01:38.714155 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a47cec82-31c1-4842-8a5a-fc233e1235f8-config-data\") pod \"nova-api-0\" (UID: \"a47cec82-31c1-4842-8a5a-fc233e1235f8\") " pod="openstack/nova-api-0" Oct 03 09:01:38 crc kubenswrapper[4790]: I1003 09:01:38.716024 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a47cec82-31c1-4842-8a5a-fc233e1235f8-public-tls-certs\") pod \"nova-api-0\" (UID: \"a47cec82-31c1-4842-8a5a-fc233e1235f8\") " pod="openstack/nova-api-0" Oct 03 09:01:38 crc kubenswrapper[4790]: I1003 09:01:38.818184 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a47cec82-31c1-4842-8a5a-fc233e1235f8-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"a47cec82-31c1-4842-8a5a-fc233e1235f8\") " pod="openstack/nova-api-0" Oct 03 09:01:38 crc kubenswrapper[4790]: I1003 09:01:38.818251 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fcmb4\" (UniqueName: \"kubernetes.io/projected/a47cec82-31c1-4842-8a5a-fc233e1235f8-kube-api-access-fcmb4\") pod \"nova-api-0\" (UID: \"a47cec82-31c1-4842-8a5a-fc233e1235f8\") " pod="openstack/nova-api-0" Oct 03 09:01:38 crc kubenswrapper[4790]: I1003 09:01:38.818368 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a47cec82-31c1-4842-8a5a-fc233e1235f8-config-data\") pod \"nova-api-0\" (UID: \"a47cec82-31c1-4842-8a5a-fc233e1235f8\") " pod="openstack/nova-api-0" Oct 03 09:01:38 crc kubenswrapper[4790]: I1003 09:01:38.818428 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a47cec82-31c1-4842-8a5a-fc233e1235f8-public-tls-certs\") pod \"nova-api-0\" (UID: \"a47cec82-31c1-4842-8a5a-fc233e1235f8\") " pod="openstack/nova-api-0" Oct 03 09:01:38 crc kubenswrapper[4790]: I1003 09:01:38.818453 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a47cec82-31c1-4842-8a5a-fc233e1235f8-logs\") pod \"nova-api-0\" (UID: \"a47cec82-31c1-4842-8a5a-fc233e1235f8\") " pod="openstack/nova-api-0" Oct 03 09:01:38 crc kubenswrapper[4790]: I1003 09:01:38.818479 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a47cec82-31c1-4842-8a5a-fc233e1235f8-internal-tls-certs\") pod \"nova-api-0\" (UID: \"a47cec82-31c1-4842-8a5a-fc233e1235f8\") " pod="openstack/nova-api-0" Oct 03 09:01:38 crc kubenswrapper[4790]: I1003 09:01:38.819247 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a47cec82-31c1-4842-8a5a-fc233e1235f8-logs\") pod \"nova-api-0\" (UID: \"a47cec82-31c1-4842-8a5a-fc233e1235f8\") " pod="openstack/nova-api-0" Oct 03 09:01:38 crc kubenswrapper[4790]: I1003 09:01:38.824550 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a47cec82-31c1-4842-8a5a-fc233e1235f8-internal-tls-certs\") pod \"nova-api-0\" (UID: \"a47cec82-31c1-4842-8a5a-fc233e1235f8\") " pod="openstack/nova-api-0" Oct 03 09:01:38 crc kubenswrapper[4790]: I1003 09:01:38.824638 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a47cec82-31c1-4842-8a5a-fc233e1235f8-public-tls-certs\") pod \"nova-api-0\" (UID: \"a47cec82-31c1-4842-8a5a-fc233e1235f8\") " pod="openstack/nova-api-0" Oct 03 09:01:38 crc kubenswrapper[4790]: I1003 09:01:38.825209 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a47cec82-31c1-4842-8a5a-fc233e1235f8-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"a47cec82-31c1-4842-8a5a-fc233e1235f8\") " pod="openstack/nova-api-0" Oct 03 09:01:38 crc kubenswrapper[4790]: I1003 09:01:38.827342 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a47cec82-31c1-4842-8a5a-fc233e1235f8-config-data\") pod \"nova-api-0\" (UID: \"a47cec82-31c1-4842-8a5a-fc233e1235f8\") " pod="openstack/nova-api-0" Oct 03 09:01:38 crc kubenswrapper[4790]: I1003 09:01:38.834392 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fcmb4\" (UniqueName: \"kubernetes.io/projected/a47cec82-31c1-4842-8a5a-fc233e1235f8-kube-api-access-fcmb4\") pod \"nova-api-0\" (UID: \"a47cec82-31c1-4842-8a5a-fc233e1235f8\") " pod="openstack/nova-api-0" Oct 03 09:01:38 crc kubenswrapper[4790]: I1003 09:01:38.903175 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 03 09:01:39 crc kubenswrapper[4790]: I1003 09:01:39.214416 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 03 09:01:39 crc kubenswrapper[4790]: W1003 09:01:39.218052 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda47cec82_31c1_4842_8a5a_fc233e1235f8.slice/crio-a8968e31d2161fbc58e97a8152d26612bb239d9cce24da80d0e774111ae3348a WatchSource:0}: Error finding container a8968e31d2161fbc58e97a8152d26612bb239d9cce24da80d0e774111ae3348a: Status 404 returned error can't find the container with id a8968e31d2161fbc58e97a8152d26612bb239d9cce24da80d0e774111ae3348a Oct 03 09:01:39 crc kubenswrapper[4790]: I1003 09:01:39.447455 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ab369c6c-af88-4f07-bc5c-fb229a9b2cc8","Type":"ContainerStarted","Data":"5b7fda9d415b0dd7954f0ad31756747d015b5fe46ec61fa37f272677a5f4b0d9"} Oct 03 09:01:39 crc kubenswrapper[4790]: I1003 09:01:39.453649 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a47cec82-31c1-4842-8a5a-fc233e1235f8","Type":"ContainerStarted","Data":"a8968e31d2161fbc58e97a8152d26612bb239d9cce24da80d0e774111ae3348a"} Oct 03 09:01:40 crc kubenswrapper[4790]: I1003 09:01:40.186363 4790 patch_prober.go:28] interesting pod/machine-config-daemon-zqj4j container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 09:01:40 crc kubenswrapper[4790]: I1003 09:01:40.186807 4790 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" podUID="36eae06a-d8be-4128-8d68-acb32e1f011b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 09:01:40 crc kubenswrapper[4790]: I1003 09:01:40.345364 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ae318a68-abcf-4d99-ac81-222288742659" path="/var/lib/kubelet/pods/ae318a68-abcf-4d99-ac81-222288742659/volumes" Oct 03 09:01:40 crc kubenswrapper[4790]: I1003 09:01:40.466401 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a47cec82-31c1-4842-8a5a-fc233e1235f8","Type":"ContainerStarted","Data":"25596ae34b254dec0db79a0649008347e976237ae84deb8da4ef55ae2fa7b2af"} Oct 03 09:01:40 crc kubenswrapper[4790]: I1003 09:01:40.466443 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a47cec82-31c1-4842-8a5a-fc233e1235f8","Type":"ContainerStarted","Data":"b779a8a324b7e5586975b19cbb5f1cefb30609edd73eea50f767ef6e56a8fcfe"} Oct 03 09:01:40 crc kubenswrapper[4790]: I1003 09:01:40.468738 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ab369c6c-af88-4f07-bc5c-fb229a9b2cc8","Type":"ContainerStarted","Data":"19eb46ebf56282c6bbef52abf06e793f70d84924da7a945c5d57a5ca06bf6217"} Oct 03 09:01:40 crc kubenswrapper[4790]: I1003 09:01:40.504789 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.504767157 podStartE2EDuration="2.504767157s" podCreationTimestamp="2025-10-03 09:01:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 09:01:40.496053758 +0000 UTC m=+1286.848988016" watchObservedRunningTime="2025-10-03 09:01:40.504767157 +0000 UTC m=+1286.857701395" Oct 03 09:01:41 crc kubenswrapper[4790]: I1003 09:01:41.480639 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ab369c6c-af88-4f07-bc5c-fb229a9b2cc8","Type":"ContainerStarted","Data":"ccb8497ffc44b07d7164d76c7a640fb70b79cd8a13f1394c3edcc4c4d5b5d243"} Oct 03 09:01:42 crc kubenswrapper[4790]: I1003 09:01:42.055377 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Oct 03 09:01:42 crc kubenswrapper[4790]: I1003 09:01:42.491207 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ab369c6c-af88-4f07-bc5c-fb229a9b2cc8","Type":"ContainerStarted","Data":"b0bbbf43a17f70bfd8125d13f4158920521c43f31f5f7736bab421016373b0b3"} Oct 03 09:01:42 crc kubenswrapper[4790]: I1003 09:01:42.491377 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 03 09:01:42 crc kubenswrapper[4790]: I1003 09:01:42.520134 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.267419087 podStartE2EDuration="5.520110289s" podCreationTimestamp="2025-10-03 09:01:37 +0000 UTC" firstStartedPulling="2025-10-03 09:01:38.254772844 +0000 UTC m=+1284.607707092" lastFinishedPulling="2025-10-03 09:01:41.507464046 +0000 UTC m=+1287.860398294" observedRunningTime="2025-10-03 09:01:42.51211463 +0000 UTC m=+1288.865048888" watchObservedRunningTime="2025-10-03 09:01:42.520110289 +0000 UTC m=+1288.873044537" Oct 03 09:01:42 crc kubenswrapper[4790]: I1003 09:01:42.837657 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-59fd7cc775-bxj2s" Oct 03 09:01:42 crc kubenswrapper[4790]: I1003 09:01:42.926898 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7859b488d7-9flsg"] Oct 03 09:01:42 crc kubenswrapper[4790]: I1003 09:01:42.927243 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7859b488d7-9flsg" podUID="ac5587ac-3a33-40b8-b068-bbfee9cb702b" containerName="dnsmasq-dns" containerID="cri-o://2ce6ad919c4abde1bd65f5a01d210a624389d2c7552ea3c4d04d6f1762fbca6d" gracePeriod=10 Oct 03 09:01:43 crc kubenswrapper[4790]: I1003 09:01:43.502665 4790 generic.go:334] "Generic (PLEG): container finished" podID="ac5587ac-3a33-40b8-b068-bbfee9cb702b" containerID="2ce6ad919c4abde1bd65f5a01d210a624389d2c7552ea3c4d04d6f1762fbca6d" exitCode=0 Oct 03 09:01:43 crc kubenswrapper[4790]: I1003 09:01:43.502758 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7859b488d7-9flsg" event={"ID":"ac5587ac-3a33-40b8-b068-bbfee9cb702b","Type":"ContainerDied","Data":"2ce6ad919c4abde1bd65f5a01d210a624389d2c7552ea3c4d04d6f1762fbca6d"} Oct 03 09:01:43 crc kubenswrapper[4790]: I1003 09:01:43.503401 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7859b488d7-9flsg" event={"ID":"ac5587ac-3a33-40b8-b068-bbfee9cb702b","Type":"ContainerDied","Data":"cd16f9894b502ba18df3f7209cf6f8adaacadae989d1010144da452f1076fe80"} Oct 03 09:01:43 crc kubenswrapper[4790]: I1003 09:01:43.503418 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cd16f9894b502ba18df3f7209cf6f8adaacadae989d1010144da452f1076fe80" Oct 03 09:01:43 crc kubenswrapper[4790]: I1003 09:01:43.505731 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7859b488d7-9flsg" Oct 03 09:01:43 crc kubenswrapper[4790]: I1003 09:01:43.624270 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ac5587ac-3a33-40b8-b068-bbfee9cb702b-dns-svc\") pod \"ac5587ac-3a33-40b8-b068-bbfee9cb702b\" (UID: \"ac5587ac-3a33-40b8-b068-bbfee9cb702b\") " Oct 03 09:01:43 crc kubenswrapper[4790]: I1003 09:01:43.624396 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ac5587ac-3a33-40b8-b068-bbfee9cb702b-ovsdbserver-sb\") pod \"ac5587ac-3a33-40b8-b068-bbfee9cb702b\" (UID: \"ac5587ac-3a33-40b8-b068-bbfee9cb702b\") " Oct 03 09:01:43 crc kubenswrapper[4790]: I1003 09:01:43.624433 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ac5587ac-3a33-40b8-b068-bbfee9cb702b-config\") pod \"ac5587ac-3a33-40b8-b068-bbfee9cb702b\" (UID: \"ac5587ac-3a33-40b8-b068-bbfee9cb702b\") " Oct 03 09:01:43 crc kubenswrapper[4790]: I1003 09:01:43.624499 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ac5587ac-3a33-40b8-b068-bbfee9cb702b-dns-swift-storage-0\") pod \"ac5587ac-3a33-40b8-b068-bbfee9cb702b\" (UID: \"ac5587ac-3a33-40b8-b068-bbfee9cb702b\") " Oct 03 09:01:43 crc kubenswrapper[4790]: I1003 09:01:43.624526 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kchpd\" (UniqueName: \"kubernetes.io/projected/ac5587ac-3a33-40b8-b068-bbfee9cb702b-kube-api-access-kchpd\") pod \"ac5587ac-3a33-40b8-b068-bbfee9cb702b\" (UID: \"ac5587ac-3a33-40b8-b068-bbfee9cb702b\") " Oct 03 09:01:43 crc kubenswrapper[4790]: I1003 09:01:43.624807 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ac5587ac-3a33-40b8-b068-bbfee9cb702b-ovsdbserver-nb\") pod \"ac5587ac-3a33-40b8-b068-bbfee9cb702b\" (UID: \"ac5587ac-3a33-40b8-b068-bbfee9cb702b\") " Oct 03 09:01:43 crc kubenswrapper[4790]: I1003 09:01:43.631267 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ac5587ac-3a33-40b8-b068-bbfee9cb702b-kube-api-access-kchpd" (OuterVolumeSpecName: "kube-api-access-kchpd") pod "ac5587ac-3a33-40b8-b068-bbfee9cb702b" (UID: "ac5587ac-3a33-40b8-b068-bbfee9cb702b"). InnerVolumeSpecName "kube-api-access-kchpd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:01:43 crc kubenswrapper[4790]: I1003 09:01:43.690094 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ac5587ac-3a33-40b8-b068-bbfee9cb702b-config" (OuterVolumeSpecName: "config") pod "ac5587ac-3a33-40b8-b068-bbfee9cb702b" (UID: "ac5587ac-3a33-40b8-b068-bbfee9cb702b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 09:01:43 crc kubenswrapper[4790]: I1003 09:01:43.696424 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ac5587ac-3a33-40b8-b068-bbfee9cb702b-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "ac5587ac-3a33-40b8-b068-bbfee9cb702b" (UID: "ac5587ac-3a33-40b8-b068-bbfee9cb702b"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 09:01:43 crc kubenswrapper[4790]: I1003 09:01:43.709020 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ac5587ac-3a33-40b8-b068-bbfee9cb702b-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "ac5587ac-3a33-40b8-b068-bbfee9cb702b" (UID: "ac5587ac-3a33-40b8-b068-bbfee9cb702b"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 09:01:43 crc kubenswrapper[4790]: I1003 09:01:43.724464 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ac5587ac-3a33-40b8-b068-bbfee9cb702b-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "ac5587ac-3a33-40b8-b068-bbfee9cb702b" (UID: "ac5587ac-3a33-40b8-b068-bbfee9cb702b"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 09:01:43 crc kubenswrapper[4790]: I1003 09:01:43.725797 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ac5587ac-3a33-40b8-b068-bbfee9cb702b-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "ac5587ac-3a33-40b8-b068-bbfee9cb702b" (UID: "ac5587ac-3a33-40b8-b068-bbfee9cb702b"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 09:01:43 crc kubenswrapper[4790]: I1003 09:01:43.727140 4790 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ac5587ac-3a33-40b8-b068-bbfee9cb702b-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 03 09:01:43 crc kubenswrapper[4790]: I1003 09:01:43.727161 4790 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ac5587ac-3a33-40b8-b068-bbfee9cb702b-config\") on node \"crc\" DevicePath \"\"" Oct 03 09:01:43 crc kubenswrapper[4790]: I1003 09:01:43.727170 4790 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ac5587ac-3a33-40b8-b068-bbfee9cb702b-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 03 09:01:43 crc kubenswrapper[4790]: I1003 09:01:43.727181 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kchpd\" (UniqueName: \"kubernetes.io/projected/ac5587ac-3a33-40b8-b068-bbfee9cb702b-kube-api-access-kchpd\") on node \"crc\" DevicePath \"\"" Oct 03 09:01:43 crc kubenswrapper[4790]: I1003 09:01:43.727191 4790 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ac5587ac-3a33-40b8-b068-bbfee9cb702b-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 03 09:01:43 crc kubenswrapper[4790]: I1003 09:01:43.727199 4790 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ac5587ac-3a33-40b8-b068-bbfee9cb702b-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 03 09:01:44 crc kubenswrapper[4790]: I1003 09:01:44.523000 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7859b488d7-9flsg" Oct 03 09:01:44 crc kubenswrapper[4790]: I1003 09:01:44.555613 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7859b488d7-9flsg"] Oct 03 09:01:44 crc kubenswrapper[4790]: I1003 09:01:44.565539 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7859b488d7-9flsg"] Oct 03 09:01:46 crc kubenswrapper[4790]: I1003 09:01:46.348218 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ac5587ac-3a33-40b8-b068-bbfee9cb702b" path="/var/lib/kubelet/pods/ac5587ac-3a33-40b8-b068-bbfee9cb702b/volumes" Oct 03 09:01:47 crc kubenswrapper[4790]: I1003 09:01:47.054723 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Oct 03 09:01:47 crc kubenswrapper[4790]: I1003 09:01:47.076520 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Oct 03 09:01:47 crc kubenswrapper[4790]: I1003 09:01:47.576490 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Oct 03 09:01:47 crc kubenswrapper[4790]: I1003 09:01:47.741505 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-sjzpk"] Oct 03 09:01:47 crc kubenswrapper[4790]: E1003 09:01:47.742478 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac5587ac-3a33-40b8-b068-bbfee9cb702b" containerName="init" Oct 03 09:01:47 crc kubenswrapper[4790]: I1003 09:01:47.742507 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac5587ac-3a33-40b8-b068-bbfee9cb702b" containerName="init" Oct 03 09:01:47 crc kubenswrapper[4790]: E1003 09:01:47.742530 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac5587ac-3a33-40b8-b068-bbfee9cb702b" containerName="dnsmasq-dns" Oct 03 09:01:47 crc kubenswrapper[4790]: I1003 09:01:47.742539 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac5587ac-3a33-40b8-b068-bbfee9cb702b" containerName="dnsmasq-dns" Oct 03 09:01:47 crc kubenswrapper[4790]: I1003 09:01:47.742868 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="ac5587ac-3a33-40b8-b068-bbfee9cb702b" containerName="dnsmasq-dns" Oct 03 09:01:47 crc kubenswrapper[4790]: I1003 09:01:47.743765 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-sjzpk" Oct 03 09:01:47 crc kubenswrapper[4790]: I1003 09:01:47.747164 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Oct 03 09:01:47 crc kubenswrapper[4790]: I1003 09:01:47.754187 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Oct 03 09:01:47 crc kubenswrapper[4790]: I1003 09:01:47.758877 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-sjzpk"] Oct 03 09:01:47 crc kubenswrapper[4790]: I1003 09:01:47.912010 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04e23e9a-a675-40ea-a178-37b08cd01b51-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-sjzpk\" (UID: \"04e23e9a-a675-40ea-a178-37b08cd01b51\") " pod="openstack/nova-cell1-cell-mapping-sjzpk" Oct 03 09:01:47 crc kubenswrapper[4790]: I1003 09:01:47.912121 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-crwb4\" (UniqueName: \"kubernetes.io/projected/04e23e9a-a675-40ea-a178-37b08cd01b51-kube-api-access-crwb4\") pod \"nova-cell1-cell-mapping-sjzpk\" (UID: \"04e23e9a-a675-40ea-a178-37b08cd01b51\") " pod="openstack/nova-cell1-cell-mapping-sjzpk" Oct 03 09:01:47 crc kubenswrapper[4790]: I1003 09:01:47.912357 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/04e23e9a-a675-40ea-a178-37b08cd01b51-scripts\") pod \"nova-cell1-cell-mapping-sjzpk\" (UID: \"04e23e9a-a675-40ea-a178-37b08cd01b51\") " pod="openstack/nova-cell1-cell-mapping-sjzpk" Oct 03 09:01:47 crc kubenswrapper[4790]: I1003 09:01:47.912560 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/04e23e9a-a675-40ea-a178-37b08cd01b51-config-data\") pod \"nova-cell1-cell-mapping-sjzpk\" (UID: \"04e23e9a-a675-40ea-a178-37b08cd01b51\") " pod="openstack/nova-cell1-cell-mapping-sjzpk" Oct 03 09:01:48 crc kubenswrapper[4790]: I1003 09:01:48.022911 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04e23e9a-a675-40ea-a178-37b08cd01b51-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-sjzpk\" (UID: \"04e23e9a-a675-40ea-a178-37b08cd01b51\") " pod="openstack/nova-cell1-cell-mapping-sjzpk" Oct 03 09:01:48 crc kubenswrapper[4790]: I1003 09:01:48.023174 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-crwb4\" (UniqueName: \"kubernetes.io/projected/04e23e9a-a675-40ea-a178-37b08cd01b51-kube-api-access-crwb4\") pod \"nova-cell1-cell-mapping-sjzpk\" (UID: \"04e23e9a-a675-40ea-a178-37b08cd01b51\") " pod="openstack/nova-cell1-cell-mapping-sjzpk" Oct 03 09:01:48 crc kubenswrapper[4790]: I1003 09:01:48.023257 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/04e23e9a-a675-40ea-a178-37b08cd01b51-scripts\") pod \"nova-cell1-cell-mapping-sjzpk\" (UID: \"04e23e9a-a675-40ea-a178-37b08cd01b51\") " pod="openstack/nova-cell1-cell-mapping-sjzpk" Oct 03 09:01:48 crc kubenswrapper[4790]: I1003 09:01:48.023298 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/04e23e9a-a675-40ea-a178-37b08cd01b51-config-data\") pod \"nova-cell1-cell-mapping-sjzpk\" (UID: \"04e23e9a-a675-40ea-a178-37b08cd01b51\") " pod="openstack/nova-cell1-cell-mapping-sjzpk" Oct 03 09:01:48 crc kubenswrapper[4790]: I1003 09:01:48.028586 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04e23e9a-a675-40ea-a178-37b08cd01b51-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-sjzpk\" (UID: \"04e23e9a-a675-40ea-a178-37b08cd01b51\") " pod="openstack/nova-cell1-cell-mapping-sjzpk" Oct 03 09:01:48 crc kubenswrapper[4790]: I1003 09:01:48.028947 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/04e23e9a-a675-40ea-a178-37b08cd01b51-scripts\") pod \"nova-cell1-cell-mapping-sjzpk\" (UID: \"04e23e9a-a675-40ea-a178-37b08cd01b51\") " pod="openstack/nova-cell1-cell-mapping-sjzpk" Oct 03 09:01:48 crc kubenswrapper[4790]: I1003 09:01:48.031994 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/04e23e9a-a675-40ea-a178-37b08cd01b51-config-data\") pod \"nova-cell1-cell-mapping-sjzpk\" (UID: \"04e23e9a-a675-40ea-a178-37b08cd01b51\") " pod="openstack/nova-cell1-cell-mapping-sjzpk" Oct 03 09:01:48 crc kubenswrapper[4790]: I1003 09:01:48.040376 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-crwb4\" (UniqueName: \"kubernetes.io/projected/04e23e9a-a675-40ea-a178-37b08cd01b51-kube-api-access-crwb4\") pod \"nova-cell1-cell-mapping-sjzpk\" (UID: \"04e23e9a-a675-40ea-a178-37b08cd01b51\") " pod="openstack/nova-cell1-cell-mapping-sjzpk" Oct 03 09:01:48 crc kubenswrapper[4790]: I1003 09:01:48.066023 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-sjzpk" Oct 03 09:01:48 crc kubenswrapper[4790]: I1003 09:01:48.519535 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-sjzpk"] Oct 03 09:01:48 crc kubenswrapper[4790]: W1003 09:01:48.524911 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod04e23e9a_a675_40ea_a178_37b08cd01b51.slice/crio-9c8e99f6daf5d00e993a8842196f59dd4bb786724638cd580c97c709550c10e4 WatchSource:0}: Error finding container 9c8e99f6daf5d00e993a8842196f59dd4bb786724638cd580c97c709550c10e4: Status 404 returned error can't find the container with id 9c8e99f6daf5d00e993a8842196f59dd4bb786724638cd580c97c709550c10e4 Oct 03 09:01:48 crc kubenswrapper[4790]: I1003 09:01:48.570609 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-sjzpk" event={"ID":"04e23e9a-a675-40ea-a178-37b08cd01b51","Type":"ContainerStarted","Data":"9c8e99f6daf5d00e993a8842196f59dd4bb786724638cd580c97c709550c10e4"} Oct 03 09:01:48 crc kubenswrapper[4790]: I1003 09:01:48.904396 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 03 09:01:48 crc kubenswrapper[4790]: I1003 09:01:48.904662 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 03 09:01:49 crc kubenswrapper[4790]: I1003 09:01:49.586916 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-sjzpk" event={"ID":"04e23e9a-a675-40ea-a178-37b08cd01b51","Type":"ContainerStarted","Data":"3b4ae2d051a5407e976adcbf3502dec4289adf7ccb7e35db1e8b14cdf13c9ee4"} Oct 03 09:01:49 crc kubenswrapper[4790]: I1003 09:01:49.609542 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-sjzpk" podStartSLOduration=2.609519569 podStartE2EDuration="2.609519569s" podCreationTimestamp="2025-10-03 09:01:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 09:01:49.601977863 +0000 UTC m=+1295.954912101" watchObservedRunningTime="2025-10-03 09:01:49.609519569 +0000 UTC m=+1295.962453807" Oct 03 09:01:49 crc kubenswrapper[4790]: I1003 09:01:49.919822 4790 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="a47cec82-31c1-4842-8a5a-fc233e1235f8" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.221:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 03 09:01:49 crc kubenswrapper[4790]: I1003 09:01:49.919838 4790 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="a47cec82-31c1-4842-8a5a-fc233e1235f8" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.221:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 03 09:01:54 crc kubenswrapper[4790]: I1003 09:01:54.645693 4790 generic.go:334] "Generic (PLEG): container finished" podID="04e23e9a-a675-40ea-a178-37b08cd01b51" containerID="3b4ae2d051a5407e976adcbf3502dec4289adf7ccb7e35db1e8b14cdf13c9ee4" exitCode=0 Oct 03 09:01:54 crc kubenswrapper[4790]: I1003 09:01:54.645786 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-sjzpk" event={"ID":"04e23e9a-a675-40ea-a178-37b08cd01b51","Type":"ContainerDied","Data":"3b4ae2d051a5407e976adcbf3502dec4289adf7ccb7e35db1e8b14cdf13c9ee4"} Oct 03 09:01:55 crc kubenswrapper[4790]: I1003 09:01:55.990032 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-sjzpk" Oct 03 09:01:56 crc kubenswrapper[4790]: I1003 09:01:56.085636 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/04e23e9a-a675-40ea-a178-37b08cd01b51-config-data\") pod \"04e23e9a-a675-40ea-a178-37b08cd01b51\" (UID: \"04e23e9a-a675-40ea-a178-37b08cd01b51\") " Oct 03 09:01:56 crc kubenswrapper[4790]: I1003 09:01:56.085834 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-crwb4\" (UniqueName: \"kubernetes.io/projected/04e23e9a-a675-40ea-a178-37b08cd01b51-kube-api-access-crwb4\") pod \"04e23e9a-a675-40ea-a178-37b08cd01b51\" (UID: \"04e23e9a-a675-40ea-a178-37b08cd01b51\") " Oct 03 09:01:56 crc kubenswrapper[4790]: I1003 09:01:56.085900 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04e23e9a-a675-40ea-a178-37b08cd01b51-combined-ca-bundle\") pod \"04e23e9a-a675-40ea-a178-37b08cd01b51\" (UID: \"04e23e9a-a675-40ea-a178-37b08cd01b51\") " Oct 03 09:01:56 crc kubenswrapper[4790]: I1003 09:01:56.085983 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/04e23e9a-a675-40ea-a178-37b08cd01b51-scripts\") pod \"04e23e9a-a675-40ea-a178-37b08cd01b51\" (UID: \"04e23e9a-a675-40ea-a178-37b08cd01b51\") " Oct 03 09:01:56 crc kubenswrapper[4790]: I1003 09:01:56.096441 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/04e23e9a-a675-40ea-a178-37b08cd01b51-kube-api-access-crwb4" (OuterVolumeSpecName: "kube-api-access-crwb4") pod "04e23e9a-a675-40ea-a178-37b08cd01b51" (UID: "04e23e9a-a675-40ea-a178-37b08cd01b51"). InnerVolumeSpecName "kube-api-access-crwb4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:01:56 crc kubenswrapper[4790]: I1003 09:01:56.099739 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/04e23e9a-a675-40ea-a178-37b08cd01b51-scripts" (OuterVolumeSpecName: "scripts") pod "04e23e9a-a675-40ea-a178-37b08cd01b51" (UID: "04e23e9a-a675-40ea-a178-37b08cd01b51"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:01:56 crc kubenswrapper[4790]: I1003 09:01:56.121845 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/04e23e9a-a675-40ea-a178-37b08cd01b51-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "04e23e9a-a675-40ea-a178-37b08cd01b51" (UID: "04e23e9a-a675-40ea-a178-37b08cd01b51"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:01:56 crc kubenswrapper[4790]: I1003 09:01:56.132646 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/04e23e9a-a675-40ea-a178-37b08cd01b51-config-data" (OuterVolumeSpecName: "config-data") pod "04e23e9a-a675-40ea-a178-37b08cd01b51" (UID: "04e23e9a-a675-40ea-a178-37b08cd01b51"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:01:56 crc kubenswrapper[4790]: I1003 09:01:56.188674 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-crwb4\" (UniqueName: \"kubernetes.io/projected/04e23e9a-a675-40ea-a178-37b08cd01b51-kube-api-access-crwb4\") on node \"crc\" DevicePath \"\"" Oct 03 09:01:56 crc kubenswrapper[4790]: I1003 09:01:56.188721 4790 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04e23e9a-a675-40ea-a178-37b08cd01b51-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 09:01:56 crc kubenswrapper[4790]: I1003 09:01:56.188735 4790 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/04e23e9a-a675-40ea-a178-37b08cd01b51-scripts\") on node \"crc\" DevicePath \"\"" Oct 03 09:01:56 crc kubenswrapper[4790]: I1003 09:01:56.188746 4790 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/04e23e9a-a675-40ea-a178-37b08cd01b51-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 09:01:56 crc kubenswrapper[4790]: I1003 09:01:56.664234 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-sjzpk" event={"ID":"04e23e9a-a675-40ea-a178-37b08cd01b51","Type":"ContainerDied","Data":"9c8e99f6daf5d00e993a8842196f59dd4bb786724638cd580c97c709550c10e4"} Oct 03 09:01:56 crc kubenswrapper[4790]: I1003 09:01:56.664273 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9c8e99f6daf5d00e993a8842196f59dd4bb786724638cd580c97c709550c10e4" Oct 03 09:01:56 crc kubenswrapper[4790]: I1003 09:01:56.664317 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-sjzpk" Oct 03 09:01:56 crc kubenswrapper[4790]: I1003 09:01:56.852895 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 03 09:01:56 crc kubenswrapper[4790]: I1003 09:01:56.857808 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="a47cec82-31c1-4842-8a5a-fc233e1235f8" containerName="nova-api-api" containerID="cri-o://25596ae34b254dec0db79a0649008347e976237ae84deb8da4ef55ae2fa7b2af" gracePeriod=30 Oct 03 09:01:56 crc kubenswrapper[4790]: I1003 09:01:56.869094 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Oct 03 09:01:56 crc kubenswrapper[4790]: I1003 09:01:56.869338 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="8b2718fc-f9ec-42fb-9aad-8ff7232c4ee2" containerName="nova-scheduler-scheduler" containerID="cri-o://ac07591fe4cd824cfd309afb5edac51bbf69f5b2173ac133c668b99f0f65d90e" gracePeriod=30 Oct 03 09:01:56 crc kubenswrapper[4790]: I1003 09:01:56.909042 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 03 09:01:56 crc kubenswrapper[4790]: I1003 09:01:56.909600 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="c4e8431e-8789-4bda-a146-4b2c056eb240" containerName="nova-metadata-log" containerID="cri-o://b4e7ca2907c6c3d5b79917495ec50be29601d0b1109983fec0b2484e44f7e029" gracePeriod=30 Oct 03 09:01:56 crc kubenswrapper[4790]: I1003 09:01:56.909823 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="c4e8431e-8789-4bda-a146-4b2c056eb240" containerName="nova-metadata-metadata" containerID="cri-o://cb8f2fd06148764e43f1d0e5967da1b50cf9da473be0289f0d22f4759f30f2b5" gracePeriod=30 Oct 03 09:01:56 crc kubenswrapper[4790]: I1003 09:01:56.911739 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="a47cec82-31c1-4842-8a5a-fc233e1235f8" containerName="nova-api-log" containerID="cri-o://b779a8a324b7e5586975b19cbb5f1cefb30609edd73eea50f767ef6e56a8fcfe" gracePeriod=30 Oct 03 09:01:57 crc kubenswrapper[4790]: I1003 09:01:57.676037 4790 generic.go:334] "Generic (PLEG): container finished" podID="a47cec82-31c1-4842-8a5a-fc233e1235f8" containerID="b779a8a324b7e5586975b19cbb5f1cefb30609edd73eea50f767ef6e56a8fcfe" exitCode=143 Oct 03 09:01:57 crc kubenswrapper[4790]: I1003 09:01:57.676199 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a47cec82-31c1-4842-8a5a-fc233e1235f8","Type":"ContainerDied","Data":"b779a8a324b7e5586975b19cbb5f1cefb30609edd73eea50f767ef6e56a8fcfe"} Oct 03 09:01:57 crc kubenswrapper[4790]: I1003 09:01:57.678357 4790 generic.go:334] "Generic (PLEG): container finished" podID="c4e8431e-8789-4bda-a146-4b2c056eb240" containerID="b4e7ca2907c6c3d5b79917495ec50be29601d0b1109983fec0b2484e44f7e029" exitCode=143 Oct 03 09:01:57 crc kubenswrapper[4790]: I1003 09:01:57.678390 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"c4e8431e-8789-4bda-a146-4b2c056eb240","Type":"ContainerDied","Data":"b4e7ca2907c6c3d5b79917495ec50be29601d0b1109983fec0b2484e44f7e029"} Oct 03 09:01:58 crc kubenswrapper[4790]: I1003 09:01:58.275149 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 03 09:01:58 crc kubenswrapper[4790]: I1003 09:01:58.432333 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wtvdh\" (UniqueName: \"kubernetes.io/projected/8b2718fc-f9ec-42fb-9aad-8ff7232c4ee2-kube-api-access-wtvdh\") pod \"8b2718fc-f9ec-42fb-9aad-8ff7232c4ee2\" (UID: \"8b2718fc-f9ec-42fb-9aad-8ff7232c4ee2\") " Oct 03 09:01:58 crc kubenswrapper[4790]: I1003 09:01:58.432543 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8b2718fc-f9ec-42fb-9aad-8ff7232c4ee2-config-data\") pod \"8b2718fc-f9ec-42fb-9aad-8ff7232c4ee2\" (UID: \"8b2718fc-f9ec-42fb-9aad-8ff7232c4ee2\") " Oct 03 09:01:58 crc kubenswrapper[4790]: I1003 09:01:58.432792 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b2718fc-f9ec-42fb-9aad-8ff7232c4ee2-combined-ca-bundle\") pod \"8b2718fc-f9ec-42fb-9aad-8ff7232c4ee2\" (UID: \"8b2718fc-f9ec-42fb-9aad-8ff7232c4ee2\") " Oct 03 09:01:58 crc kubenswrapper[4790]: I1003 09:01:58.438355 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8b2718fc-f9ec-42fb-9aad-8ff7232c4ee2-kube-api-access-wtvdh" (OuterVolumeSpecName: "kube-api-access-wtvdh") pod "8b2718fc-f9ec-42fb-9aad-8ff7232c4ee2" (UID: "8b2718fc-f9ec-42fb-9aad-8ff7232c4ee2"). InnerVolumeSpecName "kube-api-access-wtvdh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:01:58 crc kubenswrapper[4790]: I1003 09:01:58.464875 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8b2718fc-f9ec-42fb-9aad-8ff7232c4ee2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8b2718fc-f9ec-42fb-9aad-8ff7232c4ee2" (UID: "8b2718fc-f9ec-42fb-9aad-8ff7232c4ee2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:01:58 crc kubenswrapper[4790]: I1003 09:01:58.466619 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8b2718fc-f9ec-42fb-9aad-8ff7232c4ee2-config-data" (OuterVolumeSpecName: "config-data") pod "8b2718fc-f9ec-42fb-9aad-8ff7232c4ee2" (UID: "8b2718fc-f9ec-42fb-9aad-8ff7232c4ee2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:01:58 crc kubenswrapper[4790]: I1003 09:01:58.535410 4790 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b2718fc-f9ec-42fb-9aad-8ff7232c4ee2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 09:01:58 crc kubenswrapper[4790]: I1003 09:01:58.535601 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wtvdh\" (UniqueName: \"kubernetes.io/projected/8b2718fc-f9ec-42fb-9aad-8ff7232c4ee2-kube-api-access-wtvdh\") on node \"crc\" DevicePath \"\"" Oct 03 09:01:58 crc kubenswrapper[4790]: I1003 09:01:58.535676 4790 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8b2718fc-f9ec-42fb-9aad-8ff7232c4ee2-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 09:01:58 crc kubenswrapper[4790]: I1003 09:01:58.692324 4790 generic.go:334] "Generic (PLEG): container finished" podID="c4e8431e-8789-4bda-a146-4b2c056eb240" containerID="cb8f2fd06148764e43f1d0e5967da1b50cf9da473be0289f0d22f4759f30f2b5" exitCode=0 Oct 03 09:01:58 crc kubenswrapper[4790]: I1003 09:01:58.692400 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"c4e8431e-8789-4bda-a146-4b2c056eb240","Type":"ContainerDied","Data":"cb8f2fd06148764e43f1d0e5967da1b50cf9da473be0289f0d22f4759f30f2b5"} Oct 03 09:01:58 crc kubenswrapper[4790]: I1003 09:01:58.694409 4790 generic.go:334] "Generic (PLEG): container finished" podID="8b2718fc-f9ec-42fb-9aad-8ff7232c4ee2" containerID="ac07591fe4cd824cfd309afb5edac51bbf69f5b2173ac133c668b99f0f65d90e" exitCode=0 Oct 03 09:01:58 crc kubenswrapper[4790]: I1003 09:01:58.694438 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"8b2718fc-f9ec-42fb-9aad-8ff7232c4ee2","Type":"ContainerDied","Data":"ac07591fe4cd824cfd309afb5edac51bbf69f5b2173ac133c668b99f0f65d90e"} Oct 03 09:01:58 crc kubenswrapper[4790]: I1003 09:01:58.694453 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"8b2718fc-f9ec-42fb-9aad-8ff7232c4ee2","Type":"ContainerDied","Data":"c7dd849ea70c2f5fe02cee83a028d389a040e2b4070b68e3a4aa78092f555d57"} Oct 03 09:01:58 crc kubenswrapper[4790]: I1003 09:01:58.694451 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 03 09:01:58 crc kubenswrapper[4790]: I1003 09:01:58.694496 4790 scope.go:117] "RemoveContainer" containerID="ac07591fe4cd824cfd309afb5edac51bbf69f5b2173ac133c668b99f0f65d90e" Oct 03 09:01:58 crc kubenswrapper[4790]: I1003 09:01:58.714932 4790 scope.go:117] "RemoveContainer" containerID="ac07591fe4cd824cfd309afb5edac51bbf69f5b2173ac133c668b99f0f65d90e" Oct 03 09:01:58 crc kubenswrapper[4790]: E1003 09:01:58.716208 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ac07591fe4cd824cfd309afb5edac51bbf69f5b2173ac133c668b99f0f65d90e\": container with ID starting with ac07591fe4cd824cfd309afb5edac51bbf69f5b2173ac133c668b99f0f65d90e not found: ID does not exist" containerID="ac07591fe4cd824cfd309afb5edac51bbf69f5b2173ac133c668b99f0f65d90e" Oct 03 09:01:58 crc kubenswrapper[4790]: I1003 09:01:58.716259 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ac07591fe4cd824cfd309afb5edac51bbf69f5b2173ac133c668b99f0f65d90e"} err="failed to get container status \"ac07591fe4cd824cfd309afb5edac51bbf69f5b2173ac133c668b99f0f65d90e\": rpc error: code = NotFound desc = could not find container \"ac07591fe4cd824cfd309afb5edac51bbf69f5b2173ac133c668b99f0f65d90e\": container with ID starting with ac07591fe4cd824cfd309afb5edac51bbf69f5b2173ac133c668b99f0f65d90e not found: ID does not exist" Oct 03 09:01:58 crc kubenswrapper[4790]: I1003 09:01:58.730479 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Oct 03 09:01:58 crc kubenswrapper[4790]: I1003 09:01:58.768300 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Oct 03 09:01:58 crc kubenswrapper[4790]: I1003 09:01:58.783822 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Oct 03 09:01:58 crc kubenswrapper[4790]: E1003 09:01:58.786193 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="04e23e9a-a675-40ea-a178-37b08cd01b51" containerName="nova-manage" Oct 03 09:01:58 crc kubenswrapper[4790]: I1003 09:01:58.786222 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="04e23e9a-a675-40ea-a178-37b08cd01b51" containerName="nova-manage" Oct 03 09:01:58 crc kubenswrapper[4790]: E1003 09:01:58.786274 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8b2718fc-f9ec-42fb-9aad-8ff7232c4ee2" containerName="nova-scheduler-scheduler" Oct 03 09:01:58 crc kubenswrapper[4790]: I1003 09:01:58.786285 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b2718fc-f9ec-42fb-9aad-8ff7232c4ee2" containerName="nova-scheduler-scheduler" Oct 03 09:01:58 crc kubenswrapper[4790]: I1003 09:01:58.786518 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="04e23e9a-a675-40ea-a178-37b08cd01b51" containerName="nova-manage" Oct 03 09:01:58 crc kubenswrapper[4790]: I1003 09:01:58.786544 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="8b2718fc-f9ec-42fb-9aad-8ff7232c4ee2" containerName="nova-scheduler-scheduler" Oct 03 09:01:58 crc kubenswrapper[4790]: I1003 09:01:58.789028 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 03 09:01:58 crc kubenswrapper[4790]: I1003 09:01:58.794639 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Oct 03 09:01:58 crc kubenswrapper[4790]: I1003 09:01:58.803286 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 03 09:01:58 crc kubenswrapper[4790]: I1003 09:01:58.927928 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 03 09:01:58 crc kubenswrapper[4790]: I1003 09:01:58.956444 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-998b9\" (UniqueName: \"kubernetes.io/projected/87e3a414-fd4b-49ac-8e64-3514f9725df1-kube-api-access-998b9\") pod \"nova-scheduler-0\" (UID: \"87e3a414-fd4b-49ac-8e64-3514f9725df1\") " pod="openstack/nova-scheduler-0" Oct 03 09:01:58 crc kubenswrapper[4790]: I1003 09:01:58.956565 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/87e3a414-fd4b-49ac-8e64-3514f9725df1-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"87e3a414-fd4b-49ac-8e64-3514f9725df1\") " pod="openstack/nova-scheduler-0" Oct 03 09:01:58 crc kubenswrapper[4790]: I1003 09:01:58.956791 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/87e3a414-fd4b-49ac-8e64-3514f9725df1-config-data\") pod \"nova-scheduler-0\" (UID: \"87e3a414-fd4b-49ac-8e64-3514f9725df1\") " pod="openstack/nova-scheduler-0" Oct 03 09:01:59 crc kubenswrapper[4790]: I1003 09:01:59.057879 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c4e8431e-8789-4bda-a146-4b2c056eb240-logs\") pod \"c4e8431e-8789-4bda-a146-4b2c056eb240\" (UID: \"c4e8431e-8789-4bda-a146-4b2c056eb240\") " Oct 03 09:01:59 crc kubenswrapper[4790]: I1003 09:01:59.058040 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8wcfn\" (UniqueName: \"kubernetes.io/projected/c4e8431e-8789-4bda-a146-4b2c056eb240-kube-api-access-8wcfn\") pod \"c4e8431e-8789-4bda-a146-4b2c056eb240\" (UID: \"c4e8431e-8789-4bda-a146-4b2c056eb240\") " Oct 03 09:01:59 crc kubenswrapper[4790]: I1003 09:01:59.058144 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c4e8431e-8789-4bda-a146-4b2c056eb240-config-data\") pod \"c4e8431e-8789-4bda-a146-4b2c056eb240\" (UID: \"c4e8431e-8789-4bda-a146-4b2c056eb240\") " Oct 03 09:01:59 crc kubenswrapper[4790]: I1003 09:01:59.058167 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c4e8431e-8789-4bda-a146-4b2c056eb240-combined-ca-bundle\") pod \"c4e8431e-8789-4bda-a146-4b2c056eb240\" (UID: \"c4e8431e-8789-4bda-a146-4b2c056eb240\") " Oct 03 09:01:59 crc kubenswrapper[4790]: I1003 09:01:59.058201 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/c4e8431e-8789-4bda-a146-4b2c056eb240-nova-metadata-tls-certs\") pod \"c4e8431e-8789-4bda-a146-4b2c056eb240\" (UID: \"c4e8431e-8789-4bda-a146-4b2c056eb240\") " Oct 03 09:01:59 crc kubenswrapper[4790]: I1003 09:01:59.059288 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-998b9\" (UniqueName: \"kubernetes.io/projected/87e3a414-fd4b-49ac-8e64-3514f9725df1-kube-api-access-998b9\") pod \"nova-scheduler-0\" (UID: \"87e3a414-fd4b-49ac-8e64-3514f9725df1\") " pod="openstack/nova-scheduler-0" Oct 03 09:01:59 crc kubenswrapper[4790]: I1003 09:01:59.059418 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/87e3a414-fd4b-49ac-8e64-3514f9725df1-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"87e3a414-fd4b-49ac-8e64-3514f9725df1\") " pod="openstack/nova-scheduler-0" Oct 03 09:01:59 crc kubenswrapper[4790]: I1003 09:01:59.059534 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/87e3a414-fd4b-49ac-8e64-3514f9725df1-config-data\") pod \"nova-scheduler-0\" (UID: \"87e3a414-fd4b-49ac-8e64-3514f9725df1\") " pod="openstack/nova-scheduler-0" Oct 03 09:01:59 crc kubenswrapper[4790]: I1003 09:01:59.059779 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c4e8431e-8789-4bda-a146-4b2c056eb240-logs" (OuterVolumeSpecName: "logs") pod "c4e8431e-8789-4bda-a146-4b2c056eb240" (UID: "c4e8431e-8789-4bda-a146-4b2c056eb240"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 09:01:59 crc kubenswrapper[4790]: I1003 09:01:59.066037 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c4e8431e-8789-4bda-a146-4b2c056eb240-kube-api-access-8wcfn" (OuterVolumeSpecName: "kube-api-access-8wcfn") pod "c4e8431e-8789-4bda-a146-4b2c056eb240" (UID: "c4e8431e-8789-4bda-a146-4b2c056eb240"). InnerVolumeSpecName "kube-api-access-8wcfn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:01:59 crc kubenswrapper[4790]: I1003 09:01:59.067601 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/87e3a414-fd4b-49ac-8e64-3514f9725df1-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"87e3a414-fd4b-49ac-8e64-3514f9725df1\") " pod="openstack/nova-scheduler-0" Oct 03 09:01:59 crc kubenswrapper[4790]: I1003 09:01:59.079463 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/87e3a414-fd4b-49ac-8e64-3514f9725df1-config-data\") pod \"nova-scheduler-0\" (UID: \"87e3a414-fd4b-49ac-8e64-3514f9725df1\") " pod="openstack/nova-scheduler-0" Oct 03 09:01:59 crc kubenswrapper[4790]: I1003 09:01:59.092271 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-998b9\" (UniqueName: \"kubernetes.io/projected/87e3a414-fd4b-49ac-8e64-3514f9725df1-kube-api-access-998b9\") pod \"nova-scheduler-0\" (UID: \"87e3a414-fd4b-49ac-8e64-3514f9725df1\") " pod="openstack/nova-scheduler-0" Oct 03 09:01:59 crc kubenswrapper[4790]: I1003 09:01:59.105691 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c4e8431e-8789-4bda-a146-4b2c056eb240-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c4e8431e-8789-4bda-a146-4b2c056eb240" (UID: "c4e8431e-8789-4bda-a146-4b2c056eb240"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:01:59 crc kubenswrapper[4790]: I1003 09:01:59.119251 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c4e8431e-8789-4bda-a146-4b2c056eb240-config-data" (OuterVolumeSpecName: "config-data") pod "c4e8431e-8789-4bda-a146-4b2c056eb240" (UID: "c4e8431e-8789-4bda-a146-4b2c056eb240"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:01:59 crc kubenswrapper[4790]: I1003 09:01:59.128913 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 03 09:01:59 crc kubenswrapper[4790]: I1003 09:01:59.132905 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c4e8431e-8789-4bda-a146-4b2c056eb240-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "c4e8431e-8789-4bda-a146-4b2c056eb240" (UID: "c4e8431e-8789-4bda-a146-4b2c056eb240"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:01:59 crc kubenswrapper[4790]: I1003 09:01:59.160929 4790 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c4e8431e-8789-4bda-a146-4b2c056eb240-logs\") on node \"crc\" DevicePath \"\"" Oct 03 09:01:59 crc kubenswrapper[4790]: I1003 09:01:59.160955 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8wcfn\" (UniqueName: \"kubernetes.io/projected/c4e8431e-8789-4bda-a146-4b2c056eb240-kube-api-access-8wcfn\") on node \"crc\" DevicePath \"\"" Oct 03 09:01:59 crc kubenswrapper[4790]: I1003 09:01:59.160969 4790 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c4e8431e-8789-4bda-a146-4b2c056eb240-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 09:01:59 crc kubenswrapper[4790]: I1003 09:01:59.160978 4790 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c4e8431e-8789-4bda-a146-4b2c056eb240-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 09:01:59 crc kubenswrapper[4790]: I1003 09:01:59.160988 4790 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/c4e8431e-8789-4bda-a146-4b2c056eb240-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 03 09:01:59 crc kubenswrapper[4790]: I1003 09:01:59.170326 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 03 09:01:59 crc kubenswrapper[4790]: I1003 09:01:59.261698 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a47cec82-31c1-4842-8a5a-fc233e1235f8-public-tls-certs\") pod \"a47cec82-31c1-4842-8a5a-fc233e1235f8\" (UID: \"a47cec82-31c1-4842-8a5a-fc233e1235f8\") " Oct 03 09:01:59 crc kubenswrapper[4790]: I1003 09:01:59.262169 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcmb4\" (UniqueName: \"kubernetes.io/projected/a47cec82-31c1-4842-8a5a-fc233e1235f8-kube-api-access-fcmb4\") pod \"a47cec82-31c1-4842-8a5a-fc233e1235f8\" (UID: \"a47cec82-31c1-4842-8a5a-fc233e1235f8\") " Oct 03 09:01:59 crc kubenswrapper[4790]: I1003 09:01:59.262194 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a47cec82-31c1-4842-8a5a-fc233e1235f8-logs\") pod \"a47cec82-31c1-4842-8a5a-fc233e1235f8\" (UID: \"a47cec82-31c1-4842-8a5a-fc233e1235f8\") " Oct 03 09:01:59 crc kubenswrapper[4790]: I1003 09:01:59.262766 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a47cec82-31c1-4842-8a5a-fc233e1235f8-internal-tls-certs\") pod \"a47cec82-31c1-4842-8a5a-fc233e1235f8\" (UID: \"a47cec82-31c1-4842-8a5a-fc233e1235f8\") " Oct 03 09:01:59 crc kubenswrapper[4790]: I1003 09:01:59.262850 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a47cec82-31c1-4842-8a5a-fc233e1235f8-config-data\") pod \"a47cec82-31c1-4842-8a5a-fc233e1235f8\" (UID: \"a47cec82-31c1-4842-8a5a-fc233e1235f8\") " Oct 03 09:01:59 crc kubenswrapper[4790]: I1003 09:01:59.262882 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a47cec82-31c1-4842-8a5a-fc233e1235f8-combined-ca-bundle\") pod \"a47cec82-31c1-4842-8a5a-fc233e1235f8\" (UID: \"a47cec82-31c1-4842-8a5a-fc233e1235f8\") " Oct 03 09:01:59 crc kubenswrapper[4790]: I1003 09:01:59.263732 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a47cec82-31c1-4842-8a5a-fc233e1235f8-logs" (OuterVolumeSpecName: "logs") pod "a47cec82-31c1-4842-8a5a-fc233e1235f8" (UID: "a47cec82-31c1-4842-8a5a-fc233e1235f8"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 09:01:59 crc kubenswrapper[4790]: I1003 09:01:59.266825 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a47cec82-31c1-4842-8a5a-fc233e1235f8-kube-api-access-fcmb4" (OuterVolumeSpecName: "kube-api-access-fcmb4") pod "a47cec82-31c1-4842-8a5a-fc233e1235f8" (UID: "a47cec82-31c1-4842-8a5a-fc233e1235f8"). InnerVolumeSpecName "kube-api-access-fcmb4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:01:59 crc kubenswrapper[4790]: I1003 09:01:59.327433 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a47cec82-31c1-4842-8a5a-fc233e1235f8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a47cec82-31c1-4842-8a5a-fc233e1235f8" (UID: "a47cec82-31c1-4842-8a5a-fc233e1235f8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:01:59 crc kubenswrapper[4790]: I1003 09:01:59.369532 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcmb4\" (UniqueName: \"kubernetes.io/projected/a47cec82-31c1-4842-8a5a-fc233e1235f8-kube-api-access-fcmb4\") on node \"crc\" DevicePath \"\"" Oct 03 09:01:59 crc kubenswrapper[4790]: I1003 09:01:59.369803 4790 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a47cec82-31c1-4842-8a5a-fc233e1235f8-logs\") on node \"crc\" DevicePath \"\"" Oct 03 09:01:59 crc kubenswrapper[4790]: I1003 09:01:59.370273 4790 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a47cec82-31c1-4842-8a5a-fc233e1235f8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 09:01:59 crc kubenswrapper[4790]: I1003 09:01:59.372688 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a47cec82-31c1-4842-8a5a-fc233e1235f8-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "a47cec82-31c1-4842-8a5a-fc233e1235f8" (UID: "a47cec82-31c1-4842-8a5a-fc233e1235f8"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:01:59 crc kubenswrapper[4790]: I1003 09:01:59.374478 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a47cec82-31c1-4842-8a5a-fc233e1235f8-config-data" (OuterVolumeSpecName: "config-data") pod "a47cec82-31c1-4842-8a5a-fc233e1235f8" (UID: "a47cec82-31c1-4842-8a5a-fc233e1235f8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:01:59 crc kubenswrapper[4790]: I1003 09:01:59.383004 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a47cec82-31c1-4842-8a5a-fc233e1235f8-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "a47cec82-31c1-4842-8a5a-fc233e1235f8" (UID: "a47cec82-31c1-4842-8a5a-fc233e1235f8"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:02:00 crc kubenswrapper[4790]: I1003 09:01:59.471994 4790 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a47cec82-31c1-4842-8a5a-fc233e1235f8-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 03 09:02:00 crc kubenswrapper[4790]: I1003 09:01:59.472021 4790 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a47cec82-31c1-4842-8a5a-fc233e1235f8-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 09:02:00 crc kubenswrapper[4790]: I1003 09:01:59.472029 4790 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a47cec82-31c1-4842-8a5a-fc233e1235f8-public-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 03 09:02:00 crc kubenswrapper[4790]: I1003 09:01:59.635065 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 03 09:02:00 crc kubenswrapper[4790]: W1003 09:01:59.643838 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod87e3a414_fd4b_49ac_8e64_3514f9725df1.slice/crio-aa9923c685a6496e3435aeafc0cf6e2409cd3f5e096e7f1b5249984f31ed7fd4 WatchSource:0}: Error finding container aa9923c685a6496e3435aeafc0cf6e2409cd3f5e096e7f1b5249984f31ed7fd4: Status 404 returned error can't find the container with id aa9923c685a6496e3435aeafc0cf6e2409cd3f5e096e7f1b5249984f31ed7fd4 Oct 03 09:02:00 crc kubenswrapper[4790]: I1003 09:01:59.707945 4790 generic.go:334] "Generic (PLEG): container finished" podID="a47cec82-31c1-4842-8a5a-fc233e1235f8" containerID="25596ae34b254dec0db79a0649008347e976237ae84deb8da4ef55ae2fa7b2af" exitCode=0 Oct 03 09:02:00 crc kubenswrapper[4790]: I1003 09:01:59.708026 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 03 09:02:00 crc kubenswrapper[4790]: I1003 09:01:59.708024 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a47cec82-31c1-4842-8a5a-fc233e1235f8","Type":"ContainerDied","Data":"25596ae34b254dec0db79a0649008347e976237ae84deb8da4ef55ae2fa7b2af"} Oct 03 09:02:00 crc kubenswrapper[4790]: I1003 09:01:59.708213 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a47cec82-31c1-4842-8a5a-fc233e1235f8","Type":"ContainerDied","Data":"a8968e31d2161fbc58e97a8152d26612bb239d9cce24da80d0e774111ae3348a"} Oct 03 09:02:00 crc kubenswrapper[4790]: I1003 09:01:59.708274 4790 scope.go:117] "RemoveContainer" containerID="25596ae34b254dec0db79a0649008347e976237ae84deb8da4ef55ae2fa7b2af" Oct 03 09:02:00 crc kubenswrapper[4790]: I1003 09:01:59.711475 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"87e3a414-fd4b-49ac-8e64-3514f9725df1","Type":"ContainerStarted","Data":"aa9923c685a6496e3435aeafc0cf6e2409cd3f5e096e7f1b5249984f31ed7fd4"} Oct 03 09:02:00 crc kubenswrapper[4790]: I1003 09:01:59.714015 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"c4e8431e-8789-4bda-a146-4b2c056eb240","Type":"ContainerDied","Data":"6cbfd1712bfe4132a42482d57f67e458dcb0b03feedd82c68ac1efced4af49e8"} Oct 03 09:02:00 crc kubenswrapper[4790]: I1003 09:01:59.714134 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 03 09:02:00 crc kubenswrapper[4790]: I1003 09:01:59.741628 4790 scope.go:117] "RemoveContainer" containerID="b779a8a324b7e5586975b19cbb5f1cefb30609edd73eea50f767ef6e56a8fcfe" Oct 03 09:02:00 crc kubenswrapper[4790]: I1003 09:01:59.769034 4790 scope.go:117] "RemoveContainer" containerID="25596ae34b254dec0db79a0649008347e976237ae84deb8da4ef55ae2fa7b2af" Oct 03 09:02:00 crc kubenswrapper[4790]: E1003 09:01:59.769617 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"25596ae34b254dec0db79a0649008347e976237ae84deb8da4ef55ae2fa7b2af\": container with ID starting with 25596ae34b254dec0db79a0649008347e976237ae84deb8da4ef55ae2fa7b2af not found: ID does not exist" containerID="25596ae34b254dec0db79a0649008347e976237ae84deb8da4ef55ae2fa7b2af" Oct 03 09:02:00 crc kubenswrapper[4790]: I1003 09:01:59.769653 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"25596ae34b254dec0db79a0649008347e976237ae84deb8da4ef55ae2fa7b2af"} err="failed to get container status \"25596ae34b254dec0db79a0649008347e976237ae84deb8da4ef55ae2fa7b2af\": rpc error: code = NotFound desc = could not find container \"25596ae34b254dec0db79a0649008347e976237ae84deb8da4ef55ae2fa7b2af\": container with ID starting with 25596ae34b254dec0db79a0649008347e976237ae84deb8da4ef55ae2fa7b2af not found: ID does not exist" Oct 03 09:02:00 crc kubenswrapper[4790]: I1003 09:01:59.769677 4790 scope.go:117] "RemoveContainer" containerID="b779a8a324b7e5586975b19cbb5f1cefb30609edd73eea50f767ef6e56a8fcfe" Oct 03 09:02:00 crc kubenswrapper[4790]: E1003 09:01:59.770280 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b779a8a324b7e5586975b19cbb5f1cefb30609edd73eea50f767ef6e56a8fcfe\": container with ID starting with b779a8a324b7e5586975b19cbb5f1cefb30609edd73eea50f767ef6e56a8fcfe not found: ID does not exist" containerID="b779a8a324b7e5586975b19cbb5f1cefb30609edd73eea50f767ef6e56a8fcfe" Oct 03 09:02:00 crc kubenswrapper[4790]: I1003 09:01:59.770306 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b779a8a324b7e5586975b19cbb5f1cefb30609edd73eea50f767ef6e56a8fcfe"} err="failed to get container status \"b779a8a324b7e5586975b19cbb5f1cefb30609edd73eea50f767ef6e56a8fcfe\": rpc error: code = NotFound desc = could not find container \"b779a8a324b7e5586975b19cbb5f1cefb30609edd73eea50f767ef6e56a8fcfe\": container with ID starting with b779a8a324b7e5586975b19cbb5f1cefb30609edd73eea50f767ef6e56a8fcfe not found: ID does not exist" Oct 03 09:02:00 crc kubenswrapper[4790]: I1003 09:01:59.770323 4790 scope.go:117] "RemoveContainer" containerID="cb8f2fd06148764e43f1d0e5967da1b50cf9da473be0289f0d22f4759f30f2b5" Oct 03 09:02:00 crc kubenswrapper[4790]: I1003 09:01:59.772040 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 03 09:02:00 crc kubenswrapper[4790]: I1003 09:01:59.794212 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Oct 03 09:02:00 crc kubenswrapper[4790]: I1003 09:01:59.819818 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 03 09:02:00 crc kubenswrapper[4790]: I1003 09:01:59.827008 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Oct 03 09:02:00 crc kubenswrapper[4790]: I1003 09:01:59.836197 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Oct 03 09:02:00 crc kubenswrapper[4790]: E1003 09:01:59.837087 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a47cec82-31c1-4842-8a5a-fc233e1235f8" containerName="nova-api-log" Oct 03 09:02:00 crc kubenswrapper[4790]: I1003 09:01:59.837107 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="a47cec82-31c1-4842-8a5a-fc233e1235f8" containerName="nova-api-log" Oct 03 09:02:00 crc kubenswrapper[4790]: E1003 09:01:59.837160 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c4e8431e-8789-4bda-a146-4b2c056eb240" containerName="nova-metadata-metadata" Oct 03 09:02:00 crc kubenswrapper[4790]: I1003 09:01:59.837170 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="c4e8431e-8789-4bda-a146-4b2c056eb240" containerName="nova-metadata-metadata" Oct 03 09:02:00 crc kubenswrapper[4790]: E1003 09:01:59.837201 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c4e8431e-8789-4bda-a146-4b2c056eb240" containerName="nova-metadata-log" Oct 03 09:02:00 crc kubenswrapper[4790]: I1003 09:01:59.837237 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="c4e8431e-8789-4bda-a146-4b2c056eb240" containerName="nova-metadata-log" Oct 03 09:02:00 crc kubenswrapper[4790]: E1003 09:01:59.837256 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a47cec82-31c1-4842-8a5a-fc233e1235f8" containerName="nova-api-api" Oct 03 09:02:00 crc kubenswrapper[4790]: I1003 09:01:59.837263 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="a47cec82-31c1-4842-8a5a-fc233e1235f8" containerName="nova-api-api" Oct 03 09:02:00 crc kubenswrapper[4790]: I1003 09:01:59.837709 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="c4e8431e-8789-4bda-a146-4b2c056eb240" containerName="nova-metadata-metadata" Oct 03 09:02:00 crc kubenswrapper[4790]: I1003 09:01:59.837734 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="c4e8431e-8789-4bda-a146-4b2c056eb240" containerName="nova-metadata-log" Oct 03 09:02:00 crc kubenswrapper[4790]: I1003 09:01:59.837747 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="a47cec82-31c1-4842-8a5a-fc233e1235f8" containerName="nova-api-api" Oct 03 09:02:00 crc kubenswrapper[4790]: I1003 09:01:59.837782 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="a47cec82-31c1-4842-8a5a-fc233e1235f8" containerName="nova-api-log" Oct 03 09:02:00 crc kubenswrapper[4790]: I1003 09:01:59.840756 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 03 09:02:00 crc kubenswrapper[4790]: I1003 09:01:59.845924 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Oct 03 09:02:00 crc kubenswrapper[4790]: I1003 09:01:59.845960 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Oct 03 09:02:00 crc kubenswrapper[4790]: I1003 09:01:59.845924 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Oct 03 09:02:00 crc kubenswrapper[4790]: I1003 09:01:59.846248 4790 scope.go:117] "RemoveContainer" containerID="b4e7ca2907c6c3d5b79917495ec50be29601d0b1109983fec0b2484e44f7e029" Oct 03 09:02:00 crc kubenswrapper[4790]: I1003 09:01:59.848864 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Oct 03 09:02:00 crc kubenswrapper[4790]: I1003 09:01:59.869421 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 03 09:02:00 crc kubenswrapper[4790]: I1003 09:01:59.869534 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 03 09:02:00 crc kubenswrapper[4790]: I1003 09:01:59.872053 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Oct 03 09:02:00 crc kubenswrapper[4790]: I1003 09:01:59.872246 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Oct 03 09:02:00 crc kubenswrapper[4790]: I1003 09:01:59.873032 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 03 09:02:00 crc kubenswrapper[4790]: I1003 09:02:00.000649 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b829e543-5d7d-4e9b-9674-9c24d060a0c5-config-data\") pod \"nova-metadata-0\" (UID: \"b829e543-5d7d-4e9b-9674-9c24d060a0c5\") " pod="openstack/nova-metadata-0" Oct 03 09:02:00 crc kubenswrapper[4790]: I1003 09:02:00.000758 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f9x7g\" (UniqueName: \"kubernetes.io/projected/f3e4ca6a-6d02-4c7e-b549-64e5900371ab-kube-api-access-f9x7g\") pod \"nova-api-0\" (UID: \"f3e4ca6a-6d02-4c7e-b549-64e5900371ab\") " pod="openstack/nova-api-0" Oct 03 09:02:00 crc kubenswrapper[4790]: I1003 09:02:00.000831 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f3e4ca6a-6d02-4c7e-b549-64e5900371ab-config-data\") pod \"nova-api-0\" (UID: \"f3e4ca6a-6d02-4c7e-b549-64e5900371ab\") " pod="openstack/nova-api-0" Oct 03 09:02:00 crc kubenswrapper[4790]: I1003 09:02:00.002448 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f3e4ca6a-6d02-4c7e-b549-64e5900371ab-internal-tls-certs\") pod \"nova-api-0\" (UID: \"f3e4ca6a-6d02-4c7e-b549-64e5900371ab\") " pod="openstack/nova-api-0" Oct 03 09:02:00 crc kubenswrapper[4790]: I1003 09:02:00.002497 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/b829e543-5d7d-4e9b-9674-9c24d060a0c5-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"b829e543-5d7d-4e9b-9674-9c24d060a0c5\") " pod="openstack/nova-metadata-0" Oct 03 09:02:00 crc kubenswrapper[4790]: I1003 09:02:00.002554 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f3e4ca6a-6d02-4c7e-b549-64e5900371ab-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"f3e4ca6a-6d02-4c7e-b549-64e5900371ab\") " pod="openstack/nova-api-0" Oct 03 09:02:00 crc kubenswrapper[4790]: I1003 09:02:00.002615 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b829e543-5d7d-4e9b-9674-9c24d060a0c5-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"b829e543-5d7d-4e9b-9674-9c24d060a0c5\") " pod="openstack/nova-metadata-0" Oct 03 09:02:00 crc kubenswrapper[4790]: I1003 09:02:00.002652 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f3e4ca6a-6d02-4c7e-b549-64e5900371ab-logs\") pod \"nova-api-0\" (UID: \"f3e4ca6a-6d02-4c7e-b549-64e5900371ab\") " pod="openstack/nova-api-0" Oct 03 09:02:00 crc kubenswrapper[4790]: I1003 09:02:00.002714 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b829e543-5d7d-4e9b-9674-9c24d060a0c5-logs\") pod \"nova-metadata-0\" (UID: \"b829e543-5d7d-4e9b-9674-9c24d060a0c5\") " pod="openstack/nova-metadata-0" Oct 03 09:02:00 crc kubenswrapper[4790]: I1003 09:02:00.002769 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f3e4ca6a-6d02-4c7e-b549-64e5900371ab-public-tls-certs\") pod \"nova-api-0\" (UID: \"f3e4ca6a-6d02-4c7e-b549-64e5900371ab\") " pod="openstack/nova-api-0" Oct 03 09:02:00 crc kubenswrapper[4790]: I1003 09:02:00.002794 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jkjjv\" (UniqueName: \"kubernetes.io/projected/b829e543-5d7d-4e9b-9674-9c24d060a0c5-kube-api-access-jkjjv\") pod \"nova-metadata-0\" (UID: \"b829e543-5d7d-4e9b-9674-9c24d060a0c5\") " pod="openstack/nova-metadata-0" Oct 03 09:02:00 crc kubenswrapper[4790]: I1003 09:02:00.105192 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f3e4ca6a-6d02-4c7e-b549-64e5900371ab-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"f3e4ca6a-6d02-4c7e-b549-64e5900371ab\") " pod="openstack/nova-api-0" Oct 03 09:02:00 crc kubenswrapper[4790]: I1003 09:02:00.105249 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b829e543-5d7d-4e9b-9674-9c24d060a0c5-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"b829e543-5d7d-4e9b-9674-9c24d060a0c5\") " pod="openstack/nova-metadata-0" Oct 03 09:02:00 crc kubenswrapper[4790]: I1003 09:02:00.105290 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f3e4ca6a-6d02-4c7e-b549-64e5900371ab-logs\") pod \"nova-api-0\" (UID: \"f3e4ca6a-6d02-4c7e-b549-64e5900371ab\") " pod="openstack/nova-api-0" Oct 03 09:02:00 crc kubenswrapper[4790]: I1003 09:02:00.105333 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b829e543-5d7d-4e9b-9674-9c24d060a0c5-logs\") pod \"nova-metadata-0\" (UID: \"b829e543-5d7d-4e9b-9674-9c24d060a0c5\") " pod="openstack/nova-metadata-0" Oct 03 09:02:00 crc kubenswrapper[4790]: I1003 09:02:00.105377 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f3e4ca6a-6d02-4c7e-b549-64e5900371ab-public-tls-certs\") pod \"nova-api-0\" (UID: \"f3e4ca6a-6d02-4c7e-b549-64e5900371ab\") " pod="openstack/nova-api-0" Oct 03 09:02:00 crc kubenswrapper[4790]: I1003 09:02:00.106003 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jkjjv\" (UniqueName: \"kubernetes.io/projected/b829e543-5d7d-4e9b-9674-9c24d060a0c5-kube-api-access-jkjjv\") pod \"nova-metadata-0\" (UID: \"b829e543-5d7d-4e9b-9674-9c24d060a0c5\") " pod="openstack/nova-metadata-0" Oct 03 09:02:00 crc kubenswrapper[4790]: I1003 09:02:00.105946 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b829e543-5d7d-4e9b-9674-9c24d060a0c5-logs\") pod \"nova-metadata-0\" (UID: \"b829e543-5d7d-4e9b-9674-9c24d060a0c5\") " pod="openstack/nova-metadata-0" Oct 03 09:02:00 crc kubenswrapper[4790]: I1003 09:02:00.106186 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b829e543-5d7d-4e9b-9674-9c24d060a0c5-config-data\") pod \"nova-metadata-0\" (UID: \"b829e543-5d7d-4e9b-9674-9c24d060a0c5\") " pod="openstack/nova-metadata-0" Oct 03 09:02:00 crc kubenswrapper[4790]: I1003 09:02:00.106220 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f3e4ca6a-6d02-4c7e-b549-64e5900371ab-logs\") pod \"nova-api-0\" (UID: \"f3e4ca6a-6d02-4c7e-b549-64e5900371ab\") " pod="openstack/nova-api-0" Oct 03 09:02:00 crc kubenswrapper[4790]: I1003 09:02:00.106288 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f9x7g\" (UniqueName: \"kubernetes.io/projected/f3e4ca6a-6d02-4c7e-b549-64e5900371ab-kube-api-access-f9x7g\") pod \"nova-api-0\" (UID: \"f3e4ca6a-6d02-4c7e-b549-64e5900371ab\") " pod="openstack/nova-api-0" Oct 03 09:02:00 crc kubenswrapper[4790]: I1003 09:02:00.106956 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f3e4ca6a-6d02-4c7e-b549-64e5900371ab-config-data\") pod \"nova-api-0\" (UID: \"f3e4ca6a-6d02-4c7e-b549-64e5900371ab\") " pod="openstack/nova-api-0" Oct 03 09:02:00 crc kubenswrapper[4790]: I1003 09:02:00.107353 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f3e4ca6a-6d02-4c7e-b549-64e5900371ab-internal-tls-certs\") pod \"nova-api-0\" (UID: \"f3e4ca6a-6d02-4c7e-b549-64e5900371ab\") " pod="openstack/nova-api-0" Oct 03 09:02:00 crc kubenswrapper[4790]: I1003 09:02:00.107402 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/b829e543-5d7d-4e9b-9674-9c24d060a0c5-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"b829e543-5d7d-4e9b-9674-9c24d060a0c5\") " pod="openstack/nova-metadata-0" Oct 03 09:02:00 crc kubenswrapper[4790]: I1003 09:02:00.111300 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f3e4ca6a-6d02-4c7e-b549-64e5900371ab-public-tls-certs\") pod \"nova-api-0\" (UID: \"f3e4ca6a-6d02-4c7e-b549-64e5900371ab\") " pod="openstack/nova-api-0" Oct 03 09:02:00 crc kubenswrapper[4790]: I1003 09:02:00.111514 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b829e543-5d7d-4e9b-9674-9c24d060a0c5-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"b829e543-5d7d-4e9b-9674-9c24d060a0c5\") " pod="openstack/nova-metadata-0" Oct 03 09:02:00 crc kubenswrapper[4790]: I1003 09:02:00.112339 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b829e543-5d7d-4e9b-9674-9c24d060a0c5-config-data\") pod \"nova-metadata-0\" (UID: \"b829e543-5d7d-4e9b-9674-9c24d060a0c5\") " pod="openstack/nova-metadata-0" Oct 03 09:02:00 crc kubenswrapper[4790]: I1003 09:02:00.112538 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f3e4ca6a-6d02-4c7e-b549-64e5900371ab-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"f3e4ca6a-6d02-4c7e-b549-64e5900371ab\") " pod="openstack/nova-api-0" Oct 03 09:02:00 crc kubenswrapper[4790]: I1003 09:02:00.113308 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f3e4ca6a-6d02-4c7e-b549-64e5900371ab-config-data\") pod \"nova-api-0\" (UID: \"f3e4ca6a-6d02-4c7e-b549-64e5900371ab\") " pod="openstack/nova-api-0" Oct 03 09:02:00 crc kubenswrapper[4790]: I1003 09:02:00.113478 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/b829e543-5d7d-4e9b-9674-9c24d060a0c5-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"b829e543-5d7d-4e9b-9674-9c24d060a0c5\") " pod="openstack/nova-metadata-0" Oct 03 09:02:00 crc kubenswrapper[4790]: I1003 09:02:00.114969 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f3e4ca6a-6d02-4c7e-b549-64e5900371ab-internal-tls-certs\") pod \"nova-api-0\" (UID: \"f3e4ca6a-6d02-4c7e-b549-64e5900371ab\") " pod="openstack/nova-api-0" Oct 03 09:02:00 crc kubenswrapper[4790]: I1003 09:02:00.129192 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f9x7g\" (UniqueName: \"kubernetes.io/projected/f3e4ca6a-6d02-4c7e-b549-64e5900371ab-kube-api-access-f9x7g\") pod \"nova-api-0\" (UID: \"f3e4ca6a-6d02-4c7e-b549-64e5900371ab\") " pod="openstack/nova-api-0" Oct 03 09:02:00 crc kubenswrapper[4790]: I1003 09:02:00.136600 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jkjjv\" (UniqueName: \"kubernetes.io/projected/b829e543-5d7d-4e9b-9674-9c24d060a0c5-kube-api-access-jkjjv\") pod \"nova-metadata-0\" (UID: \"b829e543-5d7d-4e9b-9674-9c24d060a0c5\") " pod="openstack/nova-metadata-0" Oct 03 09:02:00 crc kubenswrapper[4790]: I1003 09:02:00.175040 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 03 09:02:00 crc kubenswrapper[4790]: I1003 09:02:00.206323 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 03 09:02:00 crc kubenswrapper[4790]: I1003 09:02:00.346992 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8b2718fc-f9ec-42fb-9aad-8ff7232c4ee2" path="/var/lib/kubelet/pods/8b2718fc-f9ec-42fb-9aad-8ff7232c4ee2/volumes" Oct 03 09:02:00 crc kubenswrapper[4790]: I1003 09:02:00.348102 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a47cec82-31c1-4842-8a5a-fc233e1235f8" path="/var/lib/kubelet/pods/a47cec82-31c1-4842-8a5a-fc233e1235f8/volumes" Oct 03 09:02:00 crc kubenswrapper[4790]: I1003 09:02:00.348830 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c4e8431e-8789-4bda-a146-4b2c056eb240" path="/var/lib/kubelet/pods/c4e8431e-8789-4bda-a146-4b2c056eb240/volumes" Oct 03 09:02:00 crc kubenswrapper[4790]: I1003 09:02:00.657427 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 03 09:02:00 crc kubenswrapper[4790]: W1003 09:02:00.674971 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf3e4ca6a_6d02_4c7e_b549_64e5900371ab.slice/crio-2f25446775509fc1dfe3af268e5ee895499f0f957942f499271b997b756de77a WatchSource:0}: Error finding container 2f25446775509fc1dfe3af268e5ee895499f0f957942f499271b997b756de77a: Status 404 returned error can't find the container with id 2f25446775509fc1dfe3af268e5ee895499f0f957942f499271b997b756de77a Oct 03 09:02:00 crc kubenswrapper[4790]: I1003 09:02:00.744651 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 03 09:02:00 crc kubenswrapper[4790]: I1003 09:02:00.746270 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f3e4ca6a-6d02-4c7e-b549-64e5900371ab","Type":"ContainerStarted","Data":"2f25446775509fc1dfe3af268e5ee895499f0f957942f499271b997b756de77a"} Oct 03 09:02:00 crc kubenswrapper[4790]: I1003 09:02:00.749114 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"87e3a414-fd4b-49ac-8e64-3514f9725df1","Type":"ContainerStarted","Data":"c43a10a3a4d933f9116558f2dd47d02a30474ba20d8800a241af2ed02926e2ee"} Oct 03 09:02:00 crc kubenswrapper[4790]: W1003 09:02:00.751902 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb829e543_5d7d_4e9b_9674_9c24d060a0c5.slice/crio-e84bdb0f159801a3fbf298c45e306c2c7c58af3b23c58da9b7744a86626bc77a WatchSource:0}: Error finding container e84bdb0f159801a3fbf298c45e306c2c7c58af3b23c58da9b7744a86626bc77a: Status 404 returned error can't find the container with id e84bdb0f159801a3fbf298c45e306c2c7c58af3b23c58da9b7744a86626bc77a Oct 03 09:02:00 crc kubenswrapper[4790]: I1003 09:02:00.770471 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.770452382 podStartE2EDuration="2.770452382s" podCreationTimestamp="2025-10-03 09:01:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 09:02:00.765427795 +0000 UTC m=+1307.118362043" watchObservedRunningTime="2025-10-03 09:02:00.770452382 +0000 UTC m=+1307.123386620" Oct 03 09:02:01 crc kubenswrapper[4790]: I1003 09:02:01.760257 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f3e4ca6a-6d02-4c7e-b549-64e5900371ab","Type":"ContainerStarted","Data":"8c776fae9440268842688b546d2dc2ccfeeb1df12bce1ce9063cb42740a475fb"} Oct 03 09:02:01 crc kubenswrapper[4790]: I1003 09:02:01.760554 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f3e4ca6a-6d02-4c7e-b549-64e5900371ab","Type":"ContainerStarted","Data":"5d4168c6a34e3898792817b5be66f138f09671abce8389b75bf890ba39699ab0"} Oct 03 09:02:01 crc kubenswrapper[4790]: I1003 09:02:01.763971 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"b829e543-5d7d-4e9b-9674-9c24d060a0c5","Type":"ContainerStarted","Data":"91a6e0bdd2e6c0da06ace2526623f0c8fc2d01584085bf59fca1bbe26a1f4049"} Oct 03 09:02:01 crc kubenswrapper[4790]: I1003 09:02:01.764009 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"b829e543-5d7d-4e9b-9674-9c24d060a0c5","Type":"ContainerStarted","Data":"93b9db7d764adac93a919eace3903e9692278552e970431e9648d0d603f73ce9"} Oct 03 09:02:01 crc kubenswrapper[4790]: I1003 09:02:01.764026 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"b829e543-5d7d-4e9b-9674-9c24d060a0c5","Type":"ContainerStarted","Data":"e84bdb0f159801a3fbf298c45e306c2c7c58af3b23c58da9b7744a86626bc77a"} Oct 03 09:02:01 crc kubenswrapper[4790]: I1003 09:02:01.782708 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.782687374 podStartE2EDuration="2.782687374s" podCreationTimestamp="2025-10-03 09:01:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 09:02:01.776532426 +0000 UTC m=+1308.129466684" watchObservedRunningTime="2025-10-03 09:02:01.782687374 +0000 UTC m=+1308.135621612" Oct 03 09:02:01 crc kubenswrapper[4790]: I1003 09:02:01.802740 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.802708423 podStartE2EDuration="2.802708423s" podCreationTimestamp="2025-10-03 09:01:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 09:02:01.794351884 +0000 UTC m=+1308.147286152" watchObservedRunningTime="2025-10-03 09:02:01.802708423 +0000 UTC m=+1308.155642671" Oct 03 09:02:04 crc kubenswrapper[4790]: I1003 09:02:04.130426 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Oct 03 09:02:05 crc kubenswrapper[4790]: I1003 09:02:05.206754 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 03 09:02:05 crc kubenswrapper[4790]: I1003 09:02:05.207116 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 03 09:02:07 crc kubenswrapper[4790]: I1003 09:02:07.810343 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Oct 03 09:02:08 crc kubenswrapper[4790]: I1003 09:02:08.476660 4790 pod_container_manager_linux.go:210] "Failed to delete cgroup paths" cgroupName=["kubepods","besteffort","podae318a68-abcf-4d99-ac81-222288742659"] err="unable to destroy cgroup paths for cgroup [kubepods besteffort podae318a68-abcf-4d99-ac81-222288742659] : Timed out while waiting for systemd to remove kubepods-besteffort-podae318a68_abcf_4d99_ac81_222288742659.slice" Oct 03 09:02:09 crc kubenswrapper[4790]: I1003 09:02:09.130372 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Oct 03 09:02:09 crc kubenswrapper[4790]: I1003 09:02:09.159703 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Oct 03 09:02:09 crc kubenswrapper[4790]: I1003 09:02:09.871998 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Oct 03 09:02:10 crc kubenswrapper[4790]: I1003 09:02:10.178653 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 03 09:02:10 crc kubenswrapper[4790]: I1003 09:02:10.179567 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 03 09:02:10 crc kubenswrapper[4790]: I1003 09:02:10.186764 4790 patch_prober.go:28] interesting pod/machine-config-daemon-zqj4j container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 09:02:10 crc kubenswrapper[4790]: I1003 09:02:10.186990 4790 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" podUID="36eae06a-d8be-4128-8d68-acb32e1f011b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 09:02:10 crc kubenswrapper[4790]: I1003 09:02:10.206772 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Oct 03 09:02:10 crc kubenswrapper[4790]: I1003 09:02:10.208071 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Oct 03 09:02:11 crc kubenswrapper[4790]: I1003 09:02:11.222798 4790 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="f3e4ca6a-6d02-4c7e-b549-64e5900371ab" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.224:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 03 09:02:11 crc kubenswrapper[4790]: I1003 09:02:11.223533 4790 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="f3e4ca6a-6d02-4c7e-b549-64e5900371ab" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.224:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 03 09:02:11 crc kubenswrapper[4790]: I1003 09:02:11.233721 4790 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="b829e543-5d7d-4e9b-9674-9c24d060a0c5" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.225:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 03 09:02:11 crc kubenswrapper[4790]: I1003 09:02:11.233727 4790 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="b829e543-5d7d-4e9b-9674-9c24d060a0c5" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.225:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 03 09:02:20 crc kubenswrapper[4790]: I1003 09:02:20.189278 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Oct 03 09:02:20 crc kubenswrapper[4790]: I1003 09:02:20.190634 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Oct 03 09:02:20 crc kubenswrapper[4790]: I1003 09:02:20.196159 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Oct 03 09:02:20 crc kubenswrapper[4790]: I1003 09:02:20.201893 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Oct 03 09:02:20 crc kubenswrapper[4790]: I1003 09:02:20.217154 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Oct 03 09:02:20 crc kubenswrapper[4790]: I1003 09:02:20.221173 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Oct 03 09:02:20 crc kubenswrapper[4790]: I1003 09:02:20.223641 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Oct 03 09:02:20 crc kubenswrapper[4790]: I1003 09:02:20.966979 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Oct 03 09:02:20 crc kubenswrapper[4790]: I1003 09:02:20.971967 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Oct 03 09:02:20 crc kubenswrapper[4790]: I1003 09:02:20.981492 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Oct 03 09:02:29 crc kubenswrapper[4790]: I1003 09:02:29.072686 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 03 09:02:29 crc kubenswrapper[4790]: I1003 09:02:29.991767 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 03 09:02:32 crc kubenswrapper[4790]: I1003 09:02:32.534359 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="f79d57e8-680a-41b6-bca9-c233a695be4e" containerName="rabbitmq" containerID="cri-o://e71013313068ea9ba19dfe304d8f5cc8e067a087124a03955964c966041a4c30" gracePeriod=604797 Oct 03 09:02:33 crc kubenswrapper[4790]: I1003 09:02:33.082286 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="f5ab6b2b-5bba-4043-b116-b8d996df1ace" containerName="rabbitmq" containerID="cri-o://d0b4288964fce1d822cc4ff4bbfc3a8a5741d45a60b21fefab8e2f14f47072ca" gracePeriod=604797 Oct 03 09:02:34 crc kubenswrapper[4790]: I1003 09:02:34.122406 4790 generic.go:334] "Generic (PLEG): container finished" podID="f79d57e8-680a-41b6-bca9-c233a695be4e" containerID="e71013313068ea9ba19dfe304d8f5cc8e067a087124a03955964c966041a4c30" exitCode=0 Oct 03 09:02:34 crc kubenswrapper[4790]: I1003 09:02:34.122536 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"f79d57e8-680a-41b6-bca9-c233a695be4e","Type":"ContainerDied","Data":"e71013313068ea9ba19dfe304d8f5cc8e067a087124a03955964c966041a4c30"} Oct 03 09:02:34 crc kubenswrapper[4790]: I1003 09:02:34.122699 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"f79d57e8-680a-41b6-bca9-c233a695be4e","Type":"ContainerDied","Data":"b222b3d339eff75c384074dc30ffa77e7683d9a7296948d7303c51f3a939e5f7"} Oct 03 09:02:34 crc kubenswrapper[4790]: I1003 09:02:34.122719 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b222b3d339eff75c384074dc30ffa77e7683d9a7296948d7303c51f3a939e5f7" Oct 03 09:02:34 crc kubenswrapper[4790]: I1003 09:02:34.230200 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 03 09:02:34 crc kubenswrapper[4790]: I1003 09:02:34.324259 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"f79d57e8-680a-41b6-bca9-c233a695be4e\" (UID: \"f79d57e8-680a-41b6-bca9-c233a695be4e\") " Oct 03 09:02:34 crc kubenswrapper[4790]: I1003 09:02:34.325093 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/f79d57e8-680a-41b6-bca9-c233a695be4e-rabbitmq-confd\") pod \"f79d57e8-680a-41b6-bca9-c233a695be4e\" (UID: \"f79d57e8-680a-41b6-bca9-c233a695be4e\") " Oct 03 09:02:34 crc kubenswrapper[4790]: I1003 09:02:34.325163 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f79d57e8-680a-41b6-bca9-c233a695be4e-config-data\") pod \"f79d57e8-680a-41b6-bca9-c233a695be4e\" (UID: \"f79d57e8-680a-41b6-bca9-c233a695be4e\") " Oct 03 09:02:34 crc kubenswrapper[4790]: I1003 09:02:34.325210 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/f79d57e8-680a-41b6-bca9-c233a695be4e-erlang-cookie-secret\") pod \"f79d57e8-680a-41b6-bca9-c233a695be4e\" (UID: \"f79d57e8-680a-41b6-bca9-c233a695be4e\") " Oct 03 09:02:34 crc kubenswrapper[4790]: I1003 09:02:34.325260 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-89kch\" (UniqueName: \"kubernetes.io/projected/f79d57e8-680a-41b6-bca9-c233a695be4e-kube-api-access-89kch\") pod \"f79d57e8-680a-41b6-bca9-c233a695be4e\" (UID: \"f79d57e8-680a-41b6-bca9-c233a695be4e\") " Oct 03 09:02:34 crc kubenswrapper[4790]: I1003 09:02:34.325318 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/f79d57e8-680a-41b6-bca9-c233a695be4e-pod-info\") pod \"f79d57e8-680a-41b6-bca9-c233a695be4e\" (UID: \"f79d57e8-680a-41b6-bca9-c233a695be4e\") " Oct 03 09:02:34 crc kubenswrapper[4790]: I1003 09:02:34.325406 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/f79d57e8-680a-41b6-bca9-c233a695be4e-rabbitmq-plugins\") pod \"f79d57e8-680a-41b6-bca9-c233a695be4e\" (UID: \"f79d57e8-680a-41b6-bca9-c233a695be4e\") " Oct 03 09:02:34 crc kubenswrapper[4790]: I1003 09:02:34.325451 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/f79d57e8-680a-41b6-bca9-c233a695be4e-rabbitmq-tls\") pod \"f79d57e8-680a-41b6-bca9-c233a695be4e\" (UID: \"f79d57e8-680a-41b6-bca9-c233a695be4e\") " Oct 03 09:02:34 crc kubenswrapper[4790]: I1003 09:02:34.325520 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/f79d57e8-680a-41b6-bca9-c233a695be4e-rabbitmq-erlang-cookie\") pod \"f79d57e8-680a-41b6-bca9-c233a695be4e\" (UID: \"f79d57e8-680a-41b6-bca9-c233a695be4e\") " Oct 03 09:02:34 crc kubenswrapper[4790]: I1003 09:02:34.325550 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/f79d57e8-680a-41b6-bca9-c233a695be4e-server-conf\") pod \"f79d57e8-680a-41b6-bca9-c233a695be4e\" (UID: \"f79d57e8-680a-41b6-bca9-c233a695be4e\") " Oct 03 09:02:34 crc kubenswrapper[4790]: I1003 09:02:34.325597 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/f79d57e8-680a-41b6-bca9-c233a695be4e-plugins-conf\") pod \"f79d57e8-680a-41b6-bca9-c233a695be4e\" (UID: \"f79d57e8-680a-41b6-bca9-c233a695be4e\") " Oct 03 09:02:34 crc kubenswrapper[4790]: I1003 09:02:34.329075 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f79d57e8-680a-41b6-bca9-c233a695be4e-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "f79d57e8-680a-41b6-bca9-c233a695be4e" (UID: "f79d57e8-680a-41b6-bca9-c233a695be4e"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 09:02:34 crc kubenswrapper[4790]: I1003 09:02:34.332823 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage11-crc" (OuterVolumeSpecName: "persistence") pod "f79d57e8-680a-41b6-bca9-c233a695be4e" (UID: "f79d57e8-680a-41b6-bca9-c233a695be4e"). InnerVolumeSpecName "local-storage11-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 03 09:02:34 crc kubenswrapper[4790]: I1003 09:02:34.333343 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f79d57e8-680a-41b6-bca9-c233a695be4e-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "f79d57e8-680a-41b6-bca9-c233a695be4e" (UID: "f79d57e8-680a-41b6-bca9-c233a695be4e"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 09:02:34 crc kubenswrapper[4790]: I1003 09:02:34.334261 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/f79d57e8-680a-41b6-bca9-c233a695be4e-pod-info" (OuterVolumeSpecName: "pod-info") pod "f79d57e8-680a-41b6-bca9-c233a695be4e" (UID: "f79d57e8-680a-41b6-bca9-c233a695be4e"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Oct 03 09:02:34 crc kubenswrapper[4790]: I1003 09:02:34.334623 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f79d57e8-680a-41b6-bca9-c233a695be4e-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "f79d57e8-680a-41b6-bca9-c233a695be4e" (UID: "f79d57e8-680a-41b6-bca9-c233a695be4e"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:02:34 crc kubenswrapper[4790]: I1003 09:02:34.336482 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f79d57e8-680a-41b6-bca9-c233a695be4e-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "f79d57e8-680a-41b6-bca9-c233a695be4e" (UID: "f79d57e8-680a-41b6-bca9-c233a695be4e"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:02:34 crc kubenswrapper[4790]: I1003 09:02:34.340285 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f79d57e8-680a-41b6-bca9-c233a695be4e-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "f79d57e8-680a-41b6-bca9-c233a695be4e" (UID: "f79d57e8-680a-41b6-bca9-c233a695be4e"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 09:02:34 crc kubenswrapper[4790]: I1003 09:02:34.346001 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f79d57e8-680a-41b6-bca9-c233a695be4e-kube-api-access-89kch" (OuterVolumeSpecName: "kube-api-access-89kch") pod "f79d57e8-680a-41b6-bca9-c233a695be4e" (UID: "f79d57e8-680a-41b6-bca9-c233a695be4e"). InnerVolumeSpecName "kube-api-access-89kch". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:02:34 crc kubenswrapper[4790]: I1003 09:02:34.393343 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f79d57e8-680a-41b6-bca9-c233a695be4e-config-data" (OuterVolumeSpecName: "config-data") pod "f79d57e8-680a-41b6-bca9-c233a695be4e" (UID: "f79d57e8-680a-41b6-bca9-c233a695be4e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 09:02:34 crc kubenswrapper[4790]: I1003 09:02:34.429854 4790 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/f79d57e8-680a-41b6-bca9-c233a695be4e-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Oct 03 09:02:34 crc kubenswrapper[4790]: I1003 09:02:34.430035 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-89kch\" (UniqueName: \"kubernetes.io/projected/f79d57e8-680a-41b6-bca9-c233a695be4e-kube-api-access-89kch\") on node \"crc\" DevicePath \"\"" Oct 03 09:02:34 crc kubenswrapper[4790]: I1003 09:02:34.430087 4790 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/f79d57e8-680a-41b6-bca9-c233a695be4e-pod-info\") on node \"crc\" DevicePath \"\"" Oct 03 09:02:34 crc kubenswrapper[4790]: I1003 09:02:34.430137 4790 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/f79d57e8-680a-41b6-bca9-c233a695be4e-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Oct 03 09:02:34 crc kubenswrapper[4790]: I1003 09:02:34.430182 4790 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/f79d57e8-680a-41b6-bca9-c233a695be4e-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Oct 03 09:02:34 crc kubenswrapper[4790]: I1003 09:02:34.430268 4790 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/f79d57e8-680a-41b6-bca9-c233a695be4e-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Oct 03 09:02:34 crc kubenswrapper[4790]: I1003 09:02:34.430331 4790 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/f79d57e8-680a-41b6-bca9-c233a695be4e-plugins-conf\") on node \"crc\" DevicePath \"\"" Oct 03 09:02:34 crc kubenswrapper[4790]: I1003 09:02:34.430414 4790 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" " Oct 03 09:02:34 crc kubenswrapper[4790]: I1003 09:02:34.430470 4790 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f79d57e8-680a-41b6-bca9-c233a695be4e-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 09:02:34 crc kubenswrapper[4790]: I1003 09:02:34.475287 4790 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage11-crc" (UniqueName: "kubernetes.io/local-volume/local-storage11-crc") on node "crc" Oct 03 09:02:34 crc kubenswrapper[4790]: I1003 09:02:34.513675 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f79d57e8-680a-41b6-bca9-c233a695be4e-server-conf" (OuterVolumeSpecName: "server-conf") pod "f79d57e8-680a-41b6-bca9-c233a695be4e" (UID: "f79d57e8-680a-41b6-bca9-c233a695be4e"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 09:02:34 crc kubenswrapper[4790]: I1003 09:02:34.533270 4790 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/f79d57e8-680a-41b6-bca9-c233a695be4e-server-conf\") on node \"crc\" DevicePath \"\"" Oct 03 09:02:34 crc kubenswrapper[4790]: I1003 09:02:34.533313 4790 reconciler_common.go:293] "Volume detached for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" DevicePath \"\"" Oct 03 09:02:34 crc kubenswrapper[4790]: I1003 09:02:34.582762 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f79d57e8-680a-41b6-bca9-c233a695be4e-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "f79d57e8-680a-41b6-bca9-c233a695be4e" (UID: "f79d57e8-680a-41b6-bca9-c233a695be4e"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:02:34 crc kubenswrapper[4790]: I1003 09:02:34.638088 4790 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/f79d57e8-680a-41b6-bca9-c233a695be4e-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Oct 03 09:02:34 crc kubenswrapper[4790]: I1003 09:02:34.693047 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 03 09:02:34 crc kubenswrapper[4790]: I1003 09:02:34.854328 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/f5ab6b2b-5bba-4043-b116-b8d996df1ace-rabbitmq-plugins\") pod \"f5ab6b2b-5bba-4043-b116-b8d996df1ace\" (UID: \"f5ab6b2b-5bba-4043-b116-b8d996df1ace\") " Oct 03 09:02:34 crc kubenswrapper[4790]: I1003 09:02:34.854876 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/f5ab6b2b-5bba-4043-b116-b8d996df1ace-erlang-cookie-secret\") pod \"f5ab6b2b-5bba-4043-b116-b8d996df1ace\" (UID: \"f5ab6b2b-5bba-4043-b116-b8d996df1ace\") " Oct 03 09:02:34 crc kubenswrapper[4790]: I1003 09:02:34.854918 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"f5ab6b2b-5bba-4043-b116-b8d996df1ace\" (UID: \"f5ab6b2b-5bba-4043-b116-b8d996df1ace\") " Oct 03 09:02:34 crc kubenswrapper[4790]: I1003 09:02:34.854990 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/f5ab6b2b-5bba-4043-b116-b8d996df1ace-rabbitmq-confd\") pod \"f5ab6b2b-5bba-4043-b116-b8d996df1ace\" (UID: \"f5ab6b2b-5bba-4043-b116-b8d996df1ace\") " Oct 03 09:02:34 crc kubenswrapper[4790]: I1003 09:02:34.855028 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/f5ab6b2b-5bba-4043-b116-b8d996df1ace-server-conf\") pod \"f5ab6b2b-5bba-4043-b116-b8d996df1ace\" (UID: \"f5ab6b2b-5bba-4043-b116-b8d996df1ace\") " Oct 03 09:02:34 crc kubenswrapper[4790]: I1003 09:02:34.855075 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dq7zf\" (UniqueName: \"kubernetes.io/projected/f5ab6b2b-5bba-4043-b116-b8d996df1ace-kube-api-access-dq7zf\") pod \"f5ab6b2b-5bba-4043-b116-b8d996df1ace\" (UID: \"f5ab6b2b-5bba-4043-b116-b8d996df1ace\") " Oct 03 09:02:34 crc kubenswrapper[4790]: I1003 09:02:34.855096 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/f5ab6b2b-5bba-4043-b116-b8d996df1ace-plugins-conf\") pod \"f5ab6b2b-5bba-4043-b116-b8d996df1ace\" (UID: \"f5ab6b2b-5bba-4043-b116-b8d996df1ace\") " Oct 03 09:02:34 crc kubenswrapper[4790]: I1003 09:02:34.855112 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/f5ab6b2b-5bba-4043-b116-b8d996df1ace-rabbitmq-tls\") pod \"f5ab6b2b-5bba-4043-b116-b8d996df1ace\" (UID: \"f5ab6b2b-5bba-4043-b116-b8d996df1ace\") " Oct 03 09:02:34 crc kubenswrapper[4790]: I1003 09:02:34.855145 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f5ab6b2b-5bba-4043-b116-b8d996df1ace-config-data\") pod \"f5ab6b2b-5bba-4043-b116-b8d996df1ace\" (UID: \"f5ab6b2b-5bba-4043-b116-b8d996df1ace\") " Oct 03 09:02:34 crc kubenswrapper[4790]: I1003 09:02:34.855180 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/f5ab6b2b-5bba-4043-b116-b8d996df1ace-pod-info\") pod \"f5ab6b2b-5bba-4043-b116-b8d996df1ace\" (UID: \"f5ab6b2b-5bba-4043-b116-b8d996df1ace\") " Oct 03 09:02:34 crc kubenswrapper[4790]: I1003 09:02:34.855246 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/f5ab6b2b-5bba-4043-b116-b8d996df1ace-rabbitmq-erlang-cookie\") pod \"f5ab6b2b-5bba-4043-b116-b8d996df1ace\" (UID: \"f5ab6b2b-5bba-4043-b116-b8d996df1ace\") " Oct 03 09:02:34 crc kubenswrapper[4790]: I1003 09:02:34.858127 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f5ab6b2b-5bba-4043-b116-b8d996df1ace-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "f5ab6b2b-5bba-4043-b116-b8d996df1ace" (UID: "f5ab6b2b-5bba-4043-b116-b8d996df1ace"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 09:02:34 crc kubenswrapper[4790]: I1003 09:02:34.859077 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f5ab6b2b-5bba-4043-b116-b8d996df1ace-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "f5ab6b2b-5bba-4043-b116-b8d996df1ace" (UID: "f5ab6b2b-5bba-4043-b116-b8d996df1ace"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 09:02:34 crc kubenswrapper[4790]: I1003 09:02:34.860550 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f5ab6b2b-5bba-4043-b116-b8d996df1ace-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "f5ab6b2b-5bba-4043-b116-b8d996df1ace" (UID: "f5ab6b2b-5bba-4043-b116-b8d996df1ace"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 09:02:34 crc kubenswrapper[4790]: I1003 09:02:34.861294 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f5ab6b2b-5bba-4043-b116-b8d996df1ace-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "f5ab6b2b-5bba-4043-b116-b8d996df1ace" (UID: "f5ab6b2b-5bba-4043-b116-b8d996df1ace"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:02:34 crc kubenswrapper[4790]: I1003 09:02:34.863527 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f5ab6b2b-5bba-4043-b116-b8d996df1ace-kube-api-access-dq7zf" (OuterVolumeSpecName: "kube-api-access-dq7zf") pod "f5ab6b2b-5bba-4043-b116-b8d996df1ace" (UID: "f5ab6b2b-5bba-4043-b116-b8d996df1ace"). InnerVolumeSpecName "kube-api-access-dq7zf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:02:34 crc kubenswrapper[4790]: I1003 09:02:34.864697 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage08-crc" (OuterVolumeSpecName: "persistence") pod "f5ab6b2b-5bba-4043-b116-b8d996df1ace" (UID: "f5ab6b2b-5bba-4043-b116-b8d996df1ace"). InnerVolumeSpecName "local-storage08-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 03 09:02:34 crc kubenswrapper[4790]: I1003 09:02:34.865492 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f5ab6b2b-5bba-4043-b116-b8d996df1ace-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "f5ab6b2b-5bba-4043-b116-b8d996df1ace" (UID: "f5ab6b2b-5bba-4043-b116-b8d996df1ace"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:02:34 crc kubenswrapper[4790]: I1003 09:02:34.866704 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/f5ab6b2b-5bba-4043-b116-b8d996df1ace-pod-info" (OuterVolumeSpecName: "pod-info") pod "f5ab6b2b-5bba-4043-b116-b8d996df1ace" (UID: "f5ab6b2b-5bba-4043-b116-b8d996df1ace"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Oct 03 09:02:34 crc kubenswrapper[4790]: I1003 09:02:34.900634 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f5ab6b2b-5bba-4043-b116-b8d996df1ace-config-data" (OuterVolumeSpecName: "config-data") pod "f5ab6b2b-5bba-4043-b116-b8d996df1ace" (UID: "f5ab6b2b-5bba-4043-b116-b8d996df1ace"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 09:02:34 crc kubenswrapper[4790]: I1003 09:02:34.936701 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f5ab6b2b-5bba-4043-b116-b8d996df1ace-server-conf" (OuterVolumeSpecName: "server-conf") pod "f5ab6b2b-5bba-4043-b116-b8d996df1ace" (UID: "f5ab6b2b-5bba-4043-b116-b8d996df1ace"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 09:02:34 crc kubenswrapper[4790]: I1003 09:02:34.957407 4790 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/f5ab6b2b-5bba-4043-b116-b8d996df1ace-server-conf\") on node \"crc\" DevicePath \"\"" Oct 03 09:02:34 crc kubenswrapper[4790]: I1003 09:02:34.957453 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dq7zf\" (UniqueName: \"kubernetes.io/projected/f5ab6b2b-5bba-4043-b116-b8d996df1ace-kube-api-access-dq7zf\") on node \"crc\" DevicePath \"\"" Oct 03 09:02:34 crc kubenswrapper[4790]: I1003 09:02:34.957465 4790 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/f5ab6b2b-5bba-4043-b116-b8d996df1ace-plugins-conf\") on node \"crc\" DevicePath \"\"" Oct 03 09:02:34 crc kubenswrapper[4790]: I1003 09:02:34.957473 4790 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/f5ab6b2b-5bba-4043-b116-b8d996df1ace-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Oct 03 09:02:34 crc kubenswrapper[4790]: I1003 09:02:34.957482 4790 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f5ab6b2b-5bba-4043-b116-b8d996df1ace-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 09:02:34 crc kubenswrapper[4790]: I1003 09:02:34.957489 4790 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/f5ab6b2b-5bba-4043-b116-b8d996df1ace-pod-info\") on node \"crc\" DevicePath \"\"" Oct 03 09:02:34 crc kubenswrapper[4790]: I1003 09:02:34.957505 4790 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/f5ab6b2b-5bba-4043-b116-b8d996df1ace-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Oct 03 09:02:34 crc kubenswrapper[4790]: I1003 09:02:34.957525 4790 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/f5ab6b2b-5bba-4043-b116-b8d996df1ace-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Oct 03 09:02:34 crc kubenswrapper[4790]: I1003 09:02:34.957533 4790 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/f5ab6b2b-5bba-4043-b116-b8d996df1ace-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Oct 03 09:02:34 crc kubenswrapper[4790]: I1003 09:02:34.957562 4790 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" " Oct 03 09:02:34 crc kubenswrapper[4790]: I1003 09:02:34.982500 4790 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage08-crc" (UniqueName: "kubernetes.io/local-volume/local-storage08-crc") on node "crc" Oct 03 09:02:34 crc kubenswrapper[4790]: I1003 09:02:34.993193 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f5ab6b2b-5bba-4043-b116-b8d996df1ace-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "f5ab6b2b-5bba-4043-b116-b8d996df1ace" (UID: "f5ab6b2b-5bba-4043-b116-b8d996df1ace"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:02:35 crc kubenswrapper[4790]: I1003 09:02:35.059088 4790 reconciler_common.go:293] "Volume detached for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" DevicePath \"\"" Oct 03 09:02:35 crc kubenswrapper[4790]: I1003 09:02:35.059122 4790 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/f5ab6b2b-5bba-4043-b116-b8d996df1ace-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Oct 03 09:02:35 crc kubenswrapper[4790]: I1003 09:02:35.133652 4790 generic.go:334] "Generic (PLEG): container finished" podID="f5ab6b2b-5bba-4043-b116-b8d996df1ace" containerID="d0b4288964fce1d822cc4ff4bbfc3a8a5741d45a60b21fefab8e2f14f47072ca" exitCode=0 Oct 03 09:02:35 crc kubenswrapper[4790]: I1003 09:02:35.133734 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"f5ab6b2b-5bba-4043-b116-b8d996df1ace","Type":"ContainerDied","Data":"d0b4288964fce1d822cc4ff4bbfc3a8a5741d45a60b21fefab8e2f14f47072ca"} Oct 03 09:02:35 crc kubenswrapper[4790]: I1003 09:02:35.133779 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 03 09:02:35 crc kubenswrapper[4790]: I1003 09:02:35.133788 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"f5ab6b2b-5bba-4043-b116-b8d996df1ace","Type":"ContainerDied","Data":"e4f27495358a00205ee47f385968638e41b2517d471d590ba99cc7a1f150bfd6"} Oct 03 09:02:35 crc kubenswrapper[4790]: I1003 09:02:35.133807 4790 scope.go:117] "RemoveContainer" containerID="d0b4288964fce1d822cc4ff4bbfc3a8a5741d45a60b21fefab8e2f14f47072ca" Oct 03 09:02:35 crc kubenswrapper[4790]: I1003 09:02:35.134993 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 03 09:02:35 crc kubenswrapper[4790]: I1003 09:02:35.157812 4790 scope.go:117] "RemoveContainer" containerID="f077e584af63fd1d7d0309f8b1b0933b111eb5a14bd21bd6d74599277edc4323" Oct 03 09:02:35 crc kubenswrapper[4790]: I1003 09:02:35.173277 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 03 09:02:35 crc kubenswrapper[4790]: I1003 09:02:35.185545 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 03 09:02:35 crc kubenswrapper[4790]: I1003 09:02:35.211933 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 03 09:02:35 crc kubenswrapper[4790]: I1003 09:02:35.223831 4790 scope.go:117] "RemoveContainer" containerID="d0b4288964fce1d822cc4ff4bbfc3a8a5741d45a60b21fefab8e2f14f47072ca" Oct 03 09:02:35 crc kubenswrapper[4790]: E1003 09:02:35.225840 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d0b4288964fce1d822cc4ff4bbfc3a8a5741d45a60b21fefab8e2f14f47072ca\": container with ID starting with d0b4288964fce1d822cc4ff4bbfc3a8a5741d45a60b21fefab8e2f14f47072ca not found: ID does not exist" containerID="d0b4288964fce1d822cc4ff4bbfc3a8a5741d45a60b21fefab8e2f14f47072ca" Oct 03 09:02:35 crc kubenswrapper[4790]: I1003 09:02:35.225891 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d0b4288964fce1d822cc4ff4bbfc3a8a5741d45a60b21fefab8e2f14f47072ca"} err="failed to get container status \"d0b4288964fce1d822cc4ff4bbfc3a8a5741d45a60b21fefab8e2f14f47072ca\": rpc error: code = NotFound desc = could not find container \"d0b4288964fce1d822cc4ff4bbfc3a8a5741d45a60b21fefab8e2f14f47072ca\": container with ID starting with d0b4288964fce1d822cc4ff4bbfc3a8a5741d45a60b21fefab8e2f14f47072ca not found: ID does not exist" Oct 03 09:02:35 crc kubenswrapper[4790]: I1003 09:02:35.225921 4790 scope.go:117] "RemoveContainer" containerID="f077e584af63fd1d7d0309f8b1b0933b111eb5a14bd21bd6d74599277edc4323" Oct 03 09:02:35 crc kubenswrapper[4790]: I1003 09:02:35.226086 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 03 09:02:35 crc kubenswrapper[4790]: E1003 09:02:35.226442 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f077e584af63fd1d7d0309f8b1b0933b111eb5a14bd21bd6d74599277edc4323\": container with ID starting with f077e584af63fd1d7d0309f8b1b0933b111eb5a14bd21bd6d74599277edc4323 not found: ID does not exist" containerID="f077e584af63fd1d7d0309f8b1b0933b111eb5a14bd21bd6d74599277edc4323" Oct 03 09:02:35 crc kubenswrapper[4790]: I1003 09:02:35.226472 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f077e584af63fd1d7d0309f8b1b0933b111eb5a14bd21bd6d74599277edc4323"} err="failed to get container status \"f077e584af63fd1d7d0309f8b1b0933b111eb5a14bd21bd6d74599277edc4323\": rpc error: code = NotFound desc = could not find container \"f077e584af63fd1d7d0309f8b1b0933b111eb5a14bd21bd6d74599277edc4323\": container with ID starting with f077e584af63fd1d7d0309f8b1b0933b111eb5a14bd21bd6d74599277edc4323 not found: ID does not exist" Oct 03 09:02:35 crc kubenswrapper[4790]: I1003 09:02:35.250972 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 03 09:02:35 crc kubenswrapper[4790]: E1003 09:02:35.251511 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f79d57e8-680a-41b6-bca9-c233a695be4e" containerName="setup-container" Oct 03 09:02:35 crc kubenswrapper[4790]: I1003 09:02:35.251536 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="f79d57e8-680a-41b6-bca9-c233a695be4e" containerName="setup-container" Oct 03 09:02:35 crc kubenswrapper[4790]: E1003 09:02:35.251555 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f5ab6b2b-5bba-4043-b116-b8d996df1ace" containerName="setup-container" Oct 03 09:02:35 crc kubenswrapper[4790]: I1003 09:02:35.251565 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="f5ab6b2b-5bba-4043-b116-b8d996df1ace" containerName="setup-container" Oct 03 09:02:35 crc kubenswrapper[4790]: E1003 09:02:35.251698 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f79d57e8-680a-41b6-bca9-c233a695be4e" containerName="rabbitmq" Oct 03 09:02:35 crc kubenswrapper[4790]: I1003 09:02:35.251709 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="f79d57e8-680a-41b6-bca9-c233a695be4e" containerName="rabbitmq" Oct 03 09:02:35 crc kubenswrapper[4790]: E1003 09:02:35.251723 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f5ab6b2b-5bba-4043-b116-b8d996df1ace" containerName="rabbitmq" Oct 03 09:02:35 crc kubenswrapper[4790]: I1003 09:02:35.251730 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="f5ab6b2b-5bba-4043-b116-b8d996df1ace" containerName="rabbitmq" Oct 03 09:02:35 crc kubenswrapper[4790]: I1003 09:02:35.252003 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="f5ab6b2b-5bba-4043-b116-b8d996df1ace" containerName="rabbitmq" Oct 03 09:02:35 crc kubenswrapper[4790]: I1003 09:02:35.252036 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="f79d57e8-680a-41b6-bca9-c233a695be4e" containerName="rabbitmq" Oct 03 09:02:35 crc kubenswrapper[4790]: I1003 09:02:35.253117 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 03 09:02:35 crc kubenswrapper[4790]: I1003 09:02:35.254967 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Oct 03 09:02:35 crc kubenswrapper[4790]: I1003 09:02:35.255191 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-q8nqg" Oct 03 09:02:35 crc kubenswrapper[4790]: I1003 09:02:35.255729 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Oct 03 09:02:35 crc kubenswrapper[4790]: I1003 09:02:35.255892 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Oct 03 09:02:35 crc kubenswrapper[4790]: I1003 09:02:35.256007 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Oct 03 09:02:35 crc kubenswrapper[4790]: I1003 09:02:35.256153 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Oct 03 09:02:35 crc kubenswrapper[4790]: I1003 09:02:35.256288 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Oct 03 09:02:35 crc kubenswrapper[4790]: I1003 09:02:35.280811 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Oct 03 09:02:35 crc kubenswrapper[4790]: I1003 09:02:35.282534 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 03 09:02:35 crc kubenswrapper[4790]: I1003 09:02:35.285213 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Oct 03 09:02:35 crc kubenswrapper[4790]: I1003 09:02:35.285274 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-2jckr" Oct 03 09:02:35 crc kubenswrapper[4790]: I1003 09:02:35.285437 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Oct 03 09:02:35 crc kubenswrapper[4790]: I1003 09:02:35.285600 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Oct 03 09:02:35 crc kubenswrapper[4790]: I1003 09:02:35.288231 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Oct 03 09:02:35 crc kubenswrapper[4790]: I1003 09:02:35.289264 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Oct 03 09:02:35 crc kubenswrapper[4790]: I1003 09:02:35.292777 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 03 09:02:35 crc kubenswrapper[4790]: I1003 09:02:35.296175 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Oct 03 09:02:35 crc kubenswrapper[4790]: I1003 09:02:35.305589 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 03 09:02:35 crc kubenswrapper[4790]: I1003 09:02:35.365456 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/babbce79-b0e5-48e6-a5b7-b08e51f1dd69-server-conf\") pod \"rabbitmq-server-0\" (UID: \"babbce79-b0e5-48e6-a5b7-b08e51f1dd69\") " pod="openstack/rabbitmq-server-0" Oct 03 09:02:35 crc kubenswrapper[4790]: I1003 09:02:35.365519 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/77c8e0d3-b23b-44de-a8d3-0eb2e368e9e9-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"77c8e0d3-b23b-44de-a8d3-0eb2e368e9e9\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 09:02:35 crc kubenswrapper[4790]: I1003 09:02:35.365548 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/babbce79-b0e5-48e6-a5b7-b08e51f1dd69-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"babbce79-b0e5-48e6-a5b7-b08e51f1dd69\") " pod="openstack/rabbitmq-server-0" Oct 03 09:02:35 crc kubenswrapper[4790]: I1003 09:02:35.365617 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/babbce79-b0e5-48e6-a5b7-b08e51f1dd69-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"babbce79-b0e5-48e6-a5b7-b08e51f1dd69\") " pod="openstack/rabbitmq-server-0" Oct 03 09:02:35 crc kubenswrapper[4790]: I1003 09:02:35.365683 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/babbce79-b0e5-48e6-a5b7-b08e51f1dd69-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"babbce79-b0e5-48e6-a5b7-b08e51f1dd69\") " pod="openstack/rabbitmq-server-0" Oct 03 09:02:35 crc kubenswrapper[4790]: I1003 09:02:35.365757 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5gcmv\" (UniqueName: \"kubernetes.io/projected/77c8e0d3-b23b-44de-a8d3-0eb2e368e9e9-kube-api-access-5gcmv\") pod \"rabbitmq-cell1-server-0\" (UID: \"77c8e0d3-b23b-44de-a8d3-0eb2e368e9e9\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 09:02:35 crc kubenswrapper[4790]: I1003 09:02:35.365791 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/babbce79-b0e5-48e6-a5b7-b08e51f1dd69-pod-info\") pod \"rabbitmq-server-0\" (UID: \"babbce79-b0e5-48e6-a5b7-b08e51f1dd69\") " pod="openstack/rabbitmq-server-0" Oct 03 09:02:35 crc kubenswrapper[4790]: I1003 09:02:35.365821 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/babbce79-b0e5-48e6-a5b7-b08e51f1dd69-config-data\") pod \"rabbitmq-server-0\" (UID: \"babbce79-b0e5-48e6-a5b7-b08e51f1dd69\") " pod="openstack/rabbitmq-server-0" Oct 03 09:02:35 crc kubenswrapper[4790]: I1003 09:02:35.365845 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"rabbitmq-server-0\" (UID: \"babbce79-b0e5-48e6-a5b7-b08e51f1dd69\") " pod="openstack/rabbitmq-server-0" Oct 03 09:02:35 crc kubenswrapper[4790]: I1003 09:02:35.365883 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/77c8e0d3-b23b-44de-a8d3-0eb2e368e9e9-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"77c8e0d3-b23b-44de-a8d3-0eb2e368e9e9\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 09:02:35 crc kubenswrapper[4790]: I1003 09:02:35.365911 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/77c8e0d3-b23b-44de-a8d3-0eb2e368e9e9-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"77c8e0d3-b23b-44de-a8d3-0eb2e368e9e9\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 09:02:35 crc kubenswrapper[4790]: I1003 09:02:35.365937 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/babbce79-b0e5-48e6-a5b7-b08e51f1dd69-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"babbce79-b0e5-48e6-a5b7-b08e51f1dd69\") " pod="openstack/rabbitmq-server-0" Oct 03 09:02:35 crc kubenswrapper[4790]: I1003 09:02:35.365974 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tvrqc\" (UniqueName: \"kubernetes.io/projected/babbce79-b0e5-48e6-a5b7-b08e51f1dd69-kube-api-access-tvrqc\") pod \"rabbitmq-server-0\" (UID: \"babbce79-b0e5-48e6-a5b7-b08e51f1dd69\") " pod="openstack/rabbitmq-server-0" Oct 03 09:02:35 crc kubenswrapper[4790]: I1003 09:02:35.366058 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/77c8e0d3-b23b-44de-a8d3-0eb2e368e9e9-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"77c8e0d3-b23b-44de-a8d3-0eb2e368e9e9\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 09:02:35 crc kubenswrapper[4790]: I1003 09:02:35.366093 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/babbce79-b0e5-48e6-a5b7-b08e51f1dd69-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"babbce79-b0e5-48e6-a5b7-b08e51f1dd69\") " pod="openstack/rabbitmq-server-0" Oct 03 09:02:35 crc kubenswrapper[4790]: I1003 09:02:35.366164 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/77c8e0d3-b23b-44de-a8d3-0eb2e368e9e9-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"77c8e0d3-b23b-44de-a8d3-0eb2e368e9e9\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 09:02:35 crc kubenswrapper[4790]: I1003 09:02:35.366429 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/77c8e0d3-b23b-44de-a8d3-0eb2e368e9e9-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"77c8e0d3-b23b-44de-a8d3-0eb2e368e9e9\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 09:02:35 crc kubenswrapper[4790]: I1003 09:02:35.366485 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"77c8e0d3-b23b-44de-a8d3-0eb2e368e9e9\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 09:02:35 crc kubenswrapper[4790]: I1003 09:02:35.366560 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/77c8e0d3-b23b-44de-a8d3-0eb2e368e9e9-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"77c8e0d3-b23b-44de-a8d3-0eb2e368e9e9\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 09:02:35 crc kubenswrapper[4790]: I1003 09:02:35.366619 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/77c8e0d3-b23b-44de-a8d3-0eb2e368e9e9-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"77c8e0d3-b23b-44de-a8d3-0eb2e368e9e9\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 09:02:35 crc kubenswrapper[4790]: I1003 09:02:35.366664 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/77c8e0d3-b23b-44de-a8d3-0eb2e368e9e9-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"77c8e0d3-b23b-44de-a8d3-0eb2e368e9e9\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 09:02:35 crc kubenswrapper[4790]: I1003 09:02:35.366745 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/babbce79-b0e5-48e6-a5b7-b08e51f1dd69-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"babbce79-b0e5-48e6-a5b7-b08e51f1dd69\") " pod="openstack/rabbitmq-server-0" Oct 03 09:02:35 crc kubenswrapper[4790]: I1003 09:02:35.468630 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/babbce79-b0e5-48e6-a5b7-b08e51f1dd69-server-conf\") pod \"rabbitmq-server-0\" (UID: \"babbce79-b0e5-48e6-a5b7-b08e51f1dd69\") " pod="openstack/rabbitmq-server-0" Oct 03 09:02:35 crc kubenswrapper[4790]: I1003 09:02:35.468894 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/77c8e0d3-b23b-44de-a8d3-0eb2e368e9e9-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"77c8e0d3-b23b-44de-a8d3-0eb2e368e9e9\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 09:02:35 crc kubenswrapper[4790]: I1003 09:02:35.468937 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/babbce79-b0e5-48e6-a5b7-b08e51f1dd69-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"babbce79-b0e5-48e6-a5b7-b08e51f1dd69\") " pod="openstack/rabbitmq-server-0" Oct 03 09:02:35 crc kubenswrapper[4790]: I1003 09:02:35.468991 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/babbce79-b0e5-48e6-a5b7-b08e51f1dd69-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"babbce79-b0e5-48e6-a5b7-b08e51f1dd69\") " pod="openstack/rabbitmq-server-0" Oct 03 09:02:35 crc kubenswrapper[4790]: I1003 09:02:35.469027 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/babbce79-b0e5-48e6-a5b7-b08e51f1dd69-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"babbce79-b0e5-48e6-a5b7-b08e51f1dd69\") " pod="openstack/rabbitmq-server-0" Oct 03 09:02:35 crc kubenswrapper[4790]: I1003 09:02:35.469141 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5gcmv\" (UniqueName: \"kubernetes.io/projected/77c8e0d3-b23b-44de-a8d3-0eb2e368e9e9-kube-api-access-5gcmv\") pod \"rabbitmq-cell1-server-0\" (UID: \"77c8e0d3-b23b-44de-a8d3-0eb2e368e9e9\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 09:02:35 crc kubenswrapper[4790]: I1003 09:02:35.469172 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/babbce79-b0e5-48e6-a5b7-b08e51f1dd69-pod-info\") pod \"rabbitmq-server-0\" (UID: \"babbce79-b0e5-48e6-a5b7-b08e51f1dd69\") " pod="openstack/rabbitmq-server-0" Oct 03 09:02:35 crc kubenswrapper[4790]: I1003 09:02:35.469200 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/babbce79-b0e5-48e6-a5b7-b08e51f1dd69-config-data\") pod \"rabbitmq-server-0\" (UID: \"babbce79-b0e5-48e6-a5b7-b08e51f1dd69\") " pod="openstack/rabbitmq-server-0" Oct 03 09:02:35 crc kubenswrapper[4790]: I1003 09:02:35.469217 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"rabbitmq-server-0\" (UID: \"babbce79-b0e5-48e6-a5b7-b08e51f1dd69\") " pod="openstack/rabbitmq-server-0" Oct 03 09:02:35 crc kubenswrapper[4790]: I1003 09:02:35.469250 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/77c8e0d3-b23b-44de-a8d3-0eb2e368e9e9-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"77c8e0d3-b23b-44de-a8d3-0eb2e368e9e9\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 09:02:35 crc kubenswrapper[4790]: I1003 09:02:35.469275 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/77c8e0d3-b23b-44de-a8d3-0eb2e368e9e9-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"77c8e0d3-b23b-44de-a8d3-0eb2e368e9e9\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 09:02:35 crc kubenswrapper[4790]: I1003 09:02:35.469296 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/babbce79-b0e5-48e6-a5b7-b08e51f1dd69-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"babbce79-b0e5-48e6-a5b7-b08e51f1dd69\") " pod="openstack/rabbitmq-server-0" Oct 03 09:02:35 crc kubenswrapper[4790]: I1003 09:02:35.469347 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tvrqc\" (UniqueName: \"kubernetes.io/projected/babbce79-b0e5-48e6-a5b7-b08e51f1dd69-kube-api-access-tvrqc\") pod \"rabbitmq-server-0\" (UID: \"babbce79-b0e5-48e6-a5b7-b08e51f1dd69\") " pod="openstack/rabbitmq-server-0" Oct 03 09:02:35 crc kubenswrapper[4790]: I1003 09:02:35.469380 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/77c8e0d3-b23b-44de-a8d3-0eb2e368e9e9-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"77c8e0d3-b23b-44de-a8d3-0eb2e368e9e9\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 09:02:35 crc kubenswrapper[4790]: I1003 09:02:35.469400 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/babbce79-b0e5-48e6-a5b7-b08e51f1dd69-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"babbce79-b0e5-48e6-a5b7-b08e51f1dd69\") " pod="openstack/rabbitmq-server-0" Oct 03 09:02:35 crc kubenswrapper[4790]: I1003 09:02:35.469423 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/77c8e0d3-b23b-44de-a8d3-0eb2e368e9e9-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"77c8e0d3-b23b-44de-a8d3-0eb2e368e9e9\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 09:02:35 crc kubenswrapper[4790]: I1003 09:02:35.469477 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/77c8e0d3-b23b-44de-a8d3-0eb2e368e9e9-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"77c8e0d3-b23b-44de-a8d3-0eb2e368e9e9\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 09:02:35 crc kubenswrapper[4790]: I1003 09:02:35.469502 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"77c8e0d3-b23b-44de-a8d3-0eb2e368e9e9\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 09:02:35 crc kubenswrapper[4790]: I1003 09:02:35.469525 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/77c8e0d3-b23b-44de-a8d3-0eb2e368e9e9-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"77c8e0d3-b23b-44de-a8d3-0eb2e368e9e9\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 09:02:35 crc kubenswrapper[4790]: I1003 09:02:35.469593 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/77c8e0d3-b23b-44de-a8d3-0eb2e368e9e9-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"77c8e0d3-b23b-44de-a8d3-0eb2e368e9e9\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 09:02:35 crc kubenswrapper[4790]: I1003 09:02:35.469626 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/77c8e0d3-b23b-44de-a8d3-0eb2e368e9e9-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"77c8e0d3-b23b-44de-a8d3-0eb2e368e9e9\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 09:02:35 crc kubenswrapper[4790]: I1003 09:02:35.469665 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/babbce79-b0e5-48e6-a5b7-b08e51f1dd69-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"babbce79-b0e5-48e6-a5b7-b08e51f1dd69\") " pod="openstack/rabbitmq-server-0" Oct 03 09:02:35 crc kubenswrapper[4790]: I1003 09:02:35.469825 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/77c8e0d3-b23b-44de-a8d3-0eb2e368e9e9-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"77c8e0d3-b23b-44de-a8d3-0eb2e368e9e9\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 09:02:35 crc kubenswrapper[4790]: I1003 09:02:35.469832 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/babbce79-b0e5-48e6-a5b7-b08e51f1dd69-server-conf\") pod \"rabbitmq-server-0\" (UID: \"babbce79-b0e5-48e6-a5b7-b08e51f1dd69\") " pod="openstack/rabbitmq-server-0" Oct 03 09:02:35 crc kubenswrapper[4790]: I1003 09:02:35.470110 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/77c8e0d3-b23b-44de-a8d3-0eb2e368e9e9-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"77c8e0d3-b23b-44de-a8d3-0eb2e368e9e9\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 09:02:35 crc kubenswrapper[4790]: I1003 09:02:35.471285 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/babbce79-b0e5-48e6-a5b7-b08e51f1dd69-config-data\") pod \"rabbitmq-server-0\" (UID: \"babbce79-b0e5-48e6-a5b7-b08e51f1dd69\") " pod="openstack/rabbitmq-server-0" Oct 03 09:02:35 crc kubenswrapper[4790]: I1003 09:02:35.471621 4790 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"rabbitmq-server-0\" (UID: \"babbce79-b0e5-48e6-a5b7-b08e51f1dd69\") device mount path \"/mnt/openstack/pv11\"" pod="openstack/rabbitmq-server-0" Oct 03 09:02:35 crc kubenswrapper[4790]: I1003 09:02:35.473517 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/77c8e0d3-b23b-44de-a8d3-0eb2e368e9e9-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"77c8e0d3-b23b-44de-a8d3-0eb2e368e9e9\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 09:02:35 crc kubenswrapper[4790]: I1003 09:02:35.473768 4790 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"77c8e0d3-b23b-44de-a8d3-0eb2e368e9e9\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/rabbitmq-cell1-server-0" Oct 03 09:02:35 crc kubenswrapper[4790]: I1003 09:02:35.474187 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/77c8e0d3-b23b-44de-a8d3-0eb2e368e9e9-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"77c8e0d3-b23b-44de-a8d3-0eb2e368e9e9\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 09:02:35 crc kubenswrapper[4790]: I1003 09:02:35.474316 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/babbce79-b0e5-48e6-a5b7-b08e51f1dd69-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"babbce79-b0e5-48e6-a5b7-b08e51f1dd69\") " pod="openstack/rabbitmq-server-0" Oct 03 09:02:35 crc kubenswrapper[4790]: I1003 09:02:35.474682 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/babbce79-b0e5-48e6-a5b7-b08e51f1dd69-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"babbce79-b0e5-48e6-a5b7-b08e51f1dd69\") " pod="openstack/rabbitmq-server-0" Oct 03 09:02:35 crc kubenswrapper[4790]: I1003 09:02:35.474780 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/babbce79-b0e5-48e6-a5b7-b08e51f1dd69-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"babbce79-b0e5-48e6-a5b7-b08e51f1dd69\") " pod="openstack/rabbitmq-server-0" Oct 03 09:02:35 crc kubenswrapper[4790]: I1003 09:02:35.475087 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/babbce79-b0e5-48e6-a5b7-b08e51f1dd69-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"babbce79-b0e5-48e6-a5b7-b08e51f1dd69\") " pod="openstack/rabbitmq-server-0" Oct 03 09:02:35 crc kubenswrapper[4790]: I1003 09:02:35.475433 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/77c8e0d3-b23b-44de-a8d3-0eb2e368e9e9-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"77c8e0d3-b23b-44de-a8d3-0eb2e368e9e9\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 09:02:35 crc kubenswrapper[4790]: I1003 09:02:35.475462 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/babbce79-b0e5-48e6-a5b7-b08e51f1dd69-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"babbce79-b0e5-48e6-a5b7-b08e51f1dd69\") " pod="openstack/rabbitmq-server-0" Oct 03 09:02:35 crc kubenswrapper[4790]: I1003 09:02:35.477741 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/babbce79-b0e5-48e6-a5b7-b08e51f1dd69-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"babbce79-b0e5-48e6-a5b7-b08e51f1dd69\") " pod="openstack/rabbitmq-server-0" Oct 03 09:02:35 crc kubenswrapper[4790]: I1003 09:02:35.480137 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/babbce79-b0e5-48e6-a5b7-b08e51f1dd69-pod-info\") pod \"rabbitmq-server-0\" (UID: \"babbce79-b0e5-48e6-a5b7-b08e51f1dd69\") " pod="openstack/rabbitmq-server-0" Oct 03 09:02:35 crc kubenswrapper[4790]: I1003 09:02:35.482320 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/77c8e0d3-b23b-44de-a8d3-0eb2e368e9e9-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"77c8e0d3-b23b-44de-a8d3-0eb2e368e9e9\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 09:02:35 crc kubenswrapper[4790]: I1003 09:02:35.482808 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/77c8e0d3-b23b-44de-a8d3-0eb2e368e9e9-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"77c8e0d3-b23b-44de-a8d3-0eb2e368e9e9\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 09:02:35 crc kubenswrapper[4790]: I1003 09:02:35.489083 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/77c8e0d3-b23b-44de-a8d3-0eb2e368e9e9-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"77c8e0d3-b23b-44de-a8d3-0eb2e368e9e9\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 09:02:35 crc kubenswrapper[4790]: I1003 09:02:35.489095 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/77c8e0d3-b23b-44de-a8d3-0eb2e368e9e9-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"77c8e0d3-b23b-44de-a8d3-0eb2e368e9e9\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 09:02:35 crc kubenswrapper[4790]: I1003 09:02:35.492426 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5gcmv\" (UniqueName: \"kubernetes.io/projected/77c8e0d3-b23b-44de-a8d3-0eb2e368e9e9-kube-api-access-5gcmv\") pod \"rabbitmq-cell1-server-0\" (UID: \"77c8e0d3-b23b-44de-a8d3-0eb2e368e9e9\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 09:02:35 crc kubenswrapper[4790]: I1003 09:02:35.495847 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tvrqc\" (UniqueName: \"kubernetes.io/projected/babbce79-b0e5-48e6-a5b7-b08e51f1dd69-kube-api-access-tvrqc\") pod \"rabbitmq-server-0\" (UID: \"babbce79-b0e5-48e6-a5b7-b08e51f1dd69\") " pod="openstack/rabbitmq-server-0" Oct 03 09:02:35 crc kubenswrapper[4790]: I1003 09:02:35.515360 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"77c8e0d3-b23b-44de-a8d3-0eb2e368e9e9\") " pod="openstack/rabbitmq-cell1-server-0" Oct 03 09:02:35 crc kubenswrapper[4790]: I1003 09:02:35.521384 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"rabbitmq-server-0\" (UID: \"babbce79-b0e5-48e6-a5b7-b08e51f1dd69\") " pod="openstack/rabbitmq-server-0" Oct 03 09:02:35 crc kubenswrapper[4790]: I1003 09:02:35.585224 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 03 09:02:35 crc kubenswrapper[4790]: I1003 09:02:35.615883 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 03 09:02:36 crc kubenswrapper[4790]: I1003 09:02:36.104017 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 03 09:02:36 crc kubenswrapper[4790]: I1003 09:02:36.152454 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"77c8e0d3-b23b-44de-a8d3-0eb2e368e9e9","Type":"ContainerStarted","Data":"28b056c4cd9dc774714c47c94094d22b793625c45f870a1dca1f0ea3e59c25a8"} Oct 03 09:02:36 crc kubenswrapper[4790]: W1003 09:02:36.231688 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbabbce79_b0e5_48e6_a5b7_b08e51f1dd69.slice/crio-6078165cbdde1f6475641acdfcf26aac4a803a7f38dc4d41f84daccb785e874a WatchSource:0}: Error finding container 6078165cbdde1f6475641acdfcf26aac4a803a7f38dc4d41f84daccb785e874a: Status 404 returned error can't find the container with id 6078165cbdde1f6475641acdfcf26aac4a803a7f38dc4d41f84daccb785e874a Oct 03 09:02:36 crc kubenswrapper[4790]: I1003 09:02:36.232118 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 03 09:02:36 crc kubenswrapper[4790]: I1003 09:02:36.341487 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f5ab6b2b-5bba-4043-b116-b8d996df1ace" path="/var/lib/kubelet/pods/f5ab6b2b-5bba-4043-b116-b8d996df1ace/volumes" Oct 03 09:02:36 crc kubenswrapper[4790]: I1003 09:02:36.343722 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f79d57e8-680a-41b6-bca9-c233a695be4e" path="/var/lib/kubelet/pods/f79d57e8-680a-41b6-bca9-c233a695be4e/volumes" Oct 03 09:02:37 crc kubenswrapper[4790]: I1003 09:02:37.165385 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"babbce79-b0e5-48e6-a5b7-b08e51f1dd69","Type":"ContainerStarted","Data":"6078165cbdde1f6475641acdfcf26aac4a803a7f38dc4d41f84daccb785e874a"} Oct 03 09:02:38 crc kubenswrapper[4790]: I1003 09:02:38.175645 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"77c8e0d3-b23b-44de-a8d3-0eb2e368e9e9","Type":"ContainerStarted","Data":"91d25227a28f966934718bf6fc4f47604526efe1139080419c21b9e9c93a143b"} Oct 03 09:02:38 crc kubenswrapper[4790]: I1003 09:02:38.177330 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"babbce79-b0e5-48e6-a5b7-b08e51f1dd69","Type":"ContainerStarted","Data":"c6f8ee29ac61a2f3d5c1d9292dc67890e29eb916af918d939318631f69b1a836"} Oct 03 09:02:40 crc kubenswrapper[4790]: I1003 09:02:40.186884 4790 patch_prober.go:28] interesting pod/machine-config-daemon-zqj4j container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 09:02:40 crc kubenswrapper[4790]: I1003 09:02:40.186942 4790 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" podUID="36eae06a-d8be-4128-8d68-acb32e1f011b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 09:02:40 crc kubenswrapper[4790]: I1003 09:02:40.186988 4790 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" Oct 03 09:02:40 crc kubenswrapper[4790]: I1003 09:02:40.187721 4790 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"c2fcffece27550dbe144383b7c492d28b1ee8a149f741f79ecb706a01a9252b8"} pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 03 09:02:40 crc kubenswrapper[4790]: I1003 09:02:40.187800 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" podUID="36eae06a-d8be-4128-8d68-acb32e1f011b" containerName="machine-config-daemon" containerID="cri-o://c2fcffece27550dbe144383b7c492d28b1ee8a149f741f79ecb706a01a9252b8" gracePeriod=600 Oct 03 09:02:41 crc kubenswrapper[4790]: I1003 09:02:41.239780 4790 generic.go:334] "Generic (PLEG): container finished" podID="36eae06a-d8be-4128-8d68-acb32e1f011b" containerID="c2fcffece27550dbe144383b7c492d28b1ee8a149f741f79ecb706a01a9252b8" exitCode=0 Oct 03 09:02:41 crc kubenswrapper[4790]: I1003 09:02:41.239858 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" event={"ID":"36eae06a-d8be-4128-8d68-acb32e1f011b","Type":"ContainerDied","Data":"c2fcffece27550dbe144383b7c492d28b1ee8a149f741f79ecb706a01a9252b8"} Oct 03 09:02:41 crc kubenswrapper[4790]: I1003 09:02:41.240454 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" event={"ID":"36eae06a-d8be-4128-8d68-acb32e1f011b","Type":"ContainerStarted","Data":"a12f13183219a9973e13fa6740df8e860fd1b4a6d896ada977d67f910dbb6700"} Oct 03 09:02:41 crc kubenswrapper[4790]: I1003 09:02:41.240474 4790 scope.go:117] "RemoveContainer" containerID="ab4e236d11b4b58cf11bf821ee4fe0466ca737afca708932f0cbf0c7c147ea38" Oct 03 09:02:43 crc kubenswrapper[4790]: I1003 09:02:43.963629 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7c897994dc-jgs4b"] Oct 03 09:02:43 crc kubenswrapper[4790]: I1003 09:02:43.967325 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7c897994dc-jgs4b" Oct 03 09:02:43 crc kubenswrapper[4790]: I1003 09:02:43.969199 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-edpm-ipam" Oct 03 09:02:43 crc kubenswrapper[4790]: I1003 09:02:43.979801 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7c897994dc-jgs4b"] Oct 03 09:02:44 crc kubenswrapper[4790]: I1003 09:02:44.048874 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2r5cf\" (UniqueName: \"kubernetes.io/projected/28968356-fef7-4f43-8815-67e1b57ef38e-kube-api-access-2r5cf\") pod \"dnsmasq-dns-7c897994dc-jgs4b\" (UID: \"28968356-fef7-4f43-8815-67e1b57ef38e\") " pod="openstack/dnsmasq-dns-7c897994dc-jgs4b" Oct 03 09:02:44 crc kubenswrapper[4790]: I1003 09:02:44.049152 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/28968356-fef7-4f43-8815-67e1b57ef38e-dns-swift-storage-0\") pod \"dnsmasq-dns-7c897994dc-jgs4b\" (UID: \"28968356-fef7-4f43-8815-67e1b57ef38e\") " pod="openstack/dnsmasq-dns-7c897994dc-jgs4b" Oct 03 09:02:44 crc kubenswrapper[4790]: I1003 09:02:44.049243 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/28968356-fef7-4f43-8815-67e1b57ef38e-dns-svc\") pod \"dnsmasq-dns-7c897994dc-jgs4b\" (UID: \"28968356-fef7-4f43-8815-67e1b57ef38e\") " pod="openstack/dnsmasq-dns-7c897994dc-jgs4b" Oct 03 09:02:44 crc kubenswrapper[4790]: I1003 09:02:44.049365 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/28968356-fef7-4f43-8815-67e1b57ef38e-openstack-edpm-ipam\") pod \"dnsmasq-dns-7c897994dc-jgs4b\" (UID: \"28968356-fef7-4f43-8815-67e1b57ef38e\") " pod="openstack/dnsmasq-dns-7c897994dc-jgs4b" Oct 03 09:02:44 crc kubenswrapper[4790]: I1003 09:02:44.049713 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/28968356-fef7-4f43-8815-67e1b57ef38e-ovsdbserver-nb\") pod \"dnsmasq-dns-7c897994dc-jgs4b\" (UID: \"28968356-fef7-4f43-8815-67e1b57ef38e\") " pod="openstack/dnsmasq-dns-7c897994dc-jgs4b" Oct 03 09:02:44 crc kubenswrapper[4790]: I1003 09:02:44.049864 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/28968356-fef7-4f43-8815-67e1b57ef38e-config\") pod \"dnsmasq-dns-7c897994dc-jgs4b\" (UID: \"28968356-fef7-4f43-8815-67e1b57ef38e\") " pod="openstack/dnsmasq-dns-7c897994dc-jgs4b" Oct 03 09:02:44 crc kubenswrapper[4790]: I1003 09:02:44.049952 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/28968356-fef7-4f43-8815-67e1b57ef38e-ovsdbserver-sb\") pod \"dnsmasq-dns-7c897994dc-jgs4b\" (UID: \"28968356-fef7-4f43-8815-67e1b57ef38e\") " pod="openstack/dnsmasq-dns-7c897994dc-jgs4b" Oct 03 09:02:44 crc kubenswrapper[4790]: I1003 09:02:44.126809 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7c897994dc-jgs4b"] Oct 03 09:02:44 crc kubenswrapper[4790]: E1003 09:02:44.127963 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[config dns-svc dns-swift-storage-0 kube-api-access-2r5cf openstack-edpm-ipam ovsdbserver-nb ovsdbserver-sb], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/dnsmasq-dns-7c897994dc-jgs4b" podUID="28968356-fef7-4f43-8815-67e1b57ef38e" Oct 03 09:02:44 crc kubenswrapper[4790]: I1003 09:02:44.152229 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/28968356-fef7-4f43-8815-67e1b57ef38e-config\") pod \"dnsmasq-dns-7c897994dc-jgs4b\" (UID: \"28968356-fef7-4f43-8815-67e1b57ef38e\") " pod="openstack/dnsmasq-dns-7c897994dc-jgs4b" Oct 03 09:02:44 crc kubenswrapper[4790]: I1003 09:02:44.152289 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/28968356-fef7-4f43-8815-67e1b57ef38e-ovsdbserver-sb\") pod \"dnsmasq-dns-7c897994dc-jgs4b\" (UID: \"28968356-fef7-4f43-8815-67e1b57ef38e\") " pod="openstack/dnsmasq-dns-7c897994dc-jgs4b" Oct 03 09:02:44 crc kubenswrapper[4790]: I1003 09:02:44.152322 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2r5cf\" (UniqueName: \"kubernetes.io/projected/28968356-fef7-4f43-8815-67e1b57ef38e-kube-api-access-2r5cf\") pod \"dnsmasq-dns-7c897994dc-jgs4b\" (UID: \"28968356-fef7-4f43-8815-67e1b57ef38e\") " pod="openstack/dnsmasq-dns-7c897994dc-jgs4b" Oct 03 09:02:44 crc kubenswrapper[4790]: I1003 09:02:44.152377 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/28968356-fef7-4f43-8815-67e1b57ef38e-dns-swift-storage-0\") pod \"dnsmasq-dns-7c897994dc-jgs4b\" (UID: \"28968356-fef7-4f43-8815-67e1b57ef38e\") " pod="openstack/dnsmasq-dns-7c897994dc-jgs4b" Oct 03 09:02:44 crc kubenswrapper[4790]: I1003 09:02:44.152399 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/28968356-fef7-4f43-8815-67e1b57ef38e-dns-svc\") pod \"dnsmasq-dns-7c897994dc-jgs4b\" (UID: \"28968356-fef7-4f43-8815-67e1b57ef38e\") " pod="openstack/dnsmasq-dns-7c897994dc-jgs4b" Oct 03 09:02:44 crc kubenswrapper[4790]: I1003 09:02:44.152454 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/28968356-fef7-4f43-8815-67e1b57ef38e-openstack-edpm-ipam\") pod \"dnsmasq-dns-7c897994dc-jgs4b\" (UID: \"28968356-fef7-4f43-8815-67e1b57ef38e\") " pod="openstack/dnsmasq-dns-7c897994dc-jgs4b" Oct 03 09:02:44 crc kubenswrapper[4790]: I1003 09:02:44.152480 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/28968356-fef7-4f43-8815-67e1b57ef38e-ovsdbserver-nb\") pod \"dnsmasq-dns-7c897994dc-jgs4b\" (UID: \"28968356-fef7-4f43-8815-67e1b57ef38e\") " pod="openstack/dnsmasq-dns-7c897994dc-jgs4b" Oct 03 09:02:44 crc kubenswrapper[4790]: I1003 09:02:44.153351 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/28968356-fef7-4f43-8815-67e1b57ef38e-ovsdbserver-nb\") pod \"dnsmasq-dns-7c897994dc-jgs4b\" (UID: \"28968356-fef7-4f43-8815-67e1b57ef38e\") " pod="openstack/dnsmasq-dns-7c897994dc-jgs4b" Oct 03 09:02:44 crc kubenswrapper[4790]: I1003 09:02:44.153897 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/28968356-fef7-4f43-8815-67e1b57ef38e-config\") pod \"dnsmasq-dns-7c897994dc-jgs4b\" (UID: \"28968356-fef7-4f43-8815-67e1b57ef38e\") " pod="openstack/dnsmasq-dns-7c897994dc-jgs4b" Oct 03 09:02:44 crc kubenswrapper[4790]: I1003 09:02:44.154385 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/28968356-fef7-4f43-8815-67e1b57ef38e-ovsdbserver-sb\") pod \"dnsmasq-dns-7c897994dc-jgs4b\" (UID: \"28968356-fef7-4f43-8815-67e1b57ef38e\") " pod="openstack/dnsmasq-dns-7c897994dc-jgs4b" Oct 03 09:02:44 crc kubenswrapper[4790]: I1003 09:02:44.161347 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/28968356-fef7-4f43-8815-67e1b57ef38e-dns-svc\") pod \"dnsmasq-dns-7c897994dc-jgs4b\" (UID: \"28968356-fef7-4f43-8815-67e1b57ef38e\") " pod="openstack/dnsmasq-dns-7c897994dc-jgs4b" Oct 03 09:02:44 crc kubenswrapper[4790]: I1003 09:02:44.161396 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/28968356-fef7-4f43-8815-67e1b57ef38e-dns-swift-storage-0\") pod \"dnsmasq-dns-7c897994dc-jgs4b\" (UID: \"28968356-fef7-4f43-8815-67e1b57ef38e\") " pod="openstack/dnsmasq-dns-7c897994dc-jgs4b" Oct 03 09:02:44 crc kubenswrapper[4790]: I1003 09:02:44.163908 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5d9785cb67-n2h98"] Oct 03 09:02:44 crc kubenswrapper[4790]: I1003 09:02:44.166091 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5d9785cb67-n2h98" Oct 03 09:02:44 crc kubenswrapper[4790]: I1003 09:02:44.169240 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/28968356-fef7-4f43-8815-67e1b57ef38e-openstack-edpm-ipam\") pod \"dnsmasq-dns-7c897994dc-jgs4b\" (UID: \"28968356-fef7-4f43-8815-67e1b57ef38e\") " pod="openstack/dnsmasq-dns-7c897994dc-jgs4b" Oct 03 09:02:44 crc kubenswrapper[4790]: I1003 09:02:44.174858 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5d9785cb67-n2h98"] Oct 03 09:02:44 crc kubenswrapper[4790]: I1003 09:02:44.179068 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2r5cf\" (UniqueName: \"kubernetes.io/projected/28968356-fef7-4f43-8815-67e1b57ef38e-kube-api-access-2r5cf\") pod \"dnsmasq-dns-7c897994dc-jgs4b\" (UID: \"28968356-fef7-4f43-8815-67e1b57ef38e\") " pod="openstack/dnsmasq-dns-7c897994dc-jgs4b" Oct 03 09:02:44 crc kubenswrapper[4790]: I1003 09:02:44.255900 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e5a9c970-49d0-488a-9cdd-dc438ce51e75-config\") pod \"dnsmasq-dns-5d9785cb67-n2h98\" (UID: \"e5a9c970-49d0-488a-9cdd-dc438ce51e75\") " pod="openstack/dnsmasq-dns-5d9785cb67-n2h98" Oct 03 09:02:44 crc kubenswrapper[4790]: I1003 09:02:44.255973 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hmbvr\" (UniqueName: \"kubernetes.io/projected/e5a9c970-49d0-488a-9cdd-dc438ce51e75-kube-api-access-hmbvr\") pod \"dnsmasq-dns-5d9785cb67-n2h98\" (UID: \"e5a9c970-49d0-488a-9cdd-dc438ce51e75\") " pod="openstack/dnsmasq-dns-5d9785cb67-n2h98" Oct 03 09:02:44 crc kubenswrapper[4790]: I1003 09:02:44.256033 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/e5a9c970-49d0-488a-9cdd-dc438ce51e75-openstack-edpm-ipam\") pod \"dnsmasq-dns-5d9785cb67-n2h98\" (UID: \"e5a9c970-49d0-488a-9cdd-dc438ce51e75\") " pod="openstack/dnsmasq-dns-5d9785cb67-n2h98" Oct 03 09:02:44 crc kubenswrapper[4790]: I1003 09:02:44.256097 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e5a9c970-49d0-488a-9cdd-dc438ce51e75-ovsdbserver-nb\") pod \"dnsmasq-dns-5d9785cb67-n2h98\" (UID: \"e5a9c970-49d0-488a-9cdd-dc438ce51e75\") " pod="openstack/dnsmasq-dns-5d9785cb67-n2h98" Oct 03 09:02:44 crc kubenswrapper[4790]: I1003 09:02:44.256163 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e5a9c970-49d0-488a-9cdd-dc438ce51e75-dns-svc\") pod \"dnsmasq-dns-5d9785cb67-n2h98\" (UID: \"e5a9c970-49d0-488a-9cdd-dc438ce51e75\") " pod="openstack/dnsmasq-dns-5d9785cb67-n2h98" Oct 03 09:02:44 crc kubenswrapper[4790]: I1003 09:02:44.256236 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e5a9c970-49d0-488a-9cdd-dc438ce51e75-dns-swift-storage-0\") pod \"dnsmasq-dns-5d9785cb67-n2h98\" (UID: \"e5a9c970-49d0-488a-9cdd-dc438ce51e75\") " pod="openstack/dnsmasq-dns-5d9785cb67-n2h98" Oct 03 09:02:44 crc kubenswrapper[4790]: I1003 09:02:44.256267 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e5a9c970-49d0-488a-9cdd-dc438ce51e75-ovsdbserver-sb\") pod \"dnsmasq-dns-5d9785cb67-n2h98\" (UID: \"e5a9c970-49d0-488a-9cdd-dc438ce51e75\") " pod="openstack/dnsmasq-dns-5d9785cb67-n2h98" Oct 03 09:02:44 crc kubenswrapper[4790]: I1003 09:02:44.280910 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7c897994dc-jgs4b" Oct 03 09:02:44 crc kubenswrapper[4790]: I1003 09:02:44.294056 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7c897994dc-jgs4b" Oct 03 09:02:44 crc kubenswrapper[4790]: I1003 09:02:44.357053 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/28968356-fef7-4f43-8815-67e1b57ef38e-dns-swift-storage-0\") pod \"28968356-fef7-4f43-8815-67e1b57ef38e\" (UID: \"28968356-fef7-4f43-8815-67e1b57ef38e\") " Oct 03 09:02:44 crc kubenswrapper[4790]: I1003 09:02:44.357153 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/28968356-fef7-4f43-8815-67e1b57ef38e-ovsdbserver-nb\") pod \"28968356-fef7-4f43-8815-67e1b57ef38e\" (UID: \"28968356-fef7-4f43-8815-67e1b57ef38e\") " Oct 03 09:02:44 crc kubenswrapper[4790]: I1003 09:02:44.357298 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/28968356-fef7-4f43-8815-67e1b57ef38e-dns-svc\") pod \"28968356-fef7-4f43-8815-67e1b57ef38e\" (UID: \"28968356-fef7-4f43-8815-67e1b57ef38e\") " Oct 03 09:02:44 crc kubenswrapper[4790]: I1003 09:02:44.357369 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/28968356-fef7-4f43-8815-67e1b57ef38e-ovsdbserver-sb\") pod \"28968356-fef7-4f43-8815-67e1b57ef38e\" (UID: \"28968356-fef7-4f43-8815-67e1b57ef38e\") " Oct 03 09:02:44 crc kubenswrapper[4790]: I1003 09:02:44.357416 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/28968356-fef7-4f43-8815-67e1b57ef38e-config\") pod \"28968356-fef7-4f43-8815-67e1b57ef38e\" (UID: \"28968356-fef7-4f43-8815-67e1b57ef38e\") " Oct 03 09:02:44 crc kubenswrapper[4790]: I1003 09:02:44.357466 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/28968356-fef7-4f43-8815-67e1b57ef38e-openstack-edpm-ipam\") pod \"28968356-fef7-4f43-8815-67e1b57ef38e\" (UID: \"28968356-fef7-4f43-8815-67e1b57ef38e\") " Oct 03 09:02:44 crc kubenswrapper[4790]: I1003 09:02:44.357508 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2r5cf\" (UniqueName: \"kubernetes.io/projected/28968356-fef7-4f43-8815-67e1b57ef38e-kube-api-access-2r5cf\") pod \"28968356-fef7-4f43-8815-67e1b57ef38e\" (UID: \"28968356-fef7-4f43-8815-67e1b57ef38e\") " Oct 03 09:02:44 crc kubenswrapper[4790]: I1003 09:02:44.357679 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/28968356-fef7-4f43-8815-67e1b57ef38e-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "28968356-fef7-4f43-8815-67e1b57ef38e" (UID: "28968356-fef7-4f43-8815-67e1b57ef38e"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 09:02:44 crc kubenswrapper[4790]: I1003 09:02:44.357732 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e5a9c970-49d0-488a-9cdd-dc438ce51e75-dns-swift-storage-0\") pod \"dnsmasq-dns-5d9785cb67-n2h98\" (UID: \"e5a9c970-49d0-488a-9cdd-dc438ce51e75\") " pod="openstack/dnsmasq-dns-5d9785cb67-n2h98" Oct 03 09:02:44 crc kubenswrapper[4790]: I1003 09:02:44.357755 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e5a9c970-49d0-488a-9cdd-dc438ce51e75-ovsdbserver-sb\") pod \"dnsmasq-dns-5d9785cb67-n2h98\" (UID: \"e5a9c970-49d0-488a-9cdd-dc438ce51e75\") " pod="openstack/dnsmasq-dns-5d9785cb67-n2h98" Oct 03 09:02:44 crc kubenswrapper[4790]: I1003 09:02:44.357777 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/28968356-fef7-4f43-8815-67e1b57ef38e-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "28968356-fef7-4f43-8815-67e1b57ef38e" (UID: "28968356-fef7-4f43-8815-67e1b57ef38e"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 09:02:44 crc kubenswrapper[4790]: I1003 09:02:44.357798 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e5a9c970-49d0-488a-9cdd-dc438ce51e75-config\") pod \"dnsmasq-dns-5d9785cb67-n2h98\" (UID: \"e5a9c970-49d0-488a-9cdd-dc438ce51e75\") " pod="openstack/dnsmasq-dns-5d9785cb67-n2h98" Oct 03 09:02:44 crc kubenswrapper[4790]: I1003 09:02:44.357844 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/28968356-fef7-4f43-8815-67e1b57ef38e-config" (OuterVolumeSpecName: "config") pod "28968356-fef7-4f43-8815-67e1b57ef38e" (UID: "28968356-fef7-4f43-8815-67e1b57ef38e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 09:02:44 crc kubenswrapper[4790]: I1003 09:02:44.357877 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hmbvr\" (UniqueName: \"kubernetes.io/projected/e5a9c970-49d0-488a-9cdd-dc438ce51e75-kube-api-access-hmbvr\") pod \"dnsmasq-dns-5d9785cb67-n2h98\" (UID: \"e5a9c970-49d0-488a-9cdd-dc438ce51e75\") " pod="openstack/dnsmasq-dns-5d9785cb67-n2h98" Oct 03 09:02:44 crc kubenswrapper[4790]: I1003 09:02:44.357976 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/e5a9c970-49d0-488a-9cdd-dc438ce51e75-openstack-edpm-ipam\") pod \"dnsmasq-dns-5d9785cb67-n2h98\" (UID: \"e5a9c970-49d0-488a-9cdd-dc438ce51e75\") " pod="openstack/dnsmasq-dns-5d9785cb67-n2h98" Oct 03 09:02:44 crc kubenswrapper[4790]: I1003 09:02:44.357975 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/28968356-fef7-4f43-8815-67e1b57ef38e-openstack-edpm-ipam" (OuterVolumeSpecName: "openstack-edpm-ipam") pod "28968356-fef7-4f43-8815-67e1b57ef38e" (UID: "28968356-fef7-4f43-8815-67e1b57ef38e"). InnerVolumeSpecName "openstack-edpm-ipam". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 09:02:44 crc kubenswrapper[4790]: I1003 09:02:44.358090 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/28968356-fef7-4f43-8815-67e1b57ef38e-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "28968356-fef7-4f43-8815-67e1b57ef38e" (UID: "28968356-fef7-4f43-8815-67e1b57ef38e"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 09:02:44 crc kubenswrapper[4790]: I1003 09:02:44.358092 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e5a9c970-49d0-488a-9cdd-dc438ce51e75-ovsdbserver-nb\") pod \"dnsmasq-dns-5d9785cb67-n2h98\" (UID: \"e5a9c970-49d0-488a-9cdd-dc438ce51e75\") " pod="openstack/dnsmasq-dns-5d9785cb67-n2h98" Oct 03 09:02:44 crc kubenswrapper[4790]: I1003 09:02:44.358165 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e5a9c970-49d0-488a-9cdd-dc438ce51e75-dns-svc\") pod \"dnsmasq-dns-5d9785cb67-n2h98\" (UID: \"e5a9c970-49d0-488a-9cdd-dc438ce51e75\") " pod="openstack/dnsmasq-dns-5d9785cb67-n2h98" Oct 03 09:02:44 crc kubenswrapper[4790]: I1003 09:02:44.358249 4790 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/28968356-fef7-4f43-8815-67e1b57ef38e-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 03 09:02:44 crc kubenswrapper[4790]: I1003 09:02:44.358264 4790 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/28968356-fef7-4f43-8815-67e1b57ef38e-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 03 09:02:44 crc kubenswrapper[4790]: I1003 09:02:44.358274 4790 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/28968356-fef7-4f43-8815-67e1b57ef38e-config\") on node \"crc\" DevicePath \"\"" Oct 03 09:02:44 crc kubenswrapper[4790]: I1003 09:02:44.358282 4790 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/28968356-fef7-4f43-8815-67e1b57ef38e-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Oct 03 09:02:44 crc kubenswrapper[4790]: I1003 09:02:44.358292 4790 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/28968356-fef7-4f43-8815-67e1b57ef38e-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 03 09:02:44 crc kubenswrapper[4790]: I1003 09:02:44.358478 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e5a9c970-49d0-488a-9cdd-dc438ce51e75-config\") pod \"dnsmasq-dns-5d9785cb67-n2h98\" (UID: \"e5a9c970-49d0-488a-9cdd-dc438ce51e75\") " pod="openstack/dnsmasq-dns-5d9785cb67-n2h98" Oct 03 09:02:44 crc kubenswrapper[4790]: I1003 09:02:44.358521 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/28968356-fef7-4f43-8815-67e1b57ef38e-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "28968356-fef7-4f43-8815-67e1b57ef38e" (UID: "28968356-fef7-4f43-8815-67e1b57ef38e"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 09:02:44 crc kubenswrapper[4790]: I1003 09:02:44.358682 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/e5a9c970-49d0-488a-9cdd-dc438ce51e75-openstack-edpm-ipam\") pod \"dnsmasq-dns-5d9785cb67-n2h98\" (UID: \"e5a9c970-49d0-488a-9cdd-dc438ce51e75\") " pod="openstack/dnsmasq-dns-5d9785cb67-n2h98" Oct 03 09:02:44 crc kubenswrapper[4790]: I1003 09:02:44.358860 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e5a9c970-49d0-488a-9cdd-dc438ce51e75-ovsdbserver-nb\") pod \"dnsmasq-dns-5d9785cb67-n2h98\" (UID: \"e5a9c970-49d0-488a-9cdd-dc438ce51e75\") " pod="openstack/dnsmasq-dns-5d9785cb67-n2h98" Oct 03 09:02:44 crc kubenswrapper[4790]: I1003 09:02:44.359338 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e5a9c970-49d0-488a-9cdd-dc438ce51e75-ovsdbserver-sb\") pod \"dnsmasq-dns-5d9785cb67-n2h98\" (UID: \"e5a9c970-49d0-488a-9cdd-dc438ce51e75\") " pod="openstack/dnsmasq-dns-5d9785cb67-n2h98" Oct 03 09:02:44 crc kubenswrapper[4790]: I1003 09:02:44.360011 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e5a9c970-49d0-488a-9cdd-dc438ce51e75-dns-swift-storage-0\") pod \"dnsmasq-dns-5d9785cb67-n2h98\" (UID: \"e5a9c970-49d0-488a-9cdd-dc438ce51e75\") " pod="openstack/dnsmasq-dns-5d9785cb67-n2h98" Oct 03 09:02:44 crc kubenswrapper[4790]: I1003 09:02:44.360495 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e5a9c970-49d0-488a-9cdd-dc438ce51e75-dns-svc\") pod \"dnsmasq-dns-5d9785cb67-n2h98\" (UID: \"e5a9c970-49d0-488a-9cdd-dc438ce51e75\") " pod="openstack/dnsmasq-dns-5d9785cb67-n2h98" Oct 03 09:02:44 crc kubenswrapper[4790]: I1003 09:02:44.377431 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/28968356-fef7-4f43-8815-67e1b57ef38e-kube-api-access-2r5cf" (OuterVolumeSpecName: "kube-api-access-2r5cf") pod "28968356-fef7-4f43-8815-67e1b57ef38e" (UID: "28968356-fef7-4f43-8815-67e1b57ef38e"). InnerVolumeSpecName "kube-api-access-2r5cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:02:44 crc kubenswrapper[4790]: I1003 09:02:44.383367 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hmbvr\" (UniqueName: \"kubernetes.io/projected/e5a9c970-49d0-488a-9cdd-dc438ce51e75-kube-api-access-hmbvr\") pod \"dnsmasq-dns-5d9785cb67-n2h98\" (UID: \"e5a9c970-49d0-488a-9cdd-dc438ce51e75\") " pod="openstack/dnsmasq-dns-5d9785cb67-n2h98" Oct 03 09:02:44 crc kubenswrapper[4790]: I1003 09:02:44.460826 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2r5cf\" (UniqueName: \"kubernetes.io/projected/28968356-fef7-4f43-8815-67e1b57ef38e-kube-api-access-2r5cf\") on node \"crc\" DevicePath \"\"" Oct 03 09:02:44 crc kubenswrapper[4790]: I1003 09:02:44.460854 4790 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/28968356-fef7-4f43-8815-67e1b57ef38e-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 03 09:02:44 crc kubenswrapper[4790]: I1003 09:02:44.528917 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5d9785cb67-n2h98" Oct 03 09:02:45 crc kubenswrapper[4790]: I1003 09:02:45.007407 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5d9785cb67-n2h98"] Oct 03 09:02:45 crc kubenswrapper[4790]: I1003 09:02:45.345491 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7c897994dc-jgs4b" Oct 03 09:02:45 crc kubenswrapper[4790]: I1003 09:02:45.347360 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d9785cb67-n2h98" event={"ID":"e5a9c970-49d0-488a-9cdd-dc438ce51e75","Type":"ContainerStarted","Data":"dc7ef96452daffe71145680f36587783bb53ddd8db3266b175347b098e5a7555"} Oct 03 09:02:45 crc kubenswrapper[4790]: I1003 09:02:45.422635 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7c897994dc-jgs4b"] Oct 03 09:02:45 crc kubenswrapper[4790]: I1003 09:02:45.438614 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7c897994dc-jgs4b"] Oct 03 09:02:46 crc kubenswrapper[4790]: I1003 09:02:46.341529 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="28968356-fef7-4f43-8815-67e1b57ef38e" path="/var/lib/kubelet/pods/28968356-fef7-4f43-8815-67e1b57ef38e/volumes" Oct 03 09:02:46 crc kubenswrapper[4790]: I1003 09:02:46.370595 4790 generic.go:334] "Generic (PLEG): container finished" podID="e5a9c970-49d0-488a-9cdd-dc438ce51e75" containerID="c42cf6774b329d9e299bf95be33cb81adb2f492f060e3bf075a98a737d7925b7" exitCode=0 Oct 03 09:02:46 crc kubenswrapper[4790]: I1003 09:02:46.370644 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d9785cb67-n2h98" event={"ID":"e5a9c970-49d0-488a-9cdd-dc438ce51e75","Type":"ContainerDied","Data":"c42cf6774b329d9e299bf95be33cb81adb2f492f060e3bf075a98a737d7925b7"} Oct 03 09:02:47 crc kubenswrapper[4790]: I1003 09:02:47.383714 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d9785cb67-n2h98" event={"ID":"e5a9c970-49d0-488a-9cdd-dc438ce51e75","Type":"ContainerStarted","Data":"d571d1c62a217999c46f28388f7dbe3587d18e614b1c2db9f322748e77336310"} Oct 03 09:02:47 crc kubenswrapper[4790]: I1003 09:02:47.384205 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5d9785cb67-n2h98" Oct 03 09:02:47 crc kubenswrapper[4790]: I1003 09:02:47.402526 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5d9785cb67-n2h98" podStartSLOduration=3.402502047 podStartE2EDuration="3.402502047s" podCreationTimestamp="2025-10-03 09:02:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 09:02:47.40003414 +0000 UTC m=+1353.752968388" watchObservedRunningTime="2025-10-03 09:02:47.402502047 +0000 UTC m=+1353.755436285" Oct 03 09:02:54 crc kubenswrapper[4790]: I1003 09:02:54.530666 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5d9785cb67-n2h98" Oct 03 09:02:54 crc kubenswrapper[4790]: I1003 09:02:54.629167 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-59fd7cc775-bxj2s"] Oct 03 09:02:54 crc kubenswrapper[4790]: I1003 09:02:54.629745 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-59fd7cc775-bxj2s" podUID="5b00ed4e-aa3d-4cb2-beaf-65ea32d33f4f" containerName="dnsmasq-dns" containerID="cri-o://4f2be66f7c868359503eafabcb37a05cd64a4504df08bbf7b790b398bc540ede" gracePeriod=10 Oct 03 09:02:54 crc kubenswrapper[4790]: I1003 09:02:54.791705 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-79ff9d4d8f-zfwpx"] Oct 03 09:02:54 crc kubenswrapper[4790]: I1003 09:02:54.794394 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-79ff9d4d8f-zfwpx" Oct 03 09:02:54 crc kubenswrapper[4790]: I1003 09:02:54.835755 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-79ff9d4d8f-zfwpx"] Oct 03 09:02:54 crc kubenswrapper[4790]: I1003 09:02:54.986068 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dbz97\" (UniqueName: \"kubernetes.io/projected/33a1dfa6-9323-4d22-b0c1-0bd210b9638f-kube-api-access-dbz97\") pod \"dnsmasq-dns-79ff9d4d8f-zfwpx\" (UID: \"33a1dfa6-9323-4d22-b0c1-0bd210b9638f\") " pod="openstack/dnsmasq-dns-79ff9d4d8f-zfwpx" Oct 03 09:02:54 crc kubenswrapper[4790]: I1003 09:02:54.986299 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/33a1dfa6-9323-4d22-b0c1-0bd210b9638f-openstack-edpm-ipam\") pod \"dnsmasq-dns-79ff9d4d8f-zfwpx\" (UID: \"33a1dfa6-9323-4d22-b0c1-0bd210b9638f\") " pod="openstack/dnsmasq-dns-79ff9d4d8f-zfwpx" Oct 03 09:02:54 crc kubenswrapper[4790]: I1003 09:02:54.986348 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/33a1dfa6-9323-4d22-b0c1-0bd210b9638f-config\") pod \"dnsmasq-dns-79ff9d4d8f-zfwpx\" (UID: \"33a1dfa6-9323-4d22-b0c1-0bd210b9638f\") " pod="openstack/dnsmasq-dns-79ff9d4d8f-zfwpx" Oct 03 09:02:54 crc kubenswrapper[4790]: I1003 09:02:54.986417 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/33a1dfa6-9323-4d22-b0c1-0bd210b9638f-ovsdbserver-nb\") pod \"dnsmasq-dns-79ff9d4d8f-zfwpx\" (UID: \"33a1dfa6-9323-4d22-b0c1-0bd210b9638f\") " pod="openstack/dnsmasq-dns-79ff9d4d8f-zfwpx" Oct 03 09:02:54 crc kubenswrapper[4790]: I1003 09:02:54.986482 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/33a1dfa6-9323-4d22-b0c1-0bd210b9638f-dns-swift-storage-0\") pod \"dnsmasq-dns-79ff9d4d8f-zfwpx\" (UID: \"33a1dfa6-9323-4d22-b0c1-0bd210b9638f\") " pod="openstack/dnsmasq-dns-79ff9d4d8f-zfwpx" Oct 03 09:02:54 crc kubenswrapper[4790]: I1003 09:02:54.986512 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/33a1dfa6-9323-4d22-b0c1-0bd210b9638f-dns-svc\") pod \"dnsmasq-dns-79ff9d4d8f-zfwpx\" (UID: \"33a1dfa6-9323-4d22-b0c1-0bd210b9638f\") " pod="openstack/dnsmasq-dns-79ff9d4d8f-zfwpx" Oct 03 09:02:54 crc kubenswrapper[4790]: I1003 09:02:54.986549 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/33a1dfa6-9323-4d22-b0c1-0bd210b9638f-ovsdbserver-sb\") pod \"dnsmasq-dns-79ff9d4d8f-zfwpx\" (UID: \"33a1dfa6-9323-4d22-b0c1-0bd210b9638f\") " pod="openstack/dnsmasq-dns-79ff9d4d8f-zfwpx" Oct 03 09:02:55 crc kubenswrapper[4790]: I1003 09:02:55.088144 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/33a1dfa6-9323-4d22-b0c1-0bd210b9638f-openstack-edpm-ipam\") pod \"dnsmasq-dns-79ff9d4d8f-zfwpx\" (UID: \"33a1dfa6-9323-4d22-b0c1-0bd210b9638f\") " pod="openstack/dnsmasq-dns-79ff9d4d8f-zfwpx" Oct 03 09:02:55 crc kubenswrapper[4790]: I1003 09:02:55.088182 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dbz97\" (UniqueName: \"kubernetes.io/projected/33a1dfa6-9323-4d22-b0c1-0bd210b9638f-kube-api-access-dbz97\") pod \"dnsmasq-dns-79ff9d4d8f-zfwpx\" (UID: \"33a1dfa6-9323-4d22-b0c1-0bd210b9638f\") " pod="openstack/dnsmasq-dns-79ff9d4d8f-zfwpx" Oct 03 09:02:55 crc kubenswrapper[4790]: I1003 09:02:55.088228 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/33a1dfa6-9323-4d22-b0c1-0bd210b9638f-config\") pod \"dnsmasq-dns-79ff9d4d8f-zfwpx\" (UID: \"33a1dfa6-9323-4d22-b0c1-0bd210b9638f\") " pod="openstack/dnsmasq-dns-79ff9d4d8f-zfwpx" Oct 03 09:02:55 crc kubenswrapper[4790]: I1003 09:02:55.088278 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/33a1dfa6-9323-4d22-b0c1-0bd210b9638f-ovsdbserver-nb\") pod \"dnsmasq-dns-79ff9d4d8f-zfwpx\" (UID: \"33a1dfa6-9323-4d22-b0c1-0bd210b9638f\") " pod="openstack/dnsmasq-dns-79ff9d4d8f-zfwpx" Oct 03 09:02:55 crc kubenswrapper[4790]: I1003 09:02:55.088317 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/33a1dfa6-9323-4d22-b0c1-0bd210b9638f-dns-swift-storage-0\") pod \"dnsmasq-dns-79ff9d4d8f-zfwpx\" (UID: \"33a1dfa6-9323-4d22-b0c1-0bd210b9638f\") " pod="openstack/dnsmasq-dns-79ff9d4d8f-zfwpx" Oct 03 09:02:55 crc kubenswrapper[4790]: I1003 09:02:55.088347 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/33a1dfa6-9323-4d22-b0c1-0bd210b9638f-dns-svc\") pod \"dnsmasq-dns-79ff9d4d8f-zfwpx\" (UID: \"33a1dfa6-9323-4d22-b0c1-0bd210b9638f\") " pod="openstack/dnsmasq-dns-79ff9d4d8f-zfwpx" Oct 03 09:02:55 crc kubenswrapper[4790]: I1003 09:02:55.088390 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/33a1dfa6-9323-4d22-b0c1-0bd210b9638f-ovsdbserver-sb\") pod \"dnsmasq-dns-79ff9d4d8f-zfwpx\" (UID: \"33a1dfa6-9323-4d22-b0c1-0bd210b9638f\") " pod="openstack/dnsmasq-dns-79ff9d4d8f-zfwpx" Oct 03 09:02:55 crc kubenswrapper[4790]: I1003 09:02:55.090436 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/33a1dfa6-9323-4d22-b0c1-0bd210b9638f-config\") pod \"dnsmasq-dns-79ff9d4d8f-zfwpx\" (UID: \"33a1dfa6-9323-4d22-b0c1-0bd210b9638f\") " pod="openstack/dnsmasq-dns-79ff9d4d8f-zfwpx" Oct 03 09:02:55 crc kubenswrapper[4790]: I1003 09:02:55.091181 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/33a1dfa6-9323-4d22-b0c1-0bd210b9638f-openstack-edpm-ipam\") pod \"dnsmasq-dns-79ff9d4d8f-zfwpx\" (UID: \"33a1dfa6-9323-4d22-b0c1-0bd210b9638f\") " pod="openstack/dnsmasq-dns-79ff9d4d8f-zfwpx" Oct 03 09:02:55 crc kubenswrapper[4790]: I1003 09:02:55.092284 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/33a1dfa6-9323-4d22-b0c1-0bd210b9638f-dns-swift-storage-0\") pod \"dnsmasq-dns-79ff9d4d8f-zfwpx\" (UID: \"33a1dfa6-9323-4d22-b0c1-0bd210b9638f\") " pod="openstack/dnsmasq-dns-79ff9d4d8f-zfwpx" Oct 03 09:02:55 crc kubenswrapper[4790]: I1003 09:02:55.093229 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/33a1dfa6-9323-4d22-b0c1-0bd210b9638f-dns-svc\") pod \"dnsmasq-dns-79ff9d4d8f-zfwpx\" (UID: \"33a1dfa6-9323-4d22-b0c1-0bd210b9638f\") " pod="openstack/dnsmasq-dns-79ff9d4d8f-zfwpx" Oct 03 09:02:55 crc kubenswrapper[4790]: I1003 09:02:55.095143 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/33a1dfa6-9323-4d22-b0c1-0bd210b9638f-ovsdbserver-nb\") pod \"dnsmasq-dns-79ff9d4d8f-zfwpx\" (UID: \"33a1dfa6-9323-4d22-b0c1-0bd210b9638f\") " pod="openstack/dnsmasq-dns-79ff9d4d8f-zfwpx" Oct 03 09:02:55 crc kubenswrapper[4790]: I1003 09:02:55.095202 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/33a1dfa6-9323-4d22-b0c1-0bd210b9638f-ovsdbserver-sb\") pod \"dnsmasq-dns-79ff9d4d8f-zfwpx\" (UID: \"33a1dfa6-9323-4d22-b0c1-0bd210b9638f\") " pod="openstack/dnsmasq-dns-79ff9d4d8f-zfwpx" Oct 03 09:02:55 crc kubenswrapper[4790]: I1003 09:02:55.112975 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dbz97\" (UniqueName: \"kubernetes.io/projected/33a1dfa6-9323-4d22-b0c1-0bd210b9638f-kube-api-access-dbz97\") pod \"dnsmasq-dns-79ff9d4d8f-zfwpx\" (UID: \"33a1dfa6-9323-4d22-b0c1-0bd210b9638f\") " pod="openstack/dnsmasq-dns-79ff9d4d8f-zfwpx" Oct 03 09:02:55 crc kubenswrapper[4790]: I1003 09:02:55.138297 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-79ff9d4d8f-zfwpx" Oct 03 09:02:55 crc kubenswrapper[4790]: I1003 09:02:55.280765 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-59fd7cc775-bxj2s" Oct 03 09:02:55 crc kubenswrapper[4790]: I1003 09:02:55.397342 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5b00ed4e-aa3d-4cb2-beaf-65ea32d33f4f-dns-svc\") pod \"5b00ed4e-aa3d-4cb2-beaf-65ea32d33f4f\" (UID: \"5b00ed4e-aa3d-4cb2-beaf-65ea32d33f4f\") " Oct 03 09:02:55 crc kubenswrapper[4790]: I1003 09:02:55.397792 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5b00ed4e-aa3d-4cb2-beaf-65ea32d33f4f-config\") pod \"5b00ed4e-aa3d-4cb2-beaf-65ea32d33f4f\" (UID: \"5b00ed4e-aa3d-4cb2-beaf-65ea32d33f4f\") " Oct 03 09:02:55 crc kubenswrapper[4790]: I1003 09:02:55.397861 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5b00ed4e-aa3d-4cb2-beaf-65ea32d33f4f-ovsdbserver-sb\") pod \"5b00ed4e-aa3d-4cb2-beaf-65ea32d33f4f\" (UID: \"5b00ed4e-aa3d-4cb2-beaf-65ea32d33f4f\") " Oct 03 09:02:55 crc kubenswrapper[4790]: I1003 09:02:55.397900 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5b00ed4e-aa3d-4cb2-beaf-65ea32d33f4f-dns-swift-storage-0\") pod \"5b00ed4e-aa3d-4cb2-beaf-65ea32d33f4f\" (UID: \"5b00ed4e-aa3d-4cb2-beaf-65ea32d33f4f\") " Oct 03 09:02:55 crc kubenswrapper[4790]: I1003 09:02:55.397939 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q5lmm\" (UniqueName: \"kubernetes.io/projected/5b00ed4e-aa3d-4cb2-beaf-65ea32d33f4f-kube-api-access-q5lmm\") pod \"5b00ed4e-aa3d-4cb2-beaf-65ea32d33f4f\" (UID: \"5b00ed4e-aa3d-4cb2-beaf-65ea32d33f4f\") " Oct 03 09:02:55 crc kubenswrapper[4790]: I1003 09:02:55.398023 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5b00ed4e-aa3d-4cb2-beaf-65ea32d33f4f-ovsdbserver-nb\") pod \"5b00ed4e-aa3d-4cb2-beaf-65ea32d33f4f\" (UID: \"5b00ed4e-aa3d-4cb2-beaf-65ea32d33f4f\") " Oct 03 09:02:55 crc kubenswrapper[4790]: I1003 09:02:55.401619 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b00ed4e-aa3d-4cb2-beaf-65ea32d33f4f-kube-api-access-q5lmm" (OuterVolumeSpecName: "kube-api-access-q5lmm") pod "5b00ed4e-aa3d-4cb2-beaf-65ea32d33f4f" (UID: "5b00ed4e-aa3d-4cb2-beaf-65ea32d33f4f"). InnerVolumeSpecName "kube-api-access-q5lmm". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:02:55 crc kubenswrapper[4790]: I1003 09:02:55.452442 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5b00ed4e-aa3d-4cb2-beaf-65ea32d33f4f-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "5b00ed4e-aa3d-4cb2-beaf-65ea32d33f4f" (UID: "5b00ed4e-aa3d-4cb2-beaf-65ea32d33f4f"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 09:02:55 crc kubenswrapper[4790]: I1003 09:02:55.476273 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5b00ed4e-aa3d-4cb2-beaf-65ea32d33f4f-config" (OuterVolumeSpecName: "config") pod "5b00ed4e-aa3d-4cb2-beaf-65ea32d33f4f" (UID: "5b00ed4e-aa3d-4cb2-beaf-65ea32d33f4f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 09:02:55 crc kubenswrapper[4790]: I1003 09:02:55.477901 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5b00ed4e-aa3d-4cb2-beaf-65ea32d33f4f-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "5b00ed4e-aa3d-4cb2-beaf-65ea32d33f4f" (UID: "5b00ed4e-aa3d-4cb2-beaf-65ea32d33f4f"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 09:02:55 crc kubenswrapper[4790]: I1003 09:02:55.481467 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5b00ed4e-aa3d-4cb2-beaf-65ea32d33f4f-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "5b00ed4e-aa3d-4cb2-beaf-65ea32d33f4f" (UID: "5b00ed4e-aa3d-4cb2-beaf-65ea32d33f4f"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 09:02:55 crc kubenswrapper[4790]: I1003 09:02:55.486892 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5b00ed4e-aa3d-4cb2-beaf-65ea32d33f4f-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "5b00ed4e-aa3d-4cb2-beaf-65ea32d33f4f" (UID: "5b00ed4e-aa3d-4cb2-beaf-65ea32d33f4f"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 09:02:55 crc kubenswrapper[4790]: I1003 09:02:55.509095 4790 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5b00ed4e-aa3d-4cb2-beaf-65ea32d33f4f-config\") on node \"crc\" DevicePath \"\"" Oct 03 09:02:55 crc kubenswrapper[4790]: I1003 09:02:55.509144 4790 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5b00ed4e-aa3d-4cb2-beaf-65ea32d33f4f-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 03 09:02:55 crc kubenswrapper[4790]: I1003 09:02:55.509161 4790 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5b00ed4e-aa3d-4cb2-beaf-65ea32d33f4f-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 03 09:02:55 crc kubenswrapper[4790]: I1003 09:02:55.509175 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q5lmm\" (UniqueName: \"kubernetes.io/projected/5b00ed4e-aa3d-4cb2-beaf-65ea32d33f4f-kube-api-access-q5lmm\") on node \"crc\" DevicePath \"\"" Oct 03 09:02:55 crc kubenswrapper[4790]: I1003 09:02:55.509195 4790 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5b00ed4e-aa3d-4cb2-beaf-65ea32d33f4f-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 03 09:02:55 crc kubenswrapper[4790]: I1003 09:02:55.509207 4790 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5b00ed4e-aa3d-4cb2-beaf-65ea32d33f4f-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 03 09:02:55 crc kubenswrapper[4790]: I1003 09:02:55.511805 4790 generic.go:334] "Generic (PLEG): container finished" podID="5b00ed4e-aa3d-4cb2-beaf-65ea32d33f4f" containerID="4f2be66f7c868359503eafabcb37a05cd64a4504df08bbf7b790b398bc540ede" exitCode=0 Oct 03 09:02:55 crc kubenswrapper[4790]: I1003 09:02:55.511865 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-59fd7cc775-bxj2s" event={"ID":"5b00ed4e-aa3d-4cb2-beaf-65ea32d33f4f","Type":"ContainerDied","Data":"4f2be66f7c868359503eafabcb37a05cd64a4504df08bbf7b790b398bc540ede"} Oct 03 09:02:55 crc kubenswrapper[4790]: I1003 09:02:55.511895 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-59fd7cc775-bxj2s" event={"ID":"5b00ed4e-aa3d-4cb2-beaf-65ea32d33f4f","Type":"ContainerDied","Data":"4ba559de7664fd811939d1cb53e0f18d25d52611fc9b1284c0cb38b12647247f"} Oct 03 09:02:55 crc kubenswrapper[4790]: I1003 09:02:55.511913 4790 scope.go:117] "RemoveContainer" containerID="4f2be66f7c868359503eafabcb37a05cd64a4504df08bbf7b790b398bc540ede" Oct 03 09:02:55 crc kubenswrapper[4790]: I1003 09:02:55.512115 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-59fd7cc775-bxj2s" Oct 03 09:02:55 crc kubenswrapper[4790]: I1003 09:02:55.555290 4790 scope.go:117] "RemoveContainer" containerID="e2d0cd7daa387af207d23fb18bc8e2652cc63773a9e3cb1169781a30d1ccb0cc" Oct 03 09:02:55 crc kubenswrapper[4790]: I1003 09:02:55.571843 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-59fd7cc775-bxj2s"] Oct 03 09:02:55 crc kubenswrapper[4790]: I1003 09:02:55.588057 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-59fd7cc775-bxj2s"] Oct 03 09:02:55 crc kubenswrapper[4790]: I1003 09:02:55.596974 4790 scope.go:117] "RemoveContainer" containerID="4f2be66f7c868359503eafabcb37a05cd64a4504df08bbf7b790b398bc540ede" Oct 03 09:02:55 crc kubenswrapper[4790]: E1003 09:02:55.597417 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4f2be66f7c868359503eafabcb37a05cd64a4504df08bbf7b790b398bc540ede\": container with ID starting with 4f2be66f7c868359503eafabcb37a05cd64a4504df08bbf7b790b398bc540ede not found: ID does not exist" containerID="4f2be66f7c868359503eafabcb37a05cd64a4504df08bbf7b790b398bc540ede" Oct 03 09:02:55 crc kubenswrapper[4790]: I1003 09:02:55.597456 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4f2be66f7c868359503eafabcb37a05cd64a4504df08bbf7b790b398bc540ede"} err="failed to get container status \"4f2be66f7c868359503eafabcb37a05cd64a4504df08bbf7b790b398bc540ede\": rpc error: code = NotFound desc = could not find container \"4f2be66f7c868359503eafabcb37a05cd64a4504df08bbf7b790b398bc540ede\": container with ID starting with 4f2be66f7c868359503eafabcb37a05cd64a4504df08bbf7b790b398bc540ede not found: ID does not exist" Oct 03 09:02:55 crc kubenswrapper[4790]: I1003 09:02:55.597484 4790 scope.go:117] "RemoveContainer" containerID="e2d0cd7daa387af207d23fb18bc8e2652cc63773a9e3cb1169781a30d1ccb0cc" Oct 03 09:02:55 crc kubenswrapper[4790]: E1003 09:02:55.597898 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e2d0cd7daa387af207d23fb18bc8e2652cc63773a9e3cb1169781a30d1ccb0cc\": container with ID starting with e2d0cd7daa387af207d23fb18bc8e2652cc63773a9e3cb1169781a30d1ccb0cc not found: ID does not exist" containerID="e2d0cd7daa387af207d23fb18bc8e2652cc63773a9e3cb1169781a30d1ccb0cc" Oct 03 09:02:55 crc kubenswrapper[4790]: I1003 09:02:55.597930 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e2d0cd7daa387af207d23fb18bc8e2652cc63773a9e3cb1169781a30d1ccb0cc"} err="failed to get container status \"e2d0cd7daa387af207d23fb18bc8e2652cc63773a9e3cb1169781a30d1ccb0cc\": rpc error: code = NotFound desc = could not find container \"e2d0cd7daa387af207d23fb18bc8e2652cc63773a9e3cb1169781a30d1ccb0cc\": container with ID starting with e2d0cd7daa387af207d23fb18bc8e2652cc63773a9e3cb1169781a30d1ccb0cc not found: ID does not exist" Oct 03 09:02:55 crc kubenswrapper[4790]: I1003 09:02:55.658049 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-79ff9d4d8f-zfwpx"] Oct 03 09:02:55 crc kubenswrapper[4790]: W1003 09:02:55.660666 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod33a1dfa6_9323_4d22_b0c1_0bd210b9638f.slice/crio-41a5447e85575fc0fa277192948ad7a06a42b850e1e6876532d2060feaf8cc29 WatchSource:0}: Error finding container 41a5447e85575fc0fa277192948ad7a06a42b850e1e6876532d2060feaf8cc29: Status 404 returned error can't find the container with id 41a5447e85575fc0fa277192948ad7a06a42b850e1e6876532d2060feaf8cc29 Oct 03 09:02:56 crc kubenswrapper[4790]: I1003 09:02:56.340941 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b00ed4e-aa3d-4cb2-beaf-65ea32d33f4f" path="/var/lib/kubelet/pods/5b00ed4e-aa3d-4cb2-beaf-65ea32d33f4f/volumes" Oct 03 09:02:56 crc kubenswrapper[4790]: I1003 09:02:56.524036 4790 generic.go:334] "Generic (PLEG): container finished" podID="33a1dfa6-9323-4d22-b0c1-0bd210b9638f" containerID="64d5e56c0b03918a09513c186b6cf6cbf41aa8e0b2f7c6c95546eead3ead825f" exitCode=0 Oct 03 09:02:56 crc kubenswrapper[4790]: I1003 09:02:56.524279 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-79ff9d4d8f-zfwpx" event={"ID":"33a1dfa6-9323-4d22-b0c1-0bd210b9638f","Type":"ContainerDied","Data":"64d5e56c0b03918a09513c186b6cf6cbf41aa8e0b2f7c6c95546eead3ead825f"} Oct 03 09:02:56 crc kubenswrapper[4790]: I1003 09:02:56.524316 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-79ff9d4d8f-zfwpx" event={"ID":"33a1dfa6-9323-4d22-b0c1-0bd210b9638f","Type":"ContainerStarted","Data":"41a5447e85575fc0fa277192948ad7a06a42b850e1e6876532d2060feaf8cc29"} Oct 03 09:02:57 crc kubenswrapper[4790]: I1003 09:02:57.535445 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-79ff9d4d8f-zfwpx" event={"ID":"33a1dfa6-9323-4d22-b0c1-0bd210b9638f","Type":"ContainerStarted","Data":"9cc994a57b755673b62af4f6d723b2166e4f85213247bd25bc425b4adab991af"} Oct 03 09:02:57 crc kubenswrapper[4790]: I1003 09:02:57.536131 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-79ff9d4d8f-zfwpx" Oct 03 09:02:57 crc kubenswrapper[4790]: I1003 09:02:57.558169 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-79ff9d4d8f-zfwpx" podStartSLOduration=3.558150942 podStartE2EDuration="3.558150942s" podCreationTimestamp="2025-10-03 09:02:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 09:02:57.554014518 +0000 UTC m=+1363.906948756" watchObservedRunningTime="2025-10-03 09:02:57.558150942 +0000 UTC m=+1363.911085180" Oct 03 09:03:05 crc kubenswrapper[4790]: I1003 09:03:05.139722 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-79ff9d4d8f-zfwpx" Oct 03 09:03:05 crc kubenswrapper[4790]: I1003 09:03:05.257289 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5d9785cb67-n2h98"] Oct 03 09:03:05 crc kubenswrapper[4790]: I1003 09:03:05.257621 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5d9785cb67-n2h98" podUID="e5a9c970-49d0-488a-9cdd-dc438ce51e75" containerName="dnsmasq-dns" containerID="cri-o://d571d1c62a217999c46f28388f7dbe3587d18e614b1c2db9f322748e77336310" gracePeriod=10 Oct 03 09:03:05 crc kubenswrapper[4790]: I1003 09:03:05.616839 4790 generic.go:334] "Generic (PLEG): container finished" podID="e5a9c970-49d0-488a-9cdd-dc438ce51e75" containerID="d571d1c62a217999c46f28388f7dbe3587d18e614b1c2db9f322748e77336310" exitCode=0 Oct 03 09:03:05 crc kubenswrapper[4790]: I1003 09:03:05.617254 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d9785cb67-n2h98" event={"ID":"e5a9c970-49d0-488a-9cdd-dc438ce51e75","Type":"ContainerDied","Data":"d571d1c62a217999c46f28388f7dbe3587d18e614b1c2db9f322748e77336310"} Oct 03 09:03:05 crc kubenswrapper[4790]: I1003 09:03:05.857662 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5d9785cb67-n2h98" Oct 03 09:03:05 crc kubenswrapper[4790]: I1003 09:03:05.913388 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e5a9c970-49d0-488a-9cdd-dc438ce51e75-dns-svc\") pod \"e5a9c970-49d0-488a-9cdd-dc438ce51e75\" (UID: \"e5a9c970-49d0-488a-9cdd-dc438ce51e75\") " Oct 03 09:03:05 crc kubenswrapper[4790]: I1003 09:03:05.913424 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e5a9c970-49d0-488a-9cdd-dc438ce51e75-ovsdbserver-sb\") pod \"e5a9c970-49d0-488a-9cdd-dc438ce51e75\" (UID: \"e5a9c970-49d0-488a-9cdd-dc438ce51e75\") " Oct 03 09:03:05 crc kubenswrapper[4790]: I1003 09:03:05.913445 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e5a9c970-49d0-488a-9cdd-dc438ce51e75-config\") pod \"e5a9c970-49d0-488a-9cdd-dc438ce51e75\" (UID: \"e5a9c970-49d0-488a-9cdd-dc438ce51e75\") " Oct 03 09:03:05 crc kubenswrapper[4790]: I1003 09:03:05.913472 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hmbvr\" (UniqueName: \"kubernetes.io/projected/e5a9c970-49d0-488a-9cdd-dc438ce51e75-kube-api-access-hmbvr\") pod \"e5a9c970-49d0-488a-9cdd-dc438ce51e75\" (UID: \"e5a9c970-49d0-488a-9cdd-dc438ce51e75\") " Oct 03 09:03:05 crc kubenswrapper[4790]: I1003 09:03:05.913590 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e5a9c970-49d0-488a-9cdd-dc438ce51e75-ovsdbserver-nb\") pod \"e5a9c970-49d0-488a-9cdd-dc438ce51e75\" (UID: \"e5a9c970-49d0-488a-9cdd-dc438ce51e75\") " Oct 03 09:03:05 crc kubenswrapper[4790]: I1003 09:03:05.913620 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e5a9c970-49d0-488a-9cdd-dc438ce51e75-dns-swift-storage-0\") pod \"e5a9c970-49d0-488a-9cdd-dc438ce51e75\" (UID: \"e5a9c970-49d0-488a-9cdd-dc438ce51e75\") " Oct 03 09:03:05 crc kubenswrapper[4790]: I1003 09:03:05.913647 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/e5a9c970-49d0-488a-9cdd-dc438ce51e75-openstack-edpm-ipam\") pod \"e5a9c970-49d0-488a-9cdd-dc438ce51e75\" (UID: \"e5a9c970-49d0-488a-9cdd-dc438ce51e75\") " Oct 03 09:03:05 crc kubenswrapper[4790]: I1003 09:03:05.940047 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e5a9c970-49d0-488a-9cdd-dc438ce51e75-kube-api-access-hmbvr" (OuterVolumeSpecName: "kube-api-access-hmbvr") pod "e5a9c970-49d0-488a-9cdd-dc438ce51e75" (UID: "e5a9c970-49d0-488a-9cdd-dc438ce51e75"). InnerVolumeSpecName "kube-api-access-hmbvr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:03:05 crc kubenswrapper[4790]: I1003 09:03:05.977047 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e5a9c970-49d0-488a-9cdd-dc438ce51e75-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "e5a9c970-49d0-488a-9cdd-dc438ce51e75" (UID: "e5a9c970-49d0-488a-9cdd-dc438ce51e75"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 09:03:05 crc kubenswrapper[4790]: I1003 09:03:05.981525 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e5a9c970-49d0-488a-9cdd-dc438ce51e75-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "e5a9c970-49d0-488a-9cdd-dc438ce51e75" (UID: "e5a9c970-49d0-488a-9cdd-dc438ce51e75"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 09:03:06 crc kubenswrapper[4790]: I1003 09:03:06.000501 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e5a9c970-49d0-488a-9cdd-dc438ce51e75-config" (OuterVolumeSpecName: "config") pod "e5a9c970-49d0-488a-9cdd-dc438ce51e75" (UID: "e5a9c970-49d0-488a-9cdd-dc438ce51e75"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 09:03:06 crc kubenswrapper[4790]: I1003 09:03:06.003405 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e5a9c970-49d0-488a-9cdd-dc438ce51e75-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "e5a9c970-49d0-488a-9cdd-dc438ce51e75" (UID: "e5a9c970-49d0-488a-9cdd-dc438ce51e75"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 09:03:06 crc kubenswrapper[4790]: I1003 09:03:06.009989 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e5a9c970-49d0-488a-9cdd-dc438ce51e75-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "e5a9c970-49d0-488a-9cdd-dc438ce51e75" (UID: "e5a9c970-49d0-488a-9cdd-dc438ce51e75"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 09:03:06 crc kubenswrapper[4790]: I1003 09:03:06.016418 4790 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e5a9c970-49d0-488a-9cdd-dc438ce51e75-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 03 09:03:06 crc kubenswrapper[4790]: I1003 09:03:06.016525 4790 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e5a9c970-49d0-488a-9cdd-dc438ce51e75-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 03 09:03:06 crc kubenswrapper[4790]: I1003 09:03:06.016845 4790 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e5a9c970-49d0-488a-9cdd-dc438ce51e75-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 03 09:03:06 crc kubenswrapper[4790]: I1003 09:03:06.016920 4790 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e5a9c970-49d0-488a-9cdd-dc438ce51e75-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 03 09:03:06 crc kubenswrapper[4790]: I1003 09:03:06.016993 4790 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e5a9c970-49d0-488a-9cdd-dc438ce51e75-config\") on node \"crc\" DevicePath \"\"" Oct 03 09:03:06 crc kubenswrapper[4790]: I1003 09:03:06.017064 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hmbvr\" (UniqueName: \"kubernetes.io/projected/e5a9c970-49d0-488a-9cdd-dc438ce51e75-kube-api-access-hmbvr\") on node \"crc\" DevicePath \"\"" Oct 03 09:03:06 crc kubenswrapper[4790]: I1003 09:03:06.019882 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e5a9c970-49d0-488a-9cdd-dc438ce51e75-openstack-edpm-ipam" (OuterVolumeSpecName: "openstack-edpm-ipam") pod "e5a9c970-49d0-488a-9cdd-dc438ce51e75" (UID: "e5a9c970-49d0-488a-9cdd-dc438ce51e75"). InnerVolumeSpecName "openstack-edpm-ipam". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 09:03:06 crc kubenswrapper[4790]: I1003 09:03:06.118126 4790 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/e5a9c970-49d0-488a-9cdd-dc438ce51e75-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Oct 03 09:03:06 crc kubenswrapper[4790]: I1003 09:03:06.627289 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d9785cb67-n2h98" event={"ID":"e5a9c970-49d0-488a-9cdd-dc438ce51e75","Type":"ContainerDied","Data":"dc7ef96452daffe71145680f36587783bb53ddd8db3266b175347b098e5a7555"} Oct 03 09:03:06 crc kubenswrapper[4790]: I1003 09:03:06.627354 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5d9785cb67-n2h98" Oct 03 09:03:06 crc kubenswrapper[4790]: I1003 09:03:06.627686 4790 scope.go:117] "RemoveContainer" containerID="d571d1c62a217999c46f28388f7dbe3587d18e614b1c2db9f322748e77336310" Oct 03 09:03:06 crc kubenswrapper[4790]: I1003 09:03:06.655936 4790 scope.go:117] "RemoveContainer" containerID="c42cf6774b329d9e299bf95be33cb81adb2f492f060e3bf075a98a737d7925b7" Oct 03 09:03:06 crc kubenswrapper[4790]: I1003 09:03:06.660792 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5d9785cb67-n2h98"] Oct 03 09:03:06 crc kubenswrapper[4790]: I1003 09:03:06.670686 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5d9785cb67-n2h98"] Oct 03 09:03:08 crc kubenswrapper[4790]: I1003 09:03:08.348491 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e5a9c970-49d0-488a-9cdd-dc438ce51e75" path="/var/lib/kubelet/pods/e5a9c970-49d0-488a-9cdd-dc438ce51e75/volumes" Oct 03 09:03:10 crc kubenswrapper[4790]: I1003 09:03:10.665219 4790 generic.go:334] "Generic (PLEG): container finished" podID="77c8e0d3-b23b-44de-a8d3-0eb2e368e9e9" containerID="91d25227a28f966934718bf6fc4f47604526efe1139080419c21b9e9c93a143b" exitCode=0 Oct 03 09:03:10 crc kubenswrapper[4790]: I1003 09:03:10.665355 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"77c8e0d3-b23b-44de-a8d3-0eb2e368e9e9","Type":"ContainerDied","Data":"91d25227a28f966934718bf6fc4f47604526efe1139080419c21b9e9c93a143b"} Oct 03 09:03:10 crc kubenswrapper[4790]: I1003 09:03:10.668859 4790 generic.go:334] "Generic (PLEG): container finished" podID="babbce79-b0e5-48e6-a5b7-b08e51f1dd69" containerID="c6f8ee29ac61a2f3d5c1d9292dc67890e29eb916af918d939318631f69b1a836" exitCode=0 Oct 03 09:03:10 crc kubenswrapper[4790]: I1003 09:03:10.668942 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"babbce79-b0e5-48e6-a5b7-b08e51f1dd69","Type":"ContainerDied","Data":"c6f8ee29ac61a2f3d5c1d9292dc67890e29eb916af918d939318631f69b1a836"} Oct 03 09:03:11 crc kubenswrapper[4790]: I1003 09:03:11.681467 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"babbce79-b0e5-48e6-a5b7-b08e51f1dd69","Type":"ContainerStarted","Data":"6ba8038f5ea58627ab6fef0d78765c6874f849655b33e04e7adac9d04dfb8573"} Oct 03 09:03:11 crc kubenswrapper[4790]: I1003 09:03:11.682412 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Oct 03 09:03:11 crc kubenswrapper[4790]: I1003 09:03:11.684486 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"77c8e0d3-b23b-44de-a8d3-0eb2e368e9e9","Type":"ContainerStarted","Data":"f1f62918be308855a04772640cca2b8005853aa8c22fce74d2e80f3bb435120e"} Oct 03 09:03:11 crc kubenswrapper[4790]: I1003 09:03:11.684903 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Oct 03 09:03:11 crc kubenswrapper[4790]: I1003 09:03:11.713125 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=36.713106245 podStartE2EDuration="36.713106245s" podCreationTimestamp="2025-10-03 09:02:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 09:03:11.705716962 +0000 UTC m=+1378.058651220" watchObservedRunningTime="2025-10-03 09:03:11.713106245 +0000 UTC m=+1378.066040483" Oct 03 09:03:11 crc kubenswrapper[4790]: I1003 09:03:11.732734 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=36.732714461 podStartE2EDuration="36.732714461s" podCreationTimestamp="2025-10-03 09:02:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 09:03:11.726708366 +0000 UTC m=+1378.079642604" watchObservedRunningTime="2025-10-03 09:03:11.732714461 +0000 UTC m=+1378.085648699" Oct 03 09:03:19 crc kubenswrapper[4790]: I1003 09:03:19.884365 4790 scope.go:117] "RemoveContainer" containerID="4227c40deecaa8a969cfdfaa0a9738eb174bad9141bd35d8295af11e64a3c6d2" Oct 03 09:03:19 crc kubenswrapper[4790]: I1003 09:03:19.911636 4790 scope.go:117] "RemoveContainer" containerID="54852890d6da353794bab5dfcfcf9a00616e67989097493ae01945c983265da4" Oct 03 09:03:19 crc kubenswrapper[4790]: I1003 09:03:19.976503 4790 scope.go:117] "RemoveContainer" containerID="967ede7f4d897538e40daed769bfd55730474c6700fa08ab1a0dcd1960359c86" Oct 03 09:03:23 crc kubenswrapper[4790]: I1003 09:03:23.784470 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-cfdk9"] Oct 03 09:03:23 crc kubenswrapper[4790]: E1003 09:03:23.785856 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e5a9c970-49d0-488a-9cdd-dc438ce51e75" containerName="init" Oct 03 09:03:23 crc kubenswrapper[4790]: I1003 09:03:23.785873 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5a9c970-49d0-488a-9cdd-dc438ce51e75" containerName="init" Oct 03 09:03:23 crc kubenswrapper[4790]: E1003 09:03:23.785884 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e5a9c970-49d0-488a-9cdd-dc438ce51e75" containerName="dnsmasq-dns" Oct 03 09:03:23 crc kubenswrapper[4790]: I1003 09:03:23.785890 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5a9c970-49d0-488a-9cdd-dc438ce51e75" containerName="dnsmasq-dns" Oct 03 09:03:23 crc kubenswrapper[4790]: E1003 09:03:23.785907 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5b00ed4e-aa3d-4cb2-beaf-65ea32d33f4f" containerName="dnsmasq-dns" Oct 03 09:03:23 crc kubenswrapper[4790]: I1003 09:03:23.785914 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="5b00ed4e-aa3d-4cb2-beaf-65ea32d33f4f" containerName="dnsmasq-dns" Oct 03 09:03:23 crc kubenswrapper[4790]: E1003 09:03:23.785941 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5b00ed4e-aa3d-4cb2-beaf-65ea32d33f4f" containerName="init" Oct 03 09:03:23 crc kubenswrapper[4790]: I1003 09:03:23.785947 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="5b00ed4e-aa3d-4cb2-beaf-65ea32d33f4f" containerName="init" Oct 03 09:03:23 crc kubenswrapper[4790]: I1003 09:03:23.786176 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="5b00ed4e-aa3d-4cb2-beaf-65ea32d33f4f" containerName="dnsmasq-dns" Oct 03 09:03:23 crc kubenswrapper[4790]: I1003 09:03:23.786206 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="e5a9c970-49d0-488a-9cdd-dc438ce51e75" containerName="dnsmasq-dns" Oct 03 09:03:23 crc kubenswrapper[4790]: I1003 09:03:23.787058 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-cfdk9" Oct 03 09:03:23 crc kubenswrapper[4790]: I1003 09:03:23.797459 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 03 09:03:23 crc kubenswrapper[4790]: I1003 09:03:23.804101 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-8j46v" Oct 03 09:03:23 crc kubenswrapper[4790]: I1003 09:03:23.804371 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 03 09:03:23 crc kubenswrapper[4790]: I1003 09:03:23.804768 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 03 09:03:23 crc kubenswrapper[4790]: I1003 09:03:23.822505 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-cfdk9"] Oct 03 09:03:23 crc kubenswrapper[4790]: I1003 09:03:23.872375 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/551d5e7d-f7da-45c1-acbe-efa124afdd28-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-cfdk9\" (UID: \"551d5e7d-f7da-45c1-acbe-efa124afdd28\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-cfdk9" Oct 03 09:03:23 crc kubenswrapper[4790]: I1003 09:03:23.872476 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/551d5e7d-f7da-45c1-acbe-efa124afdd28-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-cfdk9\" (UID: \"551d5e7d-f7da-45c1-acbe-efa124afdd28\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-cfdk9" Oct 03 09:03:23 crc kubenswrapper[4790]: I1003 09:03:23.872541 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/551d5e7d-f7da-45c1-acbe-efa124afdd28-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-cfdk9\" (UID: \"551d5e7d-f7da-45c1-acbe-efa124afdd28\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-cfdk9" Oct 03 09:03:23 crc kubenswrapper[4790]: I1003 09:03:23.872595 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xc9zw\" (UniqueName: \"kubernetes.io/projected/551d5e7d-f7da-45c1-acbe-efa124afdd28-kube-api-access-xc9zw\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-cfdk9\" (UID: \"551d5e7d-f7da-45c1-acbe-efa124afdd28\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-cfdk9" Oct 03 09:03:23 crc kubenswrapper[4790]: I1003 09:03:23.975056 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/551d5e7d-f7da-45c1-acbe-efa124afdd28-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-cfdk9\" (UID: \"551d5e7d-f7da-45c1-acbe-efa124afdd28\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-cfdk9" Oct 03 09:03:23 crc kubenswrapper[4790]: I1003 09:03:23.975143 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/551d5e7d-f7da-45c1-acbe-efa124afdd28-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-cfdk9\" (UID: \"551d5e7d-f7da-45c1-acbe-efa124afdd28\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-cfdk9" Oct 03 09:03:23 crc kubenswrapper[4790]: I1003 09:03:23.975178 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xc9zw\" (UniqueName: \"kubernetes.io/projected/551d5e7d-f7da-45c1-acbe-efa124afdd28-kube-api-access-xc9zw\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-cfdk9\" (UID: \"551d5e7d-f7da-45c1-acbe-efa124afdd28\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-cfdk9" Oct 03 09:03:23 crc kubenswrapper[4790]: I1003 09:03:23.975274 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/551d5e7d-f7da-45c1-acbe-efa124afdd28-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-cfdk9\" (UID: \"551d5e7d-f7da-45c1-acbe-efa124afdd28\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-cfdk9" Oct 03 09:03:23 crc kubenswrapper[4790]: I1003 09:03:23.980738 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/551d5e7d-f7da-45c1-acbe-efa124afdd28-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-cfdk9\" (UID: \"551d5e7d-f7da-45c1-acbe-efa124afdd28\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-cfdk9" Oct 03 09:03:23 crc kubenswrapper[4790]: I1003 09:03:23.981019 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/551d5e7d-f7da-45c1-acbe-efa124afdd28-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-cfdk9\" (UID: \"551d5e7d-f7da-45c1-acbe-efa124afdd28\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-cfdk9" Oct 03 09:03:23 crc kubenswrapper[4790]: I1003 09:03:23.986034 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/551d5e7d-f7da-45c1-acbe-efa124afdd28-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-cfdk9\" (UID: \"551d5e7d-f7da-45c1-acbe-efa124afdd28\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-cfdk9" Oct 03 09:03:24 crc kubenswrapper[4790]: I1003 09:03:24.002159 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xc9zw\" (UniqueName: \"kubernetes.io/projected/551d5e7d-f7da-45c1-acbe-efa124afdd28-kube-api-access-xc9zw\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-cfdk9\" (UID: \"551d5e7d-f7da-45c1-acbe-efa124afdd28\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-cfdk9" Oct 03 09:03:24 crc kubenswrapper[4790]: I1003 09:03:24.123301 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-cfdk9" Oct 03 09:03:24 crc kubenswrapper[4790]: I1003 09:03:24.765152 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-cfdk9"] Oct 03 09:03:24 crc kubenswrapper[4790]: W1003 09:03:24.772603 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod551d5e7d_f7da_45c1_acbe_efa124afdd28.slice/crio-d27dced44762ffe72a430acf4a2e14e96a8a0b6216a1a101c30f4c3b2b406b6a WatchSource:0}: Error finding container d27dced44762ffe72a430acf4a2e14e96a8a0b6216a1a101c30f4c3b2b406b6a: Status 404 returned error can't find the container with id d27dced44762ffe72a430acf4a2e14e96a8a0b6216a1a101c30f4c3b2b406b6a Oct 03 09:03:24 crc kubenswrapper[4790]: I1003 09:03:24.822983 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-cfdk9" event={"ID":"551d5e7d-f7da-45c1-acbe-efa124afdd28","Type":"ContainerStarted","Data":"d27dced44762ffe72a430acf4a2e14e96a8a0b6216a1a101c30f4c3b2b406b6a"} Oct 03 09:03:25 crc kubenswrapper[4790]: I1003 09:03:25.588455 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Oct 03 09:03:25 crc kubenswrapper[4790]: I1003 09:03:25.619744 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Oct 03 09:03:34 crc kubenswrapper[4790]: I1003 09:03:34.914237 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-cfdk9" event={"ID":"551d5e7d-f7da-45c1-acbe-efa124afdd28","Type":"ContainerStarted","Data":"d4714761fc10a9453787528376aff488901b61e17e215487faeee5cfdbd4a556"} Oct 03 09:03:34 crc kubenswrapper[4790]: I1003 09:03:34.939065 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-cfdk9" podStartSLOduration=2.58236904 podStartE2EDuration="11.939043037s" podCreationTimestamp="2025-10-03 09:03:23 +0000 UTC" firstStartedPulling="2025-10-03 09:03:24.774980575 +0000 UTC m=+1391.127914813" lastFinishedPulling="2025-10-03 09:03:34.131654572 +0000 UTC m=+1400.484588810" observedRunningTime="2025-10-03 09:03:34.932999152 +0000 UTC m=+1401.285933380" watchObservedRunningTime="2025-10-03 09:03:34.939043037 +0000 UTC m=+1401.291977275" Oct 03 09:03:46 crc kubenswrapper[4790]: I1003 09:03:46.031108 4790 generic.go:334] "Generic (PLEG): container finished" podID="551d5e7d-f7da-45c1-acbe-efa124afdd28" containerID="d4714761fc10a9453787528376aff488901b61e17e215487faeee5cfdbd4a556" exitCode=0 Oct 03 09:03:46 crc kubenswrapper[4790]: I1003 09:03:46.031217 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-cfdk9" event={"ID":"551d5e7d-f7da-45c1-acbe-efa124afdd28","Type":"ContainerDied","Data":"d4714761fc10a9453787528376aff488901b61e17e215487faeee5cfdbd4a556"} Oct 03 09:03:47 crc kubenswrapper[4790]: I1003 09:03:47.453944 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-cfdk9" Oct 03 09:03:47 crc kubenswrapper[4790]: I1003 09:03:47.483830 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/551d5e7d-f7da-45c1-acbe-efa124afdd28-repo-setup-combined-ca-bundle\") pod \"551d5e7d-f7da-45c1-acbe-efa124afdd28\" (UID: \"551d5e7d-f7da-45c1-acbe-efa124afdd28\") " Oct 03 09:03:47 crc kubenswrapper[4790]: I1003 09:03:47.483946 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xc9zw\" (UniqueName: \"kubernetes.io/projected/551d5e7d-f7da-45c1-acbe-efa124afdd28-kube-api-access-xc9zw\") pod \"551d5e7d-f7da-45c1-acbe-efa124afdd28\" (UID: \"551d5e7d-f7da-45c1-acbe-efa124afdd28\") " Oct 03 09:03:47 crc kubenswrapper[4790]: I1003 09:03:47.483990 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/551d5e7d-f7da-45c1-acbe-efa124afdd28-ssh-key\") pod \"551d5e7d-f7da-45c1-acbe-efa124afdd28\" (UID: \"551d5e7d-f7da-45c1-acbe-efa124afdd28\") " Oct 03 09:03:47 crc kubenswrapper[4790]: I1003 09:03:47.484063 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/551d5e7d-f7da-45c1-acbe-efa124afdd28-inventory\") pod \"551d5e7d-f7da-45c1-acbe-efa124afdd28\" (UID: \"551d5e7d-f7da-45c1-acbe-efa124afdd28\") " Oct 03 09:03:47 crc kubenswrapper[4790]: I1003 09:03:47.491150 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/551d5e7d-f7da-45c1-acbe-efa124afdd28-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "551d5e7d-f7da-45c1-acbe-efa124afdd28" (UID: "551d5e7d-f7da-45c1-acbe-efa124afdd28"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:03:47 crc kubenswrapper[4790]: I1003 09:03:47.491326 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/551d5e7d-f7da-45c1-acbe-efa124afdd28-kube-api-access-xc9zw" (OuterVolumeSpecName: "kube-api-access-xc9zw") pod "551d5e7d-f7da-45c1-acbe-efa124afdd28" (UID: "551d5e7d-f7da-45c1-acbe-efa124afdd28"). InnerVolumeSpecName "kube-api-access-xc9zw". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:03:47 crc kubenswrapper[4790]: I1003 09:03:47.518149 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/551d5e7d-f7da-45c1-acbe-efa124afdd28-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "551d5e7d-f7da-45c1-acbe-efa124afdd28" (UID: "551d5e7d-f7da-45c1-acbe-efa124afdd28"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:03:47 crc kubenswrapper[4790]: I1003 09:03:47.520049 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/551d5e7d-f7da-45c1-acbe-efa124afdd28-inventory" (OuterVolumeSpecName: "inventory") pod "551d5e7d-f7da-45c1-acbe-efa124afdd28" (UID: "551d5e7d-f7da-45c1-acbe-efa124afdd28"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:03:47 crc kubenswrapper[4790]: I1003 09:03:47.587189 4790 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/551d5e7d-f7da-45c1-acbe-efa124afdd28-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 09:03:47 crc kubenswrapper[4790]: I1003 09:03:47.587226 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xc9zw\" (UniqueName: \"kubernetes.io/projected/551d5e7d-f7da-45c1-acbe-efa124afdd28-kube-api-access-xc9zw\") on node \"crc\" DevicePath \"\"" Oct 03 09:03:47 crc kubenswrapper[4790]: I1003 09:03:47.587236 4790 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/551d5e7d-f7da-45c1-acbe-efa124afdd28-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 03 09:03:47 crc kubenswrapper[4790]: I1003 09:03:47.587246 4790 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/551d5e7d-f7da-45c1-acbe-efa124afdd28-inventory\") on node \"crc\" DevicePath \"\"" Oct 03 09:03:48 crc kubenswrapper[4790]: I1003 09:03:48.053822 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-cfdk9" event={"ID":"551d5e7d-f7da-45c1-acbe-efa124afdd28","Type":"ContainerDied","Data":"d27dced44762ffe72a430acf4a2e14e96a8a0b6216a1a101c30f4c3b2b406b6a"} Oct 03 09:03:48 crc kubenswrapper[4790]: I1003 09:03:48.054136 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d27dced44762ffe72a430acf4a2e14e96a8a0b6216a1a101c30f4c3b2b406b6a" Oct 03 09:03:48 crc kubenswrapper[4790]: I1003 09:03:48.053907 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-cfdk9" Oct 03 09:03:48 crc kubenswrapper[4790]: I1003 09:03:48.139223 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-c52nn"] Oct 03 09:03:48 crc kubenswrapper[4790]: E1003 09:03:48.140924 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="551d5e7d-f7da-45c1-acbe-efa124afdd28" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Oct 03 09:03:48 crc kubenswrapper[4790]: I1003 09:03:48.140948 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="551d5e7d-f7da-45c1-acbe-efa124afdd28" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Oct 03 09:03:48 crc kubenswrapper[4790]: I1003 09:03:48.141191 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="551d5e7d-f7da-45c1-acbe-efa124afdd28" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Oct 03 09:03:48 crc kubenswrapper[4790]: I1003 09:03:48.141935 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-c52nn" Oct 03 09:03:48 crc kubenswrapper[4790]: I1003 09:03:48.145663 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 03 09:03:48 crc kubenswrapper[4790]: I1003 09:03:48.145832 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 03 09:03:48 crc kubenswrapper[4790]: I1003 09:03:48.146786 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 03 09:03:48 crc kubenswrapper[4790]: I1003 09:03:48.146968 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-8j46v" Oct 03 09:03:48 crc kubenswrapper[4790]: I1003 09:03:48.149355 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-c52nn"] Oct 03 09:03:48 crc kubenswrapper[4790]: I1003 09:03:48.200418 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/06011acd-ae43-466d-b25d-70eddbc5c34f-ssh-key\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-c52nn\" (UID: \"06011acd-ae43-466d-b25d-70eddbc5c34f\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-c52nn" Oct 03 09:03:48 crc kubenswrapper[4790]: I1003 09:03:48.200466 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4kfng\" (UniqueName: \"kubernetes.io/projected/06011acd-ae43-466d-b25d-70eddbc5c34f-kube-api-access-4kfng\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-c52nn\" (UID: \"06011acd-ae43-466d-b25d-70eddbc5c34f\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-c52nn" Oct 03 09:03:48 crc kubenswrapper[4790]: I1003 09:03:48.200510 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/06011acd-ae43-466d-b25d-70eddbc5c34f-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-c52nn\" (UID: \"06011acd-ae43-466d-b25d-70eddbc5c34f\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-c52nn" Oct 03 09:03:48 crc kubenswrapper[4790]: I1003 09:03:48.301532 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/06011acd-ae43-466d-b25d-70eddbc5c34f-ssh-key\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-c52nn\" (UID: \"06011acd-ae43-466d-b25d-70eddbc5c34f\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-c52nn" Oct 03 09:03:48 crc kubenswrapper[4790]: I1003 09:03:48.301606 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4kfng\" (UniqueName: \"kubernetes.io/projected/06011acd-ae43-466d-b25d-70eddbc5c34f-kube-api-access-4kfng\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-c52nn\" (UID: \"06011acd-ae43-466d-b25d-70eddbc5c34f\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-c52nn" Oct 03 09:03:48 crc kubenswrapper[4790]: I1003 09:03:48.301628 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/06011acd-ae43-466d-b25d-70eddbc5c34f-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-c52nn\" (UID: \"06011acd-ae43-466d-b25d-70eddbc5c34f\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-c52nn" Oct 03 09:03:48 crc kubenswrapper[4790]: I1003 09:03:48.307890 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/06011acd-ae43-466d-b25d-70eddbc5c34f-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-c52nn\" (UID: \"06011acd-ae43-466d-b25d-70eddbc5c34f\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-c52nn" Oct 03 09:03:48 crc kubenswrapper[4790]: I1003 09:03:48.307889 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/06011acd-ae43-466d-b25d-70eddbc5c34f-ssh-key\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-c52nn\" (UID: \"06011acd-ae43-466d-b25d-70eddbc5c34f\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-c52nn" Oct 03 09:03:48 crc kubenswrapper[4790]: I1003 09:03:48.323254 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4kfng\" (UniqueName: \"kubernetes.io/projected/06011acd-ae43-466d-b25d-70eddbc5c34f-kube-api-access-4kfng\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-c52nn\" (UID: \"06011acd-ae43-466d-b25d-70eddbc5c34f\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-c52nn" Oct 03 09:03:48 crc kubenswrapper[4790]: I1003 09:03:48.494729 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-c52nn" Oct 03 09:03:49 crc kubenswrapper[4790]: I1003 09:03:49.016215 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-c52nn"] Oct 03 09:03:49 crc kubenswrapper[4790]: W1003 09:03:49.018048 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod06011acd_ae43_466d_b25d_70eddbc5c34f.slice/crio-2d5b745e0015e016d2bcef67477aebe6fafbef58d69b01a61d3518db61e0b7c8 WatchSource:0}: Error finding container 2d5b745e0015e016d2bcef67477aebe6fafbef58d69b01a61d3518db61e0b7c8: Status 404 returned error can't find the container with id 2d5b745e0015e016d2bcef67477aebe6fafbef58d69b01a61d3518db61e0b7c8 Oct 03 09:03:49 crc kubenswrapper[4790]: I1003 09:03:49.065833 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-c52nn" event={"ID":"06011acd-ae43-466d-b25d-70eddbc5c34f","Type":"ContainerStarted","Data":"2d5b745e0015e016d2bcef67477aebe6fafbef58d69b01a61d3518db61e0b7c8"} Oct 03 09:03:50 crc kubenswrapper[4790]: I1003 09:03:50.078892 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-c52nn" event={"ID":"06011acd-ae43-466d-b25d-70eddbc5c34f","Type":"ContainerStarted","Data":"2fd1f46a035f6bf66cf49a48b3f9f359ee6f8397884d7d40a2fe447c96e478eb"} Oct 03 09:03:50 crc kubenswrapper[4790]: I1003 09:03:50.107856 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-c52nn" podStartSLOduration=1.729318938 podStartE2EDuration="2.107829622s" podCreationTimestamp="2025-10-03 09:03:48 +0000 UTC" firstStartedPulling="2025-10-03 09:03:49.021153077 +0000 UTC m=+1415.374087315" lastFinishedPulling="2025-10-03 09:03:49.399663761 +0000 UTC m=+1415.752597999" observedRunningTime="2025-10-03 09:03:50.09565471 +0000 UTC m=+1416.448588968" watchObservedRunningTime="2025-10-03 09:03:50.107829622 +0000 UTC m=+1416.460763870" Oct 03 09:03:52 crc kubenswrapper[4790]: I1003 09:03:52.110949 4790 generic.go:334] "Generic (PLEG): container finished" podID="06011acd-ae43-466d-b25d-70eddbc5c34f" containerID="2fd1f46a035f6bf66cf49a48b3f9f359ee6f8397884d7d40a2fe447c96e478eb" exitCode=0 Oct 03 09:03:52 crc kubenswrapper[4790]: I1003 09:03:52.111032 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-c52nn" event={"ID":"06011acd-ae43-466d-b25d-70eddbc5c34f","Type":"ContainerDied","Data":"2fd1f46a035f6bf66cf49a48b3f9f359ee6f8397884d7d40a2fe447c96e478eb"} Oct 03 09:03:53 crc kubenswrapper[4790]: I1003 09:03:53.556013 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-c52nn" Oct 03 09:03:53 crc kubenswrapper[4790]: I1003 09:03:53.602206 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/06011acd-ae43-466d-b25d-70eddbc5c34f-ssh-key\") pod \"06011acd-ae43-466d-b25d-70eddbc5c34f\" (UID: \"06011acd-ae43-466d-b25d-70eddbc5c34f\") " Oct 03 09:03:53 crc kubenswrapper[4790]: I1003 09:03:53.602461 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4kfng\" (UniqueName: \"kubernetes.io/projected/06011acd-ae43-466d-b25d-70eddbc5c34f-kube-api-access-4kfng\") pod \"06011acd-ae43-466d-b25d-70eddbc5c34f\" (UID: \"06011acd-ae43-466d-b25d-70eddbc5c34f\") " Oct 03 09:03:53 crc kubenswrapper[4790]: I1003 09:03:53.602600 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/06011acd-ae43-466d-b25d-70eddbc5c34f-inventory\") pod \"06011acd-ae43-466d-b25d-70eddbc5c34f\" (UID: \"06011acd-ae43-466d-b25d-70eddbc5c34f\") " Oct 03 09:03:53 crc kubenswrapper[4790]: I1003 09:03:53.608480 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/06011acd-ae43-466d-b25d-70eddbc5c34f-kube-api-access-4kfng" (OuterVolumeSpecName: "kube-api-access-4kfng") pod "06011acd-ae43-466d-b25d-70eddbc5c34f" (UID: "06011acd-ae43-466d-b25d-70eddbc5c34f"). InnerVolumeSpecName "kube-api-access-4kfng". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:03:53 crc kubenswrapper[4790]: I1003 09:03:53.635046 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/06011acd-ae43-466d-b25d-70eddbc5c34f-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "06011acd-ae43-466d-b25d-70eddbc5c34f" (UID: "06011acd-ae43-466d-b25d-70eddbc5c34f"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:03:53 crc kubenswrapper[4790]: I1003 09:03:53.637310 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/06011acd-ae43-466d-b25d-70eddbc5c34f-inventory" (OuterVolumeSpecName: "inventory") pod "06011acd-ae43-466d-b25d-70eddbc5c34f" (UID: "06011acd-ae43-466d-b25d-70eddbc5c34f"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:03:53 crc kubenswrapper[4790]: I1003 09:03:53.705014 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4kfng\" (UniqueName: \"kubernetes.io/projected/06011acd-ae43-466d-b25d-70eddbc5c34f-kube-api-access-4kfng\") on node \"crc\" DevicePath \"\"" Oct 03 09:03:53 crc kubenswrapper[4790]: I1003 09:03:53.705050 4790 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/06011acd-ae43-466d-b25d-70eddbc5c34f-inventory\") on node \"crc\" DevicePath \"\"" Oct 03 09:03:53 crc kubenswrapper[4790]: I1003 09:03:53.705061 4790 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/06011acd-ae43-466d-b25d-70eddbc5c34f-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 03 09:03:54 crc kubenswrapper[4790]: I1003 09:03:54.130909 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-c52nn" event={"ID":"06011acd-ae43-466d-b25d-70eddbc5c34f","Type":"ContainerDied","Data":"2d5b745e0015e016d2bcef67477aebe6fafbef58d69b01a61d3518db61e0b7c8"} Oct 03 09:03:54 crc kubenswrapper[4790]: I1003 09:03:54.130944 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2d5b745e0015e016d2bcef67477aebe6fafbef58d69b01a61d3518db61e0b7c8" Oct 03 09:03:54 crc kubenswrapper[4790]: I1003 09:03:54.131245 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-c52nn" Oct 03 09:03:54 crc kubenswrapper[4790]: I1003 09:03:54.198100 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-l8fbg"] Oct 03 09:03:54 crc kubenswrapper[4790]: E1003 09:03:54.198640 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="06011acd-ae43-466d-b25d-70eddbc5c34f" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Oct 03 09:03:54 crc kubenswrapper[4790]: I1003 09:03:54.198664 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="06011acd-ae43-466d-b25d-70eddbc5c34f" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Oct 03 09:03:54 crc kubenswrapper[4790]: I1003 09:03:54.198888 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="06011acd-ae43-466d-b25d-70eddbc5c34f" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Oct 03 09:03:54 crc kubenswrapper[4790]: I1003 09:03:54.199736 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-l8fbg" Oct 03 09:03:54 crc kubenswrapper[4790]: I1003 09:03:54.201623 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 03 09:03:54 crc kubenswrapper[4790]: I1003 09:03:54.201995 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-8j46v" Oct 03 09:03:54 crc kubenswrapper[4790]: I1003 09:03:54.202186 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 03 09:03:54 crc kubenswrapper[4790]: I1003 09:03:54.202368 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 03 09:03:54 crc kubenswrapper[4790]: I1003 09:03:54.223625 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-l8fbg"] Oct 03 09:03:54 crc kubenswrapper[4790]: I1003 09:03:54.319990 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/97eb47a0-1a88-49ee-9d24-b9f0624ed543-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-l8fbg\" (UID: \"97eb47a0-1a88-49ee-9d24-b9f0624ed543\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-l8fbg" Oct 03 09:03:54 crc kubenswrapper[4790]: I1003 09:03:54.320236 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/97eb47a0-1a88-49ee-9d24-b9f0624ed543-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-l8fbg\" (UID: \"97eb47a0-1a88-49ee-9d24-b9f0624ed543\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-l8fbg" Oct 03 09:03:54 crc kubenswrapper[4790]: I1003 09:03:54.320461 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/97eb47a0-1a88-49ee-9d24-b9f0624ed543-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-l8fbg\" (UID: \"97eb47a0-1a88-49ee-9d24-b9f0624ed543\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-l8fbg" Oct 03 09:03:54 crc kubenswrapper[4790]: I1003 09:03:54.320636 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qss2g\" (UniqueName: \"kubernetes.io/projected/97eb47a0-1a88-49ee-9d24-b9f0624ed543-kube-api-access-qss2g\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-l8fbg\" (UID: \"97eb47a0-1a88-49ee-9d24-b9f0624ed543\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-l8fbg" Oct 03 09:03:54 crc kubenswrapper[4790]: I1003 09:03:54.422724 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/97eb47a0-1a88-49ee-9d24-b9f0624ed543-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-l8fbg\" (UID: \"97eb47a0-1a88-49ee-9d24-b9f0624ed543\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-l8fbg" Oct 03 09:03:54 crc kubenswrapper[4790]: I1003 09:03:54.423497 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/97eb47a0-1a88-49ee-9d24-b9f0624ed543-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-l8fbg\" (UID: \"97eb47a0-1a88-49ee-9d24-b9f0624ed543\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-l8fbg" Oct 03 09:03:54 crc kubenswrapper[4790]: I1003 09:03:54.424127 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/97eb47a0-1a88-49ee-9d24-b9f0624ed543-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-l8fbg\" (UID: \"97eb47a0-1a88-49ee-9d24-b9f0624ed543\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-l8fbg" Oct 03 09:03:54 crc kubenswrapper[4790]: I1003 09:03:54.424336 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qss2g\" (UniqueName: \"kubernetes.io/projected/97eb47a0-1a88-49ee-9d24-b9f0624ed543-kube-api-access-qss2g\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-l8fbg\" (UID: \"97eb47a0-1a88-49ee-9d24-b9f0624ed543\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-l8fbg" Oct 03 09:03:54 crc kubenswrapper[4790]: I1003 09:03:54.430956 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/97eb47a0-1a88-49ee-9d24-b9f0624ed543-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-l8fbg\" (UID: \"97eb47a0-1a88-49ee-9d24-b9f0624ed543\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-l8fbg" Oct 03 09:03:54 crc kubenswrapper[4790]: I1003 09:03:54.432097 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/97eb47a0-1a88-49ee-9d24-b9f0624ed543-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-l8fbg\" (UID: \"97eb47a0-1a88-49ee-9d24-b9f0624ed543\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-l8fbg" Oct 03 09:03:54 crc kubenswrapper[4790]: I1003 09:03:54.432112 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/97eb47a0-1a88-49ee-9d24-b9f0624ed543-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-l8fbg\" (UID: \"97eb47a0-1a88-49ee-9d24-b9f0624ed543\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-l8fbg" Oct 03 09:03:54 crc kubenswrapper[4790]: I1003 09:03:54.443289 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qss2g\" (UniqueName: \"kubernetes.io/projected/97eb47a0-1a88-49ee-9d24-b9f0624ed543-kube-api-access-qss2g\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-l8fbg\" (UID: \"97eb47a0-1a88-49ee-9d24-b9f0624ed543\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-l8fbg" Oct 03 09:03:54 crc kubenswrapper[4790]: I1003 09:03:54.525708 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-l8fbg" Oct 03 09:03:55 crc kubenswrapper[4790]: I1003 09:03:55.036892 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-l8fbg"] Oct 03 09:03:55 crc kubenswrapper[4790]: W1003 09:03:55.037963 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod97eb47a0_1a88_49ee_9d24_b9f0624ed543.slice/crio-095ce341b01aafab8865ca442b46ac393c5e0d67be3dd4ac32a62bef3eccb386 WatchSource:0}: Error finding container 095ce341b01aafab8865ca442b46ac393c5e0d67be3dd4ac32a62bef3eccb386: Status 404 returned error can't find the container with id 095ce341b01aafab8865ca442b46ac393c5e0d67be3dd4ac32a62bef3eccb386 Oct 03 09:03:55 crc kubenswrapper[4790]: I1003 09:03:55.140474 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-l8fbg" event={"ID":"97eb47a0-1a88-49ee-9d24-b9f0624ed543","Type":"ContainerStarted","Data":"095ce341b01aafab8865ca442b46ac393c5e0d67be3dd4ac32a62bef3eccb386"} Oct 03 09:03:56 crc kubenswrapper[4790]: I1003 09:03:56.150723 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-l8fbg" event={"ID":"97eb47a0-1a88-49ee-9d24-b9f0624ed543","Type":"ContainerStarted","Data":"1ea2da1d68422e0676ba6c34c448665c63c55f2e9b4acb5defcd0addbd5905cd"} Oct 03 09:03:56 crc kubenswrapper[4790]: I1003 09:03:56.166209 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-l8fbg" podStartSLOduration=1.7128245039999999 podStartE2EDuration="2.166179156s" podCreationTimestamp="2025-10-03 09:03:54 +0000 UTC" firstStartedPulling="2025-10-03 09:03:55.04037676 +0000 UTC m=+1421.393311008" lastFinishedPulling="2025-10-03 09:03:55.493731402 +0000 UTC m=+1421.846665660" observedRunningTime="2025-10-03 09:03:56.165025014 +0000 UTC m=+1422.517959282" watchObservedRunningTime="2025-10-03 09:03:56.166179156 +0000 UTC m=+1422.519113424" Oct 03 09:04:12 crc kubenswrapper[4790]: I1003 09:04:12.109603 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-klkx7"] Oct 03 09:04:12 crc kubenswrapper[4790]: I1003 09:04:12.112694 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-klkx7" Oct 03 09:04:12 crc kubenswrapper[4790]: I1003 09:04:12.124641 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-klkx7"] Oct 03 09:04:12 crc kubenswrapper[4790]: I1003 09:04:12.218036 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/68c30331-a727-4d5c-939b-85ea5bcea8d7-catalog-content\") pod \"certified-operators-klkx7\" (UID: \"68c30331-a727-4d5c-939b-85ea5bcea8d7\") " pod="openshift-marketplace/certified-operators-klkx7" Oct 03 09:04:12 crc kubenswrapper[4790]: I1003 09:04:12.218105 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7p9wp\" (UniqueName: \"kubernetes.io/projected/68c30331-a727-4d5c-939b-85ea5bcea8d7-kube-api-access-7p9wp\") pod \"certified-operators-klkx7\" (UID: \"68c30331-a727-4d5c-939b-85ea5bcea8d7\") " pod="openshift-marketplace/certified-operators-klkx7" Oct 03 09:04:12 crc kubenswrapper[4790]: I1003 09:04:12.218265 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/68c30331-a727-4d5c-939b-85ea5bcea8d7-utilities\") pod \"certified-operators-klkx7\" (UID: \"68c30331-a727-4d5c-939b-85ea5bcea8d7\") " pod="openshift-marketplace/certified-operators-klkx7" Oct 03 09:04:12 crc kubenswrapper[4790]: I1003 09:04:12.320605 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/68c30331-a727-4d5c-939b-85ea5bcea8d7-catalog-content\") pod \"certified-operators-klkx7\" (UID: \"68c30331-a727-4d5c-939b-85ea5bcea8d7\") " pod="openshift-marketplace/certified-operators-klkx7" Oct 03 09:04:12 crc kubenswrapper[4790]: I1003 09:04:12.320665 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7p9wp\" (UniqueName: \"kubernetes.io/projected/68c30331-a727-4d5c-939b-85ea5bcea8d7-kube-api-access-7p9wp\") pod \"certified-operators-klkx7\" (UID: \"68c30331-a727-4d5c-939b-85ea5bcea8d7\") " pod="openshift-marketplace/certified-operators-klkx7" Oct 03 09:04:12 crc kubenswrapper[4790]: I1003 09:04:12.320708 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/68c30331-a727-4d5c-939b-85ea5bcea8d7-utilities\") pod \"certified-operators-klkx7\" (UID: \"68c30331-a727-4d5c-939b-85ea5bcea8d7\") " pod="openshift-marketplace/certified-operators-klkx7" Oct 03 09:04:12 crc kubenswrapper[4790]: I1003 09:04:12.321532 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/68c30331-a727-4d5c-939b-85ea5bcea8d7-utilities\") pod \"certified-operators-klkx7\" (UID: \"68c30331-a727-4d5c-939b-85ea5bcea8d7\") " pod="openshift-marketplace/certified-operators-klkx7" Oct 03 09:04:12 crc kubenswrapper[4790]: I1003 09:04:12.321530 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/68c30331-a727-4d5c-939b-85ea5bcea8d7-catalog-content\") pod \"certified-operators-klkx7\" (UID: \"68c30331-a727-4d5c-939b-85ea5bcea8d7\") " pod="openshift-marketplace/certified-operators-klkx7" Oct 03 09:04:12 crc kubenswrapper[4790]: I1003 09:04:12.350814 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7p9wp\" (UniqueName: \"kubernetes.io/projected/68c30331-a727-4d5c-939b-85ea5bcea8d7-kube-api-access-7p9wp\") pod \"certified-operators-klkx7\" (UID: \"68c30331-a727-4d5c-939b-85ea5bcea8d7\") " pod="openshift-marketplace/certified-operators-klkx7" Oct 03 09:04:12 crc kubenswrapper[4790]: I1003 09:04:12.474267 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-klkx7" Oct 03 09:04:12 crc kubenswrapper[4790]: I1003 09:04:12.768692 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-klkx7"] Oct 03 09:04:13 crc kubenswrapper[4790]: I1003 09:04:13.388716 4790 generic.go:334] "Generic (PLEG): container finished" podID="68c30331-a727-4d5c-939b-85ea5bcea8d7" containerID="9090525e56693765447f194838ec6b037ac48776734a796727cb83ef3abbdae1" exitCode=0 Oct 03 09:04:13 crc kubenswrapper[4790]: I1003 09:04:13.388763 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-klkx7" event={"ID":"68c30331-a727-4d5c-939b-85ea5bcea8d7","Type":"ContainerDied","Data":"9090525e56693765447f194838ec6b037ac48776734a796727cb83ef3abbdae1"} Oct 03 09:04:13 crc kubenswrapper[4790]: I1003 09:04:13.388789 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-klkx7" event={"ID":"68c30331-a727-4d5c-939b-85ea5bcea8d7","Type":"ContainerStarted","Data":"3b789838f2e65a67413dc0d3c733f9d3e1a64b28814183a85af9ff632d2a7b39"} Oct 03 09:04:14 crc kubenswrapper[4790]: I1003 09:04:14.400168 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-klkx7" event={"ID":"68c30331-a727-4d5c-939b-85ea5bcea8d7","Type":"ContainerStarted","Data":"2a784b07d50156b6fc3877716c2a4361adda0b235842b2cb6312bf1114ad679b"} Oct 03 09:04:15 crc kubenswrapper[4790]: I1003 09:04:15.411129 4790 generic.go:334] "Generic (PLEG): container finished" podID="68c30331-a727-4d5c-939b-85ea5bcea8d7" containerID="2a784b07d50156b6fc3877716c2a4361adda0b235842b2cb6312bf1114ad679b" exitCode=0 Oct 03 09:04:15 crc kubenswrapper[4790]: I1003 09:04:15.411181 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-klkx7" event={"ID":"68c30331-a727-4d5c-939b-85ea5bcea8d7","Type":"ContainerDied","Data":"2a784b07d50156b6fc3877716c2a4361adda0b235842b2cb6312bf1114ad679b"} Oct 03 09:04:16 crc kubenswrapper[4790]: I1003 09:04:16.438192 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-klkx7" event={"ID":"68c30331-a727-4d5c-939b-85ea5bcea8d7","Type":"ContainerStarted","Data":"7eb2833a339873795da18558201ce39a5cbda1f309b6d23c30e4083f98e8a7ac"} Oct 03 09:04:16 crc kubenswrapper[4790]: I1003 09:04:16.456275 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-klkx7" podStartSLOduration=1.872539435 podStartE2EDuration="4.456257581s" podCreationTimestamp="2025-10-03 09:04:12 +0000 UTC" firstStartedPulling="2025-10-03 09:04:13.391498126 +0000 UTC m=+1439.744432364" lastFinishedPulling="2025-10-03 09:04:15.975216272 +0000 UTC m=+1442.328150510" observedRunningTime="2025-10-03 09:04:16.453716031 +0000 UTC m=+1442.806650299" watchObservedRunningTime="2025-10-03 09:04:16.456257581 +0000 UTC m=+1442.809191819" Oct 03 09:04:20 crc kubenswrapper[4790]: I1003 09:04:20.090893 4790 scope.go:117] "RemoveContainer" containerID="1b075cf75721e4b4dc222872f5825468c72b3460427c9967d29f7432a5c07169" Oct 03 09:04:20 crc kubenswrapper[4790]: I1003 09:04:20.244304 4790 scope.go:117] "RemoveContainer" containerID="c2fe2465c74cb35c41f727ed18c5dc12b448787c1a2a117121bc2df28cf5752c" Oct 03 09:04:20 crc kubenswrapper[4790]: I1003 09:04:20.282290 4790 scope.go:117] "RemoveContainer" containerID="eb8c3dc71c8ca8b48950da0bcc2bd797b69afd28744ac4b7dd0cd9dd51529407" Oct 03 09:04:20 crc kubenswrapper[4790]: I1003 09:04:20.333748 4790 scope.go:117] "RemoveContainer" containerID="c3fca4c476f7b449af2718dd47299c6d7cdaf0c30e27ccf14f5fcf0430a8173c" Oct 03 09:04:20 crc kubenswrapper[4790]: I1003 09:04:20.369627 4790 scope.go:117] "RemoveContainer" containerID="e71013313068ea9ba19dfe304d8f5cc8e067a087124a03955964c966041a4c30" Oct 03 09:04:22 crc kubenswrapper[4790]: I1003 09:04:22.477111 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-klkx7" Oct 03 09:04:22 crc kubenswrapper[4790]: I1003 09:04:22.477510 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-klkx7" Oct 03 09:04:22 crc kubenswrapper[4790]: I1003 09:04:22.552632 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-klkx7" Oct 03 09:04:22 crc kubenswrapper[4790]: I1003 09:04:22.627335 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-klkx7" Oct 03 09:04:22 crc kubenswrapper[4790]: I1003 09:04:22.795816 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-klkx7"] Oct 03 09:04:24 crc kubenswrapper[4790]: I1003 09:04:24.520155 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-klkx7" podUID="68c30331-a727-4d5c-939b-85ea5bcea8d7" containerName="registry-server" containerID="cri-o://7eb2833a339873795da18558201ce39a5cbda1f309b6d23c30e4083f98e8a7ac" gracePeriod=2 Oct 03 09:04:24 crc kubenswrapper[4790]: I1003 09:04:24.978907 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-klkx7" Oct 03 09:04:25 crc kubenswrapper[4790]: I1003 09:04:25.082650 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/68c30331-a727-4d5c-939b-85ea5bcea8d7-catalog-content\") pod \"68c30331-a727-4d5c-939b-85ea5bcea8d7\" (UID: \"68c30331-a727-4d5c-939b-85ea5bcea8d7\") " Oct 03 09:04:25 crc kubenswrapper[4790]: I1003 09:04:25.082862 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7p9wp\" (UniqueName: \"kubernetes.io/projected/68c30331-a727-4d5c-939b-85ea5bcea8d7-kube-api-access-7p9wp\") pod \"68c30331-a727-4d5c-939b-85ea5bcea8d7\" (UID: \"68c30331-a727-4d5c-939b-85ea5bcea8d7\") " Oct 03 09:04:25 crc kubenswrapper[4790]: I1003 09:04:25.082976 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/68c30331-a727-4d5c-939b-85ea5bcea8d7-utilities\") pod \"68c30331-a727-4d5c-939b-85ea5bcea8d7\" (UID: \"68c30331-a727-4d5c-939b-85ea5bcea8d7\") " Oct 03 09:04:25 crc kubenswrapper[4790]: I1003 09:04:25.085174 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/68c30331-a727-4d5c-939b-85ea5bcea8d7-utilities" (OuterVolumeSpecName: "utilities") pod "68c30331-a727-4d5c-939b-85ea5bcea8d7" (UID: "68c30331-a727-4d5c-939b-85ea5bcea8d7"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 09:04:25 crc kubenswrapper[4790]: I1003 09:04:25.096284 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/68c30331-a727-4d5c-939b-85ea5bcea8d7-kube-api-access-7p9wp" (OuterVolumeSpecName: "kube-api-access-7p9wp") pod "68c30331-a727-4d5c-939b-85ea5bcea8d7" (UID: "68c30331-a727-4d5c-939b-85ea5bcea8d7"). InnerVolumeSpecName "kube-api-access-7p9wp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:04:25 crc kubenswrapper[4790]: I1003 09:04:25.184302 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/68c30331-a727-4d5c-939b-85ea5bcea8d7-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "68c30331-a727-4d5c-939b-85ea5bcea8d7" (UID: "68c30331-a727-4d5c-939b-85ea5bcea8d7"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 09:04:25 crc kubenswrapper[4790]: I1003 09:04:25.185452 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/68c30331-a727-4d5c-939b-85ea5bcea8d7-catalog-content\") pod \"68c30331-a727-4d5c-939b-85ea5bcea8d7\" (UID: \"68c30331-a727-4d5c-939b-85ea5bcea8d7\") " Oct 03 09:04:25 crc kubenswrapper[4790]: W1003 09:04:25.185565 4790 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/68c30331-a727-4d5c-939b-85ea5bcea8d7/volumes/kubernetes.io~empty-dir/catalog-content Oct 03 09:04:25 crc kubenswrapper[4790]: I1003 09:04:25.185610 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/68c30331-a727-4d5c-939b-85ea5bcea8d7-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "68c30331-a727-4d5c-939b-85ea5bcea8d7" (UID: "68c30331-a727-4d5c-939b-85ea5bcea8d7"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 09:04:25 crc kubenswrapper[4790]: I1003 09:04:25.186345 4790 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/68c30331-a727-4d5c-939b-85ea5bcea8d7-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 09:04:25 crc kubenswrapper[4790]: I1003 09:04:25.186419 4790 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/68c30331-a727-4d5c-939b-85ea5bcea8d7-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 09:04:25 crc kubenswrapper[4790]: I1003 09:04:25.186486 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7p9wp\" (UniqueName: \"kubernetes.io/projected/68c30331-a727-4d5c-939b-85ea5bcea8d7-kube-api-access-7p9wp\") on node \"crc\" DevicePath \"\"" Oct 03 09:04:25 crc kubenswrapper[4790]: I1003 09:04:25.532839 4790 generic.go:334] "Generic (PLEG): container finished" podID="68c30331-a727-4d5c-939b-85ea5bcea8d7" containerID="7eb2833a339873795da18558201ce39a5cbda1f309b6d23c30e4083f98e8a7ac" exitCode=0 Oct 03 09:04:25 crc kubenswrapper[4790]: I1003 09:04:25.532886 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-klkx7" event={"ID":"68c30331-a727-4d5c-939b-85ea5bcea8d7","Type":"ContainerDied","Data":"7eb2833a339873795da18558201ce39a5cbda1f309b6d23c30e4083f98e8a7ac"} Oct 03 09:04:25 crc kubenswrapper[4790]: I1003 09:04:25.532916 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-klkx7" event={"ID":"68c30331-a727-4d5c-939b-85ea5bcea8d7","Type":"ContainerDied","Data":"3b789838f2e65a67413dc0d3c733f9d3e1a64b28814183a85af9ff632d2a7b39"} Oct 03 09:04:25 crc kubenswrapper[4790]: I1003 09:04:25.532933 4790 scope.go:117] "RemoveContainer" containerID="7eb2833a339873795da18558201ce39a5cbda1f309b6d23c30e4083f98e8a7ac" Oct 03 09:04:25 crc kubenswrapper[4790]: I1003 09:04:25.533085 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-klkx7" Oct 03 09:04:25 crc kubenswrapper[4790]: I1003 09:04:25.557177 4790 scope.go:117] "RemoveContainer" containerID="2a784b07d50156b6fc3877716c2a4361adda0b235842b2cb6312bf1114ad679b" Oct 03 09:04:25 crc kubenswrapper[4790]: I1003 09:04:25.570507 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-klkx7"] Oct 03 09:04:25 crc kubenswrapper[4790]: I1003 09:04:25.583289 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-klkx7"] Oct 03 09:04:25 crc kubenswrapper[4790]: I1003 09:04:25.590904 4790 scope.go:117] "RemoveContainer" containerID="9090525e56693765447f194838ec6b037ac48776734a796727cb83ef3abbdae1" Oct 03 09:04:25 crc kubenswrapper[4790]: I1003 09:04:25.637477 4790 scope.go:117] "RemoveContainer" containerID="7eb2833a339873795da18558201ce39a5cbda1f309b6d23c30e4083f98e8a7ac" Oct 03 09:04:25 crc kubenswrapper[4790]: E1003 09:04:25.638126 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7eb2833a339873795da18558201ce39a5cbda1f309b6d23c30e4083f98e8a7ac\": container with ID starting with 7eb2833a339873795da18558201ce39a5cbda1f309b6d23c30e4083f98e8a7ac not found: ID does not exist" containerID="7eb2833a339873795da18558201ce39a5cbda1f309b6d23c30e4083f98e8a7ac" Oct 03 09:04:25 crc kubenswrapper[4790]: I1003 09:04:25.638178 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7eb2833a339873795da18558201ce39a5cbda1f309b6d23c30e4083f98e8a7ac"} err="failed to get container status \"7eb2833a339873795da18558201ce39a5cbda1f309b6d23c30e4083f98e8a7ac\": rpc error: code = NotFound desc = could not find container \"7eb2833a339873795da18558201ce39a5cbda1f309b6d23c30e4083f98e8a7ac\": container with ID starting with 7eb2833a339873795da18558201ce39a5cbda1f309b6d23c30e4083f98e8a7ac not found: ID does not exist" Oct 03 09:04:25 crc kubenswrapper[4790]: I1003 09:04:25.638214 4790 scope.go:117] "RemoveContainer" containerID="2a784b07d50156b6fc3877716c2a4361adda0b235842b2cb6312bf1114ad679b" Oct 03 09:04:25 crc kubenswrapper[4790]: E1003 09:04:25.638930 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2a784b07d50156b6fc3877716c2a4361adda0b235842b2cb6312bf1114ad679b\": container with ID starting with 2a784b07d50156b6fc3877716c2a4361adda0b235842b2cb6312bf1114ad679b not found: ID does not exist" containerID="2a784b07d50156b6fc3877716c2a4361adda0b235842b2cb6312bf1114ad679b" Oct 03 09:04:25 crc kubenswrapper[4790]: I1003 09:04:25.638985 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2a784b07d50156b6fc3877716c2a4361adda0b235842b2cb6312bf1114ad679b"} err="failed to get container status \"2a784b07d50156b6fc3877716c2a4361adda0b235842b2cb6312bf1114ad679b\": rpc error: code = NotFound desc = could not find container \"2a784b07d50156b6fc3877716c2a4361adda0b235842b2cb6312bf1114ad679b\": container with ID starting with 2a784b07d50156b6fc3877716c2a4361adda0b235842b2cb6312bf1114ad679b not found: ID does not exist" Oct 03 09:04:25 crc kubenswrapper[4790]: I1003 09:04:25.639021 4790 scope.go:117] "RemoveContainer" containerID="9090525e56693765447f194838ec6b037ac48776734a796727cb83ef3abbdae1" Oct 03 09:04:25 crc kubenswrapper[4790]: E1003 09:04:25.639618 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9090525e56693765447f194838ec6b037ac48776734a796727cb83ef3abbdae1\": container with ID starting with 9090525e56693765447f194838ec6b037ac48776734a796727cb83ef3abbdae1 not found: ID does not exist" containerID="9090525e56693765447f194838ec6b037ac48776734a796727cb83ef3abbdae1" Oct 03 09:04:25 crc kubenswrapper[4790]: I1003 09:04:25.639744 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9090525e56693765447f194838ec6b037ac48776734a796727cb83ef3abbdae1"} err="failed to get container status \"9090525e56693765447f194838ec6b037ac48776734a796727cb83ef3abbdae1\": rpc error: code = NotFound desc = could not find container \"9090525e56693765447f194838ec6b037ac48776734a796727cb83ef3abbdae1\": container with ID starting with 9090525e56693765447f194838ec6b037ac48776734a796727cb83ef3abbdae1 not found: ID does not exist" Oct 03 09:04:26 crc kubenswrapper[4790]: I1003 09:04:26.339031 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="68c30331-a727-4d5c-939b-85ea5bcea8d7" path="/var/lib/kubelet/pods/68c30331-a727-4d5c-939b-85ea5bcea8d7/volumes" Oct 03 09:04:40 crc kubenswrapper[4790]: I1003 09:04:40.186952 4790 patch_prober.go:28] interesting pod/machine-config-daemon-zqj4j container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 09:04:40 crc kubenswrapper[4790]: I1003 09:04:40.187558 4790 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" podUID="36eae06a-d8be-4128-8d68-acb32e1f011b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 09:04:43 crc kubenswrapper[4790]: I1003 09:04:43.157153 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-tkf24"] Oct 03 09:04:43 crc kubenswrapper[4790]: E1003 09:04:43.158004 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="68c30331-a727-4d5c-939b-85ea5bcea8d7" containerName="registry-server" Oct 03 09:04:43 crc kubenswrapper[4790]: I1003 09:04:43.158023 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="68c30331-a727-4d5c-939b-85ea5bcea8d7" containerName="registry-server" Oct 03 09:04:43 crc kubenswrapper[4790]: E1003 09:04:43.158041 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="68c30331-a727-4d5c-939b-85ea5bcea8d7" containerName="extract-content" Oct 03 09:04:43 crc kubenswrapper[4790]: I1003 09:04:43.158049 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="68c30331-a727-4d5c-939b-85ea5bcea8d7" containerName="extract-content" Oct 03 09:04:43 crc kubenswrapper[4790]: E1003 09:04:43.158068 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="68c30331-a727-4d5c-939b-85ea5bcea8d7" containerName="extract-utilities" Oct 03 09:04:43 crc kubenswrapper[4790]: I1003 09:04:43.158076 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="68c30331-a727-4d5c-939b-85ea5bcea8d7" containerName="extract-utilities" Oct 03 09:04:43 crc kubenswrapper[4790]: I1003 09:04:43.158324 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="68c30331-a727-4d5c-939b-85ea5bcea8d7" containerName="registry-server" Oct 03 09:04:43 crc kubenswrapper[4790]: I1003 09:04:43.160182 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-tkf24" Oct 03 09:04:43 crc kubenswrapper[4790]: I1003 09:04:43.198661 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-tkf24"] Oct 03 09:04:43 crc kubenswrapper[4790]: I1003 09:04:43.346031 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ddc7a1ac-31de-42c8-9eb3-e959481725c5-utilities\") pod \"community-operators-tkf24\" (UID: \"ddc7a1ac-31de-42c8-9eb3-e959481725c5\") " pod="openshift-marketplace/community-operators-tkf24" Oct 03 09:04:43 crc kubenswrapper[4790]: I1003 09:04:43.346091 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ddc7a1ac-31de-42c8-9eb3-e959481725c5-catalog-content\") pod \"community-operators-tkf24\" (UID: \"ddc7a1ac-31de-42c8-9eb3-e959481725c5\") " pod="openshift-marketplace/community-operators-tkf24" Oct 03 09:04:43 crc kubenswrapper[4790]: I1003 09:04:43.346233 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hz5qn\" (UniqueName: \"kubernetes.io/projected/ddc7a1ac-31de-42c8-9eb3-e959481725c5-kube-api-access-hz5qn\") pod \"community-operators-tkf24\" (UID: \"ddc7a1ac-31de-42c8-9eb3-e959481725c5\") " pod="openshift-marketplace/community-operators-tkf24" Oct 03 09:04:43 crc kubenswrapper[4790]: I1003 09:04:43.448295 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hz5qn\" (UniqueName: \"kubernetes.io/projected/ddc7a1ac-31de-42c8-9eb3-e959481725c5-kube-api-access-hz5qn\") pod \"community-operators-tkf24\" (UID: \"ddc7a1ac-31de-42c8-9eb3-e959481725c5\") " pod="openshift-marketplace/community-operators-tkf24" Oct 03 09:04:43 crc kubenswrapper[4790]: I1003 09:04:43.448425 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ddc7a1ac-31de-42c8-9eb3-e959481725c5-utilities\") pod \"community-operators-tkf24\" (UID: \"ddc7a1ac-31de-42c8-9eb3-e959481725c5\") " pod="openshift-marketplace/community-operators-tkf24" Oct 03 09:04:43 crc kubenswrapper[4790]: I1003 09:04:43.448462 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ddc7a1ac-31de-42c8-9eb3-e959481725c5-catalog-content\") pod \"community-operators-tkf24\" (UID: \"ddc7a1ac-31de-42c8-9eb3-e959481725c5\") " pod="openshift-marketplace/community-operators-tkf24" Oct 03 09:04:43 crc kubenswrapper[4790]: I1003 09:04:43.449730 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ddc7a1ac-31de-42c8-9eb3-e959481725c5-utilities\") pod \"community-operators-tkf24\" (UID: \"ddc7a1ac-31de-42c8-9eb3-e959481725c5\") " pod="openshift-marketplace/community-operators-tkf24" Oct 03 09:04:43 crc kubenswrapper[4790]: I1003 09:04:43.449774 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ddc7a1ac-31de-42c8-9eb3-e959481725c5-catalog-content\") pod \"community-operators-tkf24\" (UID: \"ddc7a1ac-31de-42c8-9eb3-e959481725c5\") " pod="openshift-marketplace/community-operators-tkf24" Oct 03 09:04:43 crc kubenswrapper[4790]: I1003 09:04:43.475663 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hz5qn\" (UniqueName: \"kubernetes.io/projected/ddc7a1ac-31de-42c8-9eb3-e959481725c5-kube-api-access-hz5qn\") pod \"community-operators-tkf24\" (UID: \"ddc7a1ac-31de-42c8-9eb3-e959481725c5\") " pod="openshift-marketplace/community-operators-tkf24" Oct 03 09:04:43 crc kubenswrapper[4790]: I1003 09:04:43.481104 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-tkf24" Oct 03 09:04:43 crc kubenswrapper[4790]: I1003 09:04:43.991906 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-tkf24"] Oct 03 09:04:44 crc kubenswrapper[4790]: I1003 09:04:44.719384 4790 generic.go:334] "Generic (PLEG): container finished" podID="ddc7a1ac-31de-42c8-9eb3-e959481725c5" containerID="eb2f3d3153ab961b5c9bcc2b817eeda8f66349a1e890ea579b2b8bea73230434" exitCode=0 Oct 03 09:04:44 crc kubenswrapper[4790]: I1003 09:04:44.719453 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tkf24" event={"ID":"ddc7a1ac-31de-42c8-9eb3-e959481725c5","Type":"ContainerDied","Data":"eb2f3d3153ab961b5c9bcc2b817eeda8f66349a1e890ea579b2b8bea73230434"} Oct 03 09:04:44 crc kubenswrapper[4790]: I1003 09:04:44.719709 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tkf24" event={"ID":"ddc7a1ac-31de-42c8-9eb3-e959481725c5","Type":"ContainerStarted","Data":"e50924ea5b5bc830ad8da5a48414dc2271e75822e04b9d086c84dd5530ec6339"} Oct 03 09:04:46 crc kubenswrapper[4790]: I1003 09:04:46.740363 4790 generic.go:334] "Generic (PLEG): container finished" podID="ddc7a1ac-31de-42c8-9eb3-e959481725c5" containerID="f4d00ba8f75c2bf77639c184f91e1e42e43b5a49ff43389e247f84b1db86fe4b" exitCode=0 Oct 03 09:04:46 crc kubenswrapper[4790]: I1003 09:04:46.740498 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tkf24" event={"ID":"ddc7a1ac-31de-42c8-9eb3-e959481725c5","Type":"ContainerDied","Data":"f4d00ba8f75c2bf77639c184f91e1e42e43b5a49ff43389e247f84b1db86fe4b"} Oct 03 09:04:48 crc kubenswrapper[4790]: I1003 09:04:48.765259 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tkf24" event={"ID":"ddc7a1ac-31de-42c8-9eb3-e959481725c5","Type":"ContainerStarted","Data":"076f814c7b8ceb245c94a5f249d92e771664232191bab95f309bc59630151e6f"} Oct 03 09:04:48 crc kubenswrapper[4790]: I1003 09:04:48.788787 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-tkf24" podStartSLOduration=2.866186596 podStartE2EDuration="5.788760361s" podCreationTimestamp="2025-10-03 09:04:43 +0000 UTC" firstStartedPulling="2025-10-03 09:04:44.722250744 +0000 UTC m=+1471.075184992" lastFinishedPulling="2025-10-03 09:04:47.644824519 +0000 UTC m=+1473.997758757" observedRunningTime="2025-10-03 09:04:48.783446375 +0000 UTC m=+1475.136380613" watchObservedRunningTime="2025-10-03 09:04:48.788760361 +0000 UTC m=+1475.141694639" Oct 03 09:04:53 crc kubenswrapper[4790]: I1003 09:04:53.481197 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-tkf24" Oct 03 09:04:53 crc kubenswrapper[4790]: I1003 09:04:53.481883 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-tkf24" Oct 03 09:04:53 crc kubenswrapper[4790]: I1003 09:04:53.545891 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-tkf24" Oct 03 09:04:53 crc kubenswrapper[4790]: I1003 09:04:53.858915 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-tkf24" Oct 03 09:04:53 crc kubenswrapper[4790]: I1003 09:04:53.905445 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-tkf24"] Oct 03 09:04:55 crc kubenswrapper[4790]: I1003 09:04:55.830234 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-tkf24" podUID="ddc7a1ac-31de-42c8-9eb3-e959481725c5" containerName="registry-server" containerID="cri-o://076f814c7b8ceb245c94a5f249d92e771664232191bab95f309bc59630151e6f" gracePeriod=2 Oct 03 09:04:56 crc kubenswrapper[4790]: I1003 09:04:56.292053 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-tkf24" Oct 03 09:04:56 crc kubenswrapper[4790]: I1003 09:04:56.401505 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hz5qn\" (UniqueName: \"kubernetes.io/projected/ddc7a1ac-31de-42c8-9eb3-e959481725c5-kube-api-access-hz5qn\") pod \"ddc7a1ac-31de-42c8-9eb3-e959481725c5\" (UID: \"ddc7a1ac-31de-42c8-9eb3-e959481725c5\") " Oct 03 09:04:56 crc kubenswrapper[4790]: I1003 09:04:56.401808 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ddc7a1ac-31de-42c8-9eb3-e959481725c5-utilities\") pod \"ddc7a1ac-31de-42c8-9eb3-e959481725c5\" (UID: \"ddc7a1ac-31de-42c8-9eb3-e959481725c5\") " Oct 03 09:04:56 crc kubenswrapper[4790]: I1003 09:04:56.401832 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ddc7a1ac-31de-42c8-9eb3-e959481725c5-catalog-content\") pod \"ddc7a1ac-31de-42c8-9eb3-e959481725c5\" (UID: \"ddc7a1ac-31de-42c8-9eb3-e959481725c5\") " Oct 03 09:04:56 crc kubenswrapper[4790]: I1003 09:04:56.403255 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ddc7a1ac-31de-42c8-9eb3-e959481725c5-utilities" (OuterVolumeSpecName: "utilities") pod "ddc7a1ac-31de-42c8-9eb3-e959481725c5" (UID: "ddc7a1ac-31de-42c8-9eb3-e959481725c5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 09:04:56 crc kubenswrapper[4790]: I1003 09:04:56.409899 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ddc7a1ac-31de-42c8-9eb3-e959481725c5-kube-api-access-hz5qn" (OuterVolumeSpecName: "kube-api-access-hz5qn") pod "ddc7a1ac-31de-42c8-9eb3-e959481725c5" (UID: "ddc7a1ac-31de-42c8-9eb3-e959481725c5"). InnerVolumeSpecName "kube-api-access-hz5qn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:04:56 crc kubenswrapper[4790]: I1003 09:04:56.456974 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ddc7a1ac-31de-42c8-9eb3-e959481725c5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ddc7a1ac-31de-42c8-9eb3-e959481725c5" (UID: "ddc7a1ac-31de-42c8-9eb3-e959481725c5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 09:04:56 crc kubenswrapper[4790]: I1003 09:04:56.504915 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hz5qn\" (UniqueName: \"kubernetes.io/projected/ddc7a1ac-31de-42c8-9eb3-e959481725c5-kube-api-access-hz5qn\") on node \"crc\" DevicePath \"\"" Oct 03 09:04:56 crc kubenswrapper[4790]: I1003 09:04:56.504966 4790 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ddc7a1ac-31de-42c8-9eb3-e959481725c5-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 09:04:56 crc kubenswrapper[4790]: I1003 09:04:56.504978 4790 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ddc7a1ac-31de-42c8-9eb3-e959481725c5-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 09:04:56 crc kubenswrapper[4790]: I1003 09:04:56.843178 4790 generic.go:334] "Generic (PLEG): container finished" podID="ddc7a1ac-31de-42c8-9eb3-e959481725c5" containerID="076f814c7b8ceb245c94a5f249d92e771664232191bab95f309bc59630151e6f" exitCode=0 Oct 03 09:04:56 crc kubenswrapper[4790]: I1003 09:04:56.843244 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tkf24" event={"ID":"ddc7a1ac-31de-42c8-9eb3-e959481725c5","Type":"ContainerDied","Data":"076f814c7b8ceb245c94a5f249d92e771664232191bab95f309bc59630151e6f"} Oct 03 09:04:56 crc kubenswrapper[4790]: I1003 09:04:56.843276 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tkf24" event={"ID":"ddc7a1ac-31de-42c8-9eb3-e959481725c5","Type":"ContainerDied","Data":"e50924ea5b5bc830ad8da5a48414dc2271e75822e04b9d086c84dd5530ec6339"} Oct 03 09:04:56 crc kubenswrapper[4790]: I1003 09:04:56.843284 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-tkf24" Oct 03 09:04:56 crc kubenswrapper[4790]: I1003 09:04:56.843294 4790 scope.go:117] "RemoveContainer" containerID="076f814c7b8ceb245c94a5f249d92e771664232191bab95f309bc59630151e6f" Oct 03 09:04:56 crc kubenswrapper[4790]: I1003 09:04:56.875052 4790 scope.go:117] "RemoveContainer" containerID="f4d00ba8f75c2bf77639c184f91e1e42e43b5a49ff43389e247f84b1db86fe4b" Oct 03 09:04:56 crc kubenswrapper[4790]: I1003 09:04:56.902068 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-tkf24"] Oct 03 09:04:56 crc kubenswrapper[4790]: I1003 09:04:56.906855 4790 scope.go:117] "RemoveContainer" containerID="eb2f3d3153ab961b5c9bcc2b817eeda8f66349a1e890ea579b2b8bea73230434" Oct 03 09:04:56 crc kubenswrapper[4790]: I1003 09:04:56.915417 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-tkf24"] Oct 03 09:04:56 crc kubenswrapper[4790]: I1003 09:04:56.962040 4790 scope.go:117] "RemoveContainer" containerID="076f814c7b8ceb245c94a5f249d92e771664232191bab95f309bc59630151e6f" Oct 03 09:04:56 crc kubenswrapper[4790]: E1003 09:04:56.962605 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"076f814c7b8ceb245c94a5f249d92e771664232191bab95f309bc59630151e6f\": container with ID starting with 076f814c7b8ceb245c94a5f249d92e771664232191bab95f309bc59630151e6f not found: ID does not exist" containerID="076f814c7b8ceb245c94a5f249d92e771664232191bab95f309bc59630151e6f" Oct 03 09:04:56 crc kubenswrapper[4790]: I1003 09:04:56.962711 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"076f814c7b8ceb245c94a5f249d92e771664232191bab95f309bc59630151e6f"} err="failed to get container status \"076f814c7b8ceb245c94a5f249d92e771664232191bab95f309bc59630151e6f\": rpc error: code = NotFound desc = could not find container \"076f814c7b8ceb245c94a5f249d92e771664232191bab95f309bc59630151e6f\": container with ID starting with 076f814c7b8ceb245c94a5f249d92e771664232191bab95f309bc59630151e6f not found: ID does not exist" Oct 03 09:04:56 crc kubenswrapper[4790]: I1003 09:04:56.962789 4790 scope.go:117] "RemoveContainer" containerID="f4d00ba8f75c2bf77639c184f91e1e42e43b5a49ff43389e247f84b1db86fe4b" Oct 03 09:04:56 crc kubenswrapper[4790]: E1003 09:04:56.963370 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f4d00ba8f75c2bf77639c184f91e1e42e43b5a49ff43389e247f84b1db86fe4b\": container with ID starting with f4d00ba8f75c2bf77639c184f91e1e42e43b5a49ff43389e247f84b1db86fe4b not found: ID does not exist" containerID="f4d00ba8f75c2bf77639c184f91e1e42e43b5a49ff43389e247f84b1db86fe4b" Oct 03 09:04:56 crc kubenswrapper[4790]: I1003 09:04:56.963449 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f4d00ba8f75c2bf77639c184f91e1e42e43b5a49ff43389e247f84b1db86fe4b"} err="failed to get container status \"f4d00ba8f75c2bf77639c184f91e1e42e43b5a49ff43389e247f84b1db86fe4b\": rpc error: code = NotFound desc = could not find container \"f4d00ba8f75c2bf77639c184f91e1e42e43b5a49ff43389e247f84b1db86fe4b\": container with ID starting with f4d00ba8f75c2bf77639c184f91e1e42e43b5a49ff43389e247f84b1db86fe4b not found: ID does not exist" Oct 03 09:04:56 crc kubenswrapper[4790]: I1003 09:04:56.963518 4790 scope.go:117] "RemoveContainer" containerID="eb2f3d3153ab961b5c9bcc2b817eeda8f66349a1e890ea579b2b8bea73230434" Oct 03 09:04:56 crc kubenswrapper[4790]: E1003 09:04:56.964049 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eb2f3d3153ab961b5c9bcc2b817eeda8f66349a1e890ea579b2b8bea73230434\": container with ID starting with eb2f3d3153ab961b5c9bcc2b817eeda8f66349a1e890ea579b2b8bea73230434 not found: ID does not exist" containerID="eb2f3d3153ab961b5c9bcc2b817eeda8f66349a1e890ea579b2b8bea73230434" Oct 03 09:04:56 crc kubenswrapper[4790]: I1003 09:04:56.964840 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eb2f3d3153ab961b5c9bcc2b817eeda8f66349a1e890ea579b2b8bea73230434"} err="failed to get container status \"eb2f3d3153ab961b5c9bcc2b817eeda8f66349a1e890ea579b2b8bea73230434\": rpc error: code = NotFound desc = could not find container \"eb2f3d3153ab961b5c9bcc2b817eeda8f66349a1e890ea579b2b8bea73230434\": container with ID starting with eb2f3d3153ab961b5c9bcc2b817eeda8f66349a1e890ea579b2b8bea73230434 not found: ID does not exist" Oct 03 09:04:58 crc kubenswrapper[4790]: I1003 09:04:58.340405 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ddc7a1ac-31de-42c8-9eb3-e959481725c5" path="/var/lib/kubelet/pods/ddc7a1ac-31de-42c8-9eb3-e959481725c5/volumes" Oct 03 09:05:10 crc kubenswrapper[4790]: I1003 09:05:10.187132 4790 patch_prober.go:28] interesting pod/machine-config-daemon-zqj4j container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 09:05:10 crc kubenswrapper[4790]: I1003 09:05:10.188133 4790 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" podUID="36eae06a-d8be-4128-8d68-acb32e1f011b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 09:05:20 crc kubenswrapper[4790]: I1003 09:05:20.515040 4790 scope.go:117] "RemoveContainer" containerID="30e0fe45b4ecd503d294bff8ce70c90a46cb82b7bf568d3f1c53ed5fb95d4976" Oct 03 09:05:40 crc kubenswrapper[4790]: I1003 09:05:40.186480 4790 patch_prober.go:28] interesting pod/machine-config-daemon-zqj4j container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 09:05:40 crc kubenswrapper[4790]: I1003 09:05:40.187120 4790 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" podUID="36eae06a-d8be-4128-8d68-acb32e1f011b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 09:05:40 crc kubenswrapper[4790]: I1003 09:05:40.187185 4790 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" Oct 03 09:05:40 crc kubenswrapper[4790]: I1003 09:05:40.187975 4790 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"a12f13183219a9973e13fa6740df8e860fd1b4a6d896ada977d67f910dbb6700"} pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 03 09:05:40 crc kubenswrapper[4790]: I1003 09:05:40.188028 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" podUID="36eae06a-d8be-4128-8d68-acb32e1f011b" containerName="machine-config-daemon" containerID="cri-o://a12f13183219a9973e13fa6740df8e860fd1b4a6d896ada977d67f910dbb6700" gracePeriod=600 Oct 03 09:05:40 crc kubenswrapper[4790]: E1003 09:05:40.322261 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqj4j_openshift-machine-config-operator(36eae06a-d8be-4128-8d68-acb32e1f011b)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" podUID="36eae06a-d8be-4128-8d68-acb32e1f011b" Oct 03 09:05:41 crc kubenswrapper[4790]: I1003 09:05:41.307117 4790 generic.go:334] "Generic (PLEG): container finished" podID="36eae06a-d8be-4128-8d68-acb32e1f011b" containerID="a12f13183219a9973e13fa6740df8e860fd1b4a6d896ada977d67f910dbb6700" exitCode=0 Oct 03 09:05:41 crc kubenswrapper[4790]: I1003 09:05:41.307197 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" event={"ID":"36eae06a-d8be-4128-8d68-acb32e1f011b","Type":"ContainerDied","Data":"a12f13183219a9973e13fa6740df8e860fd1b4a6d896ada977d67f910dbb6700"} Oct 03 09:05:41 crc kubenswrapper[4790]: I1003 09:05:41.307427 4790 scope.go:117] "RemoveContainer" containerID="c2fcffece27550dbe144383b7c492d28b1ee8a149f741f79ecb706a01a9252b8" Oct 03 09:05:41 crc kubenswrapper[4790]: I1003 09:05:41.308216 4790 scope.go:117] "RemoveContainer" containerID="a12f13183219a9973e13fa6740df8e860fd1b4a6d896ada977d67f910dbb6700" Oct 03 09:05:41 crc kubenswrapper[4790]: E1003 09:05:41.308598 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqj4j_openshift-machine-config-operator(36eae06a-d8be-4128-8d68-acb32e1f011b)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" podUID="36eae06a-d8be-4128-8d68-acb32e1f011b" Oct 03 09:05:56 crc kubenswrapper[4790]: I1003 09:05:56.329407 4790 scope.go:117] "RemoveContainer" containerID="a12f13183219a9973e13fa6740df8e860fd1b4a6d896ada977d67f910dbb6700" Oct 03 09:05:56 crc kubenswrapper[4790]: E1003 09:05:56.330515 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqj4j_openshift-machine-config-operator(36eae06a-d8be-4128-8d68-acb32e1f011b)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" podUID="36eae06a-d8be-4128-8d68-acb32e1f011b" Oct 03 09:06:11 crc kubenswrapper[4790]: I1003 09:06:11.328794 4790 scope.go:117] "RemoveContainer" containerID="a12f13183219a9973e13fa6740df8e860fd1b4a6d896ada977d67f910dbb6700" Oct 03 09:06:11 crc kubenswrapper[4790]: E1003 09:06:11.330718 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqj4j_openshift-machine-config-operator(36eae06a-d8be-4128-8d68-acb32e1f011b)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" podUID="36eae06a-d8be-4128-8d68-acb32e1f011b" Oct 03 09:06:20 crc kubenswrapper[4790]: I1003 09:06:20.614680 4790 scope.go:117] "RemoveContainer" containerID="f2af8048b69e7ae948e0156778d10832fb1b1892abc42d130934e3ab87b85c32" Oct 03 09:06:22 crc kubenswrapper[4790]: I1003 09:06:22.329847 4790 scope.go:117] "RemoveContainer" containerID="a12f13183219a9973e13fa6740df8e860fd1b4a6d896ada977d67f910dbb6700" Oct 03 09:06:22 crc kubenswrapper[4790]: E1003 09:06:22.330224 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqj4j_openshift-machine-config-operator(36eae06a-d8be-4128-8d68-acb32e1f011b)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" podUID="36eae06a-d8be-4128-8d68-acb32e1f011b" Oct 03 09:06:36 crc kubenswrapper[4790]: I1003 09:06:36.328610 4790 scope.go:117] "RemoveContainer" containerID="a12f13183219a9973e13fa6740df8e860fd1b4a6d896ada977d67f910dbb6700" Oct 03 09:06:36 crc kubenswrapper[4790]: E1003 09:06:36.329437 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqj4j_openshift-machine-config-operator(36eae06a-d8be-4128-8d68-acb32e1f011b)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" podUID="36eae06a-d8be-4128-8d68-acb32e1f011b" Oct 03 09:06:47 crc kubenswrapper[4790]: I1003 09:06:47.661683 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-ckptl"] Oct 03 09:06:47 crc kubenswrapper[4790]: E1003 09:06:47.662729 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ddc7a1ac-31de-42c8-9eb3-e959481725c5" containerName="extract-utilities" Oct 03 09:06:47 crc kubenswrapper[4790]: I1003 09:06:47.662745 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="ddc7a1ac-31de-42c8-9eb3-e959481725c5" containerName="extract-utilities" Oct 03 09:06:47 crc kubenswrapper[4790]: E1003 09:06:47.662794 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ddc7a1ac-31de-42c8-9eb3-e959481725c5" containerName="extract-content" Oct 03 09:06:47 crc kubenswrapper[4790]: I1003 09:06:47.662801 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="ddc7a1ac-31de-42c8-9eb3-e959481725c5" containerName="extract-content" Oct 03 09:06:47 crc kubenswrapper[4790]: E1003 09:06:47.662812 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ddc7a1ac-31de-42c8-9eb3-e959481725c5" containerName="registry-server" Oct 03 09:06:47 crc kubenswrapper[4790]: I1003 09:06:47.662820 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="ddc7a1ac-31de-42c8-9eb3-e959481725c5" containerName="registry-server" Oct 03 09:06:47 crc kubenswrapper[4790]: I1003 09:06:47.663032 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="ddc7a1ac-31de-42c8-9eb3-e959481725c5" containerName="registry-server" Oct 03 09:06:47 crc kubenswrapper[4790]: I1003 09:06:47.664770 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ckptl" Oct 03 09:06:47 crc kubenswrapper[4790]: I1003 09:06:47.681148 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-ckptl"] Oct 03 09:06:47 crc kubenswrapper[4790]: I1003 09:06:47.713945 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nzpvz\" (UniqueName: \"kubernetes.io/projected/8ef39a03-096c-445f-821b-9ec306fa6f17-kube-api-access-nzpvz\") pod \"redhat-operators-ckptl\" (UID: \"8ef39a03-096c-445f-821b-9ec306fa6f17\") " pod="openshift-marketplace/redhat-operators-ckptl" Oct 03 09:06:47 crc kubenswrapper[4790]: I1003 09:06:47.714201 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8ef39a03-096c-445f-821b-9ec306fa6f17-catalog-content\") pod \"redhat-operators-ckptl\" (UID: \"8ef39a03-096c-445f-821b-9ec306fa6f17\") " pod="openshift-marketplace/redhat-operators-ckptl" Oct 03 09:06:47 crc kubenswrapper[4790]: I1003 09:06:47.714425 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8ef39a03-096c-445f-821b-9ec306fa6f17-utilities\") pod \"redhat-operators-ckptl\" (UID: \"8ef39a03-096c-445f-821b-9ec306fa6f17\") " pod="openshift-marketplace/redhat-operators-ckptl" Oct 03 09:06:47 crc kubenswrapper[4790]: I1003 09:06:47.817208 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nzpvz\" (UniqueName: \"kubernetes.io/projected/8ef39a03-096c-445f-821b-9ec306fa6f17-kube-api-access-nzpvz\") pod \"redhat-operators-ckptl\" (UID: \"8ef39a03-096c-445f-821b-9ec306fa6f17\") " pod="openshift-marketplace/redhat-operators-ckptl" Oct 03 09:06:47 crc kubenswrapper[4790]: I1003 09:06:47.817355 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8ef39a03-096c-445f-821b-9ec306fa6f17-catalog-content\") pod \"redhat-operators-ckptl\" (UID: \"8ef39a03-096c-445f-821b-9ec306fa6f17\") " pod="openshift-marketplace/redhat-operators-ckptl" Oct 03 09:06:47 crc kubenswrapper[4790]: I1003 09:06:47.817436 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8ef39a03-096c-445f-821b-9ec306fa6f17-utilities\") pod \"redhat-operators-ckptl\" (UID: \"8ef39a03-096c-445f-821b-9ec306fa6f17\") " pod="openshift-marketplace/redhat-operators-ckptl" Oct 03 09:06:47 crc kubenswrapper[4790]: I1003 09:06:47.817879 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8ef39a03-096c-445f-821b-9ec306fa6f17-utilities\") pod \"redhat-operators-ckptl\" (UID: \"8ef39a03-096c-445f-821b-9ec306fa6f17\") " pod="openshift-marketplace/redhat-operators-ckptl" Oct 03 09:06:47 crc kubenswrapper[4790]: I1003 09:06:47.818355 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8ef39a03-096c-445f-821b-9ec306fa6f17-catalog-content\") pod \"redhat-operators-ckptl\" (UID: \"8ef39a03-096c-445f-821b-9ec306fa6f17\") " pod="openshift-marketplace/redhat-operators-ckptl" Oct 03 09:06:47 crc kubenswrapper[4790]: I1003 09:06:47.836753 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nzpvz\" (UniqueName: \"kubernetes.io/projected/8ef39a03-096c-445f-821b-9ec306fa6f17-kube-api-access-nzpvz\") pod \"redhat-operators-ckptl\" (UID: \"8ef39a03-096c-445f-821b-9ec306fa6f17\") " pod="openshift-marketplace/redhat-operators-ckptl" Oct 03 09:06:47 crc kubenswrapper[4790]: I1003 09:06:47.982032 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ckptl" Oct 03 09:06:48 crc kubenswrapper[4790]: I1003 09:06:48.328390 4790 scope.go:117] "RemoveContainer" containerID="a12f13183219a9973e13fa6740df8e860fd1b4a6d896ada977d67f910dbb6700" Oct 03 09:06:48 crc kubenswrapper[4790]: E1003 09:06:48.329232 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqj4j_openshift-machine-config-operator(36eae06a-d8be-4128-8d68-acb32e1f011b)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" podUID="36eae06a-d8be-4128-8d68-acb32e1f011b" Oct 03 09:06:48 crc kubenswrapper[4790]: I1003 09:06:48.508825 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-ckptl"] Oct 03 09:06:49 crc kubenswrapper[4790]: I1003 09:06:49.023195 4790 generic.go:334] "Generic (PLEG): container finished" podID="8ef39a03-096c-445f-821b-9ec306fa6f17" containerID="8555dc9a4cbe8fbfbf789c3278fbcd3d2969bea100773bbe91aeb4bbea4ac449" exitCode=0 Oct 03 09:06:49 crc kubenswrapper[4790]: I1003 09:06:49.023253 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ckptl" event={"ID":"8ef39a03-096c-445f-821b-9ec306fa6f17","Type":"ContainerDied","Data":"8555dc9a4cbe8fbfbf789c3278fbcd3d2969bea100773bbe91aeb4bbea4ac449"} Oct 03 09:06:49 crc kubenswrapper[4790]: I1003 09:06:49.023505 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ckptl" event={"ID":"8ef39a03-096c-445f-821b-9ec306fa6f17","Type":"ContainerStarted","Data":"0989034ce7ae58619968f7c56c336a595a586d3f069254b46f6e1cf048b29f6e"} Oct 03 09:06:49 crc kubenswrapper[4790]: I1003 09:06:49.025398 4790 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 03 09:06:50 crc kubenswrapper[4790]: I1003 09:06:50.051263 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ckptl" event={"ID":"8ef39a03-096c-445f-821b-9ec306fa6f17","Type":"ContainerStarted","Data":"642b7a51b9eaaa48711c37bc048fc6df1d85f699c1810da618fc9827d398ee05"} Oct 03 09:06:52 crc kubenswrapper[4790]: I1003 09:06:52.073517 4790 generic.go:334] "Generic (PLEG): container finished" podID="8ef39a03-096c-445f-821b-9ec306fa6f17" containerID="642b7a51b9eaaa48711c37bc048fc6df1d85f699c1810da618fc9827d398ee05" exitCode=0 Oct 03 09:06:52 crc kubenswrapper[4790]: I1003 09:06:52.073595 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ckptl" event={"ID":"8ef39a03-096c-445f-821b-9ec306fa6f17","Type":"ContainerDied","Data":"642b7a51b9eaaa48711c37bc048fc6df1d85f699c1810da618fc9827d398ee05"} Oct 03 09:06:53 crc kubenswrapper[4790]: I1003 09:06:53.087292 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ckptl" event={"ID":"8ef39a03-096c-445f-821b-9ec306fa6f17","Type":"ContainerStarted","Data":"74c4df237c1742fca4756929a69572c6818837275a257049a42066f6ed71dd2e"} Oct 03 09:06:53 crc kubenswrapper[4790]: I1003 09:06:53.107751 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-ckptl" podStartSLOduration=2.5177771509999998 podStartE2EDuration="6.107729171s" podCreationTimestamp="2025-10-03 09:06:47 +0000 UTC" firstStartedPulling="2025-10-03 09:06:49.025157385 +0000 UTC m=+1595.378091623" lastFinishedPulling="2025-10-03 09:06:52.615109405 +0000 UTC m=+1598.968043643" observedRunningTime="2025-10-03 09:06:53.10438914 +0000 UTC m=+1599.457323388" watchObservedRunningTime="2025-10-03 09:06:53.107729171 +0000 UTC m=+1599.460663449" Oct 03 09:06:57 crc kubenswrapper[4790]: I1003 09:06:57.982430 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-ckptl" Oct 03 09:06:57 crc kubenswrapper[4790]: I1003 09:06:57.983141 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-ckptl" Oct 03 09:06:58 crc kubenswrapper[4790]: I1003 09:06:58.034370 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-ckptl" Oct 03 09:06:58 crc kubenswrapper[4790]: I1003 09:06:58.186132 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-ckptl" Oct 03 09:06:58 crc kubenswrapper[4790]: I1003 09:06:58.267696 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-ckptl"] Oct 03 09:07:00 crc kubenswrapper[4790]: I1003 09:07:00.149040 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-ckptl" podUID="8ef39a03-096c-445f-821b-9ec306fa6f17" containerName="registry-server" containerID="cri-o://74c4df237c1742fca4756929a69572c6818837275a257049a42066f6ed71dd2e" gracePeriod=2 Oct 03 09:07:00 crc kubenswrapper[4790]: I1003 09:07:00.329043 4790 scope.go:117] "RemoveContainer" containerID="a12f13183219a9973e13fa6740df8e860fd1b4a6d896ada977d67f910dbb6700" Oct 03 09:07:00 crc kubenswrapper[4790]: E1003 09:07:00.329642 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqj4j_openshift-machine-config-operator(36eae06a-d8be-4128-8d68-acb32e1f011b)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" podUID="36eae06a-d8be-4128-8d68-acb32e1f011b" Oct 03 09:07:00 crc kubenswrapper[4790]: I1003 09:07:00.612475 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ckptl" Oct 03 09:07:00 crc kubenswrapper[4790]: I1003 09:07:00.795965 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzpvz\" (UniqueName: \"kubernetes.io/projected/8ef39a03-096c-445f-821b-9ec306fa6f17-kube-api-access-nzpvz\") pod \"8ef39a03-096c-445f-821b-9ec306fa6f17\" (UID: \"8ef39a03-096c-445f-821b-9ec306fa6f17\") " Oct 03 09:07:00 crc kubenswrapper[4790]: I1003 09:07:00.796277 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8ef39a03-096c-445f-821b-9ec306fa6f17-catalog-content\") pod \"8ef39a03-096c-445f-821b-9ec306fa6f17\" (UID: \"8ef39a03-096c-445f-821b-9ec306fa6f17\") " Oct 03 09:07:00 crc kubenswrapper[4790]: I1003 09:07:00.796430 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8ef39a03-096c-445f-821b-9ec306fa6f17-utilities\") pod \"8ef39a03-096c-445f-821b-9ec306fa6f17\" (UID: \"8ef39a03-096c-445f-821b-9ec306fa6f17\") " Oct 03 09:07:00 crc kubenswrapper[4790]: I1003 09:07:00.797189 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8ef39a03-096c-445f-821b-9ec306fa6f17-utilities" (OuterVolumeSpecName: "utilities") pod "8ef39a03-096c-445f-821b-9ec306fa6f17" (UID: "8ef39a03-096c-445f-821b-9ec306fa6f17"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 09:07:00 crc kubenswrapper[4790]: I1003 09:07:00.797833 4790 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8ef39a03-096c-445f-821b-9ec306fa6f17-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 09:07:00 crc kubenswrapper[4790]: I1003 09:07:00.801814 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8ef39a03-096c-445f-821b-9ec306fa6f17-kube-api-access-nzpvz" (OuterVolumeSpecName: "kube-api-access-nzpvz") pod "8ef39a03-096c-445f-821b-9ec306fa6f17" (UID: "8ef39a03-096c-445f-821b-9ec306fa6f17"). InnerVolumeSpecName "kube-api-access-nzpvz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:07:00 crc kubenswrapper[4790]: I1003 09:07:00.885377 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8ef39a03-096c-445f-821b-9ec306fa6f17-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8ef39a03-096c-445f-821b-9ec306fa6f17" (UID: "8ef39a03-096c-445f-821b-9ec306fa6f17"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 09:07:00 crc kubenswrapper[4790]: I1003 09:07:00.899980 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzpvz\" (UniqueName: \"kubernetes.io/projected/8ef39a03-096c-445f-821b-9ec306fa6f17-kube-api-access-nzpvz\") on node \"crc\" DevicePath \"\"" Oct 03 09:07:00 crc kubenswrapper[4790]: I1003 09:07:00.900017 4790 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8ef39a03-096c-445f-821b-9ec306fa6f17-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 09:07:01 crc kubenswrapper[4790]: I1003 09:07:01.160996 4790 generic.go:334] "Generic (PLEG): container finished" podID="8ef39a03-096c-445f-821b-9ec306fa6f17" containerID="74c4df237c1742fca4756929a69572c6818837275a257049a42066f6ed71dd2e" exitCode=0 Oct 03 09:07:01 crc kubenswrapper[4790]: I1003 09:07:01.161044 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ckptl" event={"ID":"8ef39a03-096c-445f-821b-9ec306fa6f17","Type":"ContainerDied","Data":"74c4df237c1742fca4756929a69572c6818837275a257049a42066f6ed71dd2e"} Oct 03 09:07:01 crc kubenswrapper[4790]: I1003 09:07:01.161064 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ckptl" Oct 03 09:07:01 crc kubenswrapper[4790]: I1003 09:07:01.161082 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ckptl" event={"ID":"8ef39a03-096c-445f-821b-9ec306fa6f17","Type":"ContainerDied","Data":"0989034ce7ae58619968f7c56c336a595a586d3f069254b46f6e1cf048b29f6e"} Oct 03 09:07:01 crc kubenswrapper[4790]: I1003 09:07:01.161102 4790 scope.go:117] "RemoveContainer" containerID="74c4df237c1742fca4756929a69572c6818837275a257049a42066f6ed71dd2e" Oct 03 09:07:01 crc kubenswrapper[4790]: I1003 09:07:01.190475 4790 scope.go:117] "RemoveContainer" containerID="642b7a51b9eaaa48711c37bc048fc6df1d85f699c1810da618fc9827d398ee05" Oct 03 09:07:01 crc kubenswrapper[4790]: I1003 09:07:01.201640 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-ckptl"] Oct 03 09:07:01 crc kubenswrapper[4790]: I1003 09:07:01.213989 4790 scope.go:117] "RemoveContainer" containerID="8555dc9a4cbe8fbfbf789c3278fbcd3d2969bea100773bbe91aeb4bbea4ac449" Oct 03 09:07:01 crc kubenswrapper[4790]: I1003 09:07:01.214306 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-ckptl"] Oct 03 09:07:01 crc kubenswrapper[4790]: I1003 09:07:01.267811 4790 scope.go:117] "RemoveContainer" containerID="74c4df237c1742fca4756929a69572c6818837275a257049a42066f6ed71dd2e" Oct 03 09:07:01 crc kubenswrapper[4790]: E1003 09:07:01.268238 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"74c4df237c1742fca4756929a69572c6818837275a257049a42066f6ed71dd2e\": container with ID starting with 74c4df237c1742fca4756929a69572c6818837275a257049a42066f6ed71dd2e not found: ID does not exist" containerID="74c4df237c1742fca4756929a69572c6818837275a257049a42066f6ed71dd2e" Oct 03 09:07:01 crc kubenswrapper[4790]: I1003 09:07:01.268285 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"74c4df237c1742fca4756929a69572c6818837275a257049a42066f6ed71dd2e"} err="failed to get container status \"74c4df237c1742fca4756929a69572c6818837275a257049a42066f6ed71dd2e\": rpc error: code = NotFound desc = could not find container \"74c4df237c1742fca4756929a69572c6818837275a257049a42066f6ed71dd2e\": container with ID starting with 74c4df237c1742fca4756929a69572c6818837275a257049a42066f6ed71dd2e not found: ID does not exist" Oct 03 09:07:01 crc kubenswrapper[4790]: I1003 09:07:01.268311 4790 scope.go:117] "RemoveContainer" containerID="642b7a51b9eaaa48711c37bc048fc6df1d85f699c1810da618fc9827d398ee05" Oct 03 09:07:01 crc kubenswrapper[4790]: E1003 09:07:01.268591 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"642b7a51b9eaaa48711c37bc048fc6df1d85f699c1810da618fc9827d398ee05\": container with ID starting with 642b7a51b9eaaa48711c37bc048fc6df1d85f699c1810da618fc9827d398ee05 not found: ID does not exist" containerID="642b7a51b9eaaa48711c37bc048fc6df1d85f699c1810da618fc9827d398ee05" Oct 03 09:07:01 crc kubenswrapper[4790]: I1003 09:07:01.268619 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"642b7a51b9eaaa48711c37bc048fc6df1d85f699c1810da618fc9827d398ee05"} err="failed to get container status \"642b7a51b9eaaa48711c37bc048fc6df1d85f699c1810da618fc9827d398ee05\": rpc error: code = NotFound desc = could not find container \"642b7a51b9eaaa48711c37bc048fc6df1d85f699c1810da618fc9827d398ee05\": container with ID starting with 642b7a51b9eaaa48711c37bc048fc6df1d85f699c1810da618fc9827d398ee05 not found: ID does not exist" Oct 03 09:07:01 crc kubenswrapper[4790]: I1003 09:07:01.268636 4790 scope.go:117] "RemoveContainer" containerID="8555dc9a4cbe8fbfbf789c3278fbcd3d2969bea100773bbe91aeb4bbea4ac449" Oct 03 09:07:01 crc kubenswrapper[4790]: E1003 09:07:01.268996 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8555dc9a4cbe8fbfbf789c3278fbcd3d2969bea100773bbe91aeb4bbea4ac449\": container with ID starting with 8555dc9a4cbe8fbfbf789c3278fbcd3d2969bea100773bbe91aeb4bbea4ac449 not found: ID does not exist" containerID="8555dc9a4cbe8fbfbf789c3278fbcd3d2969bea100773bbe91aeb4bbea4ac449" Oct 03 09:07:01 crc kubenswrapper[4790]: I1003 09:07:01.269030 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8555dc9a4cbe8fbfbf789c3278fbcd3d2969bea100773bbe91aeb4bbea4ac449"} err="failed to get container status \"8555dc9a4cbe8fbfbf789c3278fbcd3d2969bea100773bbe91aeb4bbea4ac449\": rpc error: code = NotFound desc = could not find container \"8555dc9a4cbe8fbfbf789c3278fbcd3d2969bea100773bbe91aeb4bbea4ac449\": container with ID starting with 8555dc9a4cbe8fbfbf789c3278fbcd3d2969bea100773bbe91aeb4bbea4ac449 not found: ID does not exist" Oct 03 09:07:02 crc kubenswrapper[4790]: I1003 09:07:02.341489 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8ef39a03-096c-445f-821b-9ec306fa6f17" path="/var/lib/kubelet/pods/8ef39a03-096c-445f-821b-9ec306fa6f17/volumes" Oct 03 09:07:09 crc kubenswrapper[4790]: I1003 09:07:09.237952 4790 generic.go:334] "Generic (PLEG): container finished" podID="97eb47a0-1a88-49ee-9d24-b9f0624ed543" containerID="1ea2da1d68422e0676ba6c34c448665c63c55f2e9b4acb5defcd0addbd5905cd" exitCode=0 Oct 03 09:07:09 crc kubenswrapper[4790]: I1003 09:07:09.238206 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-l8fbg" event={"ID":"97eb47a0-1a88-49ee-9d24-b9f0624ed543","Type":"ContainerDied","Data":"1ea2da1d68422e0676ba6c34c448665c63c55f2e9b4acb5defcd0addbd5905cd"} Oct 03 09:07:10 crc kubenswrapper[4790]: I1003 09:07:10.752431 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-l8fbg" Oct 03 09:07:10 crc kubenswrapper[4790]: I1003 09:07:10.894765 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/97eb47a0-1a88-49ee-9d24-b9f0624ed543-inventory\") pod \"97eb47a0-1a88-49ee-9d24-b9f0624ed543\" (UID: \"97eb47a0-1a88-49ee-9d24-b9f0624ed543\") " Oct 03 09:07:10 crc kubenswrapper[4790]: I1003 09:07:10.894908 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/97eb47a0-1a88-49ee-9d24-b9f0624ed543-bootstrap-combined-ca-bundle\") pod \"97eb47a0-1a88-49ee-9d24-b9f0624ed543\" (UID: \"97eb47a0-1a88-49ee-9d24-b9f0624ed543\") " Oct 03 09:07:10 crc kubenswrapper[4790]: I1003 09:07:10.894969 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qss2g\" (UniqueName: \"kubernetes.io/projected/97eb47a0-1a88-49ee-9d24-b9f0624ed543-kube-api-access-qss2g\") pod \"97eb47a0-1a88-49ee-9d24-b9f0624ed543\" (UID: \"97eb47a0-1a88-49ee-9d24-b9f0624ed543\") " Oct 03 09:07:10 crc kubenswrapper[4790]: I1003 09:07:10.895012 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/97eb47a0-1a88-49ee-9d24-b9f0624ed543-ssh-key\") pod \"97eb47a0-1a88-49ee-9d24-b9f0624ed543\" (UID: \"97eb47a0-1a88-49ee-9d24-b9f0624ed543\") " Oct 03 09:07:10 crc kubenswrapper[4790]: I1003 09:07:10.900102 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/97eb47a0-1a88-49ee-9d24-b9f0624ed543-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "97eb47a0-1a88-49ee-9d24-b9f0624ed543" (UID: "97eb47a0-1a88-49ee-9d24-b9f0624ed543"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:07:10 crc kubenswrapper[4790]: I1003 09:07:10.900395 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/97eb47a0-1a88-49ee-9d24-b9f0624ed543-kube-api-access-qss2g" (OuterVolumeSpecName: "kube-api-access-qss2g") pod "97eb47a0-1a88-49ee-9d24-b9f0624ed543" (UID: "97eb47a0-1a88-49ee-9d24-b9f0624ed543"). InnerVolumeSpecName "kube-api-access-qss2g". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:07:10 crc kubenswrapper[4790]: I1003 09:07:10.922531 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/97eb47a0-1a88-49ee-9d24-b9f0624ed543-inventory" (OuterVolumeSpecName: "inventory") pod "97eb47a0-1a88-49ee-9d24-b9f0624ed543" (UID: "97eb47a0-1a88-49ee-9d24-b9f0624ed543"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:07:10 crc kubenswrapper[4790]: I1003 09:07:10.922832 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/97eb47a0-1a88-49ee-9d24-b9f0624ed543-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "97eb47a0-1a88-49ee-9d24-b9f0624ed543" (UID: "97eb47a0-1a88-49ee-9d24-b9f0624ed543"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:07:11 crc kubenswrapper[4790]: I1003 09:07:11.000505 4790 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/97eb47a0-1a88-49ee-9d24-b9f0624ed543-inventory\") on node \"crc\" DevicePath \"\"" Oct 03 09:07:11 crc kubenswrapper[4790]: I1003 09:07:11.000563 4790 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/97eb47a0-1a88-49ee-9d24-b9f0624ed543-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 09:07:11 crc kubenswrapper[4790]: I1003 09:07:11.000620 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qss2g\" (UniqueName: \"kubernetes.io/projected/97eb47a0-1a88-49ee-9d24-b9f0624ed543-kube-api-access-qss2g\") on node \"crc\" DevicePath \"\"" Oct 03 09:07:11 crc kubenswrapper[4790]: I1003 09:07:11.000633 4790 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/97eb47a0-1a88-49ee-9d24-b9f0624ed543-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 03 09:07:11 crc kubenswrapper[4790]: I1003 09:07:11.261923 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-l8fbg" event={"ID":"97eb47a0-1a88-49ee-9d24-b9f0624ed543","Type":"ContainerDied","Data":"095ce341b01aafab8865ca442b46ac393c5e0d67be3dd4ac32a62bef3eccb386"} Oct 03 09:07:11 crc kubenswrapper[4790]: I1003 09:07:11.261974 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="095ce341b01aafab8865ca442b46ac393c5e0d67be3dd4ac32a62bef3eccb386" Oct 03 09:07:11 crc kubenswrapper[4790]: I1003 09:07:11.262025 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-l8fbg" Oct 03 09:07:11 crc kubenswrapper[4790]: I1003 09:07:11.331751 4790 scope.go:117] "RemoveContainer" containerID="a12f13183219a9973e13fa6740df8e860fd1b4a6d896ada977d67f910dbb6700" Oct 03 09:07:11 crc kubenswrapper[4790]: E1003 09:07:11.332016 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqj4j_openshift-machine-config-operator(36eae06a-d8be-4128-8d68-acb32e1f011b)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" podUID="36eae06a-d8be-4128-8d68-acb32e1f011b" Oct 03 09:07:11 crc kubenswrapper[4790]: I1003 09:07:11.360045 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-mdlnz"] Oct 03 09:07:11 crc kubenswrapper[4790]: E1003 09:07:11.360814 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="97eb47a0-1a88-49ee-9d24-b9f0624ed543" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Oct 03 09:07:11 crc kubenswrapper[4790]: I1003 09:07:11.360903 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="97eb47a0-1a88-49ee-9d24-b9f0624ed543" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Oct 03 09:07:11 crc kubenswrapper[4790]: E1003 09:07:11.360998 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8ef39a03-096c-445f-821b-9ec306fa6f17" containerName="extract-utilities" Oct 03 09:07:11 crc kubenswrapper[4790]: I1003 09:07:11.361064 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ef39a03-096c-445f-821b-9ec306fa6f17" containerName="extract-utilities" Oct 03 09:07:11 crc kubenswrapper[4790]: E1003 09:07:11.361156 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8ef39a03-096c-445f-821b-9ec306fa6f17" containerName="extract-content" Oct 03 09:07:11 crc kubenswrapper[4790]: I1003 09:07:11.361229 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ef39a03-096c-445f-821b-9ec306fa6f17" containerName="extract-content" Oct 03 09:07:11 crc kubenswrapper[4790]: E1003 09:07:11.361352 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8ef39a03-096c-445f-821b-9ec306fa6f17" containerName="registry-server" Oct 03 09:07:11 crc kubenswrapper[4790]: I1003 09:07:11.361430 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ef39a03-096c-445f-821b-9ec306fa6f17" containerName="registry-server" Oct 03 09:07:11 crc kubenswrapper[4790]: I1003 09:07:11.361767 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="97eb47a0-1a88-49ee-9d24-b9f0624ed543" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Oct 03 09:07:11 crc kubenswrapper[4790]: I1003 09:07:11.361855 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="8ef39a03-096c-445f-821b-9ec306fa6f17" containerName="registry-server" Oct 03 09:07:11 crc kubenswrapper[4790]: I1003 09:07:11.362845 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-mdlnz" Oct 03 09:07:11 crc kubenswrapper[4790]: I1003 09:07:11.365434 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-8j46v" Oct 03 09:07:11 crc kubenswrapper[4790]: I1003 09:07:11.365944 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 03 09:07:11 crc kubenswrapper[4790]: I1003 09:07:11.366205 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 03 09:07:11 crc kubenswrapper[4790]: I1003 09:07:11.366273 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 03 09:07:11 crc kubenswrapper[4790]: I1003 09:07:11.369094 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-mdlnz"] Oct 03 09:07:11 crc kubenswrapper[4790]: I1003 09:07:11.513232 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2542827b-a700-41bf-80ff-22928d83630a-ssh-key\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-mdlnz\" (UID: \"2542827b-a700-41bf-80ff-22928d83630a\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-mdlnz" Oct 03 09:07:11 crc kubenswrapper[4790]: I1003 09:07:11.513290 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2542827b-a700-41bf-80ff-22928d83630a-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-mdlnz\" (UID: \"2542827b-a700-41bf-80ff-22928d83630a\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-mdlnz" Oct 03 09:07:11 crc kubenswrapper[4790]: I1003 09:07:11.513312 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-54f6h\" (UniqueName: \"kubernetes.io/projected/2542827b-a700-41bf-80ff-22928d83630a-kube-api-access-54f6h\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-mdlnz\" (UID: \"2542827b-a700-41bf-80ff-22928d83630a\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-mdlnz" Oct 03 09:07:11 crc kubenswrapper[4790]: I1003 09:07:11.615964 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2542827b-a700-41bf-80ff-22928d83630a-ssh-key\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-mdlnz\" (UID: \"2542827b-a700-41bf-80ff-22928d83630a\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-mdlnz" Oct 03 09:07:11 crc kubenswrapper[4790]: I1003 09:07:11.616430 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2542827b-a700-41bf-80ff-22928d83630a-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-mdlnz\" (UID: \"2542827b-a700-41bf-80ff-22928d83630a\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-mdlnz" Oct 03 09:07:11 crc kubenswrapper[4790]: I1003 09:07:11.616696 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-54f6h\" (UniqueName: \"kubernetes.io/projected/2542827b-a700-41bf-80ff-22928d83630a-kube-api-access-54f6h\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-mdlnz\" (UID: \"2542827b-a700-41bf-80ff-22928d83630a\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-mdlnz" Oct 03 09:07:11 crc kubenswrapper[4790]: I1003 09:07:11.620201 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2542827b-a700-41bf-80ff-22928d83630a-ssh-key\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-mdlnz\" (UID: \"2542827b-a700-41bf-80ff-22928d83630a\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-mdlnz" Oct 03 09:07:11 crc kubenswrapper[4790]: I1003 09:07:11.621268 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2542827b-a700-41bf-80ff-22928d83630a-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-mdlnz\" (UID: \"2542827b-a700-41bf-80ff-22928d83630a\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-mdlnz" Oct 03 09:07:11 crc kubenswrapper[4790]: I1003 09:07:11.645278 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-54f6h\" (UniqueName: \"kubernetes.io/projected/2542827b-a700-41bf-80ff-22928d83630a-kube-api-access-54f6h\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-mdlnz\" (UID: \"2542827b-a700-41bf-80ff-22928d83630a\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-mdlnz" Oct 03 09:07:11 crc kubenswrapper[4790]: I1003 09:07:11.718103 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-mdlnz" Oct 03 09:07:12 crc kubenswrapper[4790]: I1003 09:07:12.270138 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-mdlnz"] Oct 03 09:07:13 crc kubenswrapper[4790]: I1003 09:07:13.286571 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-mdlnz" event={"ID":"2542827b-a700-41bf-80ff-22928d83630a","Type":"ContainerStarted","Data":"c8cd4b968df968aa05a2294808d83ea5dab5d5c42dc79068a5d2bed462c9c81f"} Oct 03 09:07:13 crc kubenswrapper[4790]: I1003 09:07:13.287101 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-mdlnz" event={"ID":"2542827b-a700-41bf-80ff-22928d83630a","Type":"ContainerStarted","Data":"1e3a9df620d891b8ac852620f35f5ecb8b9928570d5b8a8326ab43b1adeb016d"} Oct 03 09:07:13 crc kubenswrapper[4790]: I1003 09:07:13.311883 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-mdlnz" podStartSLOduration=1.878115662 podStartE2EDuration="2.311858705s" podCreationTimestamp="2025-10-03 09:07:11 +0000 UTC" firstStartedPulling="2025-10-03 09:07:12.276522915 +0000 UTC m=+1618.629457153" lastFinishedPulling="2025-10-03 09:07:12.710265958 +0000 UTC m=+1619.063200196" observedRunningTime="2025-10-03 09:07:13.300253708 +0000 UTC m=+1619.653187976" watchObservedRunningTime="2025-10-03 09:07:13.311858705 +0000 UTC m=+1619.664792963" Oct 03 09:07:20 crc kubenswrapper[4790]: I1003 09:07:20.675822 4790 scope.go:117] "RemoveContainer" containerID="b97964d83ecfd2cc23becb7229b3a327bd0cd183d18069d6578472649c370e17" Oct 03 09:07:20 crc kubenswrapper[4790]: I1003 09:07:20.707496 4790 scope.go:117] "RemoveContainer" containerID="b3c7fadbe8c3098ca776e6142c6fb85e9f18c3076604286e7dbcc5c1c15cc8c7" Oct 03 09:07:20 crc kubenswrapper[4790]: I1003 09:07:20.773532 4790 scope.go:117] "RemoveContainer" containerID="8e837c52774af4ef3334da1340eb30cba3810dfe1ad01ff69c870840019e5bc1" Oct 03 09:07:20 crc kubenswrapper[4790]: I1003 09:07:20.795933 4790 scope.go:117] "RemoveContainer" containerID="2ce6ad919c4abde1bd65f5a01d210a624389d2c7552ea3c4d04d6f1762fbca6d" Oct 03 09:07:23 crc kubenswrapper[4790]: I1003 09:07:23.328718 4790 scope.go:117] "RemoveContainer" containerID="a12f13183219a9973e13fa6740df8e860fd1b4a6d896ada977d67f910dbb6700" Oct 03 09:07:23 crc kubenswrapper[4790]: E1003 09:07:23.329764 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqj4j_openshift-machine-config-operator(36eae06a-d8be-4128-8d68-acb32e1f011b)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" podUID="36eae06a-d8be-4128-8d68-acb32e1f011b" Oct 03 09:07:31 crc kubenswrapper[4790]: I1003 09:07:31.045349 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-db-create-dxr5m"] Oct 03 09:07:31 crc kubenswrapper[4790]: I1003 09:07:31.056022 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-8pfjr"] Oct 03 09:07:31 crc kubenswrapper[4790]: I1003 09:07:31.066874 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-28lqw"] Oct 03 09:07:31 crc kubenswrapper[4790]: I1003 09:07:31.076014 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/watcher-db-create-dxr5m"] Oct 03 09:07:31 crc kubenswrapper[4790]: I1003 09:07:31.088009 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-8pfjr"] Oct 03 09:07:31 crc kubenswrapper[4790]: I1003 09:07:31.097406 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-28lqw"] Oct 03 09:07:32 crc kubenswrapper[4790]: I1003 09:07:32.343302 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="03453a08-49f9-4433-ba29-bd7f5ce716ff" path="/var/lib/kubelet/pods/03453a08-49f9-4433-ba29-bd7f5ce716ff/volumes" Oct 03 09:07:32 crc kubenswrapper[4790]: I1003 09:07:32.344629 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ba913c08-cae4-40c2-9ec8-a4865beb0918" path="/var/lib/kubelet/pods/ba913c08-cae4-40c2-9ec8-a4865beb0918/volumes" Oct 03 09:07:32 crc kubenswrapper[4790]: I1003 09:07:32.345916 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d5099e4b-6707-4f2c-9d6f-504c101cf5c4" path="/var/lib/kubelet/pods/d5099e4b-6707-4f2c-9d6f-504c101cf5c4/volumes" Oct 03 09:07:37 crc kubenswrapper[4790]: I1003 09:07:37.036179 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-jqhqt"] Oct 03 09:07:37 crc kubenswrapper[4790]: I1003 09:07:37.044261 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-jqhqt"] Oct 03 09:07:38 crc kubenswrapper[4790]: I1003 09:07:38.028282 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-3dbc-account-create-qgx9w"] Oct 03 09:07:38 crc kubenswrapper[4790]: I1003 09:07:38.040444 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-3dbc-account-create-qgx9w"] Oct 03 09:07:38 crc kubenswrapper[4790]: I1003 09:07:38.328332 4790 scope.go:117] "RemoveContainer" containerID="a12f13183219a9973e13fa6740df8e860fd1b4a6d896ada977d67f910dbb6700" Oct 03 09:07:38 crc kubenswrapper[4790]: E1003 09:07:38.328933 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqj4j_openshift-machine-config-operator(36eae06a-d8be-4128-8d68-acb32e1f011b)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" podUID="36eae06a-d8be-4128-8d68-acb32e1f011b" Oct 03 09:07:38 crc kubenswrapper[4790]: I1003 09:07:38.345754 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ec270558-02cf-4d29-932d-b92a185a0396" path="/var/lib/kubelet/pods/ec270558-02cf-4d29-932d-b92a185a0396/volumes" Oct 03 09:07:38 crc kubenswrapper[4790]: I1003 09:07:38.346707 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f0dccb48-67d6-443b-8572-ff55454618ef" path="/var/lib/kubelet/pods/f0dccb48-67d6-443b-8572-ff55454618ef/volumes" Oct 03 09:07:49 crc kubenswrapper[4790]: I1003 09:07:49.328398 4790 scope.go:117] "RemoveContainer" containerID="a12f13183219a9973e13fa6740df8e860fd1b4a6d896ada977d67f910dbb6700" Oct 03 09:07:49 crc kubenswrapper[4790]: E1003 09:07:49.329174 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqj4j_openshift-machine-config-operator(36eae06a-d8be-4128-8d68-acb32e1f011b)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" podUID="36eae06a-d8be-4128-8d68-acb32e1f011b" Oct 03 09:07:55 crc kubenswrapper[4790]: I1003 09:07:55.048829 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-5be0-account-create-clnmh"] Oct 03 09:07:55 crc kubenswrapper[4790]: I1003 09:07:55.062734 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-b0cf-account-create-j9m59"] Oct 03 09:07:55 crc kubenswrapper[4790]: I1003 09:07:55.075060 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/watcher-5be0-account-create-clnmh"] Oct 03 09:07:55 crc kubenswrapper[4790]: I1003 09:07:55.083815 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-b0cf-account-create-j9m59"] Oct 03 09:07:55 crc kubenswrapper[4790]: I1003 09:07:55.093385 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-7a52-account-create-bzmtx"] Oct 03 09:07:55 crc kubenswrapper[4790]: I1003 09:07:55.103292 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-7a52-account-create-bzmtx"] Oct 03 09:07:56 crc kubenswrapper[4790]: I1003 09:07:56.354103 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5da9e362-3ba2-4a58-a4e1-06c4ff646cca" path="/var/lib/kubelet/pods/5da9e362-3ba2-4a58-a4e1-06c4ff646cca/volumes" Oct 03 09:07:56 crc kubenswrapper[4790]: I1003 09:07:56.355194 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6b7d6d53-4ca2-49d7-92bb-9144efa2c104" path="/var/lib/kubelet/pods/6b7d6d53-4ca2-49d7-92bb-9144efa2c104/volumes" Oct 03 09:07:56 crc kubenswrapper[4790]: I1003 09:07:56.356634 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a04e5a9b-9d29-41ec-8ca6-3106d6d431ac" path="/var/lib/kubelet/pods/a04e5a9b-9d29-41ec-8ca6-3106d6d431ac/volumes" Oct 03 09:08:01 crc kubenswrapper[4790]: I1003 09:08:01.328071 4790 scope.go:117] "RemoveContainer" containerID="a12f13183219a9973e13fa6740df8e860fd1b4a6d896ada977d67f910dbb6700" Oct 03 09:08:01 crc kubenswrapper[4790]: E1003 09:08:01.328820 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqj4j_openshift-machine-config-operator(36eae06a-d8be-4128-8d68-acb32e1f011b)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" podUID="36eae06a-d8be-4128-8d68-acb32e1f011b" Oct 03 09:08:08 crc kubenswrapper[4790]: I1003 09:08:08.049175 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-djrdw"] Oct 03 09:08:08 crc kubenswrapper[4790]: I1003 09:08:08.062828 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-m92cz"] Oct 03 09:08:08 crc kubenswrapper[4790]: I1003 09:08:08.071124 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-sn67l"] Oct 03 09:08:08 crc kubenswrapper[4790]: I1003 09:08:08.079471 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-c8zcf"] Oct 03 09:08:08 crc kubenswrapper[4790]: I1003 09:08:08.086926 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-c8zcf"] Oct 03 09:08:08 crc kubenswrapper[4790]: I1003 09:08:08.093906 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-sn67l"] Oct 03 09:08:08 crc kubenswrapper[4790]: I1003 09:08:08.101146 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-djrdw"] Oct 03 09:08:08 crc kubenswrapper[4790]: I1003 09:08:08.109774 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-m92cz"] Oct 03 09:08:08 crc kubenswrapper[4790]: I1003 09:08:08.342204 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2859de83-046e-475b-bb37-fb8f4a3de695" path="/var/lib/kubelet/pods/2859de83-046e-475b-bb37-fb8f4a3de695/volumes" Oct 03 09:08:08 crc kubenswrapper[4790]: I1003 09:08:08.343771 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9b6fdd58-025f-4e30-a384-d4f2104e5cea" path="/var/lib/kubelet/pods/9b6fdd58-025f-4e30-a384-d4f2104e5cea/volumes" Oct 03 09:08:08 crc kubenswrapper[4790]: I1003 09:08:08.344352 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a103c12c-c153-4074-ab87-d9ddb6536748" path="/var/lib/kubelet/pods/a103c12c-c153-4074-ab87-d9ddb6536748/volumes" Oct 03 09:08:08 crc kubenswrapper[4790]: I1003 09:08:08.345274 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eb9c1e83-318c-48b5-ae1c-ec2f41a6da7e" path="/var/lib/kubelet/pods/eb9c1e83-318c-48b5-ae1c-ec2f41a6da7e/volumes" Oct 03 09:08:14 crc kubenswrapper[4790]: I1003 09:08:14.338597 4790 scope.go:117] "RemoveContainer" containerID="a12f13183219a9973e13fa6740df8e860fd1b4a6d896ada977d67f910dbb6700" Oct 03 09:08:14 crc kubenswrapper[4790]: E1003 09:08:14.347401 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqj4j_openshift-machine-config-operator(36eae06a-d8be-4128-8d68-acb32e1f011b)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" podUID="36eae06a-d8be-4128-8d68-acb32e1f011b" Oct 03 09:08:19 crc kubenswrapper[4790]: I1003 09:08:19.037187 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-f9rhk"] Oct 03 09:08:19 crc kubenswrapper[4790]: I1003 09:08:19.046224 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-f9rhk"] Oct 03 09:08:20 crc kubenswrapper[4790]: I1003 09:08:20.342907 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cde60e0e-020c-42ca-9677-05fc2d3d7a72" path="/var/lib/kubelet/pods/cde60e0e-020c-42ca-9677-05fc2d3d7a72/volumes" Oct 03 09:08:20 crc kubenswrapper[4790]: I1003 09:08:20.901804 4790 scope.go:117] "RemoveContainer" containerID="8f7ce841158cd06344c6fffb05bd4f61802c4eea97cd88fea1ce751cfab4f994" Oct 03 09:08:20 crc kubenswrapper[4790]: I1003 09:08:20.938274 4790 scope.go:117] "RemoveContainer" containerID="854d53f0aaa919fc6a3c59bd76bb5f09d0b211de937538ba392a28e415ca8db2" Oct 03 09:08:21 crc kubenswrapper[4790]: I1003 09:08:20.999975 4790 scope.go:117] "RemoveContainer" containerID="d09f3aca7d9ce961a7309d936bb2ebdbbe99d1c086f5a27eeba56ef4836247cb" Oct 03 09:08:21 crc kubenswrapper[4790]: I1003 09:08:21.061218 4790 scope.go:117] "RemoveContainer" containerID="83d384e00619ab506d3e1e6efec39d645745cc27660e6db983ea55d26cf7f3d9" Oct 03 09:08:21 crc kubenswrapper[4790]: I1003 09:08:21.105624 4790 scope.go:117] "RemoveContainer" containerID="51f3869a3a9c24efd2663439eb15f61a38025ec13909a422fd2767c106cdbe4a" Oct 03 09:08:21 crc kubenswrapper[4790]: I1003 09:08:21.157006 4790 scope.go:117] "RemoveContainer" containerID="6c1c56bdc2139c51486d7bee72ed848dde56d34a71b6450dbdfc11afe4625775" Oct 03 09:08:21 crc kubenswrapper[4790]: I1003 09:08:21.202192 4790 scope.go:117] "RemoveContainer" containerID="29a7188a62adcae7fa702c6f2b865eeb573ec43ca9cb59ab1a3ab007c0a35f02" Oct 03 09:08:21 crc kubenswrapper[4790]: I1003 09:08:21.232513 4790 scope.go:117] "RemoveContainer" containerID="b3d94db5ea9f3fcafbabcf709c49a597d4397710f4d76be3a6835ddc2a8db74d" Oct 03 09:08:21 crc kubenswrapper[4790]: I1003 09:08:21.256902 4790 scope.go:117] "RemoveContainer" containerID="856ea50cea4f73c4347d3d07896076a26addfb96d480bd72a2e628c27e5ed309" Oct 03 09:08:21 crc kubenswrapper[4790]: I1003 09:08:21.282067 4790 scope.go:117] "RemoveContainer" containerID="4ba6f46b86ce0149c3824563a53028342ad3c1387370b258b8c7ff65b4b88c86" Oct 03 09:08:21 crc kubenswrapper[4790]: I1003 09:08:21.310549 4790 scope.go:117] "RemoveContainer" containerID="e0c82b798526105642c0ff4fd307316f3a97bc290a9d45108d73157e07e5b386" Oct 03 09:08:21 crc kubenswrapper[4790]: I1003 09:08:21.335880 4790 scope.go:117] "RemoveContainer" containerID="c380022c023961acb7fa7fb3fe1b74bb348a1e3aa4d56b08a9866dd031a1c876" Oct 03 09:08:21 crc kubenswrapper[4790]: I1003 09:08:21.359950 4790 scope.go:117] "RemoveContainer" containerID="002a0e39313fce8f4fe9012d3c16ea9336cdf187d2705860eccd71d0763552ae" Oct 03 09:08:25 crc kubenswrapper[4790]: I1003 09:08:25.033372 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-14c0-account-create-82p7s"] Oct 03 09:08:25 crc kubenswrapper[4790]: I1003 09:08:25.043330 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-14c0-account-create-82p7s"] Oct 03 09:08:25 crc kubenswrapper[4790]: I1003 09:08:25.328564 4790 scope.go:117] "RemoveContainer" containerID="a12f13183219a9973e13fa6740df8e860fd1b4a6d896ada977d67f910dbb6700" Oct 03 09:08:25 crc kubenswrapper[4790]: E1003 09:08:25.328922 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqj4j_openshift-machine-config-operator(36eae06a-d8be-4128-8d68-acb32e1f011b)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" podUID="36eae06a-d8be-4128-8d68-acb32e1f011b" Oct 03 09:08:26 crc kubenswrapper[4790]: I1003 09:08:26.342232 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f11270f2-ead7-4001-94f9-54e66cfbe017" path="/var/lib/kubelet/pods/f11270f2-ead7-4001-94f9-54e66cfbe017/volumes" Oct 03 09:08:27 crc kubenswrapper[4790]: I1003 09:08:27.026619 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-5d29-account-create-jqwpk"] Oct 03 09:08:27 crc kubenswrapper[4790]: I1003 09:08:27.034274 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-2100-account-create-q4khm"] Oct 03 09:08:27 crc kubenswrapper[4790]: I1003 09:08:27.043822 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-2100-account-create-q4khm"] Oct 03 09:08:27 crc kubenswrapper[4790]: I1003 09:08:27.052964 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-5d29-account-create-jqwpk"] Oct 03 09:08:28 crc kubenswrapper[4790]: I1003 09:08:28.339810 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4146a9d4-bf8b-4bce-aca7-531fb9e272ea" path="/var/lib/kubelet/pods/4146a9d4-bf8b-4bce-aca7-531fb9e272ea/volumes" Oct 03 09:08:28 crc kubenswrapper[4790]: I1003 09:08:28.340997 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="feee3da2-b9c2-4091-b0cf-cacd8f7f0fdc" path="/var/lib/kubelet/pods/feee3da2-b9c2-4091-b0cf-cacd8f7f0fdc/volumes" Oct 03 09:08:30 crc kubenswrapper[4790]: I1003 09:08:30.044702 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-db-sync-8j2zm"] Oct 03 09:08:30 crc kubenswrapper[4790]: I1003 09:08:30.058755 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/watcher-db-sync-8j2zm"] Oct 03 09:08:30 crc kubenswrapper[4790]: I1003 09:08:30.343938 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8d78d2bd-0369-4e94-9f44-4e57634ab8b6" path="/var/lib/kubelet/pods/8d78d2bd-0369-4e94-9f44-4e57634ab8b6/volumes" Oct 03 09:08:38 crc kubenswrapper[4790]: I1003 09:08:38.328159 4790 scope.go:117] "RemoveContainer" containerID="a12f13183219a9973e13fa6740df8e860fd1b4a6d896ada977d67f910dbb6700" Oct 03 09:08:38 crc kubenswrapper[4790]: E1003 09:08:38.329259 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqj4j_openshift-machine-config-operator(36eae06a-d8be-4128-8d68-acb32e1f011b)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" podUID="36eae06a-d8be-4128-8d68-acb32e1f011b" Oct 03 09:08:40 crc kubenswrapper[4790]: I1003 09:08:40.147339 4790 generic.go:334] "Generic (PLEG): container finished" podID="2542827b-a700-41bf-80ff-22928d83630a" containerID="c8cd4b968df968aa05a2294808d83ea5dab5d5c42dc79068a5d2bed462c9c81f" exitCode=0 Oct 03 09:08:40 crc kubenswrapper[4790]: I1003 09:08:40.147430 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-mdlnz" event={"ID":"2542827b-a700-41bf-80ff-22928d83630a","Type":"ContainerDied","Data":"c8cd4b968df968aa05a2294808d83ea5dab5d5c42dc79068a5d2bed462c9c81f"} Oct 03 09:08:41 crc kubenswrapper[4790]: I1003 09:08:41.580226 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-mdlnz" Oct 03 09:08:41 crc kubenswrapper[4790]: I1003 09:08:41.719371 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2542827b-a700-41bf-80ff-22928d83630a-inventory\") pod \"2542827b-a700-41bf-80ff-22928d83630a\" (UID: \"2542827b-a700-41bf-80ff-22928d83630a\") " Oct 03 09:08:41 crc kubenswrapper[4790]: I1003 09:08:41.719557 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-54f6h\" (UniqueName: \"kubernetes.io/projected/2542827b-a700-41bf-80ff-22928d83630a-kube-api-access-54f6h\") pod \"2542827b-a700-41bf-80ff-22928d83630a\" (UID: \"2542827b-a700-41bf-80ff-22928d83630a\") " Oct 03 09:08:41 crc kubenswrapper[4790]: I1003 09:08:41.719749 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2542827b-a700-41bf-80ff-22928d83630a-ssh-key\") pod \"2542827b-a700-41bf-80ff-22928d83630a\" (UID: \"2542827b-a700-41bf-80ff-22928d83630a\") " Oct 03 09:08:41 crc kubenswrapper[4790]: I1003 09:08:41.727249 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2542827b-a700-41bf-80ff-22928d83630a-kube-api-access-54f6h" (OuterVolumeSpecName: "kube-api-access-54f6h") pod "2542827b-a700-41bf-80ff-22928d83630a" (UID: "2542827b-a700-41bf-80ff-22928d83630a"). InnerVolumeSpecName "kube-api-access-54f6h". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:08:41 crc kubenswrapper[4790]: I1003 09:08:41.753830 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2542827b-a700-41bf-80ff-22928d83630a-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "2542827b-a700-41bf-80ff-22928d83630a" (UID: "2542827b-a700-41bf-80ff-22928d83630a"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:08:41 crc kubenswrapper[4790]: I1003 09:08:41.758192 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2542827b-a700-41bf-80ff-22928d83630a-inventory" (OuterVolumeSpecName: "inventory") pod "2542827b-a700-41bf-80ff-22928d83630a" (UID: "2542827b-a700-41bf-80ff-22928d83630a"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:08:41 crc kubenswrapper[4790]: I1003 09:08:41.823336 4790 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2542827b-a700-41bf-80ff-22928d83630a-inventory\") on node \"crc\" DevicePath \"\"" Oct 03 09:08:41 crc kubenswrapper[4790]: I1003 09:08:41.823379 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-54f6h\" (UniqueName: \"kubernetes.io/projected/2542827b-a700-41bf-80ff-22928d83630a-kube-api-access-54f6h\") on node \"crc\" DevicePath \"\"" Oct 03 09:08:41 crc kubenswrapper[4790]: I1003 09:08:41.823392 4790 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2542827b-a700-41bf-80ff-22928d83630a-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 03 09:08:42 crc kubenswrapper[4790]: I1003 09:08:42.166647 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-mdlnz" event={"ID":"2542827b-a700-41bf-80ff-22928d83630a","Type":"ContainerDied","Data":"1e3a9df620d891b8ac852620f35f5ecb8b9928570d5b8a8326ab43b1adeb016d"} Oct 03 09:08:42 crc kubenswrapper[4790]: I1003 09:08:42.166695 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1e3a9df620d891b8ac852620f35f5ecb8b9928570d5b8a8326ab43b1adeb016d" Oct 03 09:08:42 crc kubenswrapper[4790]: I1003 09:08:42.166783 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-mdlnz" Oct 03 09:08:42 crc kubenswrapper[4790]: I1003 09:08:42.252145 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-jr7ff"] Oct 03 09:08:42 crc kubenswrapper[4790]: E1003 09:08:42.252783 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2542827b-a700-41bf-80ff-22928d83630a" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Oct 03 09:08:42 crc kubenswrapper[4790]: I1003 09:08:42.252811 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="2542827b-a700-41bf-80ff-22928d83630a" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Oct 03 09:08:42 crc kubenswrapper[4790]: I1003 09:08:42.253106 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="2542827b-a700-41bf-80ff-22928d83630a" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Oct 03 09:08:42 crc kubenswrapper[4790]: I1003 09:08:42.254210 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-jr7ff" Oct 03 09:08:42 crc kubenswrapper[4790]: I1003 09:08:42.256457 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 03 09:08:42 crc kubenswrapper[4790]: I1003 09:08:42.257204 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 03 09:08:42 crc kubenswrapper[4790]: I1003 09:08:42.257300 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-8j46v" Oct 03 09:08:42 crc kubenswrapper[4790]: I1003 09:08:42.257399 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 03 09:08:42 crc kubenswrapper[4790]: I1003 09:08:42.267170 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-jr7ff"] Oct 03 09:08:42 crc kubenswrapper[4790]: I1003 09:08:42.333212 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k5p67\" (UniqueName: \"kubernetes.io/projected/94a4a6e2-8a66-4144-b0ea-893b7519c242-kube-api-access-k5p67\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-jr7ff\" (UID: \"94a4a6e2-8a66-4144-b0ea-893b7519c242\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-jr7ff" Oct 03 09:08:42 crc kubenswrapper[4790]: I1003 09:08:42.333269 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/94a4a6e2-8a66-4144-b0ea-893b7519c242-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-jr7ff\" (UID: \"94a4a6e2-8a66-4144-b0ea-893b7519c242\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-jr7ff" Oct 03 09:08:42 crc kubenswrapper[4790]: I1003 09:08:42.333380 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/94a4a6e2-8a66-4144-b0ea-893b7519c242-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-jr7ff\" (UID: \"94a4a6e2-8a66-4144-b0ea-893b7519c242\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-jr7ff" Oct 03 09:08:42 crc kubenswrapper[4790]: I1003 09:08:42.435636 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k5p67\" (UniqueName: \"kubernetes.io/projected/94a4a6e2-8a66-4144-b0ea-893b7519c242-kube-api-access-k5p67\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-jr7ff\" (UID: \"94a4a6e2-8a66-4144-b0ea-893b7519c242\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-jr7ff" Oct 03 09:08:42 crc kubenswrapper[4790]: I1003 09:08:42.435724 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/94a4a6e2-8a66-4144-b0ea-893b7519c242-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-jr7ff\" (UID: \"94a4a6e2-8a66-4144-b0ea-893b7519c242\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-jr7ff" Oct 03 09:08:42 crc kubenswrapper[4790]: I1003 09:08:42.435885 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/94a4a6e2-8a66-4144-b0ea-893b7519c242-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-jr7ff\" (UID: \"94a4a6e2-8a66-4144-b0ea-893b7519c242\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-jr7ff" Oct 03 09:08:42 crc kubenswrapper[4790]: I1003 09:08:42.442060 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/94a4a6e2-8a66-4144-b0ea-893b7519c242-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-jr7ff\" (UID: \"94a4a6e2-8a66-4144-b0ea-893b7519c242\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-jr7ff" Oct 03 09:08:42 crc kubenswrapper[4790]: I1003 09:08:42.442446 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/94a4a6e2-8a66-4144-b0ea-893b7519c242-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-jr7ff\" (UID: \"94a4a6e2-8a66-4144-b0ea-893b7519c242\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-jr7ff" Oct 03 09:08:42 crc kubenswrapper[4790]: I1003 09:08:42.454412 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k5p67\" (UniqueName: \"kubernetes.io/projected/94a4a6e2-8a66-4144-b0ea-893b7519c242-kube-api-access-k5p67\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-jr7ff\" (UID: \"94a4a6e2-8a66-4144-b0ea-893b7519c242\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-jr7ff" Oct 03 09:08:42 crc kubenswrapper[4790]: I1003 09:08:42.573343 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-jr7ff" Oct 03 09:08:43 crc kubenswrapper[4790]: I1003 09:08:43.103571 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-jr7ff"] Oct 03 09:08:43 crc kubenswrapper[4790]: I1003 09:08:43.176631 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-jr7ff" event={"ID":"94a4a6e2-8a66-4144-b0ea-893b7519c242","Type":"ContainerStarted","Data":"9752931e4375c5f2b9f980685bb09d49f23500055ff08d8dcfea4fd6c4b697a3"} Oct 03 09:08:44 crc kubenswrapper[4790]: I1003 09:08:44.191921 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-jr7ff" event={"ID":"94a4a6e2-8a66-4144-b0ea-893b7519c242","Type":"ContainerStarted","Data":"b0bf0be280100e48ffbd71115eb28688bcc8359d5e540e51dac6334354cf8bc7"} Oct 03 09:08:50 crc kubenswrapper[4790]: I1003 09:08:50.329164 4790 scope.go:117] "RemoveContainer" containerID="a12f13183219a9973e13fa6740df8e860fd1b4a6d896ada977d67f910dbb6700" Oct 03 09:08:50 crc kubenswrapper[4790]: E1003 09:08:50.330094 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqj4j_openshift-machine-config-operator(36eae06a-d8be-4128-8d68-acb32e1f011b)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" podUID="36eae06a-d8be-4128-8d68-acb32e1f011b" Oct 03 09:08:53 crc kubenswrapper[4790]: I1003 09:08:53.040022 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-jr7ff" podStartSLOduration=10.496879737 podStartE2EDuration="11.040000284s" podCreationTimestamp="2025-10-03 09:08:42 +0000 UTC" firstStartedPulling="2025-10-03 09:08:43.097794169 +0000 UTC m=+1709.450728407" lastFinishedPulling="2025-10-03 09:08:43.640914716 +0000 UTC m=+1709.993848954" observedRunningTime="2025-10-03 09:08:44.209944752 +0000 UTC m=+1710.562879000" watchObservedRunningTime="2025-10-03 09:08:53.040000284 +0000 UTC m=+1719.392934522" Oct 03 09:08:53 crc kubenswrapper[4790]: I1003 09:08:53.049125 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-bbwh5"] Oct 03 09:08:53 crc kubenswrapper[4790]: I1003 09:08:53.056501 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-vzdz9"] Oct 03 09:08:53 crc kubenswrapper[4790]: I1003 09:08:53.065797 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-bbwh5"] Oct 03 09:08:53 crc kubenswrapper[4790]: I1003 09:08:53.074484 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-vzdz9"] Oct 03 09:08:54 crc kubenswrapper[4790]: I1003 09:08:54.340038 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7197a2e5-b159-4ea2-bcb6-a58faf21d4ad" path="/var/lib/kubelet/pods/7197a2e5-b159-4ea2-bcb6-a58faf21d4ad/volumes" Oct 03 09:08:54 crc kubenswrapper[4790]: I1003 09:08:54.340674 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a839597d-be75-4b9f-9d18-a2e8045a674b" path="/var/lib/kubelet/pods/a839597d-be75-4b9f-9d18-a2e8045a674b/volumes" Oct 03 09:09:05 crc kubenswrapper[4790]: I1003 09:09:05.327605 4790 scope.go:117] "RemoveContainer" containerID="a12f13183219a9973e13fa6740df8e860fd1b4a6d896ada977d67f910dbb6700" Oct 03 09:09:05 crc kubenswrapper[4790]: E1003 09:09:05.328368 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqj4j_openshift-machine-config-operator(36eae06a-d8be-4128-8d68-acb32e1f011b)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" podUID="36eae06a-d8be-4128-8d68-acb32e1f011b" Oct 03 09:09:14 crc kubenswrapper[4790]: I1003 09:09:14.058305 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-mfdjk"] Oct 03 09:09:14 crc kubenswrapper[4790]: I1003 09:09:14.073014 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-mfdjk"] Oct 03 09:09:14 crc kubenswrapper[4790]: I1003 09:09:14.345648 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="df20e14b-d7d8-4cab-8b7c-61122da75fbf" path="/var/lib/kubelet/pods/df20e14b-d7d8-4cab-8b7c-61122da75fbf/volumes" Oct 03 09:09:18 crc kubenswrapper[4790]: I1003 09:09:18.330145 4790 scope.go:117] "RemoveContainer" containerID="a12f13183219a9973e13fa6740df8e860fd1b4a6d896ada977d67f910dbb6700" Oct 03 09:09:18 crc kubenswrapper[4790]: E1003 09:09:18.330791 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqj4j_openshift-machine-config-operator(36eae06a-d8be-4128-8d68-acb32e1f011b)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" podUID="36eae06a-d8be-4128-8d68-acb32e1f011b" Oct 03 09:09:21 crc kubenswrapper[4790]: I1003 09:09:21.630805 4790 scope.go:117] "RemoveContainer" containerID="b903ffd9c9737c9e3f4c122cd3010c618d42c0b548b05612976955055812f322" Oct 03 09:09:21 crc kubenswrapper[4790]: I1003 09:09:21.670142 4790 scope.go:117] "RemoveContainer" containerID="5bcbce90e5f552eaaadc6b92cb86678891a8513ca0aa2b956936185c44add82d" Oct 03 09:09:21 crc kubenswrapper[4790]: I1003 09:09:21.719713 4790 scope.go:117] "RemoveContainer" containerID="52f70ec8d15ca2b8722d40d109dd87907796a7a8eb04ba50e4a0d89100f00997" Oct 03 09:09:21 crc kubenswrapper[4790]: I1003 09:09:21.783008 4790 scope.go:117] "RemoveContainer" containerID="0d7452c96884917989d401aa60ff69a7463322b4b922a19b9ee7a87d035506bb" Oct 03 09:09:21 crc kubenswrapper[4790]: I1003 09:09:21.874304 4790 scope.go:117] "RemoveContainer" containerID="db89e74913a6abb0a0855c991ea609b1ce7803291149958d8aa0f7fb86212ea7" Oct 03 09:09:21 crc kubenswrapper[4790]: I1003 09:09:21.909302 4790 scope.go:117] "RemoveContainer" containerID="436f9b4ae537c4ee8eadbceb94dcc53da18a9ada7d1217d7ec775e3e975c1a8d" Oct 03 09:09:21 crc kubenswrapper[4790]: I1003 09:09:21.948837 4790 scope.go:117] "RemoveContainer" containerID="3810111fe0dbe40d183e931305a5130d4a7c2a94606d047505cd9ff507de82e7" Oct 03 09:09:25 crc kubenswrapper[4790]: I1003 09:09:25.035505 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-brfpm"] Oct 03 09:09:25 crc kubenswrapper[4790]: I1003 09:09:25.045679 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-brfpm"] Oct 03 09:09:26 crc kubenswrapper[4790]: I1003 09:09:26.345817 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20340665-5e73-4bfd-8a1a-6d21e0d16432" path="/var/lib/kubelet/pods/20340665-5e73-4bfd-8a1a-6d21e0d16432/volumes" Oct 03 09:09:27 crc kubenswrapper[4790]: I1003 09:09:27.033897 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-gklmz"] Oct 03 09:09:27 crc kubenswrapper[4790]: I1003 09:09:27.045856 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-gklmz"] Oct 03 09:09:28 crc kubenswrapper[4790]: I1003 09:09:28.351453 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96250103-59af-4ec6-a036-f6e8222b02e0" path="/var/lib/kubelet/pods/96250103-59af-4ec6-a036-f6e8222b02e0/volumes" Oct 03 09:09:32 crc kubenswrapper[4790]: I1003 09:09:32.328785 4790 scope.go:117] "RemoveContainer" containerID="a12f13183219a9973e13fa6740df8e860fd1b4a6d896ada977d67f910dbb6700" Oct 03 09:09:32 crc kubenswrapper[4790]: E1003 09:09:32.329693 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqj4j_openshift-machine-config-operator(36eae06a-d8be-4128-8d68-acb32e1f011b)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" podUID="36eae06a-d8be-4128-8d68-acb32e1f011b" Oct 03 09:09:44 crc kubenswrapper[4790]: I1003 09:09:44.335453 4790 scope.go:117] "RemoveContainer" containerID="a12f13183219a9973e13fa6740df8e860fd1b4a6d896ada977d67f910dbb6700" Oct 03 09:09:44 crc kubenswrapper[4790]: E1003 09:09:44.336380 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqj4j_openshift-machine-config-operator(36eae06a-d8be-4128-8d68-acb32e1f011b)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" podUID="36eae06a-d8be-4128-8d68-acb32e1f011b" Oct 03 09:09:55 crc kubenswrapper[4790]: I1003 09:09:55.040796 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-h5qqd"] Oct 03 09:09:55 crc kubenswrapper[4790]: I1003 09:09:55.055733 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-q2zhq"] Oct 03 09:09:55 crc kubenswrapper[4790]: I1003 09:09:55.070086 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-rq7b8"] Oct 03 09:09:55 crc kubenswrapper[4790]: I1003 09:09:55.081269 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-rq7b8"] Oct 03 09:09:55 crc kubenswrapper[4790]: I1003 09:09:55.089457 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-h5qqd"] Oct 03 09:09:55 crc kubenswrapper[4790]: I1003 09:09:55.096516 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-q2zhq"] Oct 03 09:09:56 crc kubenswrapper[4790]: I1003 09:09:56.339124 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0ffe5a55-d80d-47ce-9fc9-0611c97437d1" path="/var/lib/kubelet/pods/0ffe5a55-d80d-47ce-9fc9-0611c97437d1/volumes" Oct 03 09:09:56 crc kubenswrapper[4790]: I1003 09:09:56.340248 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="231c1e2d-3330-48e9-80cb-6f490ba17e37" path="/var/lib/kubelet/pods/231c1e2d-3330-48e9-80cb-6f490ba17e37/volumes" Oct 03 09:09:56 crc kubenswrapper[4790]: I1003 09:09:56.341119 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="28ca9811-81f8-4113-b7c0-ddcb16a66ad9" path="/var/lib/kubelet/pods/28ca9811-81f8-4113-b7c0-ddcb16a66ad9/volumes" Oct 03 09:09:58 crc kubenswrapper[4790]: I1003 09:09:58.328850 4790 scope.go:117] "RemoveContainer" containerID="a12f13183219a9973e13fa6740df8e860fd1b4a6d896ada977d67f910dbb6700" Oct 03 09:09:58 crc kubenswrapper[4790]: E1003 09:09:58.329473 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqj4j_openshift-machine-config-operator(36eae06a-d8be-4128-8d68-acb32e1f011b)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" podUID="36eae06a-d8be-4128-8d68-acb32e1f011b" Oct 03 09:09:58 crc kubenswrapper[4790]: I1003 09:09:58.927612 4790 generic.go:334] "Generic (PLEG): container finished" podID="94a4a6e2-8a66-4144-b0ea-893b7519c242" containerID="b0bf0be280100e48ffbd71115eb28688bcc8359d5e540e51dac6334354cf8bc7" exitCode=0 Oct 03 09:09:58 crc kubenswrapper[4790]: I1003 09:09:58.927661 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-jr7ff" event={"ID":"94a4a6e2-8a66-4144-b0ea-893b7519c242","Type":"ContainerDied","Data":"b0bf0be280100e48ffbd71115eb28688bcc8359d5e540e51dac6334354cf8bc7"} Oct 03 09:10:00 crc kubenswrapper[4790]: I1003 09:10:00.313063 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-jr7ff" Oct 03 09:10:00 crc kubenswrapper[4790]: I1003 09:10:00.471170 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/94a4a6e2-8a66-4144-b0ea-893b7519c242-inventory\") pod \"94a4a6e2-8a66-4144-b0ea-893b7519c242\" (UID: \"94a4a6e2-8a66-4144-b0ea-893b7519c242\") " Oct 03 09:10:00 crc kubenswrapper[4790]: I1003 09:10:00.471251 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/94a4a6e2-8a66-4144-b0ea-893b7519c242-ssh-key\") pod \"94a4a6e2-8a66-4144-b0ea-893b7519c242\" (UID: \"94a4a6e2-8a66-4144-b0ea-893b7519c242\") " Oct 03 09:10:00 crc kubenswrapper[4790]: I1003 09:10:00.471517 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k5p67\" (UniqueName: \"kubernetes.io/projected/94a4a6e2-8a66-4144-b0ea-893b7519c242-kube-api-access-k5p67\") pod \"94a4a6e2-8a66-4144-b0ea-893b7519c242\" (UID: \"94a4a6e2-8a66-4144-b0ea-893b7519c242\") " Oct 03 09:10:00 crc kubenswrapper[4790]: I1003 09:10:00.478291 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/94a4a6e2-8a66-4144-b0ea-893b7519c242-kube-api-access-k5p67" (OuterVolumeSpecName: "kube-api-access-k5p67") pod "94a4a6e2-8a66-4144-b0ea-893b7519c242" (UID: "94a4a6e2-8a66-4144-b0ea-893b7519c242"). InnerVolumeSpecName "kube-api-access-k5p67". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:10:00 crc kubenswrapper[4790]: I1003 09:10:00.502861 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/94a4a6e2-8a66-4144-b0ea-893b7519c242-inventory" (OuterVolumeSpecName: "inventory") pod "94a4a6e2-8a66-4144-b0ea-893b7519c242" (UID: "94a4a6e2-8a66-4144-b0ea-893b7519c242"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:10:00 crc kubenswrapper[4790]: I1003 09:10:00.505045 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/94a4a6e2-8a66-4144-b0ea-893b7519c242-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "94a4a6e2-8a66-4144-b0ea-893b7519c242" (UID: "94a4a6e2-8a66-4144-b0ea-893b7519c242"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:10:00 crc kubenswrapper[4790]: I1003 09:10:00.574278 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k5p67\" (UniqueName: \"kubernetes.io/projected/94a4a6e2-8a66-4144-b0ea-893b7519c242-kube-api-access-k5p67\") on node \"crc\" DevicePath \"\"" Oct 03 09:10:00 crc kubenswrapper[4790]: I1003 09:10:00.574313 4790 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/94a4a6e2-8a66-4144-b0ea-893b7519c242-inventory\") on node \"crc\" DevicePath \"\"" Oct 03 09:10:00 crc kubenswrapper[4790]: I1003 09:10:00.574322 4790 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/94a4a6e2-8a66-4144-b0ea-893b7519c242-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 03 09:10:00 crc kubenswrapper[4790]: I1003 09:10:00.949959 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-jr7ff" event={"ID":"94a4a6e2-8a66-4144-b0ea-893b7519c242","Type":"ContainerDied","Data":"9752931e4375c5f2b9f980685bb09d49f23500055ff08d8dcfea4fd6c4b697a3"} Oct 03 09:10:00 crc kubenswrapper[4790]: I1003 09:10:00.950011 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-jr7ff" Oct 03 09:10:00 crc kubenswrapper[4790]: I1003 09:10:00.950020 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9752931e4375c5f2b9f980685bb09d49f23500055ff08d8dcfea4fd6c4b697a3" Oct 03 09:10:01 crc kubenswrapper[4790]: I1003 09:10:01.037810 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-b75fr"] Oct 03 09:10:01 crc kubenswrapper[4790]: E1003 09:10:01.038207 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="94a4a6e2-8a66-4144-b0ea-893b7519c242" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Oct 03 09:10:01 crc kubenswrapper[4790]: I1003 09:10:01.038227 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="94a4a6e2-8a66-4144-b0ea-893b7519c242" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Oct 03 09:10:01 crc kubenswrapper[4790]: I1003 09:10:01.038448 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="94a4a6e2-8a66-4144-b0ea-893b7519c242" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Oct 03 09:10:01 crc kubenswrapper[4790]: I1003 09:10:01.039154 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-b75fr" Oct 03 09:10:01 crc kubenswrapper[4790]: I1003 09:10:01.041452 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 03 09:10:01 crc kubenswrapper[4790]: I1003 09:10:01.041454 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 03 09:10:01 crc kubenswrapper[4790]: I1003 09:10:01.044136 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 03 09:10:01 crc kubenswrapper[4790]: I1003 09:10:01.044960 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-8j46v" Oct 03 09:10:01 crc kubenswrapper[4790]: I1003 09:10:01.048874 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-b75fr"] Oct 03 09:10:01 crc kubenswrapper[4790]: I1003 09:10:01.248259 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qnxkv\" (UniqueName: \"kubernetes.io/projected/a18ef35c-666c-4fc3-b466-7d63bb44e942-kube-api-access-qnxkv\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-b75fr\" (UID: \"a18ef35c-666c-4fc3-b466-7d63bb44e942\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-b75fr" Oct 03 09:10:01 crc kubenswrapper[4790]: I1003 09:10:01.248327 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a18ef35c-666c-4fc3-b466-7d63bb44e942-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-b75fr\" (UID: \"a18ef35c-666c-4fc3-b466-7d63bb44e942\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-b75fr" Oct 03 09:10:01 crc kubenswrapper[4790]: I1003 09:10:01.248405 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a18ef35c-666c-4fc3-b466-7d63bb44e942-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-b75fr\" (UID: \"a18ef35c-666c-4fc3-b466-7d63bb44e942\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-b75fr" Oct 03 09:10:01 crc kubenswrapper[4790]: I1003 09:10:01.350981 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qnxkv\" (UniqueName: \"kubernetes.io/projected/a18ef35c-666c-4fc3-b466-7d63bb44e942-kube-api-access-qnxkv\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-b75fr\" (UID: \"a18ef35c-666c-4fc3-b466-7d63bb44e942\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-b75fr" Oct 03 09:10:01 crc kubenswrapper[4790]: I1003 09:10:01.351046 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a18ef35c-666c-4fc3-b466-7d63bb44e942-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-b75fr\" (UID: \"a18ef35c-666c-4fc3-b466-7d63bb44e942\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-b75fr" Oct 03 09:10:01 crc kubenswrapper[4790]: I1003 09:10:01.351080 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a18ef35c-666c-4fc3-b466-7d63bb44e942-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-b75fr\" (UID: \"a18ef35c-666c-4fc3-b466-7d63bb44e942\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-b75fr" Oct 03 09:10:01 crc kubenswrapper[4790]: I1003 09:10:01.358414 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a18ef35c-666c-4fc3-b466-7d63bb44e942-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-b75fr\" (UID: \"a18ef35c-666c-4fc3-b466-7d63bb44e942\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-b75fr" Oct 03 09:10:01 crc kubenswrapper[4790]: I1003 09:10:01.358455 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a18ef35c-666c-4fc3-b466-7d63bb44e942-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-b75fr\" (UID: \"a18ef35c-666c-4fc3-b466-7d63bb44e942\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-b75fr" Oct 03 09:10:01 crc kubenswrapper[4790]: I1003 09:10:01.372419 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qnxkv\" (UniqueName: \"kubernetes.io/projected/a18ef35c-666c-4fc3-b466-7d63bb44e942-kube-api-access-qnxkv\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-b75fr\" (UID: \"a18ef35c-666c-4fc3-b466-7d63bb44e942\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-b75fr" Oct 03 09:10:01 crc kubenswrapper[4790]: I1003 09:10:01.460374 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-b75fr" Oct 03 09:10:02 crc kubenswrapper[4790]: I1003 09:10:02.001203 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-b75fr"] Oct 03 09:10:02 crc kubenswrapper[4790]: I1003 09:10:02.968270 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-b75fr" event={"ID":"a18ef35c-666c-4fc3-b466-7d63bb44e942","Type":"ContainerStarted","Data":"66a9996e35d00c1feaf87916f37381aa8f38c15f66ca99bb6b995904d2407bf9"} Oct 03 09:10:02 crc kubenswrapper[4790]: I1003 09:10:02.969345 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-b75fr" event={"ID":"a18ef35c-666c-4fc3-b466-7d63bb44e942","Type":"ContainerStarted","Data":"5747948bfc1944eeeb53ff82adc39a580c49365296493f9a2d7a7d599c88f138"} Oct 03 09:10:02 crc kubenswrapper[4790]: I1003 09:10:02.987162 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-b75fr" podStartSLOduration=1.329001731 podStartE2EDuration="1.987143344s" podCreationTimestamp="2025-10-03 09:10:01 +0000 UTC" firstStartedPulling="2025-10-03 09:10:02.000764963 +0000 UTC m=+1788.353699191" lastFinishedPulling="2025-10-03 09:10:02.658906566 +0000 UTC m=+1789.011840804" observedRunningTime="2025-10-03 09:10:02.983009442 +0000 UTC m=+1789.335943680" watchObservedRunningTime="2025-10-03 09:10:02.987143344 +0000 UTC m=+1789.340077592" Oct 03 09:10:08 crc kubenswrapper[4790]: I1003 09:10:08.014609 4790 generic.go:334] "Generic (PLEG): container finished" podID="a18ef35c-666c-4fc3-b466-7d63bb44e942" containerID="66a9996e35d00c1feaf87916f37381aa8f38c15f66ca99bb6b995904d2407bf9" exitCode=0 Oct 03 09:10:08 crc kubenswrapper[4790]: I1003 09:10:08.014679 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-b75fr" event={"ID":"a18ef35c-666c-4fc3-b466-7d63bb44e942","Type":"ContainerDied","Data":"66a9996e35d00c1feaf87916f37381aa8f38c15f66ca99bb6b995904d2407bf9"} Oct 03 09:10:09 crc kubenswrapper[4790]: I1003 09:10:09.328840 4790 scope.go:117] "RemoveContainer" containerID="a12f13183219a9973e13fa6740df8e860fd1b4a6d896ada977d67f910dbb6700" Oct 03 09:10:09 crc kubenswrapper[4790]: E1003 09:10:09.329415 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqj4j_openshift-machine-config-operator(36eae06a-d8be-4128-8d68-acb32e1f011b)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" podUID="36eae06a-d8be-4128-8d68-acb32e1f011b" Oct 03 09:10:09 crc kubenswrapper[4790]: I1003 09:10:09.443873 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-b75fr" Oct 03 09:10:09 crc kubenswrapper[4790]: I1003 09:10:09.613129 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a18ef35c-666c-4fc3-b466-7d63bb44e942-inventory\") pod \"a18ef35c-666c-4fc3-b466-7d63bb44e942\" (UID: \"a18ef35c-666c-4fc3-b466-7d63bb44e942\") " Oct 03 09:10:09 crc kubenswrapper[4790]: I1003 09:10:09.613364 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qnxkv\" (UniqueName: \"kubernetes.io/projected/a18ef35c-666c-4fc3-b466-7d63bb44e942-kube-api-access-qnxkv\") pod \"a18ef35c-666c-4fc3-b466-7d63bb44e942\" (UID: \"a18ef35c-666c-4fc3-b466-7d63bb44e942\") " Oct 03 09:10:09 crc kubenswrapper[4790]: I1003 09:10:09.613461 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a18ef35c-666c-4fc3-b466-7d63bb44e942-ssh-key\") pod \"a18ef35c-666c-4fc3-b466-7d63bb44e942\" (UID: \"a18ef35c-666c-4fc3-b466-7d63bb44e942\") " Oct 03 09:10:09 crc kubenswrapper[4790]: I1003 09:10:09.622184 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a18ef35c-666c-4fc3-b466-7d63bb44e942-kube-api-access-qnxkv" (OuterVolumeSpecName: "kube-api-access-qnxkv") pod "a18ef35c-666c-4fc3-b466-7d63bb44e942" (UID: "a18ef35c-666c-4fc3-b466-7d63bb44e942"). InnerVolumeSpecName "kube-api-access-qnxkv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:10:09 crc kubenswrapper[4790]: I1003 09:10:09.655656 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a18ef35c-666c-4fc3-b466-7d63bb44e942-inventory" (OuterVolumeSpecName: "inventory") pod "a18ef35c-666c-4fc3-b466-7d63bb44e942" (UID: "a18ef35c-666c-4fc3-b466-7d63bb44e942"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:10:09 crc kubenswrapper[4790]: I1003 09:10:09.655962 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a18ef35c-666c-4fc3-b466-7d63bb44e942-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "a18ef35c-666c-4fc3-b466-7d63bb44e942" (UID: "a18ef35c-666c-4fc3-b466-7d63bb44e942"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:10:09 crc kubenswrapper[4790]: I1003 09:10:09.715756 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qnxkv\" (UniqueName: \"kubernetes.io/projected/a18ef35c-666c-4fc3-b466-7d63bb44e942-kube-api-access-qnxkv\") on node \"crc\" DevicePath \"\"" Oct 03 09:10:09 crc kubenswrapper[4790]: I1003 09:10:09.715794 4790 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a18ef35c-666c-4fc3-b466-7d63bb44e942-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 03 09:10:09 crc kubenswrapper[4790]: I1003 09:10:09.715809 4790 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a18ef35c-666c-4fc3-b466-7d63bb44e942-inventory\") on node \"crc\" DevicePath \"\"" Oct 03 09:10:10 crc kubenswrapper[4790]: I1003 09:10:10.032935 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-b75fr" event={"ID":"a18ef35c-666c-4fc3-b466-7d63bb44e942","Type":"ContainerDied","Data":"5747948bfc1944eeeb53ff82adc39a580c49365296493f9a2d7a7d599c88f138"} Oct 03 09:10:10 crc kubenswrapper[4790]: I1003 09:10:10.033020 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5747948bfc1944eeeb53ff82adc39a580c49365296493f9a2d7a7d599c88f138" Oct 03 09:10:10 crc kubenswrapper[4790]: I1003 09:10:10.033026 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-b75fr" Oct 03 09:10:10 crc kubenswrapper[4790]: I1003 09:10:10.100670 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-nb4kf"] Oct 03 09:10:10 crc kubenswrapper[4790]: E1003 09:10:10.101221 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a18ef35c-666c-4fc3-b466-7d63bb44e942" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Oct 03 09:10:10 crc kubenswrapper[4790]: I1003 09:10:10.101250 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="a18ef35c-666c-4fc3-b466-7d63bb44e942" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Oct 03 09:10:10 crc kubenswrapper[4790]: I1003 09:10:10.101548 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="a18ef35c-666c-4fc3-b466-7d63bb44e942" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Oct 03 09:10:10 crc kubenswrapper[4790]: I1003 09:10:10.102481 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-nb4kf" Oct 03 09:10:10 crc kubenswrapper[4790]: I1003 09:10:10.104660 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 03 09:10:10 crc kubenswrapper[4790]: I1003 09:10:10.104668 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 03 09:10:10 crc kubenswrapper[4790]: I1003 09:10:10.104963 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 03 09:10:10 crc kubenswrapper[4790]: I1003 09:10:10.106775 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-8j46v" Oct 03 09:10:10 crc kubenswrapper[4790]: I1003 09:10:10.115090 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-nb4kf"] Oct 03 09:10:10 crc kubenswrapper[4790]: I1003 09:10:10.225962 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-92q6w\" (UniqueName: \"kubernetes.io/projected/eae0b967-81c1-45eb-8240-26198c43a621-kube-api-access-92q6w\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-nb4kf\" (UID: \"eae0b967-81c1-45eb-8240-26198c43a621\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-nb4kf" Oct 03 09:10:10 crc kubenswrapper[4790]: I1003 09:10:10.226084 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/eae0b967-81c1-45eb-8240-26198c43a621-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-nb4kf\" (UID: \"eae0b967-81c1-45eb-8240-26198c43a621\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-nb4kf" Oct 03 09:10:10 crc kubenswrapper[4790]: I1003 09:10:10.226180 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/eae0b967-81c1-45eb-8240-26198c43a621-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-nb4kf\" (UID: \"eae0b967-81c1-45eb-8240-26198c43a621\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-nb4kf" Oct 03 09:10:10 crc kubenswrapper[4790]: I1003 09:10:10.327882 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/eae0b967-81c1-45eb-8240-26198c43a621-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-nb4kf\" (UID: \"eae0b967-81c1-45eb-8240-26198c43a621\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-nb4kf" Oct 03 09:10:10 crc kubenswrapper[4790]: I1003 09:10:10.327982 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/eae0b967-81c1-45eb-8240-26198c43a621-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-nb4kf\" (UID: \"eae0b967-81c1-45eb-8240-26198c43a621\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-nb4kf" Oct 03 09:10:10 crc kubenswrapper[4790]: I1003 09:10:10.328119 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-92q6w\" (UniqueName: \"kubernetes.io/projected/eae0b967-81c1-45eb-8240-26198c43a621-kube-api-access-92q6w\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-nb4kf\" (UID: \"eae0b967-81c1-45eb-8240-26198c43a621\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-nb4kf" Oct 03 09:10:10 crc kubenswrapper[4790]: I1003 09:10:10.334261 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/eae0b967-81c1-45eb-8240-26198c43a621-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-nb4kf\" (UID: \"eae0b967-81c1-45eb-8240-26198c43a621\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-nb4kf" Oct 03 09:10:10 crc kubenswrapper[4790]: I1003 09:10:10.334343 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/eae0b967-81c1-45eb-8240-26198c43a621-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-nb4kf\" (UID: \"eae0b967-81c1-45eb-8240-26198c43a621\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-nb4kf" Oct 03 09:10:10 crc kubenswrapper[4790]: I1003 09:10:10.346559 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-92q6w\" (UniqueName: \"kubernetes.io/projected/eae0b967-81c1-45eb-8240-26198c43a621-kube-api-access-92q6w\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-nb4kf\" (UID: \"eae0b967-81c1-45eb-8240-26198c43a621\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-nb4kf" Oct 03 09:10:10 crc kubenswrapper[4790]: I1003 09:10:10.420231 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-nb4kf" Oct 03 09:10:10 crc kubenswrapper[4790]: I1003 09:10:10.889262 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-bj5qr"] Oct 03 09:10:10 crc kubenswrapper[4790]: I1003 09:10:10.892139 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bj5qr" Oct 03 09:10:10 crc kubenswrapper[4790]: I1003 09:10:10.901170 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-bj5qr"] Oct 03 09:10:10 crc kubenswrapper[4790]: I1003 09:10:10.988565 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-nb4kf"] Oct 03 09:10:11 crc kubenswrapper[4790]: I1003 09:10:11.042428 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-nb4kf" event={"ID":"eae0b967-81c1-45eb-8240-26198c43a621","Type":"ContainerStarted","Data":"ae1c77dfe8bdc7e1e710ebcf31e6b6598c92c3d3a3988a74b8953d7ade029772"} Oct 03 09:10:11 crc kubenswrapper[4790]: I1003 09:10:11.043098 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d4109a04-7003-4e7f-9d28-16efd6fcad59-catalog-content\") pod \"redhat-marketplace-bj5qr\" (UID: \"d4109a04-7003-4e7f-9d28-16efd6fcad59\") " pod="openshift-marketplace/redhat-marketplace-bj5qr" Oct 03 09:10:11 crc kubenswrapper[4790]: I1003 09:10:11.043145 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zwjdf\" (UniqueName: \"kubernetes.io/projected/d4109a04-7003-4e7f-9d28-16efd6fcad59-kube-api-access-zwjdf\") pod \"redhat-marketplace-bj5qr\" (UID: \"d4109a04-7003-4e7f-9d28-16efd6fcad59\") " pod="openshift-marketplace/redhat-marketplace-bj5qr" Oct 03 09:10:11 crc kubenswrapper[4790]: I1003 09:10:11.043175 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d4109a04-7003-4e7f-9d28-16efd6fcad59-utilities\") pod \"redhat-marketplace-bj5qr\" (UID: \"d4109a04-7003-4e7f-9d28-16efd6fcad59\") " pod="openshift-marketplace/redhat-marketplace-bj5qr" Oct 03 09:10:11 crc kubenswrapper[4790]: I1003 09:10:11.145319 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d4109a04-7003-4e7f-9d28-16efd6fcad59-catalog-content\") pod \"redhat-marketplace-bj5qr\" (UID: \"d4109a04-7003-4e7f-9d28-16efd6fcad59\") " pod="openshift-marketplace/redhat-marketplace-bj5qr" Oct 03 09:10:11 crc kubenswrapper[4790]: I1003 09:10:11.146446 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zwjdf\" (UniqueName: \"kubernetes.io/projected/d4109a04-7003-4e7f-9d28-16efd6fcad59-kube-api-access-zwjdf\") pod \"redhat-marketplace-bj5qr\" (UID: \"d4109a04-7003-4e7f-9d28-16efd6fcad59\") " pod="openshift-marketplace/redhat-marketplace-bj5qr" Oct 03 09:10:11 crc kubenswrapper[4790]: I1003 09:10:11.146934 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d4109a04-7003-4e7f-9d28-16efd6fcad59-utilities\") pod \"redhat-marketplace-bj5qr\" (UID: \"d4109a04-7003-4e7f-9d28-16efd6fcad59\") " pod="openshift-marketplace/redhat-marketplace-bj5qr" Oct 03 09:10:11 crc kubenswrapper[4790]: I1003 09:10:11.146280 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d4109a04-7003-4e7f-9d28-16efd6fcad59-catalog-content\") pod \"redhat-marketplace-bj5qr\" (UID: \"d4109a04-7003-4e7f-9d28-16efd6fcad59\") " pod="openshift-marketplace/redhat-marketplace-bj5qr" Oct 03 09:10:11 crc kubenswrapper[4790]: I1003 09:10:11.147398 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d4109a04-7003-4e7f-9d28-16efd6fcad59-utilities\") pod \"redhat-marketplace-bj5qr\" (UID: \"d4109a04-7003-4e7f-9d28-16efd6fcad59\") " pod="openshift-marketplace/redhat-marketplace-bj5qr" Oct 03 09:10:11 crc kubenswrapper[4790]: I1003 09:10:11.172127 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zwjdf\" (UniqueName: \"kubernetes.io/projected/d4109a04-7003-4e7f-9d28-16efd6fcad59-kube-api-access-zwjdf\") pod \"redhat-marketplace-bj5qr\" (UID: \"d4109a04-7003-4e7f-9d28-16efd6fcad59\") " pod="openshift-marketplace/redhat-marketplace-bj5qr" Oct 03 09:10:11 crc kubenswrapper[4790]: I1003 09:10:11.218116 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bj5qr" Oct 03 09:10:11 crc kubenswrapper[4790]: I1003 09:10:11.656255 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-bj5qr"] Oct 03 09:10:11 crc kubenswrapper[4790]: W1003 09:10:11.663326 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd4109a04_7003_4e7f_9d28_16efd6fcad59.slice/crio-d4301f4516323b2aeb17ff56c0531ac6e4afc22f4ffab8ecf3cb041d3873d904 WatchSource:0}: Error finding container d4301f4516323b2aeb17ff56c0531ac6e4afc22f4ffab8ecf3cb041d3873d904: Status 404 returned error can't find the container with id d4301f4516323b2aeb17ff56c0531ac6e4afc22f4ffab8ecf3cb041d3873d904 Oct 03 09:10:12 crc kubenswrapper[4790]: I1003 09:10:12.057932 4790 generic.go:334] "Generic (PLEG): container finished" podID="d4109a04-7003-4e7f-9d28-16efd6fcad59" containerID="5fee1dea59ea5760edf2358342378dad87427fc595cdaf5e06310dafd674c2e8" exitCode=0 Oct 03 09:10:12 crc kubenswrapper[4790]: I1003 09:10:12.058111 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bj5qr" event={"ID":"d4109a04-7003-4e7f-9d28-16efd6fcad59","Type":"ContainerDied","Data":"5fee1dea59ea5760edf2358342378dad87427fc595cdaf5e06310dafd674c2e8"} Oct 03 09:10:12 crc kubenswrapper[4790]: I1003 09:10:12.058321 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bj5qr" event={"ID":"d4109a04-7003-4e7f-9d28-16efd6fcad59","Type":"ContainerStarted","Data":"d4301f4516323b2aeb17ff56c0531ac6e4afc22f4ffab8ecf3cb041d3873d904"} Oct 03 09:10:12 crc kubenswrapper[4790]: I1003 09:10:12.060835 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-nb4kf" event={"ID":"eae0b967-81c1-45eb-8240-26198c43a621","Type":"ContainerStarted","Data":"aedf5586fc50b36676784110e99b1f6d1578e037219f1045108047ee7cc3537c"} Oct 03 09:10:12 crc kubenswrapper[4790]: I1003 09:10:12.110750 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-nb4kf" podStartSLOduration=1.6811239869999999 podStartE2EDuration="2.110730157s" podCreationTimestamp="2025-10-03 09:10:10 +0000 UTC" firstStartedPulling="2025-10-03 09:10:10.999192061 +0000 UTC m=+1797.352126299" lastFinishedPulling="2025-10-03 09:10:11.428798231 +0000 UTC m=+1797.781732469" observedRunningTime="2025-10-03 09:10:12.094867592 +0000 UTC m=+1798.447801840" watchObservedRunningTime="2025-10-03 09:10:12.110730157 +0000 UTC m=+1798.463664395" Oct 03 09:10:13 crc kubenswrapper[4790]: I1003 09:10:13.048777 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-4d8b-account-create-qx5g5"] Oct 03 09:10:13 crc kubenswrapper[4790]: I1003 09:10:13.057548 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-510d-account-create-n6565"] Oct 03 09:10:13 crc kubenswrapper[4790]: I1003 09:10:13.067894 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-4d8b-account-create-qx5g5"] Oct 03 09:10:13 crc kubenswrapper[4790]: I1003 09:10:13.077635 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-e243-account-create-6jxhb"] Oct 03 09:10:13 crc kubenswrapper[4790]: I1003 09:10:13.085225 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-510d-account-create-n6565"] Oct 03 09:10:13 crc kubenswrapper[4790]: I1003 09:10:13.092322 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-e243-account-create-6jxhb"] Oct 03 09:10:14 crc kubenswrapper[4790]: I1003 09:10:14.080871 4790 generic.go:334] "Generic (PLEG): container finished" podID="d4109a04-7003-4e7f-9d28-16efd6fcad59" containerID="6c664095e7a32607b112d357570b10bb980f28e9d566d9a041522d5573a786f7" exitCode=0 Oct 03 09:10:14 crc kubenswrapper[4790]: I1003 09:10:14.080975 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bj5qr" event={"ID":"d4109a04-7003-4e7f-9d28-16efd6fcad59","Type":"ContainerDied","Data":"6c664095e7a32607b112d357570b10bb980f28e9d566d9a041522d5573a786f7"} Oct 03 09:10:14 crc kubenswrapper[4790]: I1003 09:10:14.348443 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0a10df34-7c11-4117-9c35-81b9fa58e7e9" path="/var/lib/kubelet/pods/0a10df34-7c11-4117-9c35-81b9fa58e7e9/volumes" Oct 03 09:10:14 crc kubenswrapper[4790]: I1003 09:10:14.352619 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1a0c8205-2d15-4333-ad01-ccfb999d5058" path="/var/lib/kubelet/pods/1a0c8205-2d15-4333-ad01-ccfb999d5058/volumes" Oct 03 09:10:14 crc kubenswrapper[4790]: I1003 09:10:14.353256 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="65738136-e138-4c95-80a9-ee6cc6b54f8f" path="/var/lib/kubelet/pods/65738136-e138-4c95-80a9-ee6cc6b54f8f/volumes" Oct 03 09:10:15 crc kubenswrapper[4790]: I1003 09:10:15.097169 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bj5qr" event={"ID":"d4109a04-7003-4e7f-9d28-16efd6fcad59","Type":"ContainerStarted","Data":"5f0700db08681a0e124b71be8c1bdc6824fb9213bac1443f7aadd4cb700c2b32"} Oct 03 09:10:15 crc kubenswrapper[4790]: I1003 09:10:15.121415 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-bj5qr" podStartSLOduration=2.678015426 podStartE2EDuration="5.121395097s" podCreationTimestamp="2025-10-03 09:10:10 +0000 UTC" firstStartedPulling="2025-10-03 09:10:12.060111337 +0000 UTC m=+1798.413045565" lastFinishedPulling="2025-10-03 09:10:14.503490998 +0000 UTC m=+1800.856425236" observedRunningTime="2025-10-03 09:10:15.115115625 +0000 UTC m=+1801.468049883" watchObservedRunningTime="2025-10-03 09:10:15.121395097 +0000 UTC m=+1801.474329345" Oct 03 09:10:21 crc kubenswrapper[4790]: I1003 09:10:21.218976 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-bj5qr" Oct 03 09:10:21 crc kubenswrapper[4790]: I1003 09:10:21.219567 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-bj5qr" Oct 03 09:10:21 crc kubenswrapper[4790]: I1003 09:10:21.268390 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-bj5qr" Oct 03 09:10:22 crc kubenswrapper[4790]: I1003 09:10:22.109163 4790 scope.go:117] "RemoveContainer" containerID="8349d3d9c890e0f2559c2e27d15abf20cd2f6a976de2460c5606a8306b102d99" Oct 03 09:10:22 crc kubenswrapper[4790]: I1003 09:10:22.141446 4790 scope.go:117] "RemoveContainer" containerID="a7dff9a615b4a115915a822588c7e829a583ee6905544f42265c30cf055594e4" Oct 03 09:10:22 crc kubenswrapper[4790]: I1003 09:10:22.182545 4790 scope.go:117] "RemoveContainer" containerID="168b94606eb5a413ca725f93786204e75532fba048cab9f8cd2851139bd0b841" Oct 03 09:10:22 crc kubenswrapper[4790]: I1003 09:10:22.232679 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-bj5qr" Oct 03 09:10:22 crc kubenswrapper[4790]: I1003 09:10:22.236381 4790 scope.go:117] "RemoveContainer" containerID="9e6dd9a9680ae937c98491561b19adaedb04ca56fedc6564020ecf1aed06adfb" Oct 03 09:10:22 crc kubenswrapper[4790]: I1003 09:10:22.298148 4790 scope.go:117] "RemoveContainer" containerID="cc38a3e97666b4fe691265c985fd77f84b5a2e2fd71eb66b094e567d30914519" Oct 03 09:10:22 crc kubenswrapper[4790]: I1003 09:10:22.306703 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-bj5qr"] Oct 03 09:10:22 crc kubenswrapper[4790]: I1003 09:10:22.328379 4790 scope.go:117] "RemoveContainer" containerID="a12f13183219a9973e13fa6740df8e860fd1b4a6d896ada977d67f910dbb6700" Oct 03 09:10:22 crc kubenswrapper[4790]: E1003 09:10:22.328841 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqj4j_openshift-machine-config-operator(36eae06a-d8be-4128-8d68-acb32e1f011b)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" podUID="36eae06a-d8be-4128-8d68-acb32e1f011b" Oct 03 09:10:22 crc kubenswrapper[4790]: I1003 09:10:22.342382 4790 scope.go:117] "RemoveContainer" containerID="98bb661d0f9ba60d912f0c38255f37a9a0aca621bc6e59eb791b9e2f702ed8f5" Oct 03 09:10:22 crc kubenswrapper[4790]: I1003 09:10:22.408614 4790 scope.go:117] "RemoveContainer" containerID="bc56f9848047534b21e8193971dec96ae80429f4424af6a673934098585c9f14" Oct 03 09:10:22 crc kubenswrapper[4790]: I1003 09:10:22.433133 4790 scope.go:117] "RemoveContainer" containerID="93b11fd1cc534e59c11142d27c2367f2da2cd23b978580621a93c89a9cad1b82" Oct 03 09:10:24 crc kubenswrapper[4790]: I1003 09:10:24.205991 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-bj5qr" podUID="d4109a04-7003-4e7f-9d28-16efd6fcad59" containerName="registry-server" containerID="cri-o://5f0700db08681a0e124b71be8c1bdc6824fb9213bac1443f7aadd4cb700c2b32" gracePeriod=2 Oct 03 09:10:24 crc kubenswrapper[4790]: I1003 09:10:24.719206 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bj5qr" Oct 03 09:10:24 crc kubenswrapper[4790]: I1003 09:10:24.829724 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zwjdf\" (UniqueName: \"kubernetes.io/projected/d4109a04-7003-4e7f-9d28-16efd6fcad59-kube-api-access-zwjdf\") pod \"d4109a04-7003-4e7f-9d28-16efd6fcad59\" (UID: \"d4109a04-7003-4e7f-9d28-16efd6fcad59\") " Oct 03 09:10:24 crc kubenswrapper[4790]: I1003 09:10:24.830053 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d4109a04-7003-4e7f-9d28-16efd6fcad59-utilities\") pod \"d4109a04-7003-4e7f-9d28-16efd6fcad59\" (UID: \"d4109a04-7003-4e7f-9d28-16efd6fcad59\") " Oct 03 09:10:24 crc kubenswrapper[4790]: I1003 09:10:24.830285 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d4109a04-7003-4e7f-9d28-16efd6fcad59-catalog-content\") pod \"d4109a04-7003-4e7f-9d28-16efd6fcad59\" (UID: \"d4109a04-7003-4e7f-9d28-16efd6fcad59\") " Oct 03 09:10:24 crc kubenswrapper[4790]: I1003 09:10:24.831006 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d4109a04-7003-4e7f-9d28-16efd6fcad59-utilities" (OuterVolumeSpecName: "utilities") pod "d4109a04-7003-4e7f-9d28-16efd6fcad59" (UID: "d4109a04-7003-4e7f-9d28-16efd6fcad59"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 09:10:24 crc kubenswrapper[4790]: I1003 09:10:24.837763 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d4109a04-7003-4e7f-9d28-16efd6fcad59-kube-api-access-zwjdf" (OuterVolumeSpecName: "kube-api-access-zwjdf") pod "d4109a04-7003-4e7f-9d28-16efd6fcad59" (UID: "d4109a04-7003-4e7f-9d28-16efd6fcad59"). InnerVolumeSpecName "kube-api-access-zwjdf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:10:24 crc kubenswrapper[4790]: I1003 09:10:24.843263 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d4109a04-7003-4e7f-9d28-16efd6fcad59-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d4109a04-7003-4e7f-9d28-16efd6fcad59" (UID: "d4109a04-7003-4e7f-9d28-16efd6fcad59"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 09:10:24 crc kubenswrapper[4790]: I1003 09:10:24.932130 4790 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d4109a04-7003-4e7f-9d28-16efd6fcad59-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 09:10:24 crc kubenswrapper[4790]: I1003 09:10:24.932159 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zwjdf\" (UniqueName: \"kubernetes.io/projected/d4109a04-7003-4e7f-9d28-16efd6fcad59-kube-api-access-zwjdf\") on node \"crc\" DevicePath \"\"" Oct 03 09:10:24 crc kubenswrapper[4790]: I1003 09:10:24.932173 4790 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d4109a04-7003-4e7f-9d28-16efd6fcad59-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 09:10:25 crc kubenswrapper[4790]: I1003 09:10:25.218459 4790 generic.go:334] "Generic (PLEG): container finished" podID="d4109a04-7003-4e7f-9d28-16efd6fcad59" containerID="5f0700db08681a0e124b71be8c1bdc6824fb9213bac1443f7aadd4cb700c2b32" exitCode=0 Oct 03 09:10:25 crc kubenswrapper[4790]: I1003 09:10:25.218524 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bj5qr" Oct 03 09:10:25 crc kubenswrapper[4790]: I1003 09:10:25.218517 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bj5qr" event={"ID":"d4109a04-7003-4e7f-9d28-16efd6fcad59","Type":"ContainerDied","Data":"5f0700db08681a0e124b71be8c1bdc6824fb9213bac1443f7aadd4cb700c2b32"} Oct 03 09:10:25 crc kubenswrapper[4790]: I1003 09:10:25.218899 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bj5qr" event={"ID":"d4109a04-7003-4e7f-9d28-16efd6fcad59","Type":"ContainerDied","Data":"d4301f4516323b2aeb17ff56c0531ac6e4afc22f4ffab8ecf3cb041d3873d904"} Oct 03 09:10:25 crc kubenswrapper[4790]: I1003 09:10:25.218919 4790 scope.go:117] "RemoveContainer" containerID="5f0700db08681a0e124b71be8c1bdc6824fb9213bac1443f7aadd4cb700c2b32" Oct 03 09:10:25 crc kubenswrapper[4790]: I1003 09:10:25.243660 4790 scope.go:117] "RemoveContainer" containerID="6c664095e7a32607b112d357570b10bb980f28e9d566d9a041522d5573a786f7" Oct 03 09:10:25 crc kubenswrapper[4790]: I1003 09:10:25.259105 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-bj5qr"] Oct 03 09:10:25 crc kubenswrapper[4790]: I1003 09:10:25.270437 4790 scope.go:117] "RemoveContainer" containerID="5fee1dea59ea5760edf2358342378dad87427fc595cdaf5e06310dafd674c2e8" Oct 03 09:10:25 crc kubenswrapper[4790]: I1003 09:10:25.275641 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-bj5qr"] Oct 03 09:10:25 crc kubenswrapper[4790]: I1003 09:10:25.336172 4790 scope.go:117] "RemoveContainer" containerID="5f0700db08681a0e124b71be8c1bdc6824fb9213bac1443f7aadd4cb700c2b32" Oct 03 09:10:25 crc kubenswrapper[4790]: E1003 09:10:25.336905 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5f0700db08681a0e124b71be8c1bdc6824fb9213bac1443f7aadd4cb700c2b32\": container with ID starting with 5f0700db08681a0e124b71be8c1bdc6824fb9213bac1443f7aadd4cb700c2b32 not found: ID does not exist" containerID="5f0700db08681a0e124b71be8c1bdc6824fb9213bac1443f7aadd4cb700c2b32" Oct 03 09:10:25 crc kubenswrapper[4790]: I1003 09:10:25.336978 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5f0700db08681a0e124b71be8c1bdc6824fb9213bac1443f7aadd4cb700c2b32"} err="failed to get container status \"5f0700db08681a0e124b71be8c1bdc6824fb9213bac1443f7aadd4cb700c2b32\": rpc error: code = NotFound desc = could not find container \"5f0700db08681a0e124b71be8c1bdc6824fb9213bac1443f7aadd4cb700c2b32\": container with ID starting with 5f0700db08681a0e124b71be8c1bdc6824fb9213bac1443f7aadd4cb700c2b32 not found: ID does not exist" Oct 03 09:10:25 crc kubenswrapper[4790]: I1003 09:10:25.337031 4790 scope.go:117] "RemoveContainer" containerID="6c664095e7a32607b112d357570b10bb980f28e9d566d9a041522d5573a786f7" Oct 03 09:10:25 crc kubenswrapper[4790]: E1003 09:10:25.337410 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6c664095e7a32607b112d357570b10bb980f28e9d566d9a041522d5573a786f7\": container with ID starting with 6c664095e7a32607b112d357570b10bb980f28e9d566d9a041522d5573a786f7 not found: ID does not exist" containerID="6c664095e7a32607b112d357570b10bb980f28e9d566d9a041522d5573a786f7" Oct 03 09:10:25 crc kubenswrapper[4790]: I1003 09:10:25.337467 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6c664095e7a32607b112d357570b10bb980f28e9d566d9a041522d5573a786f7"} err="failed to get container status \"6c664095e7a32607b112d357570b10bb980f28e9d566d9a041522d5573a786f7\": rpc error: code = NotFound desc = could not find container \"6c664095e7a32607b112d357570b10bb980f28e9d566d9a041522d5573a786f7\": container with ID starting with 6c664095e7a32607b112d357570b10bb980f28e9d566d9a041522d5573a786f7 not found: ID does not exist" Oct 03 09:10:25 crc kubenswrapper[4790]: I1003 09:10:25.337496 4790 scope.go:117] "RemoveContainer" containerID="5fee1dea59ea5760edf2358342378dad87427fc595cdaf5e06310dafd674c2e8" Oct 03 09:10:25 crc kubenswrapper[4790]: E1003 09:10:25.338001 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5fee1dea59ea5760edf2358342378dad87427fc595cdaf5e06310dafd674c2e8\": container with ID starting with 5fee1dea59ea5760edf2358342378dad87427fc595cdaf5e06310dafd674c2e8 not found: ID does not exist" containerID="5fee1dea59ea5760edf2358342378dad87427fc595cdaf5e06310dafd674c2e8" Oct 03 09:10:25 crc kubenswrapper[4790]: I1003 09:10:25.338034 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5fee1dea59ea5760edf2358342378dad87427fc595cdaf5e06310dafd674c2e8"} err="failed to get container status \"5fee1dea59ea5760edf2358342378dad87427fc595cdaf5e06310dafd674c2e8\": rpc error: code = NotFound desc = could not find container \"5fee1dea59ea5760edf2358342378dad87427fc595cdaf5e06310dafd674c2e8\": container with ID starting with 5fee1dea59ea5760edf2358342378dad87427fc595cdaf5e06310dafd674c2e8 not found: ID does not exist" Oct 03 09:10:26 crc kubenswrapper[4790]: I1003 09:10:26.338192 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d4109a04-7003-4e7f-9d28-16efd6fcad59" path="/var/lib/kubelet/pods/d4109a04-7003-4e7f-9d28-16efd6fcad59/volumes" Oct 03 09:10:37 crc kubenswrapper[4790]: I1003 09:10:37.328851 4790 scope.go:117] "RemoveContainer" containerID="a12f13183219a9973e13fa6740df8e860fd1b4a6d896ada977d67f910dbb6700" Oct 03 09:10:37 crc kubenswrapper[4790]: E1003 09:10:37.330313 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqj4j_openshift-machine-config-operator(36eae06a-d8be-4128-8d68-acb32e1f011b)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" podUID="36eae06a-d8be-4128-8d68-acb32e1f011b" Oct 03 09:10:44 crc kubenswrapper[4790]: I1003 09:10:44.040767 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-tzhvj"] Oct 03 09:10:44 crc kubenswrapper[4790]: I1003 09:10:44.051566 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-tzhvj"] Oct 03 09:10:44 crc kubenswrapper[4790]: I1003 09:10:44.347811 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="42766976-3706-48c2-b456-1eef4f3d1303" path="/var/lib/kubelet/pods/42766976-3706-48c2-b456-1eef4f3d1303/volumes" Oct 03 09:10:49 crc kubenswrapper[4790]: I1003 09:10:49.438078 4790 generic.go:334] "Generic (PLEG): container finished" podID="eae0b967-81c1-45eb-8240-26198c43a621" containerID="aedf5586fc50b36676784110e99b1f6d1578e037219f1045108047ee7cc3537c" exitCode=0 Oct 03 09:10:49 crc kubenswrapper[4790]: I1003 09:10:49.438167 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-nb4kf" event={"ID":"eae0b967-81c1-45eb-8240-26198c43a621","Type":"ContainerDied","Data":"aedf5586fc50b36676784110e99b1f6d1578e037219f1045108047ee7cc3537c"} Oct 03 09:10:50 crc kubenswrapper[4790]: I1003 09:10:50.860392 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-nb4kf" Oct 03 09:10:50 crc kubenswrapper[4790]: I1003 09:10:50.959626 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/eae0b967-81c1-45eb-8240-26198c43a621-ssh-key\") pod \"eae0b967-81c1-45eb-8240-26198c43a621\" (UID: \"eae0b967-81c1-45eb-8240-26198c43a621\") " Oct 03 09:10:50 crc kubenswrapper[4790]: I1003 09:10:50.959868 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-92q6w\" (UniqueName: \"kubernetes.io/projected/eae0b967-81c1-45eb-8240-26198c43a621-kube-api-access-92q6w\") pod \"eae0b967-81c1-45eb-8240-26198c43a621\" (UID: \"eae0b967-81c1-45eb-8240-26198c43a621\") " Oct 03 09:10:50 crc kubenswrapper[4790]: I1003 09:10:50.959907 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/eae0b967-81c1-45eb-8240-26198c43a621-inventory\") pod \"eae0b967-81c1-45eb-8240-26198c43a621\" (UID: \"eae0b967-81c1-45eb-8240-26198c43a621\") " Oct 03 09:10:50 crc kubenswrapper[4790]: I1003 09:10:50.966096 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eae0b967-81c1-45eb-8240-26198c43a621-kube-api-access-92q6w" (OuterVolumeSpecName: "kube-api-access-92q6w") pod "eae0b967-81c1-45eb-8240-26198c43a621" (UID: "eae0b967-81c1-45eb-8240-26198c43a621"). InnerVolumeSpecName "kube-api-access-92q6w". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:10:50 crc kubenswrapper[4790]: I1003 09:10:50.992083 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eae0b967-81c1-45eb-8240-26198c43a621-inventory" (OuterVolumeSpecName: "inventory") pod "eae0b967-81c1-45eb-8240-26198c43a621" (UID: "eae0b967-81c1-45eb-8240-26198c43a621"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:10:50 crc kubenswrapper[4790]: I1003 09:10:50.993786 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eae0b967-81c1-45eb-8240-26198c43a621-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "eae0b967-81c1-45eb-8240-26198c43a621" (UID: "eae0b967-81c1-45eb-8240-26198c43a621"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:10:51 crc kubenswrapper[4790]: I1003 09:10:51.062480 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-92q6w\" (UniqueName: \"kubernetes.io/projected/eae0b967-81c1-45eb-8240-26198c43a621-kube-api-access-92q6w\") on node \"crc\" DevicePath \"\"" Oct 03 09:10:51 crc kubenswrapper[4790]: I1003 09:10:51.062513 4790 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/eae0b967-81c1-45eb-8240-26198c43a621-inventory\") on node \"crc\" DevicePath \"\"" Oct 03 09:10:51 crc kubenswrapper[4790]: I1003 09:10:51.062523 4790 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/eae0b967-81c1-45eb-8240-26198c43a621-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 03 09:10:51 crc kubenswrapper[4790]: I1003 09:10:51.455747 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-nb4kf" event={"ID":"eae0b967-81c1-45eb-8240-26198c43a621","Type":"ContainerDied","Data":"ae1c77dfe8bdc7e1e710ebcf31e6b6598c92c3d3a3988a74b8953d7ade029772"} Oct 03 09:10:51 crc kubenswrapper[4790]: I1003 09:10:51.455793 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ae1c77dfe8bdc7e1e710ebcf31e6b6598c92c3d3a3988a74b8953d7ade029772" Oct 03 09:10:51 crc kubenswrapper[4790]: I1003 09:10:51.455798 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-nb4kf" Oct 03 09:10:51 crc kubenswrapper[4790]: I1003 09:10:51.543222 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-6wbf8"] Oct 03 09:10:51 crc kubenswrapper[4790]: E1003 09:10:51.543822 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d4109a04-7003-4e7f-9d28-16efd6fcad59" containerName="registry-server" Oct 03 09:10:51 crc kubenswrapper[4790]: I1003 09:10:51.543846 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="d4109a04-7003-4e7f-9d28-16efd6fcad59" containerName="registry-server" Oct 03 09:10:51 crc kubenswrapper[4790]: E1003 09:10:51.543874 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d4109a04-7003-4e7f-9d28-16efd6fcad59" containerName="extract-content" Oct 03 09:10:51 crc kubenswrapper[4790]: I1003 09:10:51.543880 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="d4109a04-7003-4e7f-9d28-16efd6fcad59" containerName="extract-content" Oct 03 09:10:51 crc kubenswrapper[4790]: E1003 09:10:51.543899 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eae0b967-81c1-45eb-8240-26198c43a621" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Oct 03 09:10:51 crc kubenswrapper[4790]: I1003 09:10:51.543906 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="eae0b967-81c1-45eb-8240-26198c43a621" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Oct 03 09:10:51 crc kubenswrapper[4790]: E1003 09:10:51.543924 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d4109a04-7003-4e7f-9d28-16efd6fcad59" containerName="extract-utilities" Oct 03 09:10:51 crc kubenswrapper[4790]: I1003 09:10:51.543929 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="d4109a04-7003-4e7f-9d28-16efd6fcad59" containerName="extract-utilities" Oct 03 09:10:51 crc kubenswrapper[4790]: I1003 09:10:51.544115 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="eae0b967-81c1-45eb-8240-26198c43a621" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Oct 03 09:10:51 crc kubenswrapper[4790]: I1003 09:10:51.544132 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="d4109a04-7003-4e7f-9d28-16efd6fcad59" containerName="registry-server" Oct 03 09:10:51 crc kubenswrapper[4790]: I1003 09:10:51.544980 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-6wbf8" Oct 03 09:10:51 crc kubenswrapper[4790]: I1003 09:10:51.548465 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 03 09:10:51 crc kubenswrapper[4790]: I1003 09:10:51.548774 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 03 09:10:51 crc kubenswrapper[4790]: I1003 09:10:51.548935 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 03 09:10:51 crc kubenswrapper[4790]: I1003 09:10:51.548964 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-8j46v" Oct 03 09:10:51 crc kubenswrapper[4790]: I1003 09:10:51.555976 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-6wbf8"] Oct 03 09:10:51 crc kubenswrapper[4790]: I1003 09:10:51.678283 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e3ec6c6b-f744-4d2d-a924-32f4682b1dfa-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-6wbf8\" (UID: \"e3ec6c6b-f744-4d2d-a924-32f4682b1dfa\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-6wbf8" Oct 03 09:10:51 crc kubenswrapper[4790]: I1003 09:10:51.678706 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d6v9b\" (UniqueName: \"kubernetes.io/projected/e3ec6c6b-f744-4d2d-a924-32f4682b1dfa-kube-api-access-d6v9b\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-6wbf8\" (UID: \"e3ec6c6b-f744-4d2d-a924-32f4682b1dfa\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-6wbf8" Oct 03 09:10:51 crc kubenswrapper[4790]: I1003 09:10:51.678804 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e3ec6c6b-f744-4d2d-a924-32f4682b1dfa-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-6wbf8\" (UID: \"e3ec6c6b-f744-4d2d-a924-32f4682b1dfa\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-6wbf8" Oct 03 09:10:51 crc kubenswrapper[4790]: I1003 09:10:51.781152 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e3ec6c6b-f744-4d2d-a924-32f4682b1dfa-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-6wbf8\" (UID: \"e3ec6c6b-f744-4d2d-a924-32f4682b1dfa\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-6wbf8" Oct 03 09:10:51 crc kubenswrapper[4790]: I1003 09:10:51.781342 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e3ec6c6b-f744-4d2d-a924-32f4682b1dfa-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-6wbf8\" (UID: \"e3ec6c6b-f744-4d2d-a924-32f4682b1dfa\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-6wbf8" Oct 03 09:10:51 crc kubenswrapper[4790]: I1003 09:10:51.781454 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d6v9b\" (UniqueName: \"kubernetes.io/projected/e3ec6c6b-f744-4d2d-a924-32f4682b1dfa-kube-api-access-d6v9b\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-6wbf8\" (UID: \"e3ec6c6b-f744-4d2d-a924-32f4682b1dfa\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-6wbf8" Oct 03 09:10:51 crc kubenswrapper[4790]: I1003 09:10:51.785105 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e3ec6c6b-f744-4d2d-a924-32f4682b1dfa-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-6wbf8\" (UID: \"e3ec6c6b-f744-4d2d-a924-32f4682b1dfa\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-6wbf8" Oct 03 09:10:51 crc kubenswrapper[4790]: I1003 09:10:51.786444 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e3ec6c6b-f744-4d2d-a924-32f4682b1dfa-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-6wbf8\" (UID: \"e3ec6c6b-f744-4d2d-a924-32f4682b1dfa\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-6wbf8" Oct 03 09:10:51 crc kubenswrapper[4790]: I1003 09:10:51.801280 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d6v9b\" (UniqueName: \"kubernetes.io/projected/e3ec6c6b-f744-4d2d-a924-32f4682b1dfa-kube-api-access-d6v9b\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-6wbf8\" (UID: \"e3ec6c6b-f744-4d2d-a924-32f4682b1dfa\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-6wbf8" Oct 03 09:10:51 crc kubenswrapper[4790]: I1003 09:10:51.863193 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-6wbf8" Oct 03 09:10:52 crc kubenswrapper[4790]: I1003 09:10:52.362109 4790 scope.go:117] "RemoveContainer" containerID="a12f13183219a9973e13fa6740df8e860fd1b4a6d896ada977d67f910dbb6700" Oct 03 09:10:52 crc kubenswrapper[4790]: I1003 09:10:52.414213 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-6wbf8"] Oct 03 09:10:52 crc kubenswrapper[4790]: I1003 09:10:52.465745 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-6wbf8" event={"ID":"e3ec6c6b-f744-4d2d-a924-32f4682b1dfa","Type":"ContainerStarted","Data":"4d70465e448cbff4dbab221b6f492afea2ffbcb04e50bf8af8b493e8580fd552"} Oct 03 09:10:53 crc kubenswrapper[4790]: I1003 09:10:53.477877 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" event={"ID":"36eae06a-d8be-4128-8d68-acb32e1f011b","Type":"ContainerStarted","Data":"6274a3236231c094c931f1afe606d7703eff60ab0480342f673b71ab8af0913c"} Oct 03 09:10:53 crc kubenswrapper[4790]: I1003 09:10:53.487043 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-6wbf8" event={"ID":"e3ec6c6b-f744-4d2d-a924-32f4682b1dfa","Type":"ContainerStarted","Data":"6e6e6762ad206be57fb8abeccae63d43a0f647cb9075206e8f35037e1532b390"} Oct 03 09:10:53 crc kubenswrapper[4790]: I1003 09:10:53.525931 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-6wbf8" podStartSLOduration=2.081102734 podStartE2EDuration="2.525911342s" podCreationTimestamp="2025-10-03 09:10:51 +0000 UTC" firstStartedPulling="2025-10-03 09:10:52.421256745 +0000 UTC m=+1838.774190983" lastFinishedPulling="2025-10-03 09:10:52.866065353 +0000 UTC m=+1839.218999591" observedRunningTime="2025-10-03 09:10:53.513980945 +0000 UTC m=+1839.866915183" watchObservedRunningTime="2025-10-03 09:10:53.525911342 +0000 UTC m=+1839.878845570" Oct 03 09:11:09 crc kubenswrapper[4790]: I1003 09:11:09.046987 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-4z5vg"] Oct 03 09:11:09 crc kubenswrapper[4790]: I1003 09:11:09.056450 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-4z5vg"] Oct 03 09:11:10 crc kubenswrapper[4790]: I1003 09:11:10.340034 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="62abaf76-b94a-430c-932f-75164034006a" path="/var/lib/kubelet/pods/62abaf76-b94a-430c-932f-75164034006a/volumes" Oct 03 09:11:18 crc kubenswrapper[4790]: I1003 09:11:18.029946 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-vhvvz"] Oct 03 09:11:18 crc kubenswrapper[4790]: I1003 09:11:18.037868 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-vhvvz"] Oct 03 09:11:18 crc kubenswrapper[4790]: I1003 09:11:18.340092 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d1f1bdf-e9e5-4a90-b873-ae755f89c4cc" path="/var/lib/kubelet/pods/1d1f1bdf-e9e5-4a90-b873-ae755f89c4cc/volumes" Oct 03 09:11:22 crc kubenswrapper[4790]: I1003 09:11:22.596414 4790 scope.go:117] "RemoveContainer" containerID="6cdc7e4e2579294bae2eea362c1ec158e112005ba8b149ba23fea5acc8e1c523" Oct 03 09:11:22 crc kubenswrapper[4790]: I1003 09:11:22.671621 4790 scope.go:117] "RemoveContainer" containerID="c3fb7da91946b3152ddae2cd5e951b930c2d505ffcb9a61ef1bdc5685a8c21da" Oct 03 09:11:22 crc kubenswrapper[4790]: I1003 09:11:22.736280 4790 scope.go:117] "RemoveContainer" containerID="0305656b4c1094354b0983f64912be6419b8c3b1db2abe014347fc0bc66115fb" Oct 03 09:11:48 crc kubenswrapper[4790]: I1003 09:11:48.006007 4790 generic.go:334] "Generic (PLEG): container finished" podID="e3ec6c6b-f744-4d2d-a924-32f4682b1dfa" containerID="6e6e6762ad206be57fb8abeccae63d43a0f647cb9075206e8f35037e1532b390" exitCode=2 Oct 03 09:11:48 crc kubenswrapper[4790]: I1003 09:11:48.006082 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-6wbf8" event={"ID":"e3ec6c6b-f744-4d2d-a924-32f4682b1dfa","Type":"ContainerDied","Data":"6e6e6762ad206be57fb8abeccae63d43a0f647cb9075206e8f35037e1532b390"} Oct 03 09:11:49 crc kubenswrapper[4790]: I1003 09:11:49.480642 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-6wbf8" Oct 03 09:11:49 crc kubenswrapper[4790]: I1003 09:11:49.552539 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e3ec6c6b-f744-4d2d-a924-32f4682b1dfa-ssh-key\") pod \"e3ec6c6b-f744-4d2d-a924-32f4682b1dfa\" (UID: \"e3ec6c6b-f744-4d2d-a924-32f4682b1dfa\") " Oct 03 09:11:49 crc kubenswrapper[4790]: I1003 09:11:49.552644 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6v9b\" (UniqueName: \"kubernetes.io/projected/e3ec6c6b-f744-4d2d-a924-32f4682b1dfa-kube-api-access-d6v9b\") pod \"e3ec6c6b-f744-4d2d-a924-32f4682b1dfa\" (UID: \"e3ec6c6b-f744-4d2d-a924-32f4682b1dfa\") " Oct 03 09:11:49 crc kubenswrapper[4790]: I1003 09:11:49.552685 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e3ec6c6b-f744-4d2d-a924-32f4682b1dfa-inventory\") pod \"e3ec6c6b-f744-4d2d-a924-32f4682b1dfa\" (UID: \"e3ec6c6b-f744-4d2d-a924-32f4682b1dfa\") " Oct 03 09:11:49 crc kubenswrapper[4790]: I1003 09:11:49.560262 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e3ec6c6b-f744-4d2d-a924-32f4682b1dfa-kube-api-access-d6v9b" (OuterVolumeSpecName: "kube-api-access-d6v9b") pod "e3ec6c6b-f744-4d2d-a924-32f4682b1dfa" (UID: "e3ec6c6b-f744-4d2d-a924-32f4682b1dfa"). InnerVolumeSpecName "kube-api-access-d6v9b". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:11:49 crc kubenswrapper[4790]: E1003 09:11:49.591081 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e3ec6c6b-f744-4d2d-a924-32f4682b1dfa-inventory podName:e3ec6c6b-f744-4d2d-a924-32f4682b1dfa nodeName:}" failed. No retries permitted until 2025-10-03 09:11:50.091031266 +0000 UTC m=+1896.443965504 (durationBeforeRetry 500ms). Error: error cleaning subPath mounts for volume "inventory" (UniqueName: "kubernetes.io/secret/e3ec6c6b-f744-4d2d-a924-32f4682b1dfa-inventory") pod "e3ec6c6b-f744-4d2d-a924-32f4682b1dfa" (UID: "e3ec6c6b-f744-4d2d-a924-32f4682b1dfa") : error deleting /var/lib/kubelet/pods/e3ec6c6b-f744-4d2d-a924-32f4682b1dfa/volume-subpaths: remove /var/lib/kubelet/pods/e3ec6c6b-f744-4d2d-a924-32f4682b1dfa/volume-subpaths: no such file or directory Oct 03 09:11:49 crc kubenswrapper[4790]: I1003 09:11:49.597721 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e3ec6c6b-f744-4d2d-a924-32f4682b1dfa-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "e3ec6c6b-f744-4d2d-a924-32f4682b1dfa" (UID: "e3ec6c6b-f744-4d2d-a924-32f4682b1dfa"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:11:49 crc kubenswrapper[4790]: I1003 09:11:49.655686 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6v9b\" (UniqueName: \"kubernetes.io/projected/e3ec6c6b-f744-4d2d-a924-32f4682b1dfa-kube-api-access-d6v9b\") on node \"crc\" DevicePath \"\"" Oct 03 09:11:49 crc kubenswrapper[4790]: I1003 09:11:49.655726 4790 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e3ec6c6b-f744-4d2d-a924-32f4682b1dfa-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 03 09:11:50 crc kubenswrapper[4790]: I1003 09:11:50.026991 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-6wbf8" event={"ID":"e3ec6c6b-f744-4d2d-a924-32f4682b1dfa","Type":"ContainerDied","Data":"4d70465e448cbff4dbab221b6f492afea2ffbcb04e50bf8af8b493e8580fd552"} Oct 03 09:11:50 crc kubenswrapper[4790]: I1003 09:11:50.027031 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4d70465e448cbff4dbab221b6f492afea2ffbcb04e50bf8af8b493e8580fd552" Oct 03 09:11:50 crc kubenswrapper[4790]: I1003 09:11:50.027043 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-6wbf8" Oct 03 09:11:50 crc kubenswrapper[4790]: I1003 09:11:50.166724 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e3ec6c6b-f744-4d2d-a924-32f4682b1dfa-inventory\") pod \"e3ec6c6b-f744-4d2d-a924-32f4682b1dfa\" (UID: \"e3ec6c6b-f744-4d2d-a924-32f4682b1dfa\") " Oct 03 09:11:50 crc kubenswrapper[4790]: I1003 09:11:50.170895 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e3ec6c6b-f744-4d2d-a924-32f4682b1dfa-inventory" (OuterVolumeSpecName: "inventory") pod "e3ec6c6b-f744-4d2d-a924-32f4682b1dfa" (UID: "e3ec6c6b-f744-4d2d-a924-32f4682b1dfa"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:11:50 crc kubenswrapper[4790]: I1003 09:11:50.269123 4790 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e3ec6c6b-f744-4d2d-a924-32f4682b1dfa-inventory\") on node \"crc\" DevicePath \"\"" Oct 03 09:11:56 crc kubenswrapper[4790]: I1003 09:11:56.044371 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-sjzpk"] Oct 03 09:11:56 crc kubenswrapper[4790]: I1003 09:11:56.054854 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-sjzpk"] Oct 03 09:11:56 crc kubenswrapper[4790]: I1003 09:11:56.340461 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="04e23e9a-a675-40ea-a178-37b08cd01b51" path="/var/lib/kubelet/pods/04e23e9a-a675-40ea-a178-37b08cd01b51/volumes" Oct 03 09:11:57 crc kubenswrapper[4790]: I1003 09:11:57.044925 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-m8tf7"] Oct 03 09:11:57 crc kubenswrapper[4790]: E1003 09:11:57.045984 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e3ec6c6b-f744-4d2d-a924-32f4682b1dfa" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Oct 03 09:11:57 crc kubenswrapper[4790]: I1003 09:11:57.046011 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3ec6c6b-f744-4d2d-a924-32f4682b1dfa" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Oct 03 09:11:57 crc kubenswrapper[4790]: I1003 09:11:57.046297 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="e3ec6c6b-f744-4d2d-a924-32f4682b1dfa" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Oct 03 09:11:57 crc kubenswrapper[4790]: I1003 09:11:57.047163 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-m8tf7" Oct 03 09:11:57 crc kubenswrapper[4790]: I1003 09:11:57.050524 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-8j46v" Oct 03 09:11:57 crc kubenswrapper[4790]: I1003 09:11:57.050848 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 03 09:11:57 crc kubenswrapper[4790]: I1003 09:11:57.050892 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 03 09:11:57 crc kubenswrapper[4790]: I1003 09:11:57.051520 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 03 09:11:57 crc kubenswrapper[4790]: I1003 09:11:57.062979 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-m8tf7"] Oct 03 09:11:57 crc kubenswrapper[4790]: I1003 09:11:57.106839 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/46adefe5-7949-4e5f-a07c-81569f5f6ad8-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-m8tf7\" (UID: \"46adefe5-7949-4e5f-a07c-81569f5f6ad8\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-m8tf7" Oct 03 09:11:57 crc kubenswrapper[4790]: I1003 09:11:57.106926 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/46adefe5-7949-4e5f-a07c-81569f5f6ad8-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-m8tf7\" (UID: \"46adefe5-7949-4e5f-a07c-81569f5f6ad8\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-m8tf7" Oct 03 09:11:57 crc kubenswrapper[4790]: I1003 09:11:57.107053 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6xkdn\" (UniqueName: \"kubernetes.io/projected/46adefe5-7949-4e5f-a07c-81569f5f6ad8-kube-api-access-6xkdn\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-m8tf7\" (UID: \"46adefe5-7949-4e5f-a07c-81569f5f6ad8\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-m8tf7" Oct 03 09:11:57 crc kubenswrapper[4790]: I1003 09:11:57.209199 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6xkdn\" (UniqueName: \"kubernetes.io/projected/46adefe5-7949-4e5f-a07c-81569f5f6ad8-kube-api-access-6xkdn\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-m8tf7\" (UID: \"46adefe5-7949-4e5f-a07c-81569f5f6ad8\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-m8tf7" Oct 03 09:11:57 crc kubenswrapper[4790]: I1003 09:11:57.209465 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/46adefe5-7949-4e5f-a07c-81569f5f6ad8-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-m8tf7\" (UID: \"46adefe5-7949-4e5f-a07c-81569f5f6ad8\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-m8tf7" Oct 03 09:11:57 crc kubenswrapper[4790]: I1003 09:11:57.209621 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/46adefe5-7949-4e5f-a07c-81569f5f6ad8-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-m8tf7\" (UID: \"46adefe5-7949-4e5f-a07c-81569f5f6ad8\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-m8tf7" Oct 03 09:11:57 crc kubenswrapper[4790]: I1003 09:11:57.218835 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/46adefe5-7949-4e5f-a07c-81569f5f6ad8-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-m8tf7\" (UID: \"46adefe5-7949-4e5f-a07c-81569f5f6ad8\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-m8tf7" Oct 03 09:11:57 crc kubenswrapper[4790]: I1003 09:11:57.220193 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/46adefe5-7949-4e5f-a07c-81569f5f6ad8-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-m8tf7\" (UID: \"46adefe5-7949-4e5f-a07c-81569f5f6ad8\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-m8tf7" Oct 03 09:11:57 crc kubenswrapper[4790]: I1003 09:11:57.229047 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6xkdn\" (UniqueName: \"kubernetes.io/projected/46adefe5-7949-4e5f-a07c-81569f5f6ad8-kube-api-access-6xkdn\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-m8tf7\" (UID: \"46adefe5-7949-4e5f-a07c-81569f5f6ad8\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-m8tf7" Oct 03 09:11:57 crc kubenswrapper[4790]: I1003 09:11:57.377473 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-m8tf7" Oct 03 09:11:57 crc kubenswrapper[4790]: I1003 09:11:57.964408 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-m8tf7"] Oct 03 09:11:57 crc kubenswrapper[4790]: I1003 09:11:57.977011 4790 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 03 09:11:58 crc kubenswrapper[4790]: I1003 09:11:58.102875 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-m8tf7" event={"ID":"46adefe5-7949-4e5f-a07c-81569f5f6ad8","Type":"ContainerStarted","Data":"06ffda1e218719be497af2069ef0326d03ca0a3df4b7a9ce3a53ad3ee28a65dc"} Oct 03 09:11:59 crc kubenswrapper[4790]: I1003 09:11:59.114410 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-m8tf7" event={"ID":"46adefe5-7949-4e5f-a07c-81569f5f6ad8","Type":"ContainerStarted","Data":"bc3925e785737e067fd0f413eb28f3e5dbdce76fe7dc1c8c1c1a105af33e3e93"} Oct 03 09:11:59 crc kubenswrapper[4790]: I1003 09:11:59.133016 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-m8tf7" podStartSLOduration=1.716387289 podStartE2EDuration="2.132994306s" podCreationTimestamp="2025-10-03 09:11:57 +0000 UTC" firstStartedPulling="2025-10-03 09:11:57.976782026 +0000 UTC m=+1904.329716264" lastFinishedPulling="2025-10-03 09:11:58.393389033 +0000 UTC m=+1904.746323281" observedRunningTime="2025-10-03 09:11:59.130992701 +0000 UTC m=+1905.483926949" watchObservedRunningTime="2025-10-03 09:11:59.132994306 +0000 UTC m=+1905.485928544" Oct 03 09:12:22 crc kubenswrapper[4790]: I1003 09:12:22.889235 4790 scope.go:117] "RemoveContainer" containerID="3b4ae2d051a5407e976adcbf3502dec4289adf7ccb7e35db1e8b14cdf13c9ee4" Oct 03 09:12:42 crc kubenswrapper[4790]: I1003 09:12:42.533128 4790 generic.go:334] "Generic (PLEG): container finished" podID="46adefe5-7949-4e5f-a07c-81569f5f6ad8" containerID="bc3925e785737e067fd0f413eb28f3e5dbdce76fe7dc1c8c1c1a105af33e3e93" exitCode=0 Oct 03 09:12:42 crc kubenswrapper[4790]: I1003 09:12:42.533205 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-m8tf7" event={"ID":"46adefe5-7949-4e5f-a07c-81569f5f6ad8","Type":"ContainerDied","Data":"bc3925e785737e067fd0f413eb28f3e5dbdce76fe7dc1c8c1c1a105af33e3e93"} Oct 03 09:12:44 crc kubenswrapper[4790]: I1003 09:12:44.037022 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-m8tf7" Oct 03 09:12:44 crc kubenswrapper[4790]: I1003 09:12:44.169520 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6xkdn\" (UniqueName: \"kubernetes.io/projected/46adefe5-7949-4e5f-a07c-81569f5f6ad8-kube-api-access-6xkdn\") pod \"46adefe5-7949-4e5f-a07c-81569f5f6ad8\" (UID: \"46adefe5-7949-4e5f-a07c-81569f5f6ad8\") " Oct 03 09:12:44 crc kubenswrapper[4790]: I1003 09:12:44.169685 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/46adefe5-7949-4e5f-a07c-81569f5f6ad8-ssh-key\") pod \"46adefe5-7949-4e5f-a07c-81569f5f6ad8\" (UID: \"46adefe5-7949-4e5f-a07c-81569f5f6ad8\") " Oct 03 09:12:44 crc kubenswrapper[4790]: I1003 09:12:44.169714 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/46adefe5-7949-4e5f-a07c-81569f5f6ad8-inventory\") pod \"46adefe5-7949-4e5f-a07c-81569f5f6ad8\" (UID: \"46adefe5-7949-4e5f-a07c-81569f5f6ad8\") " Oct 03 09:12:44 crc kubenswrapper[4790]: I1003 09:12:44.175205 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/46adefe5-7949-4e5f-a07c-81569f5f6ad8-kube-api-access-6xkdn" (OuterVolumeSpecName: "kube-api-access-6xkdn") pod "46adefe5-7949-4e5f-a07c-81569f5f6ad8" (UID: "46adefe5-7949-4e5f-a07c-81569f5f6ad8"). InnerVolumeSpecName "kube-api-access-6xkdn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:12:44 crc kubenswrapper[4790]: I1003 09:12:44.200309 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/46adefe5-7949-4e5f-a07c-81569f5f6ad8-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "46adefe5-7949-4e5f-a07c-81569f5f6ad8" (UID: "46adefe5-7949-4e5f-a07c-81569f5f6ad8"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:12:44 crc kubenswrapper[4790]: I1003 09:12:44.228368 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/46adefe5-7949-4e5f-a07c-81569f5f6ad8-inventory" (OuterVolumeSpecName: "inventory") pod "46adefe5-7949-4e5f-a07c-81569f5f6ad8" (UID: "46adefe5-7949-4e5f-a07c-81569f5f6ad8"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:12:44 crc kubenswrapper[4790]: I1003 09:12:44.272338 4790 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/46adefe5-7949-4e5f-a07c-81569f5f6ad8-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 03 09:12:44 crc kubenswrapper[4790]: I1003 09:12:44.272667 4790 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/46adefe5-7949-4e5f-a07c-81569f5f6ad8-inventory\") on node \"crc\" DevicePath \"\"" Oct 03 09:12:44 crc kubenswrapper[4790]: I1003 09:12:44.272770 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6xkdn\" (UniqueName: \"kubernetes.io/projected/46adefe5-7949-4e5f-a07c-81569f5f6ad8-kube-api-access-6xkdn\") on node \"crc\" DevicePath \"\"" Oct 03 09:12:44 crc kubenswrapper[4790]: I1003 09:12:44.571008 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-m8tf7" event={"ID":"46adefe5-7949-4e5f-a07c-81569f5f6ad8","Type":"ContainerDied","Data":"06ffda1e218719be497af2069ef0326d03ca0a3df4b7a9ce3a53ad3ee28a65dc"} Oct 03 09:12:44 crc kubenswrapper[4790]: I1003 09:12:44.571065 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="06ffda1e218719be497af2069ef0326d03ca0a3df4b7a9ce3a53ad3ee28a65dc" Oct 03 09:12:44 crc kubenswrapper[4790]: I1003 09:12:44.571147 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-m8tf7" Oct 03 09:12:44 crc kubenswrapper[4790]: I1003 09:12:44.664347 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-brzw8"] Oct 03 09:12:44 crc kubenswrapper[4790]: E1003 09:12:44.664905 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="46adefe5-7949-4e5f-a07c-81569f5f6ad8" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Oct 03 09:12:44 crc kubenswrapper[4790]: I1003 09:12:44.664925 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="46adefe5-7949-4e5f-a07c-81569f5f6ad8" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Oct 03 09:12:44 crc kubenswrapper[4790]: I1003 09:12:44.665165 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="46adefe5-7949-4e5f-a07c-81569f5f6ad8" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Oct 03 09:12:44 crc kubenswrapper[4790]: I1003 09:12:44.665959 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-brzw8" Oct 03 09:12:44 crc kubenswrapper[4790]: I1003 09:12:44.669830 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 03 09:12:44 crc kubenswrapper[4790]: I1003 09:12:44.669991 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 03 09:12:44 crc kubenswrapper[4790]: I1003 09:12:44.670045 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 03 09:12:44 crc kubenswrapper[4790]: I1003 09:12:44.670208 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-8j46v" Oct 03 09:12:44 crc kubenswrapper[4790]: I1003 09:12:44.683015 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-brzw8"] Oct 03 09:12:44 crc kubenswrapper[4790]: I1003 09:12:44.801427 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/44cc8b3a-08a0-490c-b1ba-e23b7da761a4-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-brzw8\" (UID: \"44cc8b3a-08a0-490c-b1ba-e23b7da761a4\") " pod="openstack/ssh-known-hosts-edpm-deployment-brzw8" Oct 03 09:12:44 crc kubenswrapper[4790]: I1003 09:12:44.801500 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/44cc8b3a-08a0-490c-b1ba-e23b7da761a4-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-brzw8\" (UID: \"44cc8b3a-08a0-490c-b1ba-e23b7da761a4\") " pod="openstack/ssh-known-hosts-edpm-deployment-brzw8" Oct 03 09:12:44 crc kubenswrapper[4790]: I1003 09:12:44.801558 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rbqbw\" (UniqueName: \"kubernetes.io/projected/44cc8b3a-08a0-490c-b1ba-e23b7da761a4-kube-api-access-rbqbw\") pod \"ssh-known-hosts-edpm-deployment-brzw8\" (UID: \"44cc8b3a-08a0-490c-b1ba-e23b7da761a4\") " pod="openstack/ssh-known-hosts-edpm-deployment-brzw8" Oct 03 09:12:44 crc kubenswrapper[4790]: I1003 09:12:44.903779 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rbqbw\" (UniqueName: \"kubernetes.io/projected/44cc8b3a-08a0-490c-b1ba-e23b7da761a4-kube-api-access-rbqbw\") pod \"ssh-known-hosts-edpm-deployment-brzw8\" (UID: \"44cc8b3a-08a0-490c-b1ba-e23b7da761a4\") " pod="openstack/ssh-known-hosts-edpm-deployment-brzw8" Oct 03 09:12:44 crc kubenswrapper[4790]: I1003 09:12:44.903956 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/44cc8b3a-08a0-490c-b1ba-e23b7da761a4-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-brzw8\" (UID: \"44cc8b3a-08a0-490c-b1ba-e23b7da761a4\") " pod="openstack/ssh-known-hosts-edpm-deployment-brzw8" Oct 03 09:12:44 crc kubenswrapper[4790]: I1003 09:12:44.903985 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/44cc8b3a-08a0-490c-b1ba-e23b7da761a4-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-brzw8\" (UID: \"44cc8b3a-08a0-490c-b1ba-e23b7da761a4\") " pod="openstack/ssh-known-hosts-edpm-deployment-brzw8" Oct 03 09:12:44 crc kubenswrapper[4790]: I1003 09:12:44.908515 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/44cc8b3a-08a0-490c-b1ba-e23b7da761a4-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-brzw8\" (UID: \"44cc8b3a-08a0-490c-b1ba-e23b7da761a4\") " pod="openstack/ssh-known-hosts-edpm-deployment-brzw8" Oct 03 09:12:44 crc kubenswrapper[4790]: I1003 09:12:44.908615 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/44cc8b3a-08a0-490c-b1ba-e23b7da761a4-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-brzw8\" (UID: \"44cc8b3a-08a0-490c-b1ba-e23b7da761a4\") " pod="openstack/ssh-known-hosts-edpm-deployment-brzw8" Oct 03 09:12:44 crc kubenswrapper[4790]: I1003 09:12:44.922414 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rbqbw\" (UniqueName: \"kubernetes.io/projected/44cc8b3a-08a0-490c-b1ba-e23b7da761a4-kube-api-access-rbqbw\") pod \"ssh-known-hosts-edpm-deployment-brzw8\" (UID: \"44cc8b3a-08a0-490c-b1ba-e23b7da761a4\") " pod="openstack/ssh-known-hosts-edpm-deployment-brzw8" Oct 03 09:12:45 crc kubenswrapper[4790]: I1003 09:12:44.999996 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-brzw8" Oct 03 09:12:45 crc kubenswrapper[4790]: I1003 09:12:45.537569 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-brzw8"] Oct 03 09:12:45 crc kubenswrapper[4790]: I1003 09:12:45.579871 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-brzw8" event={"ID":"44cc8b3a-08a0-490c-b1ba-e23b7da761a4","Type":"ContainerStarted","Data":"a2978f226796b1ec00fd9f8159e3c136dd336dd8c80a326c9dc75ae1368ab0a1"} Oct 03 09:12:46 crc kubenswrapper[4790]: I1003 09:12:46.592626 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-brzw8" event={"ID":"44cc8b3a-08a0-490c-b1ba-e23b7da761a4","Type":"ContainerStarted","Data":"3b2b5271c45678582acbcf3e7151d4d9d0a2f51718d82766b051c6a6693d168d"} Oct 03 09:12:53 crc kubenswrapper[4790]: I1003 09:12:53.674136 4790 generic.go:334] "Generic (PLEG): container finished" podID="44cc8b3a-08a0-490c-b1ba-e23b7da761a4" containerID="3b2b5271c45678582acbcf3e7151d4d9d0a2f51718d82766b051c6a6693d168d" exitCode=0 Oct 03 09:12:53 crc kubenswrapper[4790]: I1003 09:12:53.674228 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-brzw8" event={"ID":"44cc8b3a-08a0-490c-b1ba-e23b7da761a4","Type":"ContainerDied","Data":"3b2b5271c45678582acbcf3e7151d4d9d0a2f51718d82766b051c6a6693d168d"} Oct 03 09:12:55 crc kubenswrapper[4790]: I1003 09:12:55.124828 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-brzw8" Oct 03 09:12:55 crc kubenswrapper[4790]: I1003 09:12:55.229114 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/44cc8b3a-08a0-490c-b1ba-e23b7da761a4-inventory-0\") pod \"44cc8b3a-08a0-490c-b1ba-e23b7da761a4\" (UID: \"44cc8b3a-08a0-490c-b1ba-e23b7da761a4\") " Oct 03 09:12:55 crc kubenswrapper[4790]: I1003 09:12:55.229166 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rbqbw\" (UniqueName: \"kubernetes.io/projected/44cc8b3a-08a0-490c-b1ba-e23b7da761a4-kube-api-access-rbqbw\") pod \"44cc8b3a-08a0-490c-b1ba-e23b7da761a4\" (UID: \"44cc8b3a-08a0-490c-b1ba-e23b7da761a4\") " Oct 03 09:12:55 crc kubenswrapper[4790]: I1003 09:12:55.229236 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/44cc8b3a-08a0-490c-b1ba-e23b7da761a4-ssh-key-openstack-edpm-ipam\") pod \"44cc8b3a-08a0-490c-b1ba-e23b7da761a4\" (UID: \"44cc8b3a-08a0-490c-b1ba-e23b7da761a4\") " Oct 03 09:12:55 crc kubenswrapper[4790]: I1003 09:12:55.236402 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44cc8b3a-08a0-490c-b1ba-e23b7da761a4-kube-api-access-rbqbw" (OuterVolumeSpecName: "kube-api-access-rbqbw") pod "44cc8b3a-08a0-490c-b1ba-e23b7da761a4" (UID: "44cc8b3a-08a0-490c-b1ba-e23b7da761a4"). InnerVolumeSpecName "kube-api-access-rbqbw". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:12:55 crc kubenswrapper[4790]: I1003 09:12:55.259395 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/44cc8b3a-08a0-490c-b1ba-e23b7da761a4-inventory-0" (OuterVolumeSpecName: "inventory-0") pod "44cc8b3a-08a0-490c-b1ba-e23b7da761a4" (UID: "44cc8b3a-08a0-490c-b1ba-e23b7da761a4"). InnerVolumeSpecName "inventory-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:12:55 crc kubenswrapper[4790]: I1003 09:12:55.259744 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/44cc8b3a-08a0-490c-b1ba-e23b7da761a4-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "44cc8b3a-08a0-490c-b1ba-e23b7da761a4" (UID: "44cc8b3a-08a0-490c-b1ba-e23b7da761a4"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:12:55 crc kubenswrapper[4790]: I1003 09:12:55.331553 4790 reconciler_common.go:293] "Volume detached for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/44cc8b3a-08a0-490c-b1ba-e23b7da761a4-inventory-0\") on node \"crc\" DevicePath \"\"" Oct 03 09:12:55 crc kubenswrapper[4790]: I1003 09:12:55.331619 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rbqbw\" (UniqueName: \"kubernetes.io/projected/44cc8b3a-08a0-490c-b1ba-e23b7da761a4-kube-api-access-rbqbw\") on node \"crc\" DevicePath \"\"" Oct 03 09:12:55 crc kubenswrapper[4790]: I1003 09:12:55.331636 4790 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/44cc8b3a-08a0-490c-b1ba-e23b7da761a4-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Oct 03 09:12:55 crc kubenswrapper[4790]: I1003 09:12:55.713541 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-brzw8" event={"ID":"44cc8b3a-08a0-490c-b1ba-e23b7da761a4","Type":"ContainerDied","Data":"a2978f226796b1ec00fd9f8159e3c136dd336dd8c80a326c9dc75ae1368ab0a1"} Oct 03 09:12:55 crc kubenswrapper[4790]: I1003 09:12:55.713610 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a2978f226796b1ec00fd9f8159e3c136dd336dd8c80a326c9dc75ae1368ab0a1" Oct 03 09:12:55 crc kubenswrapper[4790]: I1003 09:12:55.713741 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-brzw8" Oct 03 09:12:55 crc kubenswrapper[4790]: I1003 09:12:55.771268 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-zm2qd"] Oct 03 09:12:55 crc kubenswrapper[4790]: E1003 09:12:55.771849 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="44cc8b3a-08a0-490c-b1ba-e23b7da761a4" containerName="ssh-known-hosts-edpm-deployment" Oct 03 09:12:55 crc kubenswrapper[4790]: I1003 09:12:55.771870 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="44cc8b3a-08a0-490c-b1ba-e23b7da761a4" containerName="ssh-known-hosts-edpm-deployment" Oct 03 09:12:55 crc kubenswrapper[4790]: I1003 09:12:55.772150 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="44cc8b3a-08a0-490c-b1ba-e23b7da761a4" containerName="ssh-known-hosts-edpm-deployment" Oct 03 09:12:55 crc kubenswrapper[4790]: I1003 09:12:55.772900 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-zm2qd" Oct 03 09:12:55 crc kubenswrapper[4790]: I1003 09:12:55.779213 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 03 09:12:55 crc kubenswrapper[4790]: I1003 09:12:55.779456 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 03 09:12:55 crc kubenswrapper[4790]: I1003 09:12:55.779917 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-8j46v" Oct 03 09:12:55 crc kubenswrapper[4790]: I1003 09:12:55.780325 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 03 09:12:55 crc kubenswrapper[4790]: I1003 09:12:55.788978 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-zm2qd"] Oct 03 09:12:55 crc kubenswrapper[4790]: I1003 09:12:55.943302 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8dd56d83-b17f-4633-822c-e9ce0e3236d3-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-zm2qd\" (UID: \"8dd56d83-b17f-4633-822c-e9ce0e3236d3\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-zm2qd" Oct 03 09:12:55 crc kubenswrapper[4790]: I1003 09:12:55.943388 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-44rc4\" (UniqueName: \"kubernetes.io/projected/8dd56d83-b17f-4633-822c-e9ce0e3236d3-kube-api-access-44rc4\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-zm2qd\" (UID: \"8dd56d83-b17f-4633-822c-e9ce0e3236d3\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-zm2qd" Oct 03 09:12:55 crc kubenswrapper[4790]: I1003 09:12:55.943777 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8dd56d83-b17f-4633-822c-e9ce0e3236d3-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-zm2qd\" (UID: \"8dd56d83-b17f-4633-822c-e9ce0e3236d3\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-zm2qd" Oct 03 09:12:56 crc kubenswrapper[4790]: I1003 09:12:56.045357 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-44rc4\" (UniqueName: \"kubernetes.io/projected/8dd56d83-b17f-4633-822c-e9ce0e3236d3-kube-api-access-44rc4\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-zm2qd\" (UID: \"8dd56d83-b17f-4633-822c-e9ce0e3236d3\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-zm2qd" Oct 03 09:12:56 crc kubenswrapper[4790]: I1003 09:12:56.045772 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8dd56d83-b17f-4633-822c-e9ce0e3236d3-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-zm2qd\" (UID: \"8dd56d83-b17f-4633-822c-e9ce0e3236d3\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-zm2qd" Oct 03 09:12:56 crc kubenswrapper[4790]: I1003 09:12:56.045962 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8dd56d83-b17f-4633-822c-e9ce0e3236d3-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-zm2qd\" (UID: \"8dd56d83-b17f-4633-822c-e9ce0e3236d3\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-zm2qd" Oct 03 09:12:56 crc kubenswrapper[4790]: I1003 09:12:56.052413 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8dd56d83-b17f-4633-822c-e9ce0e3236d3-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-zm2qd\" (UID: \"8dd56d83-b17f-4633-822c-e9ce0e3236d3\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-zm2qd" Oct 03 09:12:56 crc kubenswrapper[4790]: I1003 09:12:56.054634 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8dd56d83-b17f-4633-822c-e9ce0e3236d3-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-zm2qd\" (UID: \"8dd56d83-b17f-4633-822c-e9ce0e3236d3\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-zm2qd" Oct 03 09:12:56 crc kubenswrapper[4790]: I1003 09:12:56.061994 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-44rc4\" (UniqueName: \"kubernetes.io/projected/8dd56d83-b17f-4633-822c-e9ce0e3236d3-kube-api-access-44rc4\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-zm2qd\" (UID: \"8dd56d83-b17f-4633-822c-e9ce0e3236d3\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-zm2qd" Oct 03 09:12:56 crc kubenswrapper[4790]: I1003 09:12:56.100163 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-zm2qd" Oct 03 09:12:56 crc kubenswrapper[4790]: I1003 09:12:56.602421 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-zm2qd"] Oct 03 09:12:56 crc kubenswrapper[4790]: I1003 09:12:56.726473 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-zm2qd" event={"ID":"8dd56d83-b17f-4633-822c-e9ce0e3236d3","Type":"ContainerStarted","Data":"e922779293235f14be34f1faa5fa147364e35b4c7d9bb11feeff31873c78feb7"} Oct 03 09:12:57 crc kubenswrapper[4790]: I1003 09:12:57.738879 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-zm2qd" event={"ID":"8dd56d83-b17f-4633-822c-e9ce0e3236d3","Type":"ContainerStarted","Data":"fdad4ec6d9fb2ab3cf2eeec04eb0435513d9b1e34098acd72307fa0885372bcb"} Oct 03 09:12:57 crc kubenswrapper[4790]: I1003 09:12:57.767270 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-zm2qd" podStartSLOduration=2.133624452 podStartE2EDuration="2.767248195s" podCreationTimestamp="2025-10-03 09:12:55 +0000 UTC" firstStartedPulling="2025-10-03 09:12:56.603847949 +0000 UTC m=+1962.956782187" lastFinishedPulling="2025-10-03 09:12:57.237471692 +0000 UTC m=+1963.590405930" observedRunningTime="2025-10-03 09:12:57.754170456 +0000 UTC m=+1964.107104714" watchObservedRunningTime="2025-10-03 09:12:57.767248195 +0000 UTC m=+1964.120182453" Oct 03 09:13:05 crc kubenswrapper[4790]: I1003 09:13:05.816486 4790 generic.go:334] "Generic (PLEG): container finished" podID="8dd56d83-b17f-4633-822c-e9ce0e3236d3" containerID="fdad4ec6d9fb2ab3cf2eeec04eb0435513d9b1e34098acd72307fa0885372bcb" exitCode=0 Oct 03 09:13:05 crc kubenswrapper[4790]: I1003 09:13:05.816532 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-zm2qd" event={"ID":"8dd56d83-b17f-4633-822c-e9ce0e3236d3","Type":"ContainerDied","Data":"fdad4ec6d9fb2ab3cf2eeec04eb0435513d9b1e34098acd72307fa0885372bcb"} Oct 03 09:13:07 crc kubenswrapper[4790]: I1003 09:13:07.286648 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-zm2qd" Oct 03 09:13:07 crc kubenswrapper[4790]: I1003 09:13:07.382382 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8dd56d83-b17f-4633-822c-e9ce0e3236d3-ssh-key\") pod \"8dd56d83-b17f-4633-822c-e9ce0e3236d3\" (UID: \"8dd56d83-b17f-4633-822c-e9ce0e3236d3\") " Oct 03 09:13:07 crc kubenswrapper[4790]: I1003 09:13:07.382661 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8dd56d83-b17f-4633-822c-e9ce0e3236d3-inventory\") pod \"8dd56d83-b17f-4633-822c-e9ce0e3236d3\" (UID: \"8dd56d83-b17f-4633-822c-e9ce0e3236d3\") " Oct 03 09:13:07 crc kubenswrapper[4790]: I1003 09:13:07.382721 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-44rc4\" (UniqueName: \"kubernetes.io/projected/8dd56d83-b17f-4633-822c-e9ce0e3236d3-kube-api-access-44rc4\") pod \"8dd56d83-b17f-4633-822c-e9ce0e3236d3\" (UID: \"8dd56d83-b17f-4633-822c-e9ce0e3236d3\") " Oct 03 09:13:07 crc kubenswrapper[4790]: I1003 09:13:07.387923 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8dd56d83-b17f-4633-822c-e9ce0e3236d3-kube-api-access-44rc4" (OuterVolumeSpecName: "kube-api-access-44rc4") pod "8dd56d83-b17f-4633-822c-e9ce0e3236d3" (UID: "8dd56d83-b17f-4633-822c-e9ce0e3236d3"). InnerVolumeSpecName "kube-api-access-44rc4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:13:07 crc kubenswrapper[4790]: I1003 09:13:07.408916 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8dd56d83-b17f-4633-822c-e9ce0e3236d3-inventory" (OuterVolumeSpecName: "inventory") pod "8dd56d83-b17f-4633-822c-e9ce0e3236d3" (UID: "8dd56d83-b17f-4633-822c-e9ce0e3236d3"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:13:07 crc kubenswrapper[4790]: I1003 09:13:07.415037 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8dd56d83-b17f-4633-822c-e9ce0e3236d3-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "8dd56d83-b17f-4633-822c-e9ce0e3236d3" (UID: "8dd56d83-b17f-4633-822c-e9ce0e3236d3"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:13:07 crc kubenswrapper[4790]: I1003 09:13:07.485894 4790 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8dd56d83-b17f-4633-822c-e9ce0e3236d3-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 03 09:13:07 crc kubenswrapper[4790]: I1003 09:13:07.485930 4790 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8dd56d83-b17f-4633-822c-e9ce0e3236d3-inventory\") on node \"crc\" DevicePath \"\"" Oct 03 09:13:07 crc kubenswrapper[4790]: I1003 09:13:07.485944 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-44rc4\" (UniqueName: \"kubernetes.io/projected/8dd56d83-b17f-4633-822c-e9ce0e3236d3-kube-api-access-44rc4\") on node \"crc\" DevicePath \"\"" Oct 03 09:13:07 crc kubenswrapper[4790]: I1003 09:13:07.845003 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-zm2qd" event={"ID":"8dd56d83-b17f-4633-822c-e9ce0e3236d3","Type":"ContainerDied","Data":"e922779293235f14be34f1faa5fa147364e35b4c7d9bb11feeff31873c78feb7"} Oct 03 09:13:07 crc kubenswrapper[4790]: I1003 09:13:07.845044 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e922779293235f14be34f1faa5fa147364e35b4c7d9bb11feeff31873c78feb7" Oct 03 09:13:07 crc kubenswrapper[4790]: I1003 09:13:07.845094 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-zm2qd" Oct 03 09:13:07 crc kubenswrapper[4790]: I1003 09:13:07.921833 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-nnnn2"] Oct 03 09:13:07 crc kubenswrapper[4790]: E1003 09:13:07.922257 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8dd56d83-b17f-4633-822c-e9ce0e3236d3" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Oct 03 09:13:07 crc kubenswrapper[4790]: I1003 09:13:07.922275 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="8dd56d83-b17f-4633-822c-e9ce0e3236d3" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Oct 03 09:13:07 crc kubenswrapper[4790]: I1003 09:13:07.922514 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="8dd56d83-b17f-4633-822c-e9ce0e3236d3" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Oct 03 09:13:07 crc kubenswrapper[4790]: I1003 09:13:07.923241 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-nnnn2" Oct 03 09:13:07 crc kubenswrapper[4790]: I1003 09:13:07.925406 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 03 09:13:07 crc kubenswrapper[4790]: I1003 09:13:07.925708 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-8j46v" Oct 03 09:13:07 crc kubenswrapper[4790]: I1003 09:13:07.925879 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 03 09:13:07 crc kubenswrapper[4790]: I1003 09:13:07.926108 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 03 09:13:07 crc kubenswrapper[4790]: I1003 09:13:07.935871 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-nnnn2"] Oct 03 09:13:08 crc kubenswrapper[4790]: I1003 09:13:08.097233 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8cc4fd03-2e9a-4988-8e58-06d9df765283-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-nnnn2\" (UID: \"8cc4fd03-2e9a-4988-8e58-06d9df765283\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-nnnn2" Oct 03 09:13:08 crc kubenswrapper[4790]: I1003 09:13:08.097679 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c2phj\" (UniqueName: \"kubernetes.io/projected/8cc4fd03-2e9a-4988-8e58-06d9df765283-kube-api-access-c2phj\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-nnnn2\" (UID: \"8cc4fd03-2e9a-4988-8e58-06d9df765283\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-nnnn2" Oct 03 09:13:08 crc kubenswrapper[4790]: I1003 09:13:08.097804 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8cc4fd03-2e9a-4988-8e58-06d9df765283-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-nnnn2\" (UID: \"8cc4fd03-2e9a-4988-8e58-06d9df765283\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-nnnn2" Oct 03 09:13:08 crc kubenswrapper[4790]: I1003 09:13:08.199936 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8cc4fd03-2e9a-4988-8e58-06d9df765283-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-nnnn2\" (UID: \"8cc4fd03-2e9a-4988-8e58-06d9df765283\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-nnnn2" Oct 03 09:13:08 crc kubenswrapper[4790]: I1003 09:13:08.200015 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c2phj\" (UniqueName: \"kubernetes.io/projected/8cc4fd03-2e9a-4988-8e58-06d9df765283-kube-api-access-c2phj\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-nnnn2\" (UID: \"8cc4fd03-2e9a-4988-8e58-06d9df765283\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-nnnn2" Oct 03 09:13:08 crc kubenswrapper[4790]: I1003 09:13:08.200048 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8cc4fd03-2e9a-4988-8e58-06d9df765283-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-nnnn2\" (UID: \"8cc4fd03-2e9a-4988-8e58-06d9df765283\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-nnnn2" Oct 03 09:13:08 crc kubenswrapper[4790]: I1003 09:13:08.210114 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8cc4fd03-2e9a-4988-8e58-06d9df765283-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-nnnn2\" (UID: \"8cc4fd03-2e9a-4988-8e58-06d9df765283\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-nnnn2" Oct 03 09:13:08 crc kubenswrapper[4790]: I1003 09:13:08.211149 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8cc4fd03-2e9a-4988-8e58-06d9df765283-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-nnnn2\" (UID: \"8cc4fd03-2e9a-4988-8e58-06d9df765283\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-nnnn2" Oct 03 09:13:08 crc kubenswrapper[4790]: I1003 09:13:08.218526 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c2phj\" (UniqueName: \"kubernetes.io/projected/8cc4fd03-2e9a-4988-8e58-06d9df765283-kube-api-access-c2phj\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-nnnn2\" (UID: \"8cc4fd03-2e9a-4988-8e58-06d9df765283\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-nnnn2" Oct 03 09:13:08 crc kubenswrapper[4790]: I1003 09:13:08.242071 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-nnnn2" Oct 03 09:13:08 crc kubenswrapper[4790]: I1003 09:13:08.802894 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-nnnn2"] Oct 03 09:13:08 crc kubenswrapper[4790]: W1003 09:13:08.805098 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8cc4fd03_2e9a_4988_8e58_06d9df765283.slice/crio-7ca910a68bd7b69cd2625e2f5a890497ca7908f78a43081ffa9733d40ad0cbac WatchSource:0}: Error finding container 7ca910a68bd7b69cd2625e2f5a890497ca7908f78a43081ffa9733d40ad0cbac: Status 404 returned error can't find the container with id 7ca910a68bd7b69cd2625e2f5a890497ca7908f78a43081ffa9733d40ad0cbac Oct 03 09:13:08 crc kubenswrapper[4790]: I1003 09:13:08.856602 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-nnnn2" event={"ID":"8cc4fd03-2e9a-4988-8e58-06d9df765283","Type":"ContainerStarted","Data":"7ca910a68bd7b69cd2625e2f5a890497ca7908f78a43081ffa9733d40ad0cbac"} Oct 03 09:13:09 crc kubenswrapper[4790]: I1003 09:13:09.868183 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-nnnn2" event={"ID":"8cc4fd03-2e9a-4988-8e58-06d9df765283","Type":"ContainerStarted","Data":"7fe4f5fadaa060210453e7163a6716a7ce2380013f0627cb77125bcf4a566fdc"} Oct 03 09:13:09 crc kubenswrapper[4790]: I1003 09:13:09.888318 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-nnnn2" podStartSLOduration=2.415819533 podStartE2EDuration="2.888292054s" podCreationTimestamp="2025-10-03 09:13:07 +0000 UTC" firstStartedPulling="2025-10-03 09:13:08.807644498 +0000 UTC m=+1975.160578736" lastFinishedPulling="2025-10-03 09:13:09.280117019 +0000 UTC m=+1975.633051257" observedRunningTime="2025-10-03 09:13:09.887488342 +0000 UTC m=+1976.240422600" watchObservedRunningTime="2025-10-03 09:13:09.888292054 +0000 UTC m=+1976.241226292" Oct 03 09:13:10 crc kubenswrapper[4790]: I1003 09:13:10.186824 4790 patch_prober.go:28] interesting pod/machine-config-daemon-zqj4j container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 09:13:10 crc kubenswrapper[4790]: I1003 09:13:10.186894 4790 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" podUID="36eae06a-d8be-4128-8d68-acb32e1f011b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 09:13:18 crc kubenswrapper[4790]: I1003 09:13:18.960345 4790 generic.go:334] "Generic (PLEG): container finished" podID="8cc4fd03-2e9a-4988-8e58-06d9df765283" containerID="7fe4f5fadaa060210453e7163a6716a7ce2380013f0627cb77125bcf4a566fdc" exitCode=0 Oct 03 09:13:18 crc kubenswrapper[4790]: I1003 09:13:18.960378 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-nnnn2" event={"ID":"8cc4fd03-2e9a-4988-8e58-06d9df765283","Type":"ContainerDied","Data":"7fe4f5fadaa060210453e7163a6716a7ce2380013f0627cb77125bcf4a566fdc"} Oct 03 09:13:20 crc kubenswrapper[4790]: I1003 09:13:20.372110 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-nnnn2" Oct 03 09:13:20 crc kubenswrapper[4790]: I1003 09:13:20.461975 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8cc4fd03-2e9a-4988-8e58-06d9df765283-ssh-key\") pod \"8cc4fd03-2e9a-4988-8e58-06d9df765283\" (UID: \"8cc4fd03-2e9a-4988-8e58-06d9df765283\") " Oct 03 09:13:20 crc kubenswrapper[4790]: I1003 09:13:20.462358 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8cc4fd03-2e9a-4988-8e58-06d9df765283-inventory\") pod \"8cc4fd03-2e9a-4988-8e58-06d9df765283\" (UID: \"8cc4fd03-2e9a-4988-8e58-06d9df765283\") " Oct 03 09:13:20 crc kubenswrapper[4790]: I1003 09:13:20.462605 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c2phj\" (UniqueName: \"kubernetes.io/projected/8cc4fd03-2e9a-4988-8e58-06d9df765283-kube-api-access-c2phj\") pod \"8cc4fd03-2e9a-4988-8e58-06d9df765283\" (UID: \"8cc4fd03-2e9a-4988-8e58-06d9df765283\") " Oct 03 09:13:20 crc kubenswrapper[4790]: I1003 09:13:20.475994 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cc4fd03-2e9a-4988-8e58-06d9df765283-kube-api-access-c2phj" (OuterVolumeSpecName: "kube-api-access-c2phj") pod "8cc4fd03-2e9a-4988-8e58-06d9df765283" (UID: "8cc4fd03-2e9a-4988-8e58-06d9df765283"). InnerVolumeSpecName "kube-api-access-c2phj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:13:20 crc kubenswrapper[4790]: I1003 09:13:20.494654 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cc4fd03-2e9a-4988-8e58-06d9df765283-inventory" (OuterVolumeSpecName: "inventory") pod "8cc4fd03-2e9a-4988-8e58-06d9df765283" (UID: "8cc4fd03-2e9a-4988-8e58-06d9df765283"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:13:20 crc kubenswrapper[4790]: I1003 09:13:20.499670 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cc4fd03-2e9a-4988-8e58-06d9df765283-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "8cc4fd03-2e9a-4988-8e58-06d9df765283" (UID: "8cc4fd03-2e9a-4988-8e58-06d9df765283"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:13:20 crc kubenswrapper[4790]: I1003 09:13:20.564854 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c2phj\" (UniqueName: \"kubernetes.io/projected/8cc4fd03-2e9a-4988-8e58-06d9df765283-kube-api-access-c2phj\") on node \"crc\" DevicePath \"\"" Oct 03 09:13:20 crc kubenswrapper[4790]: I1003 09:13:20.564891 4790 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8cc4fd03-2e9a-4988-8e58-06d9df765283-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 03 09:13:20 crc kubenswrapper[4790]: I1003 09:13:20.564904 4790 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8cc4fd03-2e9a-4988-8e58-06d9df765283-inventory\") on node \"crc\" DevicePath \"\"" Oct 03 09:13:20 crc kubenswrapper[4790]: I1003 09:13:20.979916 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-nnnn2" event={"ID":"8cc4fd03-2e9a-4988-8e58-06d9df765283","Type":"ContainerDied","Data":"7ca910a68bd7b69cd2625e2f5a890497ca7908f78a43081ffa9733d40ad0cbac"} Oct 03 09:13:20 crc kubenswrapper[4790]: I1003 09:13:20.980272 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7ca910a68bd7b69cd2625e2f5a890497ca7908f78a43081ffa9733d40ad0cbac" Oct 03 09:13:20 crc kubenswrapper[4790]: I1003 09:13:20.980329 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-nnnn2" Oct 03 09:13:21 crc kubenswrapper[4790]: I1003 09:13:21.047569 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-x28c6"] Oct 03 09:13:21 crc kubenswrapper[4790]: E1003 09:13:21.048155 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8cc4fd03-2e9a-4988-8e58-06d9df765283" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Oct 03 09:13:21 crc kubenswrapper[4790]: I1003 09:13:21.048185 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="8cc4fd03-2e9a-4988-8e58-06d9df765283" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Oct 03 09:13:21 crc kubenswrapper[4790]: I1003 09:13:21.048399 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="8cc4fd03-2e9a-4988-8e58-06d9df765283" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Oct 03 09:13:21 crc kubenswrapper[4790]: I1003 09:13:21.049273 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-x28c6" Oct 03 09:13:21 crc kubenswrapper[4790]: I1003 09:13:21.052619 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-neutron-metadata-default-certs-0" Oct 03 09:13:21 crc kubenswrapper[4790]: I1003 09:13:21.052994 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-telemetry-default-certs-0" Oct 03 09:13:21 crc kubenswrapper[4790]: I1003 09:13:21.055993 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-ovn-default-certs-0" Oct 03 09:13:21 crc kubenswrapper[4790]: I1003 09:13:21.056212 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 03 09:13:21 crc kubenswrapper[4790]: I1003 09:13:21.056310 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 03 09:13:21 crc kubenswrapper[4790]: I1003 09:13:21.056443 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-libvirt-default-certs-0" Oct 03 09:13:21 crc kubenswrapper[4790]: I1003 09:13:21.056548 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-8j46v" Oct 03 09:13:21 crc kubenswrapper[4790]: I1003 09:13:21.056766 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 03 09:13:21 crc kubenswrapper[4790]: I1003 09:13:21.062125 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-x28c6"] Oct 03 09:13:21 crc kubenswrapper[4790]: I1003 09:13:21.181937 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/037bc62e-a9b6-45c4-9bce-9395a372bc90-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-x28c6\" (UID: \"037bc62e-a9b6-45c4-9bce-9395a372bc90\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-x28c6" Oct 03 09:13:21 crc kubenswrapper[4790]: I1003 09:13:21.181996 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/037bc62e-a9b6-45c4-9bce-9395a372bc90-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-x28c6\" (UID: \"037bc62e-a9b6-45c4-9bce-9395a372bc90\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-x28c6" Oct 03 09:13:21 crc kubenswrapper[4790]: I1003 09:13:21.182032 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/037bc62e-a9b6-45c4-9bce-9395a372bc90-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-x28c6\" (UID: \"037bc62e-a9b6-45c4-9bce-9395a372bc90\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-x28c6" Oct 03 09:13:21 crc kubenswrapper[4790]: I1003 09:13:21.182109 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/037bc62e-a9b6-45c4-9bce-9395a372bc90-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-x28c6\" (UID: \"037bc62e-a9b6-45c4-9bce-9395a372bc90\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-x28c6" Oct 03 09:13:21 crc kubenswrapper[4790]: I1003 09:13:21.182149 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/037bc62e-a9b6-45c4-9bce-9395a372bc90-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-x28c6\" (UID: \"037bc62e-a9b6-45c4-9bce-9395a372bc90\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-x28c6" Oct 03 09:13:21 crc kubenswrapper[4790]: I1003 09:13:21.182211 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rgp9k\" (UniqueName: \"kubernetes.io/projected/037bc62e-a9b6-45c4-9bce-9395a372bc90-kube-api-access-rgp9k\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-x28c6\" (UID: \"037bc62e-a9b6-45c4-9bce-9395a372bc90\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-x28c6" Oct 03 09:13:21 crc kubenswrapper[4790]: I1003 09:13:21.182311 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/037bc62e-a9b6-45c4-9bce-9395a372bc90-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-x28c6\" (UID: \"037bc62e-a9b6-45c4-9bce-9395a372bc90\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-x28c6" Oct 03 09:13:21 crc kubenswrapper[4790]: I1003 09:13:21.182346 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/037bc62e-a9b6-45c4-9bce-9395a372bc90-ssh-key\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-x28c6\" (UID: \"037bc62e-a9b6-45c4-9bce-9395a372bc90\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-x28c6" Oct 03 09:13:21 crc kubenswrapper[4790]: I1003 09:13:21.182369 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/037bc62e-a9b6-45c4-9bce-9395a372bc90-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-x28c6\" (UID: \"037bc62e-a9b6-45c4-9bce-9395a372bc90\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-x28c6" Oct 03 09:13:21 crc kubenswrapper[4790]: I1003 09:13:21.182519 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/037bc62e-a9b6-45c4-9bce-9395a372bc90-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-x28c6\" (UID: \"037bc62e-a9b6-45c4-9bce-9395a372bc90\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-x28c6" Oct 03 09:13:21 crc kubenswrapper[4790]: I1003 09:13:21.182563 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/037bc62e-a9b6-45c4-9bce-9395a372bc90-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-x28c6\" (UID: \"037bc62e-a9b6-45c4-9bce-9395a372bc90\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-x28c6" Oct 03 09:13:21 crc kubenswrapper[4790]: I1003 09:13:21.182610 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/037bc62e-a9b6-45c4-9bce-9395a372bc90-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-x28c6\" (UID: \"037bc62e-a9b6-45c4-9bce-9395a372bc90\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-x28c6" Oct 03 09:13:21 crc kubenswrapper[4790]: I1003 09:13:21.182674 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/037bc62e-a9b6-45c4-9bce-9395a372bc90-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-x28c6\" (UID: \"037bc62e-a9b6-45c4-9bce-9395a372bc90\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-x28c6" Oct 03 09:13:21 crc kubenswrapper[4790]: I1003 09:13:21.182699 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/037bc62e-a9b6-45c4-9bce-9395a372bc90-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-x28c6\" (UID: \"037bc62e-a9b6-45c4-9bce-9395a372bc90\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-x28c6" Oct 03 09:13:21 crc kubenswrapper[4790]: I1003 09:13:21.284682 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rgp9k\" (UniqueName: \"kubernetes.io/projected/037bc62e-a9b6-45c4-9bce-9395a372bc90-kube-api-access-rgp9k\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-x28c6\" (UID: \"037bc62e-a9b6-45c4-9bce-9395a372bc90\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-x28c6" Oct 03 09:13:21 crc kubenswrapper[4790]: I1003 09:13:21.284785 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/037bc62e-a9b6-45c4-9bce-9395a372bc90-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-x28c6\" (UID: \"037bc62e-a9b6-45c4-9bce-9395a372bc90\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-x28c6" Oct 03 09:13:21 crc kubenswrapper[4790]: I1003 09:13:21.284827 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/037bc62e-a9b6-45c4-9bce-9395a372bc90-ssh-key\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-x28c6\" (UID: \"037bc62e-a9b6-45c4-9bce-9395a372bc90\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-x28c6" Oct 03 09:13:21 crc kubenswrapper[4790]: I1003 09:13:21.284877 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/037bc62e-a9b6-45c4-9bce-9395a372bc90-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-x28c6\" (UID: \"037bc62e-a9b6-45c4-9bce-9395a372bc90\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-x28c6" Oct 03 09:13:21 crc kubenswrapper[4790]: I1003 09:13:21.284954 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/037bc62e-a9b6-45c4-9bce-9395a372bc90-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-x28c6\" (UID: \"037bc62e-a9b6-45c4-9bce-9395a372bc90\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-x28c6" Oct 03 09:13:21 crc kubenswrapper[4790]: I1003 09:13:21.285003 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/037bc62e-a9b6-45c4-9bce-9395a372bc90-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-x28c6\" (UID: \"037bc62e-a9b6-45c4-9bce-9395a372bc90\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-x28c6" Oct 03 09:13:21 crc kubenswrapper[4790]: I1003 09:13:21.285032 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/037bc62e-a9b6-45c4-9bce-9395a372bc90-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-x28c6\" (UID: \"037bc62e-a9b6-45c4-9bce-9395a372bc90\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-x28c6" Oct 03 09:13:21 crc kubenswrapper[4790]: I1003 09:13:21.285073 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/037bc62e-a9b6-45c4-9bce-9395a372bc90-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-x28c6\" (UID: \"037bc62e-a9b6-45c4-9bce-9395a372bc90\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-x28c6" Oct 03 09:13:21 crc kubenswrapper[4790]: I1003 09:13:21.285097 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/037bc62e-a9b6-45c4-9bce-9395a372bc90-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-x28c6\" (UID: \"037bc62e-a9b6-45c4-9bce-9395a372bc90\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-x28c6" Oct 03 09:13:21 crc kubenswrapper[4790]: I1003 09:13:21.285190 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/037bc62e-a9b6-45c4-9bce-9395a372bc90-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-x28c6\" (UID: \"037bc62e-a9b6-45c4-9bce-9395a372bc90\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-x28c6" Oct 03 09:13:21 crc kubenswrapper[4790]: I1003 09:13:21.285241 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/037bc62e-a9b6-45c4-9bce-9395a372bc90-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-x28c6\" (UID: \"037bc62e-a9b6-45c4-9bce-9395a372bc90\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-x28c6" Oct 03 09:13:21 crc kubenswrapper[4790]: I1003 09:13:21.285275 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/037bc62e-a9b6-45c4-9bce-9395a372bc90-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-x28c6\" (UID: \"037bc62e-a9b6-45c4-9bce-9395a372bc90\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-x28c6" Oct 03 09:13:21 crc kubenswrapper[4790]: I1003 09:13:21.285336 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/037bc62e-a9b6-45c4-9bce-9395a372bc90-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-x28c6\" (UID: \"037bc62e-a9b6-45c4-9bce-9395a372bc90\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-x28c6" Oct 03 09:13:21 crc kubenswrapper[4790]: I1003 09:13:21.285377 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/037bc62e-a9b6-45c4-9bce-9395a372bc90-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-x28c6\" (UID: \"037bc62e-a9b6-45c4-9bce-9395a372bc90\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-x28c6" Oct 03 09:13:21 crc kubenswrapper[4790]: I1003 09:13:21.290480 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/037bc62e-a9b6-45c4-9bce-9395a372bc90-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-x28c6\" (UID: \"037bc62e-a9b6-45c4-9bce-9395a372bc90\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-x28c6" Oct 03 09:13:21 crc kubenswrapper[4790]: I1003 09:13:21.290845 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/037bc62e-a9b6-45c4-9bce-9395a372bc90-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-x28c6\" (UID: \"037bc62e-a9b6-45c4-9bce-9395a372bc90\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-x28c6" Oct 03 09:13:21 crc kubenswrapper[4790]: I1003 09:13:21.291149 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/037bc62e-a9b6-45c4-9bce-9395a372bc90-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-x28c6\" (UID: \"037bc62e-a9b6-45c4-9bce-9395a372bc90\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-x28c6" Oct 03 09:13:21 crc kubenswrapper[4790]: I1003 09:13:21.291735 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/037bc62e-a9b6-45c4-9bce-9395a372bc90-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-x28c6\" (UID: \"037bc62e-a9b6-45c4-9bce-9395a372bc90\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-x28c6" Oct 03 09:13:21 crc kubenswrapper[4790]: I1003 09:13:21.291915 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/037bc62e-a9b6-45c4-9bce-9395a372bc90-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-x28c6\" (UID: \"037bc62e-a9b6-45c4-9bce-9395a372bc90\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-x28c6" Oct 03 09:13:21 crc kubenswrapper[4790]: I1003 09:13:21.291946 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/037bc62e-a9b6-45c4-9bce-9395a372bc90-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-x28c6\" (UID: \"037bc62e-a9b6-45c4-9bce-9395a372bc90\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-x28c6" Oct 03 09:13:21 crc kubenswrapper[4790]: I1003 09:13:21.292369 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/037bc62e-a9b6-45c4-9bce-9395a372bc90-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-x28c6\" (UID: \"037bc62e-a9b6-45c4-9bce-9395a372bc90\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-x28c6" Oct 03 09:13:21 crc kubenswrapper[4790]: I1003 09:13:21.293449 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/037bc62e-a9b6-45c4-9bce-9395a372bc90-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-x28c6\" (UID: \"037bc62e-a9b6-45c4-9bce-9395a372bc90\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-x28c6" Oct 03 09:13:21 crc kubenswrapper[4790]: I1003 09:13:21.293777 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/037bc62e-a9b6-45c4-9bce-9395a372bc90-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-x28c6\" (UID: \"037bc62e-a9b6-45c4-9bce-9395a372bc90\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-x28c6" Oct 03 09:13:21 crc kubenswrapper[4790]: I1003 09:13:21.294544 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/037bc62e-a9b6-45c4-9bce-9395a372bc90-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-x28c6\" (UID: \"037bc62e-a9b6-45c4-9bce-9395a372bc90\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-x28c6" Oct 03 09:13:21 crc kubenswrapper[4790]: I1003 09:13:21.295084 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/037bc62e-a9b6-45c4-9bce-9395a372bc90-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-x28c6\" (UID: \"037bc62e-a9b6-45c4-9bce-9395a372bc90\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-x28c6" Oct 03 09:13:21 crc kubenswrapper[4790]: I1003 09:13:21.299524 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/037bc62e-a9b6-45c4-9bce-9395a372bc90-ssh-key\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-x28c6\" (UID: \"037bc62e-a9b6-45c4-9bce-9395a372bc90\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-x28c6" Oct 03 09:13:21 crc kubenswrapper[4790]: I1003 09:13:21.299935 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/037bc62e-a9b6-45c4-9bce-9395a372bc90-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-x28c6\" (UID: \"037bc62e-a9b6-45c4-9bce-9395a372bc90\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-x28c6" Oct 03 09:13:21 crc kubenswrapper[4790]: I1003 09:13:21.307486 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rgp9k\" (UniqueName: \"kubernetes.io/projected/037bc62e-a9b6-45c4-9bce-9395a372bc90-kube-api-access-rgp9k\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-x28c6\" (UID: \"037bc62e-a9b6-45c4-9bce-9395a372bc90\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-x28c6" Oct 03 09:13:21 crc kubenswrapper[4790]: I1003 09:13:21.367777 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-x28c6" Oct 03 09:13:21 crc kubenswrapper[4790]: I1003 09:13:21.896452 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-x28c6"] Oct 03 09:13:21 crc kubenswrapper[4790]: I1003 09:13:21.994077 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-x28c6" event={"ID":"037bc62e-a9b6-45c4-9bce-9395a372bc90","Type":"ContainerStarted","Data":"e9e705fb9244949a03d0928737388d1f1b5d202966da4d822c7b4f38781f750f"} Oct 03 09:13:23 crc kubenswrapper[4790]: I1003 09:13:23.003784 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-x28c6" event={"ID":"037bc62e-a9b6-45c4-9bce-9395a372bc90","Type":"ContainerStarted","Data":"eb08bf547bf99b04db8fbcd7cb2efd136293ed553b08b5dfd65ff2966ec5704c"} Oct 03 09:13:23 crc kubenswrapper[4790]: I1003 09:13:23.021018 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-x28c6" podStartSLOduration=1.4092702209999999 podStartE2EDuration="2.021000583s" podCreationTimestamp="2025-10-03 09:13:21 +0000 UTC" firstStartedPulling="2025-10-03 09:13:21.905319947 +0000 UTC m=+1988.258254185" lastFinishedPulling="2025-10-03 09:13:22.517050309 +0000 UTC m=+1988.869984547" observedRunningTime="2025-10-03 09:13:23.020097018 +0000 UTC m=+1989.373031276" watchObservedRunningTime="2025-10-03 09:13:23.021000583 +0000 UTC m=+1989.373934821" Oct 03 09:13:40 crc kubenswrapper[4790]: I1003 09:13:40.185921 4790 patch_prober.go:28] interesting pod/machine-config-daemon-zqj4j container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 09:13:40 crc kubenswrapper[4790]: I1003 09:13:40.186501 4790 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" podUID="36eae06a-d8be-4128-8d68-acb32e1f011b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 09:14:00 crc kubenswrapper[4790]: I1003 09:14:00.344007 4790 generic.go:334] "Generic (PLEG): container finished" podID="037bc62e-a9b6-45c4-9bce-9395a372bc90" containerID="eb08bf547bf99b04db8fbcd7cb2efd136293ed553b08b5dfd65ff2966ec5704c" exitCode=0 Oct 03 09:14:00 crc kubenswrapper[4790]: I1003 09:14:00.344092 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-x28c6" event={"ID":"037bc62e-a9b6-45c4-9bce-9395a372bc90","Type":"ContainerDied","Data":"eb08bf547bf99b04db8fbcd7cb2efd136293ed553b08b5dfd65ff2966ec5704c"} Oct 03 09:14:01 crc kubenswrapper[4790]: I1003 09:14:01.732647 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-x28c6" Oct 03 09:14:01 crc kubenswrapper[4790]: I1003 09:14:01.838929 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/037bc62e-a9b6-45c4-9bce-9395a372bc90-inventory\") pod \"037bc62e-a9b6-45c4-9bce-9395a372bc90\" (UID: \"037bc62e-a9b6-45c4-9bce-9395a372bc90\") " Oct 03 09:14:01 crc kubenswrapper[4790]: I1003 09:14:01.839234 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/037bc62e-a9b6-45c4-9bce-9395a372bc90-telemetry-combined-ca-bundle\") pod \"037bc62e-a9b6-45c4-9bce-9395a372bc90\" (UID: \"037bc62e-a9b6-45c4-9bce-9395a372bc90\") " Oct 03 09:14:01 crc kubenswrapper[4790]: I1003 09:14:01.839300 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/037bc62e-a9b6-45c4-9bce-9395a372bc90-libvirt-combined-ca-bundle\") pod \"037bc62e-a9b6-45c4-9bce-9395a372bc90\" (UID: \"037bc62e-a9b6-45c4-9bce-9395a372bc90\") " Oct 03 09:14:01 crc kubenswrapper[4790]: I1003 09:14:01.839353 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/037bc62e-a9b6-45c4-9bce-9395a372bc90-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"037bc62e-a9b6-45c4-9bce-9395a372bc90\" (UID: \"037bc62e-a9b6-45c4-9bce-9395a372bc90\") " Oct 03 09:14:01 crc kubenswrapper[4790]: I1003 09:14:01.839414 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/037bc62e-a9b6-45c4-9bce-9395a372bc90-bootstrap-combined-ca-bundle\") pod \"037bc62e-a9b6-45c4-9bce-9395a372bc90\" (UID: \"037bc62e-a9b6-45c4-9bce-9395a372bc90\") " Oct 03 09:14:01 crc kubenswrapper[4790]: I1003 09:14:01.839471 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/037bc62e-a9b6-45c4-9bce-9395a372bc90-openstack-edpm-ipam-ovn-default-certs-0\") pod \"037bc62e-a9b6-45c4-9bce-9395a372bc90\" (UID: \"037bc62e-a9b6-45c4-9bce-9395a372bc90\") " Oct 03 09:14:01 crc kubenswrapper[4790]: I1003 09:14:01.839505 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rgp9k\" (UniqueName: \"kubernetes.io/projected/037bc62e-a9b6-45c4-9bce-9395a372bc90-kube-api-access-rgp9k\") pod \"037bc62e-a9b6-45c4-9bce-9395a372bc90\" (UID: \"037bc62e-a9b6-45c4-9bce-9395a372bc90\") " Oct 03 09:14:01 crc kubenswrapper[4790]: I1003 09:14:01.839545 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/037bc62e-a9b6-45c4-9bce-9395a372bc90-neutron-metadata-combined-ca-bundle\") pod \"037bc62e-a9b6-45c4-9bce-9395a372bc90\" (UID: \"037bc62e-a9b6-45c4-9bce-9395a372bc90\") " Oct 03 09:14:01 crc kubenswrapper[4790]: I1003 09:14:01.839700 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/037bc62e-a9b6-45c4-9bce-9395a372bc90-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"037bc62e-a9b6-45c4-9bce-9395a372bc90\" (UID: \"037bc62e-a9b6-45c4-9bce-9395a372bc90\") " Oct 03 09:14:01 crc kubenswrapper[4790]: I1003 09:14:01.839772 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/037bc62e-a9b6-45c4-9bce-9395a372bc90-repo-setup-combined-ca-bundle\") pod \"037bc62e-a9b6-45c4-9bce-9395a372bc90\" (UID: \"037bc62e-a9b6-45c4-9bce-9395a372bc90\") " Oct 03 09:14:01 crc kubenswrapper[4790]: I1003 09:14:01.839811 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/037bc62e-a9b6-45c4-9bce-9395a372bc90-ssh-key\") pod \"037bc62e-a9b6-45c4-9bce-9395a372bc90\" (UID: \"037bc62e-a9b6-45c4-9bce-9395a372bc90\") " Oct 03 09:14:01 crc kubenswrapper[4790]: I1003 09:14:01.839850 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/037bc62e-a9b6-45c4-9bce-9395a372bc90-nova-combined-ca-bundle\") pod \"037bc62e-a9b6-45c4-9bce-9395a372bc90\" (UID: \"037bc62e-a9b6-45c4-9bce-9395a372bc90\") " Oct 03 09:14:01 crc kubenswrapper[4790]: I1003 09:14:01.839887 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/037bc62e-a9b6-45c4-9bce-9395a372bc90-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"037bc62e-a9b6-45c4-9bce-9395a372bc90\" (UID: \"037bc62e-a9b6-45c4-9bce-9395a372bc90\") " Oct 03 09:14:01 crc kubenswrapper[4790]: I1003 09:14:01.839941 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/037bc62e-a9b6-45c4-9bce-9395a372bc90-ovn-combined-ca-bundle\") pod \"037bc62e-a9b6-45c4-9bce-9395a372bc90\" (UID: \"037bc62e-a9b6-45c4-9bce-9395a372bc90\") " Oct 03 09:14:01 crc kubenswrapper[4790]: I1003 09:14:01.845816 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/037bc62e-a9b6-45c4-9bce-9395a372bc90-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "037bc62e-a9b6-45c4-9bce-9395a372bc90" (UID: "037bc62e-a9b6-45c4-9bce-9395a372bc90"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:14:01 crc kubenswrapper[4790]: I1003 09:14:01.846088 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/037bc62e-a9b6-45c4-9bce-9395a372bc90-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "037bc62e-a9b6-45c4-9bce-9395a372bc90" (UID: "037bc62e-a9b6-45c4-9bce-9395a372bc90"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:14:01 crc kubenswrapper[4790]: I1003 09:14:01.846775 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/037bc62e-a9b6-45c4-9bce-9395a372bc90-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "037bc62e-a9b6-45c4-9bce-9395a372bc90" (UID: "037bc62e-a9b6-45c4-9bce-9395a372bc90"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:14:01 crc kubenswrapper[4790]: I1003 09:14:01.846908 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/037bc62e-a9b6-45c4-9bce-9395a372bc90-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "037bc62e-a9b6-45c4-9bce-9395a372bc90" (UID: "037bc62e-a9b6-45c4-9bce-9395a372bc90"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:14:01 crc kubenswrapper[4790]: I1003 09:14:01.848480 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/037bc62e-a9b6-45c4-9bce-9395a372bc90-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "037bc62e-a9b6-45c4-9bce-9395a372bc90" (UID: "037bc62e-a9b6-45c4-9bce-9395a372bc90"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:14:01 crc kubenswrapper[4790]: I1003 09:14:01.848498 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/037bc62e-a9b6-45c4-9bce-9395a372bc90-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "037bc62e-a9b6-45c4-9bce-9395a372bc90" (UID: "037bc62e-a9b6-45c4-9bce-9395a372bc90"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:14:01 crc kubenswrapper[4790]: I1003 09:14:01.849705 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/037bc62e-a9b6-45c4-9bce-9395a372bc90-openstack-edpm-ipam-telemetry-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-telemetry-default-certs-0") pod "037bc62e-a9b6-45c4-9bce-9395a372bc90" (UID: "037bc62e-a9b6-45c4-9bce-9395a372bc90"). InnerVolumeSpecName "openstack-edpm-ipam-telemetry-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:14:01 crc kubenswrapper[4790]: I1003 09:14:01.850478 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/037bc62e-a9b6-45c4-9bce-9395a372bc90-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "037bc62e-a9b6-45c4-9bce-9395a372bc90" (UID: "037bc62e-a9b6-45c4-9bce-9395a372bc90"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:14:01 crc kubenswrapper[4790]: I1003 09:14:01.850689 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/037bc62e-a9b6-45c4-9bce-9395a372bc90-openstack-edpm-ipam-libvirt-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-libvirt-default-certs-0") pod "037bc62e-a9b6-45c4-9bce-9395a372bc90" (UID: "037bc62e-a9b6-45c4-9bce-9395a372bc90"). InnerVolumeSpecName "openstack-edpm-ipam-libvirt-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:14:01 crc kubenswrapper[4790]: I1003 09:14:01.850721 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/037bc62e-a9b6-45c4-9bce-9395a372bc90-openstack-edpm-ipam-neutron-metadata-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-neutron-metadata-default-certs-0") pod "037bc62e-a9b6-45c4-9bce-9395a372bc90" (UID: "037bc62e-a9b6-45c4-9bce-9395a372bc90"). InnerVolumeSpecName "openstack-edpm-ipam-neutron-metadata-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:14:01 crc kubenswrapper[4790]: I1003 09:14:01.850920 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/037bc62e-a9b6-45c4-9bce-9395a372bc90-openstack-edpm-ipam-ovn-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-ovn-default-certs-0") pod "037bc62e-a9b6-45c4-9bce-9395a372bc90" (UID: "037bc62e-a9b6-45c4-9bce-9395a372bc90"). InnerVolumeSpecName "openstack-edpm-ipam-ovn-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:14:01 crc kubenswrapper[4790]: I1003 09:14:01.852629 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/037bc62e-a9b6-45c4-9bce-9395a372bc90-kube-api-access-rgp9k" (OuterVolumeSpecName: "kube-api-access-rgp9k") pod "037bc62e-a9b6-45c4-9bce-9395a372bc90" (UID: "037bc62e-a9b6-45c4-9bce-9395a372bc90"). InnerVolumeSpecName "kube-api-access-rgp9k". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:14:01 crc kubenswrapper[4790]: I1003 09:14:01.870894 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/037bc62e-a9b6-45c4-9bce-9395a372bc90-inventory" (OuterVolumeSpecName: "inventory") pod "037bc62e-a9b6-45c4-9bce-9395a372bc90" (UID: "037bc62e-a9b6-45c4-9bce-9395a372bc90"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:14:01 crc kubenswrapper[4790]: I1003 09:14:01.873285 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/037bc62e-a9b6-45c4-9bce-9395a372bc90-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "037bc62e-a9b6-45c4-9bce-9395a372bc90" (UID: "037bc62e-a9b6-45c4-9bce-9395a372bc90"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:14:01 crc kubenswrapper[4790]: I1003 09:14:01.942753 4790 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/037bc62e-a9b6-45c4-9bce-9395a372bc90-openstack-edpm-ipam-telemetry-default-certs-0\") on node \"crc\" DevicePath \"\"" Oct 03 09:14:01 crc kubenswrapper[4790]: I1003 09:14:01.942791 4790 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/037bc62e-a9b6-45c4-9bce-9395a372bc90-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 09:14:01 crc kubenswrapper[4790]: I1003 09:14:01.942809 4790 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/037bc62e-a9b6-45c4-9bce-9395a372bc90-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 03 09:14:01 crc kubenswrapper[4790]: I1003 09:14:01.942821 4790 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/037bc62e-a9b6-45c4-9bce-9395a372bc90-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 09:14:01 crc kubenswrapper[4790]: I1003 09:14:01.942831 4790 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/037bc62e-a9b6-45c4-9bce-9395a372bc90-openstack-edpm-ipam-libvirt-default-certs-0\") on node \"crc\" DevicePath \"\"" Oct 03 09:14:01 crc kubenswrapper[4790]: I1003 09:14:01.942840 4790 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/037bc62e-a9b6-45c4-9bce-9395a372bc90-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 09:14:01 crc kubenswrapper[4790]: I1003 09:14:01.942849 4790 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/037bc62e-a9b6-45c4-9bce-9395a372bc90-inventory\") on node \"crc\" DevicePath \"\"" Oct 03 09:14:01 crc kubenswrapper[4790]: I1003 09:14:01.942857 4790 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/037bc62e-a9b6-45c4-9bce-9395a372bc90-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 09:14:01 crc kubenswrapper[4790]: I1003 09:14:01.942865 4790 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/037bc62e-a9b6-45c4-9bce-9395a372bc90-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 09:14:01 crc kubenswrapper[4790]: I1003 09:14:01.942876 4790 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/037bc62e-a9b6-45c4-9bce-9395a372bc90-openstack-edpm-ipam-neutron-metadata-default-certs-0\") on node \"crc\" DevicePath \"\"" Oct 03 09:14:01 crc kubenswrapper[4790]: I1003 09:14:01.942885 4790 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/037bc62e-a9b6-45c4-9bce-9395a372bc90-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 09:14:01 crc kubenswrapper[4790]: I1003 09:14:01.943087 4790 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/037bc62e-a9b6-45c4-9bce-9395a372bc90-openstack-edpm-ipam-ovn-default-certs-0\") on node \"crc\" DevicePath \"\"" Oct 03 09:14:01 crc kubenswrapper[4790]: I1003 09:14:01.943098 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rgp9k\" (UniqueName: \"kubernetes.io/projected/037bc62e-a9b6-45c4-9bce-9395a372bc90-kube-api-access-rgp9k\") on node \"crc\" DevicePath \"\"" Oct 03 09:14:01 crc kubenswrapper[4790]: I1003 09:14:01.943111 4790 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/037bc62e-a9b6-45c4-9bce-9395a372bc90-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 09:14:02 crc kubenswrapper[4790]: I1003 09:14:02.362669 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-x28c6" event={"ID":"037bc62e-a9b6-45c4-9bce-9395a372bc90","Type":"ContainerDied","Data":"e9e705fb9244949a03d0928737388d1f1b5d202966da4d822c7b4f38781f750f"} Oct 03 09:14:02 crc kubenswrapper[4790]: I1003 09:14:02.362713 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e9e705fb9244949a03d0928737388d1f1b5d202966da4d822c7b4f38781f750f" Oct 03 09:14:02 crc kubenswrapper[4790]: I1003 09:14:02.362979 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-x28c6" Oct 03 09:14:02 crc kubenswrapper[4790]: I1003 09:14:02.460666 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-c77vh"] Oct 03 09:14:02 crc kubenswrapper[4790]: E1003 09:14:02.461484 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="037bc62e-a9b6-45c4-9bce-9395a372bc90" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Oct 03 09:14:02 crc kubenswrapper[4790]: I1003 09:14:02.461509 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="037bc62e-a9b6-45c4-9bce-9395a372bc90" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Oct 03 09:14:02 crc kubenswrapper[4790]: I1003 09:14:02.461804 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="037bc62e-a9b6-45c4-9bce-9395a372bc90" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Oct 03 09:14:02 crc kubenswrapper[4790]: I1003 09:14:02.462752 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-c77vh" Oct 03 09:14:02 crc kubenswrapper[4790]: I1003 09:14:02.466432 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 03 09:14:02 crc kubenswrapper[4790]: I1003 09:14:02.466800 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 03 09:14:02 crc kubenswrapper[4790]: I1003 09:14:02.466828 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-8j46v" Oct 03 09:14:02 crc kubenswrapper[4790]: I1003 09:14:02.466977 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-config" Oct 03 09:14:02 crc kubenswrapper[4790]: I1003 09:14:02.470752 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-c77vh"] Oct 03 09:14:02 crc kubenswrapper[4790]: I1003 09:14:02.481175 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 03 09:14:02 crc kubenswrapper[4790]: I1003 09:14:02.555060 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f3195619-a867-4741-a9c5-fc995f682859-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-c77vh\" (UID: \"f3195619-a867-4741-a9c5-fc995f682859\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-c77vh" Oct 03 09:14:02 crc kubenswrapper[4790]: I1003 09:14:02.555118 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f3195619-a867-4741-a9c5-fc995f682859-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-c77vh\" (UID: \"f3195619-a867-4741-a9c5-fc995f682859\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-c77vh" Oct 03 09:14:02 crc kubenswrapper[4790]: I1003 09:14:02.555158 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f3195619-a867-4741-a9c5-fc995f682859-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-c77vh\" (UID: \"f3195619-a867-4741-a9c5-fc995f682859\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-c77vh" Oct 03 09:14:02 crc kubenswrapper[4790]: I1003 09:14:02.555229 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b4qbd\" (UniqueName: \"kubernetes.io/projected/f3195619-a867-4741-a9c5-fc995f682859-kube-api-access-b4qbd\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-c77vh\" (UID: \"f3195619-a867-4741-a9c5-fc995f682859\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-c77vh" Oct 03 09:14:02 crc kubenswrapper[4790]: I1003 09:14:02.555269 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/f3195619-a867-4741-a9c5-fc995f682859-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-c77vh\" (UID: \"f3195619-a867-4741-a9c5-fc995f682859\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-c77vh" Oct 03 09:14:02 crc kubenswrapper[4790]: I1003 09:14:02.657870 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f3195619-a867-4741-a9c5-fc995f682859-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-c77vh\" (UID: \"f3195619-a867-4741-a9c5-fc995f682859\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-c77vh" Oct 03 09:14:02 crc kubenswrapper[4790]: I1003 09:14:02.657975 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f3195619-a867-4741-a9c5-fc995f682859-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-c77vh\" (UID: \"f3195619-a867-4741-a9c5-fc995f682859\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-c77vh" Oct 03 09:14:02 crc kubenswrapper[4790]: I1003 09:14:02.658015 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f3195619-a867-4741-a9c5-fc995f682859-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-c77vh\" (UID: \"f3195619-a867-4741-a9c5-fc995f682859\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-c77vh" Oct 03 09:14:02 crc kubenswrapper[4790]: I1003 09:14:02.658124 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b4qbd\" (UniqueName: \"kubernetes.io/projected/f3195619-a867-4741-a9c5-fc995f682859-kube-api-access-b4qbd\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-c77vh\" (UID: \"f3195619-a867-4741-a9c5-fc995f682859\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-c77vh" Oct 03 09:14:02 crc kubenswrapper[4790]: I1003 09:14:02.658557 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/f3195619-a867-4741-a9c5-fc995f682859-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-c77vh\" (UID: \"f3195619-a867-4741-a9c5-fc995f682859\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-c77vh" Oct 03 09:14:02 crc kubenswrapper[4790]: I1003 09:14:02.659424 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/f3195619-a867-4741-a9c5-fc995f682859-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-c77vh\" (UID: \"f3195619-a867-4741-a9c5-fc995f682859\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-c77vh" Oct 03 09:14:02 crc kubenswrapper[4790]: I1003 09:14:02.665408 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f3195619-a867-4741-a9c5-fc995f682859-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-c77vh\" (UID: \"f3195619-a867-4741-a9c5-fc995f682859\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-c77vh" Oct 03 09:14:02 crc kubenswrapper[4790]: I1003 09:14:02.666064 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f3195619-a867-4741-a9c5-fc995f682859-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-c77vh\" (UID: \"f3195619-a867-4741-a9c5-fc995f682859\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-c77vh" Oct 03 09:14:02 crc kubenswrapper[4790]: I1003 09:14:02.667052 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f3195619-a867-4741-a9c5-fc995f682859-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-c77vh\" (UID: \"f3195619-a867-4741-a9c5-fc995f682859\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-c77vh" Oct 03 09:14:02 crc kubenswrapper[4790]: I1003 09:14:02.677423 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b4qbd\" (UniqueName: \"kubernetes.io/projected/f3195619-a867-4741-a9c5-fc995f682859-kube-api-access-b4qbd\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-c77vh\" (UID: \"f3195619-a867-4741-a9c5-fc995f682859\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-c77vh" Oct 03 09:14:02 crc kubenswrapper[4790]: I1003 09:14:02.780945 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-c77vh" Oct 03 09:14:03 crc kubenswrapper[4790]: I1003 09:14:03.338402 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-c77vh"] Oct 03 09:14:03 crc kubenswrapper[4790]: I1003 09:14:03.382243 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-c77vh" event={"ID":"f3195619-a867-4741-a9c5-fc995f682859","Type":"ContainerStarted","Data":"32085b4379534a35861f9ada8f8181b0159a2557f0cb1cf3114f3993e16b6163"} Oct 03 09:14:04 crc kubenswrapper[4790]: I1003 09:14:04.395535 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-c77vh" event={"ID":"f3195619-a867-4741-a9c5-fc995f682859","Type":"ContainerStarted","Data":"5afc1c3ae4198dc7cf15a481908e0bb2f6c11265a6e4930770233d0d84987429"} Oct 03 09:14:04 crc kubenswrapper[4790]: I1003 09:14:04.415875 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-c77vh" podStartSLOduration=1.9053561829999999 podStartE2EDuration="2.415850676s" podCreationTimestamp="2025-10-03 09:14:02 +0000 UTC" firstStartedPulling="2025-10-03 09:14:03.346920033 +0000 UTC m=+2029.699854271" lastFinishedPulling="2025-10-03 09:14:03.857414516 +0000 UTC m=+2030.210348764" observedRunningTime="2025-10-03 09:14:04.412015601 +0000 UTC m=+2030.764949879" watchObservedRunningTime="2025-10-03 09:14:04.415850676 +0000 UTC m=+2030.768784914" Oct 03 09:14:10 crc kubenswrapper[4790]: I1003 09:14:10.186113 4790 patch_prober.go:28] interesting pod/machine-config-daemon-zqj4j container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 09:14:10 crc kubenswrapper[4790]: I1003 09:14:10.186760 4790 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" podUID="36eae06a-d8be-4128-8d68-acb32e1f011b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 09:14:10 crc kubenswrapper[4790]: I1003 09:14:10.186841 4790 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" Oct 03 09:14:10 crc kubenswrapper[4790]: I1003 09:14:10.187689 4790 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"6274a3236231c094c931f1afe606d7703eff60ab0480342f673b71ab8af0913c"} pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 03 09:14:10 crc kubenswrapper[4790]: I1003 09:14:10.187742 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" podUID="36eae06a-d8be-4128-8d68-acb32e1f011b" containerName="machine-config-daemon" containerID="cri-o://6274a3236231c094c931f1afe606d7703eff60ab0480342f673b71ab8af0913c" gracePeriod=600 Oct 03 09:14:10 crc kubenswrapper[4790]: I1003 09:14:10.447556 4790 generic.go:334] "Generic (PLEG): container finished" podID="36eae06a-d8be-4128-8d68-acb32e1f011b" containerID="6274a3236231c094c931f1afe606d7703eff60ab0480342f673b71ab8af0913c" exitCode=0 Oct 03 09:14:10 crc kubenswrapper[4790]: I1003 09:14:10.447621 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" event={"ID":"36eae06a-d8be-4128-8d68-acb32e1f011b","Type":"ContainerDied","Data":"6274a3236231c094c931f1afe606d7703eff60ab0480342f673b71ab8af0913c"} Oct 03 09:14:10 crc kubenswrapper[4790]: I1003 09:14:10.447655 4790 scope.go:117] "RemoveContainer" containerID="a12f13183219a9973e13fa6740df8e860fd1b4a6d896ada977d67f910dbb6700" Oct 03 09:14:11 crc kubenswrapper[4790]: I1003 09:14:11.470290 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" event={"ID":"36eae06a-d8be-4128-8d68-acb32e1f011b","Type":"ContainerStarted","Data":"105f7e31a1b4fd122ba3755f58ec2001671f4c1e80e4803ea6d27a8c94b0c9b8"} Oct 03 09:14:34 crc kubenswrapper[4790]: I1003 09:14:34.841866 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-pngw7"] Oct 03 09:14:34 crc kubenswrapper[4790]: I1003 09:14:34.845664 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-pngw7" Oct 03 09:14:34 crc kubenswrapper[4790]: I1003 09:14:34.854835 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-pngw7"] Oct 03 09:14:34 crc kubenswrapper[4790]: I1003 09:14:34.996293 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fc1cfbc7-34be-4697-b694-9228816e746a-utilities\") pod \"certified-operators-pngw7\" (UID: \"fc1cfbc7-34be-4697-b694-9228816e746a\") " pod="openshift-marketplace/certified-operators-pngw7" Oct 03 09:14:34 crc kubenswrapper[4790]: I1003 09:14:34.996382 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-68fgc\" (UniqueName: \"kubernetes.io/projected/fc1cfbc7-34be-4697-b694-9228816e746a-kube-api-access-68fgc\") pod \"certified-operators-pngw7\" (UID: \"fc1cfbc7-34be-4697-b694-9228816e746a\") " pod="openshift-marketplace/certified-operators-pngw7" Oct 03 09:14:34 crc kubenswrapper[4790]: I1003 09:14:34.996446 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fc1cfbc7-34be-4697-b694-9228816e746a-catalog-content\") pod \"certified-operators-pngw7\" (UID: \"fc1cfbc7-34be-4697-b694-9228816e746a\") " pod="openshift-marketplace/certified-operators-pngw7" Oct 03 09:14:35 crc kubenswrapper[4790]: I1003 09:14:35.098428 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fc1cfbc7-34be-4697-b694-9228816e746a-utilities\") pod \"certified-operators-pngw7\" (UID: \"fc1cfbc7-34be-4697-b694-9228816e746a\") " pod="openshift-marketplace/certified-operators-pngw7" Oct 03 09:14:35 crc kubenswrapper[4790]: I1003 09:14:35.098549 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-68fgc\" (UniqueName: \"kubernetes.io/projected/fc1cfbc7-34be-4697-b694-9228816e746a-kube-api-access-68fgc\") pod \"certified-operators-pngw7\" (UID: \"fc1cfbc7-34be-4697-b694-9228816e746a\") " pod="openshift-marketplace/certified-operators-pngw7" Oct 03 09:14:35 crc kubenswrapper[4790]: I1003 09:14:35.098648 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fc1cfbc7-34be-4697-b694-9228816e746a-catalog-content\") pod \"certified-operators-pngw7\" (UID: \"fc1cfbc7-34be-4697-b694-9228816e746a\") " pod="openshift-marketplace/certified-operators-pngw7" Oct 03 09:14:35 crc kubenswrapper[4790]: I1003 09:14:35.099106 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fc1cfbc7-34be-4697-b694-9228816e746a-utilities\") pod \"certified-operators-pngw7\" (UID: \"fc1cfbc7-34be-4697-b694-9228816e746a\") " pod="openshift-marketplace/certified-operators-pngw7" Oct 03 09:14:35 crc kubenswrapper[4790]: I1003 09:14:35.099164 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fc1cfbc7-34be-4697-b694-9228816e746a-catalog-content\") pod \"certified-operators-pngw7\" (UID: \"fc1cfbc7-34be-4697-b694-9228816e746a\") " pod="openshift-marketplace/certified-operators-pngw7" Oct 03 09:14:35 crc kubenswrapper[4790]: I1003 09:14:35.120157 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-68fgc\" (UniqueName: \"kubernetes.io/projected/fc1cfbc7-34be-4697-b694-9228816e746a-kube-api-access-68fgc\") pod \"certified-operators-pngw7\" (UID: \"fc1cfbc7-34be-4697-b694-9228816e746a\") " pod="openshift-marketplace/certified-operators-pngw7" Oct 03 09:14:35 crc kubenswrapper[4790]: I1003 09:14:35.167767 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-pngw7" Oct 03 09:14:35 crc kubenswrapper[4790]: I1003 09:14:35.728452 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-pngw7"] Oct 03 09:14:35 crc kubenswrapper[4790]: I1003 09:14:35.754806 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pngw7" event={"ID":"fc1cfbc7-34be-4697-b694-9228816e746a","Type":"ContainerStarted","Data":"86e86ed722642a95747c89a81eb887d73314b57e6e76ceeb69642f0212f82f3f"} Oct 03 09:14:36 crc kubenswrapper[4790]: I1003 09:14:36.775810 4790 generic.go:334] "Generic (PLEG): container finished" podID="fc1cfbc7-34be-4697-b694-9228816e746a" containerID="c82911e5461ef4ae09899ed78996b1fa4828b96aa60278efff7c78cc5624d7fa" exitCode=0 Oct 03 09:14:36 crc kubenswrapper[4790]: I1003 09:14:36.775874 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pngw7" event={"ID":"fc1cfbc7-34be-4697-b694-9228816e746a","Type":"ContainerDied","Data":"c82911e5461ef4ae09899ed78996b1fa4828b96aa60278efff7c78cc5624d7fa"} Oct 03 09:14:38 crc kubenswrapper[4790]: I1003 09:14:38.799075 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pngw7" event={"ID":"fc1cfbc7-34be-4697-b694-9228816e746a","Type":"ContainerStarted","Data":"b5dc5fe1820135b74da80ae58fd861e810b3530c2c89016ea12b9dc33f5acaa4"} Oct 03 09:14:39 crc kubenswrapper[4790]: I1003 09:14:39.811909 4790 generic.go:334] "Generic (PLEG): container finished" podID="fc1cfbc7-34be-4697-b694-9228816e746a" containerID="b5dc5fe1820135b74da80ae58fd861e810b3530c2c89016ea12b9dc33f5acaa4" exitCode=0 Oct 03 09:14:39 crc kubenswrapper[4790]: I1003 09:14:39.811965 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pngw7" event={"ID":"fc1cfbc7-34be-4697-b694-9228816e746a","Type":"ContainerDied","Data":"b5dc5fe1820135b74da80ae58fd861e810b3530c2c89016ea12b9dc33f5acaa4"} Oct 03 09:14:41 crc kubenswrapper[4790]: I1003 09:14:41.843308 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pngw7" event={"ID":"fc1cfbc7-34be-4697-b694-9228816e746a","Type":"ContainerStarted","Data":"67d7ed07f4f692ff0d60a3fc9b35c59105814d544f54af8be15f167785027470"} Oct 03 09:14:41 crc kubenswrapper[4790]: I1003 09:14:41.879824 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-pngw7" podStartSLOduration=4.064137625 podStartE2EDuration="7.87979296s" podCreationTimestamp="2025-10-03 09:14:34 +0000 UTC" firstStartedPulling="2025-10-03 09:14:36.778295477 +0000 UTC m=+2063.131229715" lastFinishedPulling="2025-10-03 09:14:40.593950812 +0000 UTC m=+2066.946885050" observedRunningTime="2025-10-03 09:14:41.86124192 +0000 UTC m=+2068.214176248" watchObservedRunningTime="2025-10-03 09:14:41.87979296 +0000 UTC m=+2068.232727238" Oct 03 09:14:45 crc kubenswrapper[4790]: I1003 09:14:45.168589 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-pngw7" Oct 03 09:14:45 crc kubenswrapper[4790]: I1003 09:14:45.169258 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-pngw7" Oct 03 09:14:45 crc kubenswrapper[4790]: I1003 09:14:45.218477 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-pngw7" Oct 03 09:14:45 crc kubenswrapper[4790]: I1003 09:14:45.930665 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-pngw7" Oct 03 09:14:45 crc kubenswrapper[4790]: I1003 09:14:45.970463 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-pngw7"] Oct 03 09:14:47 crc kubenswrapper[4790]: I1003 09:14:47.885431 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-92r86"] Oct 03 09:14:47 crc kubenswrapper[4790]: I1003 09:14:47.888014 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-92r86" Oct 03 09:14:47 crc kubenswrapper[4790]: I1003 09:14:47.900883 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-92r86"] Oct 03 09:14:47 crc kubenswrapper[4790]: I1003 09:14:47.912499 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-pngw7" podUID="fc1cfbc7-34be-4697-b694-9228816e746a" containerName="registry-server" containerID="cri-o://67d7ed07f4f692ff0d60a3fc9b35c59105814d544f54af8be15f167785027470" gracePeriod=2 Oct 03 09:14:47 crc kubenswrapper[4790]: I1003 09:14:47.977921 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4c817004-6a29-4be9-907b-cc5a254e7132-utilities\") pod \"community-operators-92r86\" (UID: \"4c817004-6a29-4be9-907b-cc5a254e7132\") " pod="openshift-marketplace/community-operators-92r86" Oct 03 09:14:47 crc kubenswrapper[4790]: I1003 09:14:47.978124 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dwxtk\" (UniqueName: \"kubernetes.io/projected/4c817004-6a29-4be9-907b-cc5a254e7132-kube-api-access-dwxtk\") pod \"community-operators-92r86\" (UID: \"4c817004-6a29-4be9-907b-cc5a254e7132\") " pod="openshift-marketplace/community-operators-92r86" Oct 03 09:14:47 crc kubenswrapper[4790]: I1003 09:14:47.978191 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4c817004-6a29-4be9-907b-cc5a254e7132-catalog-content\") pod \"community-operators-92r86\" (UID: \"4c817004-6a29-4be9-907b-cc5a254e7132\") " pod="openshift-marketplace/community-operators-92r86" Oct 03 09:14:48 crc kubenswrapper[4790]: I1003 09:14:48.080078 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4c817004-6a29-4be9-907b-cc5a254e7132-utilities\") pod \"community-operators-92r86\" (UID: \"4c817004-6a29-4be9-907b-cc5a254e7132\") " pod="openshift-marketplace/community-operators-92r86" Oct 03 09:14:48 crc kubenswrapper[4790]: I1003 09:14:48.080163 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dwxtk\" (UniqueName: \"kubernetes.io/projected/4c817004-6a29-4be9-907b-cc5a254e7132-kube-api-access-dwxtk\") pod \"community-operators-92r86\" (UID: \"4c817004-6a29-4be9-907b-cc5a254e7132\") " pod="openshift-marketplace/community-operators-92r86" Oct 03 09:14:48 crc kubenswrapper[4790]: I1003 09:14:48.080186 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4c817004-6a29-4be9-907b-cc5a254e7132-catalog-content\") pod \"community-operators-92r86\" (UID: \"4c817004-6a29-4be9-907b-cc5a254e7132\") " pod="openshift-marketplace/community-operators-92r86" Oct 03 09:14:48 crc kubenswrapper[4790]: I1003 09:14:48.080863 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4c817004-6a29-4be9-907b-cc5a254e7132-catalog-content\") pod \"community-operators-92r86\" (UID: \"4c817004-6a29-4be9-907b-cc5a254e7132\") " pod="openshift-marketplace/community-operators-92r86" Oct 03 09:14:48 crc kubenswrapper[4790]: I1003 09:14:48.080870 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4c817004-6a29-4be9-907b-cc5a254e7132-utilities\") pod \"community-operators-92r86\" (UID: \"4c817004-6a29-4be9-907b-cc5a254e7132\") " pod="openshift-marketplace/community-operators-92r86" Oct 03 09:14:48 crc kubenswrapper[4790]: I1003 09:14:48.103451 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dwxtk\" (UniqueName: \"kubernetes.io/projected/4c817004-6a29-4be9-907b-cc5a254e7132-kube-api-access-dwxtk\") pod \"community-operators-92r86\" (UID: \"4c817004-6a29-4be9-907b-cc5a254e7132\") " pod="openshift-marketplace/community-operators-92r86" Oct 03 09:14:48 crc kubenswrapper[4790]: I1003 09:14:48.251093 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-92r86" Oct 03 09:14:48 crc kubenswrapper[4790]: I1003 09:14:48.466570 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-pngw7" Oct 03 09:14:48 crc kubenswrapper[4790]: I1003 09:14:48.591443 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fc1cfbc7-34be-4697-b694-9228816e746a-catalog-content\") pod \"fc1cfbc7-34be-4697-b694-9228816e746a\" (UID: \"fc1cfbc7-34be-4697-b694-9228816e746a\") " Oct 03 09:14:48 crc kubenswrapper[4790]: I1003 09:14:48.591807 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fc1cfbc7-34be-4697-b694-9228816e746a-utilities\") pod \"fc1cfbc7-34be-4697-b694-9228816e746a\" (UID: \"fc1cfbc7-34be-4697-b694-9228816e746a\") " Oct 03 09:14:48 crc kubenswrapper[4790]: I1003 09:14:48.591975 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-68fgc\" (UniqueName: \"kubernetes.io/projected/fc1cfbc7-34be-4697-b694-9228816e746a-kube-api-access-68fgc\") pod \"fc1cfbc7-34be-4697-b694-9228816e746a\" (UID: \"fc1cfbc7-34be-4697-b694-9228816e746a\") " Oct 03 09:14:48 crc kubenswrapper[4790]: I1003 09:14:48.592982 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fc1cfbc7-34be-4697-b694-9228816e746a-utilities" (OuterVolumeSpecName: "utilities") pod "fc1cfbc7-34be-4697-b694-9228816e746a" (UID: "fc1cfbc7-34be-4697-b694-9228816e746a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 09:14:48 crc kubenswrapper[4790]: I1003 09:14:48.593690 4790 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fc1cfbc7-34be-4697-b694-9228816e746a-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 09:14:48 crc kubenswrapper[4790]: I1003 09:14:48.599113 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fc1cfbc7-34be-4697-b694-9228816e746a-kube-api-access-68fgc" (OuterVolumeSpecName: "kube-api-access-68fgc") pod "fc1cfbc7-34be-4697-b694-9228816e746a" (UID: "fc1cfbc7-34be-4697-b694-9228816e746a"). InnerVolumeSpecName "kube-api-access-68fgc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:14:48 crc kubenswrapper[4790]: I1003 09:14:48.639783 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fc1cfbc7-34be-4697-b694-9228816e746a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "fc1cfbc7-34be-4697-b694-9228816e746a" (UID: "fc1cfbc7-34be-4697-b694-9228816e746a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 09:14:48 crc kubenswrapper[4790]: I1003 09:14:48.695014 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-68fgc\" (UniqueName: \"kubernetes.io/projected/fc1cfbc7-34be-4697-b694-9228816e746a-kube-api-access-68fgc\") on node \"crc\" DevicePath \"\"" Oct 03 09:14:48 crc kubenswrapper[4790]: I1003 09:14:48.695050 4790 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fc1cfbc7-34be-4697-b694-9228816e746a-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 09:14:48 crc kubenswrapper[4790]: I1003 09:14:48.837570 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-92r86"] Oct 03 09:14:48 crc kubenswrapper[4790]: I1003 09:14:48.922928 4790 generic.go:334] "Generic (PLEG): container finished" podID="fc1cfbc7-34be-4697-b694-9228816e746a" containerID="67d7ed07f4f692ff0d60a3fc9b35c59105814d544f54af8be15f167785027470" exitCode=0 Oct 03 09:14:48 crc kubenswrapper[4790]: I1003 09:14:48.923007 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-pngw7" Oct 03 09:14:48 crc kubenswrapper[4790]: I1003 09:14:48.923014 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pngw7" event={"ID":"fc1cfbc7-34be-4697-b694-9228816e746a","Type":"ContainerDied","Data":"67d7ed07f4f692ff0d60a3fc9b35c59105814d544f54af8be15f167785027470"} Oct 03 09:14:48 crc kubenswrapper[4790]: I1003 09:14:48.923150 4790 scope.go:117] "RemoveContainer" containerID="67d7ed07f4f692ff0d60a3fc9b35c59105814d544f54af8be15f167785027470" Oct 03 09:14:48 crc kubenswrapper[4790]: I1003 09:14:48.923051 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pngw7" event={"ID":"fc1cfbc7-34be-4697-b694-9228816e746a","Type":"ContainerDied","Data":"86e86ed722642a95747c89a81eb887d73314b57e6e76ceeb69642f0212f82f3f"} Oct 03 09:14:48 crc kubenswrapper[4790]: I1003 09:14:48.924486 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-92r86" event={"ID":"4c817004-6a29-4be9-907b-cc5a254e7132","Type":"ContainerStarted","Data":"9ca000b6b799fcb430ec6863f68ab3c2dca40ea633aa8e8d9e70d0b7a80817d7"} Oct 03 09:14:48 crc kubenswrapper[4790]: I1003 09:14:48.947271 4790 scope.go:117] "RemoveContainer" containerID="b5dc5fe1820135b74da80ae58fd861e810b3530c2c89016ea12b9dc33f5acaa4" Oct 03 09:14:48 crc kubenswrapper[4790]: I1003 09:14:48.977490 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-pngw7"] Oct 03 09:14:48 crc kubenswrapper[4790]: I1003 09:14:48.977961 4790 scope.go:117] "RemoveContainer" containerID="c82911e5461ef4ae09899ed78996b1fa4828b96aa60278efff7c78cc5624d7fa" Oct 03 09:14:48 crc kubenswrapper[4790]: I1003 09:14:48.985892 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-pngw7"] Oct 03 09:14:49 crc kubenswrapper[4790]: I1003 09:14:49.001792 4790 scope.go:117] "RemoveContainer" containerID="67d7ed07f4f692ff0d60a3fc9b35c59105814d544f54af8be15f167785027470" Oct 03 09:14:49 crc kubenswrapper[4790]: E1003 09:14:49.002257 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"67d7ed07f4f692ff0d60a3fc9b35c59105814d544f54af8be15f167785027470\": container with ID starting with 67d7ed07f4f692ff0d60a3fc9b35c59105814d544f54af8be15f167785027470 not found: ID does not exist" containerID="67d7ed07f4f692ff0d60a3fc9b35c59105814d544f54af8be15f167785027470" Oct 03 09:14:49 crc kubenswrapper[4790]: I1003 09:14:49.002290 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"67d7ed07f4f692ff0d60a3fc9b35c59105814d544f54af8be15f167785027470"} err="failed to get container status \"67d7ed07f4f692ff0d60a3fc9b35c59105814d544f54af8be15f167785027470\": rpc error: code = NotFound desc = could not find container \"67d7ed07f4f692ff0d60a3fc9b35c59105814d544f54af8be15f167785027470\": container with ID starting with 67d7ed07f4f692ff0d60a3fc9b35c59105814d544f54af8be15f167785027470 not found: ID does not exist" Oct 03 09:14:49 crc kubenswrapper[4790]: I1003 09:14:49.002311 4790 scope.go:117] "RemoveContainer" containerID="b5dc5fe1820135b74da80ae58fd861e810b3530c2c89016ea12b9dc33f5acaa4" Oct 03 09:14:49 crc kubenswrapper[4790]: E1003 09:14:49.002946 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b5dc5fe1820135b74da80ae58fd861e810b3530c2c89016ea12b9dc33f5acaa4\": container with ID starting with b5dc5fe1820135b74da80ae58fd861e810b3530c2c89016ea12b9dc33f5acaa4 not found: ID does not exist" containerID="b5dc5fe1820135b74da80ae58fd861e810b3530c2c89016ea12b9dc33f5acaa4" Oct 03 09:14:49 crc kubenswrapper[4790]: I1003 09:14:49.003047 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b5dc5fe1820135b74da80ae58fd861e810b3530c2c89016ea12b9dc33f5acaa4"} err="failed to get container status \"b5dc5fe1820135b74da80ae58fd861e810b3530c2c89016ea12b9dc33f5acaa4\": rpc error: code = NotFound desc = could not find container \"b5dc5fe1820135b74da80ae58fd861e810b3530c2c89016ea12b9dc33f5acaa4\": container with ID starting with b5dc5fe1820135b74da80ae58fd861e810b3530c2c89016ea12b9dc33f5acaa4 not found: ID does not exist" Oct 03 09:14:49 crc kubenswrapper[4790]: I1003 09:14:49.003126 4790 scope.go:117] "RemoveContainer" containerID="c82911e5461ef4ae09899ed78996b1fa4828b96aa60278efff7c78cc5624d7fa" Oct 03 09:14:49 crc kubenswrapper[4790]: E1003 09:14:49.003586 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c82911e5461ef4ae09899ed78996b1fa4828b96aa60278efff7c78cc5624d7fa\": container with ID starting with c82911e5461ef4ae09899ed78996b1fa4828b96aa60278efff7c78cc5624d7fa not found: ID does not exist" containerID="c82911e5461ef4ae09899ed78996b1fa4828b96aa60278efff7c78cc5624d7fa" Oct 03 09:14:49 crc kubenswrapper[4790]: I1003 09:14:49.003699 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c82911e5461ef4ae09899ed78996b1fa4828b96aa60278efff7c78cc5624d7fa"} err="failed to get container status \"c82911e5461ef4ae09899ed78996b1fa4828b96aa60278efff7c78cc5624d7fa\": rpc error: code = NotFound desc = could not find container \"c82911e5461ef4ae09899ed78996b1fa4828b96aa60278efff7c78cc5624d7fa\": container with ID starting with c82911e5461ef4ae09899ed78996b1fa4828b96aa60278efff7c78cc5624d7fa not found: ID does not exist" Oct 03 09:14:49 crc kubenswrapper[4790]: I1003 09:14:49.951980 4790 generic.go:334] "Generic (PLEG): container finished" podID="4c817004-6a29-4be9-907b-cc5a254e7132" containerID="cb1679d2e7cadd3fe046e068fd90a1e8fe4958f45aea460487f3a1ac2c0299cf" exitCode=0 Oct 03 09:14:49 crc kubenswrapper[4790]: I1003 09:14:49.952419 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-92r86" event={"ID":"4c817004-6a29-4be9-907b-cc5a254e7132","Type":"ContainerDied","Data":"cb1679d2e7cadd3fe046e068fd90a1e8fe4958f45aea460487f3a1ac2c0299cf"} Oct 03 09:14:50 crc kubenswrapper[4790]: I1003 09:14:50.342034 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fc1cfbc7-34be-4697-b694-9228816e746a" path="/var/lib/kubelet/pods/fc1cfbc7-34be-4697-b694-9228816e746a/volumes" Oct 03 09:14:55 crc kubenswrapper[4790]: I1003 09:14:55.014122 4790 generic.go:334] "Generic (PLEG): container finished" podID="4c817004-6a29-4be9-907b-cc5a254e7132" containerID="ce5e532ecda9d05ff8314e2112c124cd3af59632f0b45d5f4cabfdd04afe8ee8" exitCode=0 Oct 03 09:14:55 crc kubenswrapper[4790]: I1003 09:14:55.015945 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-92r86" event={"ID":"4c817004-6a29-4be9-907b-cc5a254e7132","Type":"ContainerDied","Data":"ce5e532ecda9d05ff8314e2112c124cd3af59632f0b45d5f4cabfdd04afe8ee8"} Oct 03 09:14:56 crc kubenswrapper[4790]: I1003 09:14:56.029380 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-92r86" event={"ID":"4c817004-6a29-4be9-907b-cc5a254e7132","Type":"ContainerStarted","Data":"78d9fa563e4135495db5d75887bfab90d06314efd2d066e54d90a92dcdf32eef"} Oct 03 09:14:56 crc kubenswrapper[4790]: I1003 09:14:56.058356 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-92r86" podStartSLOduration=3.592398442 podStartE2EDuration="9.058324998s" podCreationTimestamp="2025-10-03 09:14:47 +0000 UTC" firstStartedPulling="2025-10-03 09:14:49.955073586 +0000 UTC m=+2076.308007824" lastFinishedPulling="2025-10-03 09:14:55.421000142 +0000 UTC m=+2081.773934380" observedRunningTime="2025-10-03 09:14:56.048364264 +0000 UTC m=+2082.401298532" watchObservedRunningTime="2025-10-03 09:14:56.058324998 +0000 UTC m=+2082.411259296" Oct 03 09:14:58 crc kubenswrapper[4790]: I1003 09:14:58.251808 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-92r86" Oct 03 09:14:58 crc kubenswrapper[4790]: I1003 09:14:58.252764 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-92r86" Oct 03 09:14:58 crc kubenswrapper[4790]: I1003 09:14:58.318483 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-92r86" Oct 03 09:15:00 crc kubenswrapper[4790]: I1003 09:15:00.167494 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29324715-f4g2f"] Oct 03 09:15:00 crc kubenswrapper[4790]: E1003 09:15:00.168086 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc1cfbc7-34be-4697-b694-9228816e746a" containerName="extract-utilities" Oct 03 09:15:00 crc kubenswrapper[4790]: I1003 09:15:00.168109 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc1cfbc7-34be-4697-b694-9228816e746a" containerName="extract-utilities" Oct 03 09:15:00 crc kubenswrapper[4790]: E1003 09:15:00.168140 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc1cfbc7-34be-4697-b694-9228816e746a" containerName="registry-server" Oct 03 09:15:00 crc kubenswrapper[4790]: I1003 09:15:00.168148 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc1cfbc7-34be-4697-b694-9228816e746a" containerName="registry-server" Oct 03 09:15:00 crc kubenswrapper[4790]: E1003 09:15:00.168179 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc1cfbc7-34be-4697-b694-9228816e746a" containerName="extract-content" Oct 03 09:15:00 crc kubenswrapper[4790]: I1003 09:15:00.168188 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc1cfbc7-34be-4697-b694-9228816e746a" containerName="extract-content" Oct 03 09:15:00 crc kubenswrapper[4790]: I1003 09:15:00.168508 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="fc1cfbc7-34be-4697-b694-9228816e746a" containerName="registry-server" Oct 03 09:15:00 crc kubenswrapper[4790]: I1003 09:15:00.169367 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29324715-f4g2f" Oct 03 09:15:00 crc kubenswrapper[4790]: I1003 09:15:00.171949 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 03 09:15:00 crc kubenswrapper[4790]: I1003 09:15:00.171964 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 03 09:15:00 crc kubenswrapper[4790]: I1003 09:15:00.184719 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29324715-f4g2f"] Oct 03 09:15:00 crc kubenswrapper[4790]: I1003 09:15:00.236703 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/41d654b2-107a-4eae-b098-12182f3a239a-config-volume\") pod \"collect-profiles-29324715-f4g2f\" (UID: \"41d654b2-107a-4eae-b098-12182f3a239a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324715-f4g2f" Oct 03 09:15:00 crc kubenswrapper[4790]: I1003 09:15:00.236938 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-znmmt\" (UniqueName: \"kubernetes.io/projected/41d654b2-107a-4eae-b098-12182f3a239a-kube-api-access-znmmt\") pod \"collect-profiles-29324715-f4g2f\" (UID: \"41d654b2-107a-4eae-b098-12182f3a239a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324715-f4g2f" Oct 03 09:15:00 crc kubenswrapper[4790]: I1003 09:15:00.237057 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/41d654b2-107a-4eae-b098-12182f3a239a-secret-volume\") pod \"collect-profiles-29324715-f4g2f\" (UID: \"41d654b2-107a-4eae-b098-12182f3a239a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324715-f4g2f" Oct 03 09:15:00 crc kubenswrapper[4790]: I1003 09:15:00.339201 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-znmmt\" (UniqueName: \"kubernetes.io/projected/41d654b2-107a-4eae-b098-12182f3a239a-kube-api-access-znmmt\") pod \"collect-profiles-29324715-f4g2f\" (UID: \"41d654b2-107a-4eae-b098-12182f3a239a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324715-f4g2f" Oct 03 09:15:00 crc kubenswrapper[4790]: I1003 09:15:00.339311 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/41d654b2-107a-4eae-b098-12182f3a239a-secret-volume\") pod \"collect-profiles-29324715-f4g2f\" (UID: \"41d654b2-107a-4eae-b098-12182f3a239a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324715-f4g2f" Oct 03 09:15:00 crc kubenswrapper[4790]: I1003 09:15:00.339424 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/41d654b2-107a-4eae-b098-12182f3a239a-config-volume\") pod \"collect-profiles-29324715-f4g2f\" (UID: \"41d654b2-107a-4eae-b098-12182f3a239a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324715-f4g2f" Oct 03 09:15:00 crc kubenswrapper[4790]: I1003 09:15:00.340398 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/41d654b2-107a-4eae-b098-12182f3a239a-config-volume\") pod \"collect-profiles-29324715-f4g2f\" (UID: \"41d654b2-107a-4eae-b098-12182f3a239a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324715-f4g2f" Oct 03 09:15:00 crc kubenswrapper[4790]: I1003 09:15:00.348626 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/41d654b2-107a-4eae-b098-12182f3a239a-secret-volume\") pod \"collect-profiles-29324715-f4g2f\" (UID: \"41d654b2-107a-4eae-b098-12182f3a239a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324715-f4g2f" Oct 03 09:15:00 crc kubenswrapper[4790]: I1003 09:15:00.356034 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-znmmt\" (UniqueName: \"kubernetes.io/projected/41d654b2-107a-4eae-b098-12182f3a239a-kube-api-access-znmmt\") pod \"collect-profiles-29324715-f4g2f\" (UID: \"41d654b2-107a-4eae-b098-12182f3a239a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324715-f4g2f" Oct 03 09:15:00 crc kubenswrapper[4790]: I1003 09:15:00.495799 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29324715-f4g2f" Oct 03 09:15:00 crc kubenswrapper[4790]: I1003 09:15:00.989868 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29324715-f4g2f"] Oct 03 09:15:01 crc kubenswrapper[4790]: I1003 09:15:01.097044 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29324715-f4g2f" event={"ID":"41d654b2-107a-4eae-b098-12182f3a239a","Type":"ContainerStarted","Data":"bf49c906ea5b8a553f35d3908b8be6ee4a681720bb38e8bf602a1bca22b13061"} Oct 03 09:15:02 crc kubenswrapper[4790]: I1003 09:15:02.109250 4790 generic.go:334] "Generic (PLEG): container finished" podID="41d654b2-107a-4eae-b098-12182f3a239a" containerID="2ac122f766f58060ad178d2e3f4682fee79c2cae57bd85d339c08fc28532bd3a" exitCode=0 Oct 03 09:15:02 crc kubenswrapper[4790]: I1003 09:15:02.109505 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29324715-f4g2f" event={"ID":"41d654b2-107a-4eae-b098-12182f3a239a","Type":"ContainerDied","Data":"2ac122f766f58060ad178d2e3f4682fee79c2cae57bd85d339c08fc28532bd3a"} Oct 03 09:15:03 crc kubenswrapper[4790]: I1003 09:15:03.461316 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29324715-f4g2f" Oct 03 09:15:03 crc kubenswrapper[4790]: I1003 09:15:03.504173 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-znmmt\" (UniqueName: \"kubernetes.io/projected/41d654b2-107a-4eae-b098-12182f3a239a-kube-api-access-znmmt\") pod \"41d654b2-107a-4eae-b098-12182f3a239a\" (UID: \"41d654b2-107a-4eae-b098-12182f3a239a\") " Oct 03 09:15:03 crc kubenswrapper[4790]: I1003 09:15:03.504280 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/41d654b2-107a-4eae-b098-12182f3a239a-secret-volume\") pod \"41d654b2-107a-4eae-b098-12182f3a239a\" (UID: \"41d654b2-107a-4eae-b098-12182f3a239a\") " Oct 03 09:15:03 crc kubenswrapper[4790]: I1003 09:15:03.504612 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/41d654b2-107a-4eae-b098-12182f3a239a-config-volume\") pod \"41d654b2-107a-4eae-b098-12182f3a239a\" (UID: \"41d654b2-107a-4eae-b098-12182f3a239a\") " Oct 03 09:15:03 crc kubenswrapper[4790]: I1003 09:15:03.505408 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/41d654b2-107a-4eae-b098-12182f3a239a-config-volume" (OuterVolumeSpecName: "config-volume") pod "41d654b2-107a-4eae-b098-12182f3a239a" (UID: "41d654b2-107a-4eae-b098-12182f3a239a"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 09:15:03 crc kubenswrapper[4790]: I1003 09:15:03.510645 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/41d654b2-107a-4eae-b098-12182f3a239a-kube-api-access-znmmt" (OuterVolumeSpecName: "kube-api-access-znmmt") pod "41d654b2-107a-4eae-b098-12182f3a239a" (UID: "41d654b2-107a-4eae-b098-12182f3a239a"). InnerVolumeSpecName "kube-api-access-znmmt". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:15:03 crc kubenswrapper[4790]: I1003 09:15:03.511081 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/41d654b2-107a-4eae-b098-12182f3a239a-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "41d654b2-107a-4eae-b098-12182f3a239a" (UID: "41d654b2-107a-4eae-b098-12182f3a239a"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:15:03 crc kubenswrapper[4790]: I1003 09:15:03.606742 4790 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/41d654b2-107a-4eae-b098-12182f3a239a-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 03 09:15:03 crc kubenswrapper[4790]: I1003 09:15:03.606780 4790 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/41d654b2-107a-4eae-b098-12182f3a239a-config-volume\") on node \"crc\" DevicePath \"\"" Oct 03 09:15:03 crc kubenswrapper[4790]: I1003 09:15:03.606793 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-znmmt\" (UniqueName: \"kubernetes.io/projected/41d654b2-107a-4eae-b098-12182f3a239a-kube-api-access-znmmt\") on node \"crc\" DevicePath \"\"" Oct 03 09:15:04 crc kubenswrapper[4790]: I1003 09:15:04.132561 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29324715-f4g2f" event={"ID":"41d654b2-107a-4eae-b098-12182f3a239a","Type":"ContainerDied","Data":"bf49c906ea5b8a553f35d3908b8be6ee4a681720bb38e8bf602a1bca22b13061"} Oct 03 09:15:04 crc kubenswrapper[4790]: I1003 09:15:04.132911 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bf49c906ea5b8a553f35d3908b8be6ee4a681720bb38e8bf602a1bca22b13061" Oct 03 09:15:04 crc kubenswrapper[4790]: I1003 09:15:04.132641 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29324715-f4g2f" Oct 03 09:15:04 crc kubenswrapper[4790]: I1003 09:15:04.532076 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29324670-6j9ps"] Oct 03 09:15:04 crc kubenswrapper[4790]: I1003 09:15:04.540153 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29324670-6j9ps"] Oct 03 09:15:06 crc kubenswrapper[4790]: I1003 09:15:06.153998 4790 generic.go:334] "Generic (PLEG): container finished" podID="f3195619-a867-4741-a9c5-fc995f682859" containerID="5afc1c3ae4198dc7cf15a481908e0bb2f6c11265a6e4930770233d0d84987429" exitCode=0 Oct 03 09:15:06 crc kubenswrapper[4790]: I1003 09:15:06.154093 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-c77vh" event={"ID":"f3195619-a867-4741-a9c5-fc995f682859","Type":"ContainerDied","Data":"5afc1c3ae4198dc7cf15a481908e0bb2f6c11265a6e4930770233d0d84987429"} Oct 03 09:15:06 crc kubenswrapper[4790]: I1003 09:15:06.343152 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="68fadce0-3257-4525-b81f-552537c013be" path="/var/lib/kubelet/pods/68fadce0-3257-4525-b81f-552537c013be/volumes" Oct 03 09:15:07 crc kubenswrapper[4790]: I1003 09:15:07.564860 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-c77vh" Oct 03 09:15:07 crc kubenswrapper[4790]: I1003 09:15:07.588714 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b4qbd\" (UniqueName: \"kubernetes.io/projected/f3195619-a867-4741-a9c5-fc995f682859-kube-api-access-b4qbd\") pod \"f3195619-a867-4741-a9c5-fc995f682859\" (UID: \"f3195619-a867-4741-a9c5-fc995f682859\") " Oct 03 09:15:07 crc kubenswrapper[4790]: I1003 09:15:07.588828 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/f3195619-a867-4741-a9c5-fc995f682859-ovncontroller-config-0\") pod \"f3195619-a867-4741-a9c5-fc995f682859\" (UID: \"f3195619-a867-4741-a9c5-fc995f682859\") " Oct 03 09:15:07 crc kubenswrapper[4790]: I1003 09:15:07.588992 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f3195619-a867-4741-a9c5-fc995f682859-ssh-key\") pod \"f3195619-a867-4741-a9c5-fc995f682859\" (UID: \"f3195619-a867-4741-a9c5-fc995f682859\") " Oct 03 09:15:07 crc kubenswrapper[4790]: I1003 09:15:07.589017 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f3195619-a867-4741-a9c5-fc995f682859-ovn-combined-ca-bundle\") pod \"f3195619-a867-4741-a9c5-fc995f682859\" (UID: \"f3195619-a867-4741-a9c5-fc995f682859\") " Oct 03 09:15:07 crc kubenswrapper[4790]: I1003 09:15:07.589050 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f3195619-a867-4741-a9c5-fc995f682859-inventory\") pod \"f3195619-a867-4741-a9c5-fc995f682859\" (UID: \"f3195619-a867-4741-a9c5-fc995f682859\") " Oct 03 09:15:07 crc kubenswrapper[4790]: I1003 09:15:07.595429 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f3195619-a867-4741-a9c5-fc995f682859-kube-api-access-b4qbd" (OuterVolumeSpecName: "kube-api-access-b4qbd") pod "f3195619-a867-4741-a9c5-fc995f682859" (UID: "f3195619-a867-4741-a9c5-fc995f682859"). InnerVolumeSpecName "kube-api-access-b4qbd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:15:07 crc kubenswrapper[4790]: I1003 09:15:07.595704 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f3195619-a867-4741-a9c5-fc995f682859-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "f3195619-a867-4741-a9c5-fc995f682859" (UID: "f3195619-a867-4741-a9c5-fc995f682859"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:15:07 crc kubenswrapper[4790]: I1003 09:15:07.626689 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f3195619-a867-4741-a9c5-fc995f682859-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "f3195619-a867-4741-a9c5-fc995f682859" (UID: "f3195619-a867-4741-a9c5-fc995f682859"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:15:07 crc kubenswrapper[4790]: I1003 09:15:07.637518 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f3195619-a867-4741-a9c5-fc995f682859-inventory" (OuterVolumeSpecName: "inventory") pod "f3195619-a867-4741-a9c5-fc995f682859" (UID: "f3195619-a867-4741-a9c5-fc995f682859"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:15:07 crc kubenswrapper[4790]: I1003 09:15:07.637802 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f3195619-a867-4741-a9c5-fc995f682859-ovncontroller-config-0" (OuterVolumeSpecName: "ovncontroller-config-0") pod "f3195619-a867-4741-a9c5-fc995f682859" (UID: "f3195619-a867-4741-a9c5-fc995f682859"). InnerVolumeSpecName "ovncontroller-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 09:15:07 crc kubenswrapper[4790]: I1003 09:15:07.692740 4790 reconciler_common.go:293] "Volume detached for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/f3195619-a867-4741-a9c5-fc995f682859-ovncontroller-config-0\") on node \"crc\" DevicePath \"\"" Oct 03 09:15:07 crc kubenswrapper[4790]: I1003 09:15:07.692904 4790 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f3195619-a867-4741-a9c5-fc995f682859-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 03 09:15:07 crc kubenswrapper[4790]: I1003 09:15:07.693005 4790 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f3195619-a867-4741-a9c5-fc995f682859-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 09:15:07 crc kubenswrapper[4790]: I1003 09:15:07.693080 4790 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f3195619-a867-4741-a9c5-fc995f682859-inventory\") on node \"crc\" DevicePath \"\"" Oct 03 09:15:07 crc kubenswrapper[4790]: I1003 09:15:07.693164 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b4qbd\" (UniqueName: \"kubernetes.io/projected/f3195619-a867-4741-a9c5-fc995f682859-kube-api-access-b4qbd\") on node \"crc\" DevicePath \"\"" Oct 03 09:15:08 crc kubenswrapper[4790]: I1003 09:15:08.178551 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-c77vh" event={"ID":"f3195619-a867-4741-a9c5-fc995f682859","Type":"ContainerDied","Data":"32085b4379534a35861f9ada8f8181b0159a2557f0cb1cf3114f3993e16b6163"} Oct 03 09:15:08 crc kubenswrapper[4790]: I1003 09:15:08.178921 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="32085b4379534a35861f9ada8f8181b0159a2557f0cb1cf3114f3993e16b6163" Oct 03 09:15:08 crc kubenswrapper[4790]: I1003 09:15:08.178649 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-c77vh" Oct 03 09:15:08 crc kubenswrapper[4790]: I1003 09:15:08.286155 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-64ldb"] Oct 03 09:15:08 crc kubenswrapper[4790]: E1003 09:15:08.286530 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f3195619-a867-4741-a9c5-fc995f682859" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Oct 03 09:15:08 crc kubenswrapper[4790]: I1003 09:15:08.286548 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="f3195619-a867-4741-a9c5-fc995f682859" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Oct 03 09:15:08 crc kubenswrapper[4790]: E1003 09:15:08.286563 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="41d654b2-107a-4eae-b098-12182f3a239a" containerName="collect-profiles" Oct 03 09:15:08 crc kubenswrapper[4790]: I1003 09:15:08.286593 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="41d654b2-107a-4eae-b098-12182f3a239a" containerName="collect-profiles" Oct 03 09:15:08 crc kubenswrapper[4790]: I1003 09:15:08.286772 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="f3195619-a867-4741-a9c5-fc995f682859" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Oct 03 09:15:08 crc kubenswrapper[4790]: I1003 09:15:08.286789 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="41d654b2-107a-4eae-b098-12182f3a239a" containerName="collect-profiles" Oct 03 09:15:08 crc kubenswrapper[4790]: I1003 09:15:08.287467 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-64ldb" Oct 03 09:15:08 crc kubenswrapper[4790]: I1003 09:15:08.290123 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-ovn-metadata-agent-neutron-config" Oct 03 09:15:08 crc kubenswrapper[4790]: I1003 09:15:08.293178 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 03 09:15:08 crc kubenswrapper[4790]: I1003 09:15:08.294676 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 03 09:15:08 crc kubenswrapper[4790]: I1003 09:15:08.297165 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 03 09:15:08 crc kubenswrapper[4790]: I1003 09:15:08.297479 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-8j46v" Oct 03 09:15:08 crc kubenswrapper[4790]: I1003 09:15:08.297520 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-neutron-config" Oct 03 09:15:08 crc kubenswrapper[4790]: I1003 09:15:08.304148 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-64ldb"] Oct 03 09:15:08 crc kubenswrapper[4790]: I1003 09:15:08.313135 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-92r86" Oct 03 09:15:08 crc kubenswrapper[4790]: I1003 09:15:08.399137 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-92r86"] Oct 03 09:15:08 crc kubenswrapper[4790]: I1003 09:15:08.409690 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1e83d303-3e2f-4dab-b67f-de344d5fe175-ssh-key\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-64ldb\" (UID: \"1e83d303-3e2f-4dab-b67f-de344d5fe175\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-64ldb" Oct 03 09:15:08 crc kubenswrapper[4790]: I1003 09:15:08.409793 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/1e83d303-3e2f-4dab-b67f-de344d5fe175-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-64ldb\" (UID: \"1e83d303-3e2f-4dab-b67f-de344d5fe175\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-64ldb" Oct 03 09:15:08 crc kubenswrapper[4790]: I1003 09:15:08.409847 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/1e83d303-3e2f-4dab-b67f-de344d5fe175-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-64ldb\" (UID: \"1e83d303-3e2f-4dab-b67f-de344d5fe175\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-64ldb" Oct 03 09:15:08 crc kubenswrapper[4790]: I1003 09:15:08.409884 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7fzln\" (UniqueName: \"kubernetes.io/projected/1e83d303-3e2f-4dab-b67f-de344d5fe175-kube-api-access-7fzln\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-64ldb\" (UID: \"1e83d303-3e2f-4dab-b67f-de344d5fe175\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-64ldb" Oct 03 09:15:08 crc kubenswrapper[4790]: I1003 09:15:08.409907 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1e83d303-3e2f-4dab-b67f-de344d5fe175-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-64ldb\" (UID: \"1e83d303-3e2f-4dab-b67f-de344d5fe175\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-64ldb" Oct 03 09:15:08 crc kubenswrapper[4790]: I1003 09:15:08.409996 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1e83d303-3e2f-4dab-b67f-de344d5fe175-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-64ldb\" (UID: \"1e83d303-3e2f-4dab-b67f-de344d5fe175\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-64ldb" Oct 03 09:15:08 crc kubenswrapper[4790]: I1003 09:15:08.439464 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-8rbb4"] Oct 03 09:15:08 crc kubenswrapper[4790]: I1003 09:15:08.439720 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-8rbb4" podUID="668fc837-03c4-4539-a9c6-dc5aca43693a" containerName="registry-server" containerID="cri-o://504859a6c6fd684c5dc10798a342b9da957079dfa73307f1d4d4eef17a460706" gracePeriod=2 Oct 03 09:15:08 crc kubenswrapper[4790]: I1003 09:15:08.511349 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/1e83d303-3e2f-4dab-b67f-de344d5fe175-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-64ldb\" (UID: \"1e83d303-3e2f-4dab-b67f-de344d5fe175\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-64ldb" Oct 03 09:15:08 crc kubenswrapper[4790]: I1003 09:15:08.511416 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/1e83d303-3e2f-4dab-b67f-de344d5fe175-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-64ldb\" (UID: \"1e83d303-3e2f-4dab-b67f-de344d5fe175\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-64ldb" Oct 03 09:15:08 crc kubenswrapper[4790]: I1003 09:15:08.511449 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7fzln\" (UniqueName: \"kubernetes.io/projected/1e83d303-3e2f-4dab-b67f-de344d5fe175-kube-api-access-7fzln\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-64ldb\" (UID: \"1e83d303-3e2f-4dab-b67f-de344d5fe175\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-64ldb" Oct 03 09:15:08 crc kubenswrapper[4790]: I1003 09:15:08.511469 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1e83d303-3e2f-4dab-b67f-de344d5fe175-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-64ldb\" (UID: \"1e83d303-3e2f-4dab-b67f-de344d5fe175\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-64ldb" Oct 03 09:15:08 crc kubenswrapper[4790]: I1003 09:15:08.511549 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1e83d303-3e2f-4dab-b67f-de344d5fe175-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-64ldb\" (UID: \"1e83d303-3e2f-4dab-b67f-de344d5fe175\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-64ldb" Oct 03 09:15:08 crc kubenswrapper[4790]: I1003 09:15:08.511640 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1e83d303-3e2f-4dab-b67f-de344d5fe175-ssh-key\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-64ldb\" (UID: \"1e83d303-3e2f-4dab-b67f-de344d5fe175\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-64ldb" Oct 03 09:15:08 crc kubenswrapper[4790]: I1003 09:15:08.516480 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1e83d303-3e2f-4dab-b67f-de344d5fe175-ssh-key\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-64ldb\" (UID: \"1e83d303-3e2f-4dab-b67f-de344d5fe175\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-64ldb" Oct 03 09:15:08 crc kubenswrapper[4790]: I1003 09:15:08.517032 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/1e83d303-3e2f-4dab-b67f-de344d5fe175-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-64ldb\" (UID: \"1e83d303-3e2f-4dab-b67f-de344d5fe175\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-64ldb" Oct 03 09:15:08 crc kubenswrapper[4790]: I1003 09:15:08.520969 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1e83d303-3e2f-4dab-b67f-de344d5fe175-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-64ldb\" (UID: \"1e83d303-3e2f-4dab-b67f-de344d5fe175\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-64ldb" Oct 03 09:15:08 crc kubenswrapper[4790]: I1003 09:15:08.521165 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/1e83d303-3e2f-4dab-b67f-de344d5fe175-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-64ldb\" (UID: \"1e83d303-3e2f-4dab-b67f-de344d5fe175\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-64ldb" Oct 03 09:15:08 crc kubenswrapper[4790]: I1003 09:15:08.522159 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1e83d303-3e2f-4dab-b67f-de344d5fe175-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-64ldb\" (UID: \"1e83d303-3e2f-4dab-b67f-de344d5fe175\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-64ldb" Oct 03 09:15:08 crc kubenswrapper[4790]: I1003 09:15:08.530986 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7fzln\" (UniqueName: \"kubernetes.io/projected/1e83d303-3e2f-4dab-b67f-de344d5fe175-kube-api-access-7fzln\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-64ldb\" (UID: \"1e83d303-3e2f-4dab-b67f-de344d5fe175\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-64ldb" Oct 03 09:15:08 crc kubenswrapper[4790]: I1003 09:15:08.616233 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-64ldb" Oct 03 09:15:08 crc kubenswrapper[4790]: I1003 09:15:08.930406 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8rbb4" Oct 03 09:15:09 crc kubenswrapper[4790]: I1003 09:15:09.027591 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gn6ns\" (UniqueName: \"kubernetes.io/projected/668fc837-03c4-4539-a9c6-dc5aca43693a-kube-api-access-gn6ns\") pod \"668fc837-03c4-4539-a9c6-dc5aca43693a\" (UID: \"668fc837-03c4-4539-a9c6-dc5aca43693a\") " Oct 03 09:15:09 crc kubenswrapper[4790]: I1003 09:15:09.027655 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/668fc837-03c4-4539-a9c6-dc5aca43693a-utilities\") pod \"668fc837-03c4-4539-a9c6-dc5aca43693a\" (UID: \"668fc837-03c4-4539-a9c6-dc5aca43693a\") " Oct 03 09:15:09 crc kubenswrapper[4790]: I1003 09:15:09.027716 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/668fc837-03c4-4539-a9c6-dc5aca43693a-catalog-content\") pod \"668fc837-03c4-4539-a9c6-dc5aca43693a\" (UID: \"668fc837-03c4-4539-a9c6-dc5aca43693a\") " Oct 03 09:15:09 crc kubenswrapper[4790]: I1003 09:15:09.030210 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/668fc837-03c4-4539-a9c6-dc5aca43693a-utilities" (OuterVolumeSpecName: "utilities") pod "668fc837-03c4-4539-a9c6-dc5aca43693a" (UID: "668fc837-03c4-4539-a9c6-dc5aca43693a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 09:15:09 crc kubenswrapper[4790]: I1003 09:15:09.037553 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/668fc837-03c4-4539-a9c6-dc5aca43693a-kube-api-access-gn6ns" (OuterVolumeSpecName: "kube-api-access-gn6ns") pod "668fc837-03c4-4539-a9c6-dc5aca43693a" (UID: "668fc837-03c4-4539-a9c6-dc5aca43693a"). InnerVolumeSpecName "kube-api-access-gn6ns". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:15:09 crc kubenswrapper[4790]: I1003 09:15:09.102002 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/668fc837-03c4-4539-a9c6-dc5aca43693a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "668fc837-03c4-4539-a9c6-dc5aca43693a" (UID: "668fc837-03c4-4539-a9c6-dc5aca43693a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 09:15:09 crc kubenswrapper[4790]: I1003 09:15:09.130754 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gn6ns\" (UniqueName: \"kubernetes.io/projected/668fc837-03c4-4539-a9c6-dc5aca43693a-kube-api-access-gn6ns\") on node \"crc\" DevicePath \"\"" Oct 03 09:15:09 crc kubenswrapper[4790]: I1003 09:15:09.130810 4790 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/668fc837-03c4-4539-a9c6-dc5aca43693a-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 09:15:09 crc kubenswrapper[4790]: I1003 09:15:09.130828 4790 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/668fc837-03c4-4539-a9c6-dc5aca43693a-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 09:15:09 crc kubenswrapper[4790]: I1003 09:15:09.190779 4790 generic.go:334] "Generic (PLEG): container finished" podID="668fc837-03c4-4539-a9c6-dc5aca43693a" containerID="504859a6c6fd684c5dc10798a342b9da957079dfa73307f1d4d4eef17a460706" exitCode=0 Oct 03 09:15:09 crc kubenswrapper[4790]: I1003 09:15:09.190853 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8rbb4" Oct 03 09:15:09 crc kubenswrapper[4790]: I1003 09:15:09.190876 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8rbb4" event={"ID":"668fc837-03c4-4539-a9c6-dc5aca43693a","Type":"ContainerDied","Data":"504859a6c6fd684c5dc10798a342b9da957079dfa73307f1d4d4eef17a460706"} Oct 03 09:15:09 crc kubenswrapper[4790]: I1003 09:15:09.190961 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8rbb4" event={"ID":"668fc837-03c4-4539-a9c6-dc5aca43693a","Type":"ContainerDied","Data":"f830574ffe0327d08f29fec0d939edd213f2e01f7b7967fe12e735e9beaf50ed"} Oct 03 09:15:09 crc kubenswrapper[4790]: I1003 09:15:09.190990 4790 scope.go:117] "RemoveContainer" containerID="504859a6c6fd684c5dc10798a342b9da957079dfa73307f1d4d4eef17a460706" Oct 03 09:15:09 crc kubenswrapper[4790]: I1003 09:15:09.221260 4790 scope.go:117] "RemoveContainer" containerID="0fbaf01ed76d7988df91296fb78b40946e27d33fd907a2bf8a44d64979915e2a" Oct 03 09:15:09 crc kubenswrapper[4790]: I1003 09:15:09.239451 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-8rbb4"] Oct 03 09:15:09 crc kubenswrapper[4790]: I1003 09:15:09.248738 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-8rbb4"] Oct 03 09:15:09 crc kubenswrapper[4790]: I1003 09:15:09.268488 4790 scope.go:117] "RemoveContainer" containerID="71645f7c203aa8bf40c347599f8091ac073762c0c8d6c862706dff01ae74bcc0" Oct 03 09:15:09 crc kubenswrapper[4790]: I1003 09:15:09.297065 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-64ldb"] Oct 03 09:15:09 crc kubenswrapper[4790]: I1003 09:15:09.313788 4790 scope.go:117] "RemoveContainer" containerID="504859a6c6fd684c5dc10798a342b9da957079dfa73307f1d4d4eef17a460706" Oct 03 09:15:09 crc kubenswrapper[4790]: E1003 09:15:09.315351 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"504859a6c6fd684c5dc10798a342b9da957079dfa73307f1d4d4eef17a460706\": container with ID starting with 504859a6c6fd684c5dc10798a342b9da957079dfa73307f1d4d4eef17a460706 not found: ID does not exist" containerID="504859a6c6fd684c5dc10798a342b9da957079dfa73307f1d4d4eef17a460706" Oct 03 09:15:09 crc kubenswrapper[4790]: I1003 09:15:09.315383 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"504859a6c6fd684c5dc10798a342b9da957079dfa73307f1d4d4eef17a460706"} err="failed to get container status \"504859a6c6fd684c5dc10798a342b9da957079dfa73307f1d4d4eef17a460706\": rpc error: code = NotFound desc = could not find container \"504859a6c6fd684c5dc10798a342b9da957079dfa73307f1d4d4eef17a460706\": container with ID starting with 504859a6c6fd684c5dc10798a342b9da957079dfa73307f1d4d4eef17a460706 not found: ID does not exist" Oct 03 09:15:09 crc kubenswrapper[4790]: I1003 09:15:09.315403 4790 scope.go:117] "RemoveContainer" containerID="0fbaf01ed76d7988df91296fb78b40946e27d33fd907a2bf8a44d64979915e2a" Oct 03 09:15:09 crc kubenswrapper[4790]: E1003 09:15:09.321205 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0fbaf01ed76d7988df91296fb78b40946e27d33fd907a2bf8a44d64979915e2a\": container with ID starting with 0fbaf01ed76d7988df91296fb78b40946e27d33fd907a2bf8a44d64979915e2a not found: ID does not exist" containerID="0fbaf01ed76d7988df91296fb78b40946e27d33fd907a2bf8a44d64979915e2a" Oct 03 09:15:09 crc kubenswrapper[4790]: I1003 09:15:09.321252 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0fbaf01ed76d7988df91296fb78b40946e27d33fd907a2bf8a44d64979915e2a"} err="failed to get container status \"0fbaf01ed76d7988df91296fb78b40946e27d33fd907a2bf8a44d64979915e2a\": rpc error: code = NotFound desc = could not find container \"0fbaf01ed76d7988df91296fb78b40946e27d33fd907a2bf8a44d64979915e2a\": container with ID starting with 0fbaf01ed76d7988df91296fb78b40946e27d33fd907a2bf8a44d64979915e2a not found: ID does not exist" Oct 03 09:15:09 crc kubenswrapper[4790]: I1003 09:15:09.321285 4790 scope.go:117] "RemoveContainer" containerID="71645f7c203aa8bf40c347599f8091ac073762c0c8d6c862706dff01ae74bcc0" Oct 03 09:15:09 crc kubenswrapper[4790]: E1003 09:15:09.327733 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"71645f7c203aa8bf40c347599f8091ac073762c0c8d6c862706dff01ae74bcc0\": container with ID starting with 71645f7c203aa8bf40c347599f8091ac073762c0c8d6c862706dff01ae74bcc0 not found: ID does not exist" containerID="71645f7c203aa8bf40c347599f8091ac073762c0c8d6c862706dff01ae74bcc0" Oct 03 09:15:09 crc kubenswrapper[4790]: I1003 09:15:09.327783 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"71645f7c203aa8bf40c347599f8091ac073762c0c8d6c862706dff01ae74bcc0"} err="failed to get container status \"71645f7c203aa8bf40c347599f8091ac073762c0c8d6c862706dff01ae74bcc0\": rpc error: code = NotFound desc = could not find container \"71645f7c203aa8bf40c347599f8091ac073762c0c8d6c862706dff01ae74bcc0\": container with ID starting with 71645f7c203aa8bf40c347599f8091ac073762c0c8d6c862706dff01ae74bcc0 not found: ID does not exist" Oct 03 09:15:10 crc kubenswrapper[4790]: I1003 09:15:10.206351 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-64ldb" event={"ID":"1e83d303-3e2f-4dab-b67f-de344d5fe175","Type":"ContainerStarted","Data":"0f791e10687c07f01df14acff54837a87d446424d54243e8984cdcaa5c4d6dd2"} Oct 03 09:15:10 crc kubenswrapper[4790]: I1003 09:15:10.338946 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="668fc837-03c4-4539-a9c6-dc5aca43693a" path="/var/lib/kubelet/pods/668fc837-03c4-4539-a9c6-dc5aca43693a/volumes" Oct 03 09:15:11 crc kubenswrapper[4790]: I1003 09:15:11.216227 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-64ldb" event={"ID":"1e83d303-3e2f-4dab-b67f-de344d5fe175","Type":"ContainerStarted","Data":"4268e57f3696a2fc3510967040814d0a5628199e8b2381603c763065510e10bb"} Oct 03 09:15:11 crc kubenswrapper[4790]: I1003 09:15:11.236146 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-64ldb" podStartSLOduration=2.136094081 podStartE2EDuration="3.236125978s" podCreationTimestamp="2025-10-03 09:15:08 +0000 UTC" firstStartedPulling="2025-10-03 09:15:09.32773727 +0000 UTC m=+2095.680671508" lastFinishedPulling="2025-10-03 09:15:10.427769157 +0000 UTC m=+2096.780703405" observedRunningTime="2025-10-03 09:15:11.231199803 +0000 UTC m=+2097.584134071" watchObservedRunningTime="2025-10-03 09:15:11.236125978 +0000 UTC m=+2097.589060206" Oct 03 09:15:23 crc kubenswrapper[4790]: I1003 09:15:23.003902 4790 scope.go:117] "RemoveContainer" containerID="464aec7f7c6fd3f9380c7c491dfee2cab43bacc0d8f586ee7aeb6b34f58b320e" Oct 03 09:15:59 crc kubenswrapper[4790]: I1003 09:15:59.736626 4790 generic.go:334] "Generic (PLEG): container finished" podID="1e83d303-3e2f-4dab-b67f-de344d5fe175" containerID="4268e57f3696a2fc3510967040814d0a5628199e8b2381603c763065510e10bb" exitCode=0 Oct 03 09:15:59 crc kubenswrapper[4790]: I1003 09:15:59.736692 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-64ldb" event={"ID":"1e83d303-3e2f-4dab-b67f-de344d5fe175","Type":"ContainerDied","Data":"4268e57f3696a2fc3510967040814d0a5628199e8b2381603c763065510e10bb"} Oct 03 09:16:01 crc kubenswrapper[4790]: I1003 09:16:01.201722 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-64ldb" Oct 03 09:16:01 crc kubenswrapper[4790]: I1003 09:16:01.313091 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/1e83d303-3e2f-4dab-b67f-de344d5fe175-neutron-ovn-metadata-agent-neutron-config-0\") pod \"1e83d303-3e2f-4dab-b67f-de344d5fe175\" (UID: \"1e83d303-3e2f-4dab-b67f-de344d5fe175\") " Oct 03 09:16:01 crc kubenswrapper[4790]: I1003 09:16:01.313526 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7fzln\" (UniqueName: \"kubernetes.io/projected/1e83d303-3e2f-4dab-b67f-de344d5fe175-kube-api-access-7fzln\") pod \"1e83d303-3e2f-4dab-b67f-de344d5fe175\" (UID: \"1e83d303-3e2f-4dab-b67f-de344d5fe175\") " Oct 03 09:16:01 crc kubenswrapper[4790]: I1003 09:16:01.313656 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1e83d303-3e2f-4dab-b67f-de344d5fe175-neutron-metadata-combined-ca-bundle\") pod \"1e83d303-3e2f-4dab-b67f-de344d5fe175\" (UID: \"1e83d303-3e2f-4dab-b67f-de344d5fe175\") " Oct 03 09:16:01 crc kubenswrapper[4790]: I1003 09:16:01.313740 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1e83d303-3e2f-4dab-b67f-de344d5fe175-inventory\") pod \"1e83d303-3e2f-4dab-b67f-de344d5fe175\" (UID: \"1e83d303-3e2f-4dab-b67f-de344d5fe175\") " Oct 03 09:16:01 crc kubenswrapper[4790]: I1003 09:16:01.313833 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1e83d303-3e2f-4dab-b67f-de344d5fe175-ssh-key\") pod \"1e83d303-3e2f-4dab-b67f-de344d5fe175\" (UID: \"1e83d303-3e2f-4dab-b67f-de344d5fe175\") " Oct 03 09:16:01 crc kubenswrapper[4790]: I1003 09:16:01.313923 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/1e83d303-3e2f-4dab-b67f-de344d5fe175-nova-metadata-neutron-config-0\") pod \"1e83d303-3e2f-4dab-b67f-de344d5fe175\" (UID: \"1e83d303-3e2f-4dab-b67f-de344d5fe175\") " Oct 03 09:16:01 crc kubenswrapper[4790]: I1003 09:16:01.319369 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1e83d303-3e2f-4dab-b67f-de344d5fe175-kube-api-access-7fzln" (OuterVolumeSpecName: "kube-api-access-7fzln") pod "1e83d303-3e2f-4dab-b67f-de344d5fe175" (UID: "1e83d303-3e2f-4dab-b67f-de344d5fe175"). InnerVolumeSpecName "kube-api-access-7fzln". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:16:01 crc kubenswrapper[4790]: I1003 09:16:01.319637 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1e83d303-3e2f-4dab-b67f-de344d5fe175-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "1e83d303-3e2f-4dab-b67f-de344d5fe175" (UID: "1e83d303-3e2f-4dab-b67f-de344d5fe175"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:16:01 crc kubenswrapper[4790]: I1003 09:16:01.346398 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1e83d303-3e2f-4dab-b67f-de344d5fe175-neutron-ovn-metadata-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-ovn-metadata-agent-neutron-config-0") pod "1e83d303-3e2f-4dab-b67f-de344d5fe175" (UID: "1e83d303-3e2f-4dab-b67f-de344d5fe175"). InnerVolumeSpecName "neutron-ovn-metadata-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:16:01 crc kubenswrapper[4790]: I1003 09:16:01.348751 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1e83d303-3e2f-4dab-b67f-de344d5fe175-nova-metadata-neutron-config-0" (OuterVolumeSpecName: "nova-metadata-neutron-config-0") pod "1e83d303-3e2f-4dab-b67f-de344d5fe175" (UID: "1e83d303-3e2f-4dab-b67f-de344d5fe175"). InnerVolumeSpecName "nova-metadata-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:16:01 crc kubenswrapper[4790]: I1003 09:16:01.349271 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1e83d303-3e2f-4dab-b67f-de344d5fe175-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "1e83d303-3e2f-4dab-b67f-de344d5fe175" (UID: "1e83d303-3e2f-4dab-b67f-de344d5fe175"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:16:01 crc kubenswrapper[4790]: I1003 09:16:01.351730 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1e83d303-3e2f-4dab-b67f-de344d5fe175-inventory" (OuterVolumeSpecName: "inventory") pod "1e83d303-3e2f-4dab-b67f-de344d5fe175" (UID: "1e83d303-3e2f-4dab-b67f-de344d5fe175"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:16:01 crc kubenswrapper[4790]: I1003 09:16:01.416120 4790 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/1e83d303-3e2f-4dab-b67f-de344d5fe175-nova-metadata-neutron-config-0\") on node \"crc\" DevicePath \"\"" Oct 03 09:16:01 crc kubenswrapper[4790]: I1003 09:16:01.416154 4790 reconciler_common.go:293] "Volume detached for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/1e83d303-3e2f-4dab-b67f-de344d5fe175-neutron-ovn-metadata-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Oct 03 09:16:01 crc kubenswrapper[4790]: I1003 09:16:01.416172 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7fzln\" (UniqueName: \"kubernetes.io/projected/1e83d303-3e2f-4dab-b67f-de344d5fe175-kube-api-access-7fzln\") on node \"crc\" DevicePath \"\"" Oct 03 09:16:01 crc kubenswrapper[4790]: I1003 09:16:01.416185 4790 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1e83d303-3e2f-4dab-b67f-de344d5fe175-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 09:16:01 crc kubenswrapper[4790]: I1003 09:16:01.416198 4790 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1e83d303-3e2f-4dab-b67f-de344d5fe175-inventory\") on node \"crc\" DevicePath \"\"" Oct 03 09:16:01 crc kubenswrapper[4790]: I1003 09:16:01.416211 4790 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1e83d303-3e2f-4dab-b67f-de344d5fe175-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 03 09:16:01 crc kubenswrapper[4790]: I1003 09:16:01.760872 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-64ldb" event={"ID":"1e83d303-3e2f-4dab-b67f-de344d5fe175","Type":"ContainerDied","Data":"0f791e10687c07f01df14acff54837a87d446424d54243e8984cdcaa5c4d6dd2"} Oct 03 09:16:01 crc kubenswrapper[4790]: I1003 09:16:01.760931 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0f791e10687c07f01df14acff54837a87d446424d54243e8984cdcaa5c4d6dd2" Oct 03 09:16:01 crc kubenswrapper[4790]: I1003 09:16:01.761029 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-64ldb" Oct 03 09:16:01 crc kubenswrapper[4790]: I1003 09:16:01.853261 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-pgz87"] Oct 03 09:16:01 crc kubenswrapper[4790]: E1003 09:16:01.854065 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="668fc837-03c4-4539-a9c6-dc5aca43693a" containerName="registry-server" Oct 03 09:16:01 crc kubenswrapper[4790]: I1003 09:16:01.854088 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="668fc837-03c4-4539-a9c6-dc5aca43693a" containerName="registry-server" Oct 03 09:16:01 crc kubenswrapper[4790]: E1003 09:16:01.854138 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="668fc837-03c4-4539-a9c6-dc5aca43693a" containerName="extract-content" Oct 03 09:16:01 crc kubenswrapper[4790]: I1003 09:16:01.854147 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="668fc837-03c4-4539-a9c6-dc5aca43693a" containerName="extract-content" Oct 03 09:16:01 crc kubenswrapper[4790]: E1003 09:16:01.854157 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="668fc837-03c4-4539-a9c6-dc5aca43693a" containerName="extract-utilities" Oct 03 09:16:01 crc kubenswrapper[4790]: I1003 09:16:01.854166 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="668fc837-03c4-4539-a9c6-dc5aca43693a" containerName="extract-utilities" Oct 03 09:16:01 crc kubenswrapper[4790]: E1003 09:16:01.854194 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1e83d303-3e2f-4dab-b67f-de344d5fe175" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Oct 03 09:16:01 crc kubenswrapper[4790]: I1003 09:16:01.854204 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e83d303-3e2f-4dab-b67f-de344d5fe175" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Oct 03 09:16:01 crc kubenswrapper[4790]: I1003 09:16:01.854646 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="668fc837-03c4-4539-a9c6-dc5aca43693a" containerName="registry-server" Oct 03 09:16:01 crc kubenswrapper[4790]: I1003 09:16:01.854686 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="1e83d303-3e2f-4dab-b67f-de344d5fe175" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Oct 03 09:16:01 crc kubenswrapper[4790]: I1003 09:16:01.855616 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-pgz87" Oct 03 09:16:01 crc kubenswrapper[4790]: I1003 09:16:01.858402 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"libvirt-secret" Oct 03 09:16:01 crc kubenswrapper[4790]: I1003 09:16:01.864898 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-8j46v" Oct 03 09:16:01 crc kubenswrapper[4790]: I1003 09:16:01.864946 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 03 09:16:01 crc kubenswrapper[4790]: I1003 09:16:01.865370 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 03 09:16:01 crc kubenswrapper[4790]: I1003 09:16:01.867045 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-pgz87"] Oct 03 09:16:01 crc kubenswrapper[4790]: I1003 09:16:01.867896 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 03 09:16:01 crc kubenswrapper[4790]: I1003 09:16:01.928211 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/60e50039-1bd7-4bd0-98ae-f1285c701e2e-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-pgz87\" (UID: \"60e50039-1bd7-4bd0-98ae-f1285c701e2e\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-pgz87" Oct 03 09:16:01 crc kubenswrapper[4790]: I1003 09:16:01.928261 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60e50039-1bd7-4bd0-98ae-f1285c701e2e-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-pgz87\" (UID: \"60e50039-1bd7-4bd0-98ae-f1285c701e2e\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-pgz87" Oct 03 09:16:01 crc kubenswrapper[4790]: I1003 09:16:01.928351 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/60e50039-1bd7-4bd0-98ae-f1285c701e2e-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-pgz87\" (UID: \"60e50039-1bd7-4bd0-98ae-f1285c701e2e\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-pgz87" Oct 03 09:16:01 crc kubenswrapper[4790]: I1003 09:16:01.928391 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/60e50039-1bd7-4bd0-98ae-f1285c701e2e-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-pgz87\" (UID: \"60e50039-1bd7-4bd0-98ae-f1285c701e2e\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-pgz87" Oct 03 09:16:01 crc kubenswrapper[4790]: I1003 09:16:01.928549 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-79wz8\" (UniqueName: \"kubernetes.io/projected/60e50039-1bd7-4bd0-98ae-f1285c701e2e-kube-api-access-79wz8\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-pgz87\" (UID: \"60e50039-1bd7-4bd0-98ae-f1285c701e2e\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-pgz87" Oct 03 09:16:02 crc kubenswrapper[4790]: I1003 09:16:02.030174 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-79wz8\" (UniqueName: \"kubernetes.io/projected/60e50039-1bd7-4bd0-98ae-f1285c701e2e-kube-api-access-79wz8\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-pgz87\" (UID: \"60e50039-1bd7-4bd0-98ae-f1285c701e2e\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-pgz87" Oct 03 09:16:02 crc kubenswrapper[4790]: I1003 09:16:02.030257 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/60e50039-1bd7-4bd0-98ae-f1285c701e2e-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-pgz87\" (UID: \"60e50039-1bd7-4bd0-98ae-f1285c701e2e\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-pgz87" Oct 03 09:16:02 crc kubenswrapper[4790]: I1003 09:16:02.030305 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60e50039-1bd7-4bd0-98ae-f1285c701e2e-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-pgz87\" (UID: \"60e50039-1bd7-4bd0-98ae-f1285c701e2e\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-pgz87" Oct 03 09:16:02 crc kubenswrapper[4790]: I1003 09:16:02.030393 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/60e50039-1bd7-4bd0-98ae-f1285c701e2e-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-pgz87\" (UID: \"60e50039-1bd7-4bd0-98ae-f1285c701e2e\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-pgz87" Oct 03 09:16:02 crc kubenswrapper[4790]: I1003 09:16:02.030447 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/60e50039-1bd7-4bd0-98ae-f1285c701e2e-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-pgz87\" (UID: \"60e50039-1bd7-4bd0-98ae-f1285c701e2e\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-pgz87" Oct 03 09:16:02 crc kubenswrapper[4790]: I1003 09:16:02.033863 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/60e50039-1bd7-4bd0-98ae-f1285c701e2e-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-pgz87\" (UID: \"60e50039-1bd7-4bd0-98ae-f1285c701e2e\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-pgz87" Oct 03 09:16:02 crc kubenswrapper[4790]: I1003 09:16:02.035143 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/60e50039-1bd7-4bd0-98ae-f1285c701e2e-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-pgz87\" (UID: \"60e50039-1bd7-4bd0-98ae-f1285c701e2e\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-pgz87" Oct 03 09:16:02 crc kubenswrapper[4790]: I1003 09:16:02.036030 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/60e50039-1bd7-4bd0-98ae-f1285c701e2e-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-pgz87\" (UID: \"60e50039-1bd7-4bd0-98ae-f1285c701e2e\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-pgz87" Oct 03 09:16:02 crc kubenswrapper[4790]: I1003 09:16:02.044302 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60e50039-1bd7-4bd0-98ae-f1285c701e2e-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-pgz87\" (UID: \"60e50039-1bd7-4bd0-98ae-f1285c701e2e\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-pgz87" Oct 03 09:16:02 crc kubenswrapper[4790]: I1003 09:16:02.055344 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-79wz8\" (UniqueName: \"kubernetes.io/projected/60e50039-1bd7-4bd0-98ae-f1285c701e2e-kube-api-access-79wz8\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-pgz87\" (UID: \"60e50039-1bd7-4bd0-98ae-f1285c701e2e\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-pgz87" Oct 03 09:16:02 crc kubenswrapper[4790]: I1003 09:16:02.177798 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-pgz87" Oct 03 09:16:02 crc kubenswrapper[4790]: I1003 09:16:02.719539 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-pgz87"] Oct 03 09:16:02 crc kubenswrapper[4790]: I1003 09:16:02.775010 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-pgz87" event={"ID":"60e50039-1bd7-4bd0-98ae-f1285c701e2e","Type":"ContainerStarted","Data":"a8f0517d45a2eea9bb96b8a151cfcc6dc2d046c623f5dafdc631bbd8107b133c"} Oct 03 09:16:03 crc kubenswrapper[4790]: I1003 09:16:03.786080 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-pgz87" event={"ID":"60e50039-1bd7-4bd0-98ae-f1285c701e2e","Type":"ContainerStarted","Data":"9665bc83eb17c295ae6e9e17b26d3e859d12e872bf54d6e2d71eb6c8d170c822"} Oct 03 09:16:03 crc kubenswrapper[4790]: I1003 09:16:03.802343 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-pgz87" podStartSLOduration=2.277164564 podStartE2EDuration="2.802323831s" podCreationTimestamp="2025-10-03 09:16:01 +0000 UTC" firstStartedPulling="2025-10-03 09:16:02.715747633 +0000 UTC m=+2149.068681871" lastFinishedPulling="2025-10-03 09:16:03.24090689 +0000 UTC m=+2149.593841138" observedRunningTime="2025-10-03 09:16:03.800286426 +0000 UTC m=+2150.153220704" watchObservedRunningTime="2025-10-03 09:16:03.802323831 +0000 UTC m=+2150.155258079" Oct 03 09:16:10 crc kubenswrapper[4790]: I1003 09:16:10.186067 4790 patch_prober.go:28] interesting pod/machine-config-daemon-zqj4j container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 09:16:10 crc kubenswrapper[4790]: I1003 09:16:10.188141 4790 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" podUID="36eae06a-d8be-4128-8d68-acb32e1f011b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 09:16:40 crc kubenswrapper[4790]: I1003 09:16:40.186118 4790 patch_prober.go:28] interesting pod/machine-config-daemon-zqj4j container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 09:16:40 crc kubenswrapper[4790]: I1003 09:16:40.186837 4790 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" podUID="36eae06a-d8be-4128-8d68-acb32e1f011b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 09:17:10 crc kubenswrapper[4790]: I1003 09:17:10.186484 4790 patch_prober.go:28] interesting pod/machine-config-daemon-zqj4j container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 09:17:10 crc kubenswrapper[4790]: I1003 09:17:10.187044 4790 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" podUID="36eae06a-d8be-4128-8d68-acb32e1f011b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 09:17:10 crc kubenswrapper[4790]: I1003 09:17:10.187091 4790 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" Oct 03 09:17:10 crc kubenswrapper[4790]: I1003 09:17:10.187697 4790 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"105f7e31a1b4fd122ba3755f58ec2001671f4c1e80e4803ea6d27a8c94b0c9b8"} pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 03 09:17:10 crc kubenswrapper[4790]: I1003 09:17:10.187758 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" podUID="36eae06a-d8be-4128-8d68-acb32e1f011b" containerName="machine-config-daemon" containerID="cri-o://105f7e31a1b4fd122ba3755f58ec2001671f4c1e80e4803ea6d27a8c94b0c9b8" gracePeriod=600 Oct 03 09:17:10 crc kubenswrapper[4790]: E1003 09:17:10.307504 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqj4j_openshift-machine-config-operator(36eae06a-d8be-4128-8d68-acb32e1f011b)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" podUID="36eae06a-d8be-4128-8d68-acb32e1f011b" Oct 03 09:17:10 crc kubenswrapper[4790]: I1003 09:17:10.467306 4790 generic.go:334] "Generic (PLEG): container finished" podID="36eae06a-d8be-4128-8d68-acb32e1f011b" containerID="105f7e31a1b4fd122ba3755f58ec2001671f4c1e80e4803ea6d27a8c94b0c9b8" exitCode=0 Oct 03 09:17:10 crc kubenswrapper[4790]: I1003 09:17:10.467350 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" event={"ID":"36eae06a-d8be-4128-8d68-acb32e1f011b","Type":"ContainerDied","Data":"105f7e31a1b4fd122ba3755f58ec2001671f4c1e80e4803ea6d27a8c94b0c9b8"} Oct 03 09:17:10 crc kubenswrapper[4790]: I1003 09:17:10.467388 4790 scope.go:117] "RemoveContainer" containerID="6274a3236231c094c931f1afe606d7703eff60ab0480342f673b71ab8af0913c" Oct 03 09:17:10 crc kubenswrapper[4790]: I1003 09:17:10.468099 4790 scope.go:117] "RemoveContainer" containerID="105f7e31a1b4fd122ba3755f58ec2001671f4c1e80e4803ea6d27a8c94b0c9b8" Oct 03 09:17:10 crc kubenswrapper[4790]: E1003 09:17:10.468369 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqj4j_openshift-machine-config-operator(36eae06a-d8be-4128-8d68-acb32e1f011b)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" podUID="36eae06a-d8be-4128-8d68-acb32e1f011b" Oct 03 09:17:24 crc kubenswrapper[4790]: I1003 09:17:24.338897 4790 scope.go:117] "RemoveContainer" containerID="105f7e31a1b4fd122ba3755f58ec2001671f4c1e80e4803ea6d27a8c94b0c9b8" Oct 03 09:17:24 crc kubenswrapper[4790]: E1003 09:17:24.340098 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqj4j_openshift-machine-config-operator(36eae06a-d8be-4128-8d68-acb32e1f011b)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" podUID="36eae06a-d8be-4128-8d68-acb32e1f011b" Oct 03 09:17:35 crc kubenswrapper[4790]: I1003 09:17:35.328281 4790 scope.go:117] "RemoveContainer" containerID="105f7e31a1b4fd122ba3755f58ec2001671f4c1e80e4803ea6d27a8c94b0c9b8" Oct 03 09:17:35 crc kubenswrapper[4790]: E1003 09:17:35.329018 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqj4j_openshift-machine-config-operator(36eae06a-d8be-4128-8d68-acb32e1f011b)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" podUID="36eae06a-d8be-4128-8d68-acb32e1f011b" Oct 03 09:17:37 crc kubenswrapper[4790]: I1003 09:17:37.797455 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-mg5db"] Oct 03 09:17:37 crc kubenswrapper[4790]: I1003 09:17:37.803037 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-mg5db" Oct 03 09:17:37 crc kubenswrapper[4790]: I1003 09:17:37.813919 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-mg5db"] Oct 03 09:17:37 crc kubenswrapper[4790]: I1003 09:17:37.858789 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rmx6w\" (UniqueName: \"kubernetes.io/projected/36a8783d-f796-41ea-b847-7797630b1839-kube-api-access-rmx6w\") pod \"redhat-operators-mg5db\" (UID: \"36a8783d-f796-41ea-b847-7797630b1839\") " pod="openshift-marketplace/redhat-operators-mg5db" Oct 03 09:17:37 crc kubenswrapper[4790]: I1003 09:17:37.859303 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/36a8783d-f796-41ea-b847-7797630b1839-catalog-content\") pod \"redhat-operators-mg5db\" (UID: \"36a8783d-f796-41ea-b847-7797630b1839\") " pod="openshift-marketplace/redhat-operators-mg5db" Oct 03 09:17:37 crc kubenswrapper[4790]: I1003 09:17:37.859493 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/36a8783d-f796-41ea-b847-7797630b1839-utilities\") pod \"redhat-operators-mg5db\" (UID: \"36a8783d-f796-41ea-b847-7797630b1839\") " pod="openshift-marketplace/redhat-operators-mg5db" Oct 03 09:17:37 crc kubenswrapper[4790]: I1003 09:17:37.961124 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rmx6w\" (UniqueName: \"kubernetes.io/projected/36a8783d-f796-41ea-b847-7797630b1839-kube-api-access-rmx6w\") pod \"redhat-operators-mg5db\" (UID: \"36a8783d-f796-41ea-b847-7797630b1839\") " pod="openshift-marketplace/redhat-operators-mg5db" Oct 03 09:17:37 crc kubenswrapper[4790]: I1003 09:17:37.961265 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/36a8783d-f796-41ea-b847-7797630b1839-catalog-content\") pod \"redhat-operators-mg5db\" (UID: \"36a8783d-f796-41ea-b847-7797630b1839\") " pod="openshift-marketplace/redhat-operators-mg5db" Oct 03 09:17:37 crc kubenswrapper[4790]: I1003 09:17:37.961322 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/36a8783d-f796-41ea-b847-7797630b1839-utilities\") pod \"redhat-operators-mg5db\" (UID: \"36a8783d-f796-41ea-b847-7797630b1839\") " pod="openshift-marketplace/redhat-operators-mg5db" Oct 03 09:17:37 crc kubenswrapper[4790]: I1003 09:17:37.961887 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/36a8783d-f796-41ea-b847-7797630b1839-catalog-content\") pod \"redhat-operators-mg5db\" (UID: \"36a8783d-f796-41ea-b847-7797630b1839\") " pod="openshift-marketplace/redhat-operators-mg5db" Oct 03 09:17:37 crc kubenswrapper[4790]: I1003 09:17:37.961955 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/36a8783d-f796-41ea-b847-7797630b1839-utilities\") pod \"redhat-operators-mg5db\" (UID: \"36a8783d-f796-41ea-b847-7797630b1839\") " pod="openshift-marketplace/redhat-operators-mg5db" Oct 03 09:17:37 crc kubenswrapper[4790]: I1003 09:17:37.983261 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rmx6w\" (UniqueName: \"kubernetes.io/projected/36a8783d-f796-41ea-b847-7797630b1839-kube-api-access-rmx6w\") pod \"redhat-operators-mg5db\" (UID: \"36a8783d-f796-41ea-b847-7797630b1839\") " pod="openshift-marketplace/redhat-operators-mg5db" Oct 03 09:17:38 crc kubenswrapper[4790]: I1003 09:17:38.140879 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-mg5db" Oct 03 09:17:38 crc kubenswrapper[4790]: I1003 09:17:38.601306 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-mg5db"] Oct 03 09:17:38 crc kubenswrapper[4790]: I1003 09:17:38.756848 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mg5db" event={"ID":"36a8783d-f796-41ea-b847-7797630b1839","Type":"ContainerStarted","Data":"92f4a89e2ee61ebf1c6c25e5ac776dc26c28e51f0b982b731f0b58fcbc7f8c75"} Oct 03 09:17:39 crc kubenswrapper[4790]: I1003 09:17:39.767384 4790 generic.go:334] "Generic (PLEG): container finished" podID="36a8783d-f796-41ea-b847-7797630b1839" containerID="9fce7ab7cabb5d0a4505e942ae506dd61738e84fb020f8ec4c46fcaa39c1d9bd" exitCode=0 Oct 03 09:17:39 crc kubenswrapper[4790]: I1003 09:17:39.768545 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mg5db" event={"ID":"36a8783d-f796-41ea-b847-7797630b1839","Type":"ContainerDied","Data":"9fce7ab7cabb5d0a4505e942ae506dd61738e84fb020f8ec4c46fcaa39c1d9bd"} Oct 03 09:17:39 crc kubenswrapper[4790]: I1003 09:17:39.771903 4790 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 03 09:17:41 crc kubenswrapper[4790]: I1003 09:17:41.813910 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mg5db" event={"ID":"36a8783d-f796-41ea-b847-7797630b1839","Type":"ContainerStarted","Data":"0eaca8ef7ec64b998ae92cf99b53dde8227ff29d106591ae3076ef10f36f2a34"} Oct 03 09:17:42 crc kubenswrapper[4790]: I1003 09:17:42.835408 4790 generic.go:334] "Generic (PLEG): container finished" podID="36a8783d-f796-41ea-b847-7797630b1839" containerID="0eaca8ef7ec64b998ae92cf99b53dde8227ff29d106591ae3076ef10f36f2a34" exitCode=0 Oct 03 09:17:42 crc kubenswrapper[4790]: I1003 09:17:42.836501 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mg5db" event={"ID":"36a8783d-f796-41ea-b847-7797630b1839","Type":"ContainerDied","Data":"0eaca8ef7ec64b998ae92cf99b53dde8227ff29d106591ae3076ef10f36f2a34"} Oct 03 09:17:44 crc kubenswrapper[4790]: I1003 09:17:44.872015 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mg5db" event={"ID":"36a8783d-f796-41ea-b847-7797630b1839","Type":"ContainerStarted","Data":"ec883a9b612812d59b2ccb966337719bfcb84edee6687088e471db25250ee262"} Oct 03 09:17:44 crc kubenswrapper[4790]: I1003 09:17:44.891489 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-mg5db" podStartSLOduration=4.013247288 podStartE2EDuration="7.89147088s" podCreationTimestamp="2025-10-03 09:17:37 +0000 UTC" firstStartedPulling="2025-10-03 09:17:39.771595512 +0000 UTC m=+2246.124529750" lastFinishedPulling="2025-10-03 09:17:43.649819104 +0000 UTC m=+2250.002753342" observedRunningTime="2025-10-03 09:17:44.889643269 +0000 UTC m=+2251.242577517" watchObservedRunningTime="2025-10-03 09:17:44.89147088 +0000 UTC m=+2251.244405148" Oct 03 09:17:48 crc kubenswrapper[4790]: I1003 09:17:48.142322 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-mg5db" Oct 03 09:17:48 crc kubenswrapper[4790]: I1003 09:17:48.143738 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-mg5db" Oct 03 09:17:49 crc kubenswrapper[4790]: I1003 09:17:49.198026 4790 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-mg5db" podUID="36a8783d-f796-41ea-b847-7797630b1839" containerName="registry-server" probeResult="failure" output=< Oct 03 09:17:49 crc kubenswrapper[4790]: timeout: failed to connect service ":50051" within 1s Oct 03 09:17:49 crc kubenswrapper[4790]: > Oct 03 09:17:50 crc kubenswrapper[4790]: I1003 09:17:50.328367 4790 scope.go:117] "RemoveContainer" containerID="105f7e31a1b4fd122ba3755f58ec2001671f4c1e80e4803ea6d27a8c94b0c9b8" Oct 03 09:17:50 crc kubenswrapper[4790]: E1003 09:17:50.328672 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqj4j_openshift-machine-config-operator(36eae06a-d8be-4128-8d68-acb32e1f011b)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" podUID="36eae06a-d8be-4128-8d68-acb32e1f011b" Oct 03 09:17:58 crc kubenswrapper[4790]: I1003 09:17:58.190538 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-mg5db" Oct 03 09:17:58 crc kubenswrapper[4790]: I1003 09:17:58.238216 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-mg5db" Oct 03 09:17:58 crc kubenswrapper[4790]: I1003 09:17:58.431360 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-mg5db"] Oct 03 09:18:00 crc kubenswrapper[4790]: I1003 09:18:00.018602 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-mg5db" podUID="36a8783d-f796-41ea-b847-7797630b1839" containerName="registry-server" containerID="cri-o://ec883a9b612812d59b2ccb966337719bfcb84edee6687088e471db25250ee262" gracePeriod=2 Oct 03 09:18:00 crc kubenswrapper[4790]: I1003 09:18:00.532037 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-mg5db" Oct 03 09:18:00 crc kubenswrapper[4790]: I1003 09:18:00.638853 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/36a8783d-f796-41ea-b847-7797630b1839-utilities\") pod \"36a8783d-f796-41ea-b847-7797630b1839\" (UID: \"36a8783d-f796-41ea-b847-7797630b1839\") " Oct 03 09:18:00 crc kubenswrapper[4790]: I1003 09:18:00.638978 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rmx6w\" (UniqueName: \"kubernetes.io/projected/36a8783d-f796-41ea-b847-7797630b1839-kube-api-access-rmx6w\") pod \"36a8783d-f796-41ea-b847-7797630b1839\" (UID: \"36a8783d-f796-41ea-b847-7797630b1839\") " Oct 03 09:18:00 crc kubenswrapper[4790]: I1003 09:18:00.639296 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/36a8783d-f796-41ea-b847-7797630b1839-catalog-content\") pod \"36a8783d-f796-41ea-b847-7797630b1839\" (UID: \"36a8783d-f796-41ea-b847-7797630b1839\") " Oct 03 09:18:00 crc kubenswrapper[4790]: I1003 09:18:00.639773 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/36a8783d-f796-41ea-b847-7797630b1839-utilities" (OuterVolumeSpecName: "utilities") pod "36a8783d-f796-41ea-b847-7797630b1839" (UID: "36a8783d-f796-41ea-b847-7797630b1839"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 09:18:00 crc kubenswrapper[4790]: I1003 09:18:00.644510 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/36a8783d-f796-41ea-b847-7797630b1839-kube-api-access-rmx6w" (OuterVolumeSpecName: "kube-api-access-rmx6w") pod "36a8783d-f796-41ea-b847-7797630b1839" (UID: "36a8783d-f796-41ea-b847-7797630b1839"). InnerVolumeSpecName "kube-api-access-rmx6w". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:18:00 crc kubenswrapper[4790]: I1003 09:18:00.725639 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/36a8783d-f796-41ea-b847-7797630b1839-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "36a8783d-f796-41ea-b847-7797630b1839" (UID: "36a8783d-f796-41ea-b847-7797630b1839"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 09:18:00 crc kubenswrapper[4790]: I1003 09:18:00.741305 4790 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/36a8783d-f796-41ea-b847-7797630b1839-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 09:18:00 crc kubenswrapper[4790]: I1003 09:18:00.741347 4790 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/36a8783d-f796-41ea-b847-7797630b1839-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 09:18:00 crc kubenswrapper[4790]: I1003 09:18:00.741362 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rmx6w\" (UniqueName: \"kubernetes.io/projected/36a8783d-f796-41ea-b847-7797630b1839-kube-api-access-rmx6w\") on node \"crc\" DevicePath \"\"" Oct 03 09:18:01 crc kubenswrapper[4790]: I1003 09:18:01.031869 4790 generic.go:334] "Generic (PLEG): container finished" podID="36a8783d-f796-41ea-b847-7797630b1839" containerID="ec883a9b612812d59b2ccb966337719bfcb84edee6687088e471db25250ee262" exitCode=0 Oct 03 09:18:01 crc kubenswrapper[4790]: I1003 09:18:01.031943 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mg5db" event={"ID":"36a8783d-f796-41ea-b847-7797630b1839","Type":"ContainerDied","Data":"ec883a9b612812d59b2ccb966337719bfcb84edee6687088e471db25250ee262"} Oct 03 09:18:01 crc kubenswrapper[4790]: I1003 09:18:01.031992 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-mg5db" Oct 03 09:18:01 crc kubenswrapper[4790]: I1003 09:18:01.032017 4790 scope.go:117] "RemoveContainer" containerID="ec883a9b612812d59b2ccb966337719bfcb84edee6687088e471db25250ee262" Oct 03 09:18:01 crc kubenswrapper[4790]: I1003 09:18:01.031996 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mg5db" event={"ID":"36a8783d-f796-41ea-b847-7797630b1839","Type":"ContainerDied","Data":"92f4a89e2ee61ebf1c6c25e5ac776dc26c28e51f0b982b731f0b58fcbc7f8c75"} Oct 03 09:18:01 crc kubenswrapper[4790]: I1003 09:18:01.065905 4790 scope.go:117] "RemoveContainer" containerID="0eaca8ef7ec64b998ae92cf99b53dde8227ff29d106591ae3076ef10f36f2a34" Oct 03 09:18:01 crc kubenswrapper[4790]: I1003 09:18:01.095660 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-mg5db"] Oct 03 09:18:01 crc kubenswrapper[4790]: I1003 09:18:01.109390 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-mg5db"] Oct 03 09:18:01 crc kubenswrapper[4790]: I1003 09:18:01.111918 4790 scope.go:117] "RemoveContainer" containerID="9fce7ab7cabb5d0a4505e942ae506dd61738e84fb020f8ec4c46fcaa39c1d9bd" Oct 03 09:18:01 crc kubenswrapper[4790]: I1003 09:18:01.148661 4790 scope.go:117] "RemoveContainer" containerID="ec883a9b612812d59b2ccb966337719bfcb84edee6687088e471db25250ee262" Oct 03 09:18:01 crc kubenswrapper[4790]: E1003 09:18:01.149127 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ec883a9b612812d59b2ccb966337719bfcb84edee6687088e471db25250ee262\": container with ID starting with ec883a9b612812d59b2ccb966337719bfcb84edee6687088e471db25250ee262 not found: ID does not exist" containerID="ec883a9b612812d59b2ccb966337719bfcb84edee6687088e471db25250ee262" Oct 03 09:18:01 crc kubenswrapper[4790]: I1003 09:18:01.149173 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ec883a9b612812d59b2ccb966337719bfcb84edee6687088e471db25250ee262"} err="failed to get container status \"ec883a9b612812d59b2ccb966337719bfcb84edee6687088e471db25250ee262\": rpc error: code = NotFound desc = could not find container \"ec883a9b612812d59b2ccb966337719bfcb84edee6687088e471db25250ee262\": container with ID starting with ec883a9b612812d59b2ccb966337719bfcb84edee6687088e471db25250ee262 not found: ID does not exist" Oct 03 09:18:01 crc kubenswrapper[4790]: I1003 09:18:01.149200 4790 scope.go:117] "RemoveContainer" containerID="0eaca8ef7ec64b998ae92cf99b53dde8227ff29d106591ae3076ef10f36f2a34" Oct 03 09:18:01 crc kubenswrapper[4790]: E1003 09:18:01.149593 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0eaca8ef7ec64b998ae92cf99b53dde8227ff29d106591ae3076ef10f36f2a34\": container with ID starting with 0eaca8ef7ec64b998ae92cf99b53dde8227ff29d106591ae3076ef10f36f2a34 not found: ID does not exist" containerID="0eaca8ef7ec64b998ae92cf99b53dde8227ff29d106591ae3076ef10f36f2a34" Oct 03 09:18:01 crc kubenswrapper[4790]: I1003 09:18:01.149645 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0eaca8ef7ec64b998ae92cf99b53dde8227ff29d106591ae3076ef10f36f2a34"} err="failed to get container status \"0eaca8ef7ec64b998ae92cf99b53dde8227ff29d106591ae3076ef10f36f2a34\": rpc error: code = NotFound desc = could not find container \"0eaca8ef7ec64b998ae92cf99b53dde8227ff29d106591ae3076ef10f36f2a34\": container with ID starting with 0eaca8ef7ec64b998ae92cf99b53dde8227ff29d106591ae3076ef10f36f2a34 not found: ID does not exist" Oct 03 09:18:01 crc kubenswrapper[4790]: I1003 09:18:01.149679 4790 scope.go:117] "RemoveContainer" containerID="9fce7ab7cabb5d0a4505e942ae506dd61738e84fb020f8ec4c46fcaa39c1d9bd" Oct 03 09:18:01 crc kubenswrapper[4790]: E1003 09:18:01.150081 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9fce7ab7cabb5d0a4505e942ae506dd61738e84fb020f8ec4c46fcaa39c1d9bd\": container with ID starting with 9fce7ab7cabb5d0a4505e942ae506dd61738e84fb020f8ec4c46fcaa39c1d9bd not found: ID does not exist" containerID="9fce7ab7cabb5d0a4505e942ae506dd61738e84fb020f8ec4c46fcaa39c1d9bd" Oct 03 09:18:01 crc kubenswrapper[4790]: I1003 09:18:01.150221 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9fce7ab7cabb5d0a4505e942ae506dd61738e84fb020f8ec4c46fcaa39c1d9bd"} err="failed to get container status \"9fce7ab7cabb5d0a4505e942ae506dd61738e84fb020f8ec4c46fcaa39c1d9bd\": rpc error: code = NotFound desc = could not find container \"9fce7ab7cabb5d0a4505e942ae506dd61738e84fb020f8ec4c46fcaa39c1d9bd\": container with ID starting with 9fce7ab7cabb5d0a4505e942ae506dd61738e84fb020f8ec4c46fcaa39c1d9bd not found: ID does not exist" Oct 03 09:18:02 crc kubenswrapper[4790]: I1003 09:18:02.343261 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="36a8783d-f796-41ea-b847-7797630b1839" path="/var/lib/kubelet/pods/36a8783d-f796-41ea-b847-7797630b1839/volumes" Oct 03 09:18:04 crc kubenswrapper[4790]: I1003 09:18:04.341397 4790 scope.go:117] "RemoveContainer" containerID="105f7e31a1b4fd122ba3755f58ec2001671f4c1e80e4803ea6d27a8c94b0c9b8" Oct 03 09:18:04 crc kubenswrapper[4790]: E1003 09:18:04.342009 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqj4j_openshift-machine-config-operator(36eae06a-d8be-4128-8d68-acb32e1f011b)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" podUID="36eae06a-d8be-4128-8d68-acb32e1f011b" Oct 03 09:18:15 crc kubenswrapper[4790]: I1003 09:18:15.328910 4790 scope.go:117] "RemoveContainer" containerID="105f7e31a1b4fd122ba3755f58ec2001671f4c1e80e4803ea6d27a8c94b0c9b8" Oct 03 09:18:15 crc kubenswrapper[4790]: E1003 09:18:15.329860 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqj4j_openshift-machine-config-operator(36eae06a-d8be-4128-8d68-acb32e1f011b)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" podUID="36eae06a-d8be-4128-8d68-acb32e1f011b" Oct 03 09:18:27 crc kubenswrapper[4790]: I1003 09:18:27.328507 4790 scope.go:117] "RemoveContainer" containerID="105f7e31a1b4fd122ba3755f58ec2001671f4c1e80e4803ea6d27a8c94b0c9b8" Oct 03 09:18:27 crc kubenswrapper[4790]: E1003 09:18:27.329616 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqj4j_openshift-machine-config-operator(36eae06a-d8be-4128-8d68-acb32e1f011b)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" podUID="36eae06a-d8be-4128-8d68-acb32e1f011b" Oct 03 09:18:42 crc kubenswrapper[4790]: I1003 09:18:42.329385 4790 scope.go:117] "RemoveContainer" containerID="105f7e31a1b4fd122ba3755f58ec2001671f4c1e80e4803ea6d27a8c94b0c9b8" Oct 03 09:18:42 crc kubenswrapper[4790]: E1003 09:18:42.331264 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqj4j_openshift-machine-config-operator(36eae06a-d8be-4128-8d68-acb32e1f011b)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" podUID="36eae06a-d8be-4128-8d68-acb32e1f011b" Oct 03 09:18:57 crc kubenswrapper[4790]: I1003 09:18:57.328785 4790 scope.go:117] "RemoveContainer" containerID="105f7e31a1b4fd122ba3755f58ec2001671f4c1e80e4803ea6d27a8c94b0c9b8" Oct 03 09:18:57 crc kubenswrapper[4790]: E1003 09:18:57.330451 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqj4j_openshift-machine-config-operator(36eae06a-d8be-4128-8d68-acb32e1f011b)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" podUID="36eae06a-d8be-4128-8d68-acb32e1f011b" Oct 03 09:19:08 crc kubenswrapper[4790]: I1003 09:19:08.329157 4790 scope.go:117] "RemoveContainer" containerID="105f7e31a1b4fd122ba3755f58ec2001671f4c1e80e4803ea6d27a8c94b0c9b8" Oct 03 09:19:08 crc kubenswrapper[4790]: E1003 09:19:08.332124 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqj4j_openshift-machine-config-operator(36eae06a-d8be-4128-8d68-acb32e1f011b)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" podUID="36eae06a-d8be-4128-8d68-acb32e1f011b" Oct 03 09:19:22 crc kubenswrapper[4790]: I1003 09:19:22.328550 4790 scope.go:117] "RemoveContainer" containerID="105f7e31a1b4fd122ba3755f58ec2001671f4c1e80e4803ea6d27a8c94b0c9b8" Oct 03 09:19:22 crc kubenswrapper[4790]: E1003 09:19:22.329624 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqj4j_openshift-machine-config-operator(36eae06a-d8be-4128-8d68-acb32e1f011b)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" podUID="36eae06a-d8be-4128-8d68-acb32e1f011b" Oct 03 09:19:34 crc kubenswrapper[4790]: I1003 09:19:34.335470 4790 scope.go:117] "RemoveContainer" containerID="105f7e31a1b4fd122ba3755f58ec2001671f4c1e80e4803ea6d27a8c94b0c9b8" Oct 03 09:19:34 crc kubenswrapper[4790]: E1003 09:19:34.336164 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqj4j_openshift-machine-config-operator(36eae06a-d8be-4128-8d68-acb32e1f011b)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" podUID="36eae06a-d8be-4128-8d68-acb32e1f011b" Oct 03 09:19:46 crc kubenswrapper[4790]: I1003 09:19:46.328741 4790 scope.go:117] "RemoveContainer" containerID="105f7e31a1b4fd122ba3755f58ec2001671f4c1e80e4803ea6d27a8c94b0c9b8" Oct 03 09:19:46 crc kubenswrapper[4790]: E1003 09:19:46.329612 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqj4j_openshift-machine-config-operator(36eae06a-d8be-4128-8d68-acb32e1f011b)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" podUID="36eae06a-d8be-4128-8d68-acb32e1f011b" Oct 03 09:20:01 crc kubenswrapper[4790]: I1003 09:20:01.328924 4790 scope.go:117] "RemoveContainer" containerID="105f7e31a1b4fd122ba3755f58ec2001671f4c1e80e4803ea6d27a8c94b0c9b8" Oct 03 09:20:01 crc kubenswrapper[4790]: E1003 09:20:01.329928 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqj4j_openshift-machine-config-operator(36eae06a-d8be-4128-8d68-acb32e1f011b)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" podUID="36eae06a-d8be-4128-8d68-acb32e1f011b" Oct 03 09:20:14 crc kubenswrapper[4790]: I1003 09:20:14.343940 4790 scope.go:117] "RemoveContainer" containerID="105f7e31a1b4fd122ba3755f58ec2001671f4c1e80e4803ea6d27a8c94b0c9b8" Oct 03 09:20:14 crc kubenswrapper[4790]: E1003 09:20:14.344871 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqj4j_openshift-machine-config-operator(36eae06a-d8be-4128-8d68-acb32e1f011b)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" podUID="36eae06a-d8be-4128-8d68-acb32e1f011b" Oct 03 09:20:17 crc kubenswrapper[4790]: I1003 09:20:17.414621 4790 generic.go:334] "Generic (PLEG): container finished" podID="60e50039-1bd7-4bd0-98ae-f1285c701e2e" containerID="9665bc83eb17c295ae6e9e17b26d3e859d12e872bf54d6e2d71eb6c8d170c822" exitCode=0 Oct 03 09:20:17 crc kubenswrapper[4790]: I1003 09:20:17.414692 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-pgz87" event={"ID":"60e50039-1bd7-4bd0-98ae-f1285c701e2e","Type":"ContainerDied","Data":"9665bc83eb17c295ae6e9e17b26d3e859d12e872bf54d6e2d71eb6c8d170c822"} Oct 03 09:20:18 crc kubenswrapper[4790]: I1003 09:20:18.829886 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-pgz87" Oct 03 09:20:18 crc kubenswrapper[4790]: I1003 09:20:18.973121 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/60e50039-1bd7-4bd0-98ae-f1285c701e2e-ssh-key\") pod \"60e50039-1bd7-4bd0-98ae-f1285c701e2e\" (UID: \"60e50039-1bd7-4bd0-98ae-f1285c701e2e\") " Oct 03 09:20:18 crc kubenswrapper[4790]: I1003 09:20:18.973310 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/60e50039-1bd7-4bd0-98ae-f1285c701e2e-libvirt-secret-0\") pod \"60e50039-1bd7-4bd0-98ae-f1285c701e2e\" (UID: \"60e50039-1bd7-4bd0-98ae-f1285c701e2e\") " Oct 03 09:20:18 crc kubenswrapper[4790]: I1003 09:20:18.973365 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60e50039-1bd7-4bd0-98ae-f1285c701e2e-libvirt-combined-ca-bundle\") pod \"60e50039-1bd7-4bd0-98ae-f1285c701e2e\" (UID: \"60e50039-1bd7-4bd0-98ae-f1285c701e2e\") " Oct 03 09:20:18 crc kubenswrapper[4790]: I1003 09:20:18.973418 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/60e50039-1bd7-4bd0-98ae-f1285c701e2e-inventory\") pod \"60e50039-1bd7-4bd0-98ae-f1285c701e2e\" (UID: \"60e50039-1bd7-4bd0-98ae-f1285c701e2e\") " Oct 03 09:20:18 crc kubenswrapper[4790]: I1003 09:20:18.973483 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-79wz8\" (UniqueName: \"kubernetes.io/projected/60e50039-1bd7-4bd0-98ae-f1285c701e2e-kube-api-access-79wz8\") pod \"60e50039-1bd7-4bd0-98ae-f1285c701e2e\" (UID: \"60e50039-1bd7-4bd0-98ae-f1285c701e2e\") " Oct 03 09:20:18 crc kubenswrapper[4790]: I1003 09:20:18.979202 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/60e50039-1bd7-4bd0-98ae-f1285c701e2e-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "60e50039-1bd7-4bd0-98ae-f1285c701e2e" (UID: "60e50039-1bd7-4bd0-98ae-f1285c701e2e"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:20:18 crc kubenswrapper[4790]: I1003 09:20:18.981808 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/60e50039-1bd7-4bd0-98ae-f1285c701e2e-kube-api-access-79wz8" (OuterVolumeSpecName: "kube-api-access-79wz8") pod "60e50039-1bd7-4bd0-98ae-f1285c701e2e" (UID: "60e50039-1bd7-4bd0-98ae-f1285c701e2e"). InnerVolumeSpecName "kube-api-access-79wz8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:20:19 crc kubenswrapper[4790]: I1003 09:20:19.005977 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/60e50039-1bd7-4bd0-98ae-f1285c701e2e-inventory" (OuterVolumeSpecName: "inventory") pod "60e50039-1bd7-4bd0-98ae-f1285c701e2e" (UID: "60e50039-1bd7-4bd0-98ae-f1285c701e2e"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:20:19 crc kubenswrapper[4790]: I1003 09:20:19.006116 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/60e50039-1bd7-4bd0-98ae-f1285c701e2e-libvirt-secret-0" (OuterVolumeSpecName: "libvirt-secret-0") pod "60e50039-1bd7-4bd0-98ae-f1285c701e2e" (UID: "60e50039-1bd7-4bd0-98ae-f1285c701e2e"). InnerVolumeSpecName "libvirt-secret-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:20:19 crc kubenswrapper[4790]: I1003 09:20:19.009265 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/60e50039-1bd7-4bd0-98ae-f1285c701e2e-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "60e50039-1bd7-4bd0-98ae-f1285c701e2e" (UID: "60e50039-1bd7-4bd0-98ae-f1285c701e2e"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:20:19 crc kubenswrapper[4790]: I1003 09:20:19.075898 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-79wz8\" (UniqueName: \"kubernetes.io/projected/60e50039-1bd7-4bd0-98ae-f1285c701e2e-kube-api-access-79wz8\") on node \"crc\" DevicePath \"\"" Oct 03 09:20:19 crc kubenswrapper[4790]: I1003 09:20:19.075948 4790 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/60e50039-1bd7-4bd0-98ae-f1285c701e2e-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 03 09:20:19 crc kubenswrapper[4790]: I1003 09:20:19.075971 4790 reconciler_common.go:293] "Volume detached for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/60e50039-1bd7-4bd0-98ae-f1285c701e2e-libvirt-secret-0\") on node \"crc\" DevicePath \"\"" Oct 03 09:20:19 crc kubenswrapper[4790]: I1003 09:20:19.075990 4790 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60e50039-1bd7-4bd0-98ae-f1285c701e2e-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 09:20:19 crc kubenswrapper[4790]: I1003 09:20:19.076009 4790 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/60e50039-1bd7-4bd0-98ae-f1285c701e2e-inventory\") on node \"crc\" DevicePath \"\"" Oct 03 09:20:19 crc kubenswrapper[4790]: I1003 09:20:19.438753 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-pgz87" event={"ID":"60e50039-1bd7-4bd0-98ae-f1285c701e2e","Type":"ContainerDied","Data":"a8f0517d45a2eea9bb96b8a151cfcc6dc2d046c623f5dafdc631bbd8107b133c"} Oct 03 09:20:19 crc kubenswrapper[4790]: I1003 09:20:19.438809 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a8f0517d45a2eea9bb96b8a151cfcc6dc2d046c623f5dafdc631bbd8107b133c" Oct 03 09:20:19 crc kubenswrapper[4790]: I1003 09:20:19.438824 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-pgz87" Oct 03 09:20:19 crc kubenswrapper[4790]: I1003 09:20:19.533384 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-z7tfh"] Oct 03 09:20:19 crc kubenswrapper[4790]: E1003 09:20:19.534386 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="60e50039-1bd7-4bd0-98ae-f1285c701e2e" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Oct 03 09:20:19 crc kubenswrapper[4790]: I1003 09:20:19.534411 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="60e50039-1bd7-4bd0-98ae-f1285c701e2e" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Oct 03 09:20:19 crc kubenswrapper[4790]: E1003 09:20:19.534448 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="36a8783d-f796-41ea-b847-7797630b1839" containerName="registry-server" Oct 03 09:20:19 crc kubenswrapper[4790]: I1003 09:20:19.534457 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="36a8783d-f796-41ea-b847-7797630b1839" containerName="registry-server" Oct 03 09:20:19 crc kubenswrapper[4790]: E1003 09:20:19.534475 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="36a8783d-f796-41ea-b847-7797630b1839" containerName="extract-utilities" Oct 03 09:20:19 crc kubenswrapper[4790]: I1003 09:20:19.534484 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="36a8783d-f796-41ea-b847-7797630b1839" containerName="extract-utilities" Oct 03 09:20:19 crc kubenswrapper[4790]: E1003 09:20:19.534499 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="36a8783d-f796-41ea-b847-7797630b1839" containerName="extract-content" Oct 03 09:20:19 crc kubenswrapper[4790]: I1003 09:20:19.534507 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="36a8783d-f796-41ea-b847-7797630b1839" containerName="extract-content" Oct 03 09:20:19 crc kubenswrapper[4790]: I1003 09:20:19.534794 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="60e50039-1bd7-4bd0-98ae-f1285c701e2e" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Oct 03 09:20:19 crc kubenswrapper[4790]: I1003 09:20:19.534835 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="36a8783d-f796-41ea-b847-7797630b1839" containerName="registry-server" Oct 03 09:20:19 crc kubenswrapper[4790]: I1003 09:20:19.536398 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-z7tfh" Oct 03 09:20:19 crc kubenswrapper[4790]: I1003 09:20:19.538245 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"nova-extra-config" Oct 03 09:20:19 crc kubenswrapper[4790]: I1003 09:20:19.538851 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-compute-config" Oct 03 09:20:19 crc kubenswrapper[4790]: I1003 09:20:19.539016 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-migration-ssh-key" Oct 03 09:20:19 crc kubenswrapper[4790]: I1003 09:20:19.539061 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 03 09:20:19 crc kubenswrapper[4790]: I1003 09:20:19.539263 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 03 09:20:19 crc kubenswrapper[4790]: I1003 09:20:19.539366 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-8j46v" Oct 03 09:20:19 crc kubenswrapper[4790]: I1003 09:20:19.539450 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 03 09:20:19 crc kubenswrapper[4790]: I1003 09:20:19.543048 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-z7tfh"] Oct 03 09:20:19 crc kubenswrapper[4790]: I1003 09:20:19.694249 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/7e86fc9f-d34e-41ca-b945-b8b45136512f-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-z7tfh\" (UID: \"7e86fc9f-d34e-41ca-b945-b8b45136512f\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-z7tfh" Oct 03 09:20:19 crc kubenswrapper[4790]: I1003 09:20:19.694321 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7e86fc9f-d34e-41ca-b945-b8b45136512f-ssh-key\") pod \"nova-edpm-deployment-openstack-edpm-ipam-z7tfh\" (UID: \"7e86fc9f-d34e-41ca-b945-b8b45136512f\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-z7tfh" Oct 03 09:20:19 crc kubenswrapper[4790]: I1003 09:20:19.694344 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/7e86fc9f-d34e-41ca-b945-b8b45136512f-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-z7tfh\" (UID: \"7e86fc9f-d34e-41ca-b945-b8b45136512f\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-z7tfh" Oct 03 09:20:19 crc kubenswrapper[4790]: I1003 09:20:19.694496 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2vmcz\" (UniqueName: \"kubernetes.io/projected/7e86fc9f-d34e-41ca-b945-b8b45136512f-kube-api-access-2vmcz\") pod \"nova-edpm-deployment-openstack-edpm-ipam-z7tfh\" (UID: \"7e86fc9f-d34e-41ca-b945-b8b45136512f\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-z7tfh" Oct 03 09:20:19 crc kubenswrapper[4790]: I1003 09:20:19.694789 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/7e86fc9f-d34e-41ca-b945-b8b45136512f-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-z7tfh\" (UID: \"7e86fc9f-d34e-41ca-b945-b8b45136512f\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-z7tfh" Oct 03 09:20:19 crc kubenswrapper[4790]: I1003 09:20:19.694867 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e86fc9f-d34e-41ca-b945-b8b45136512f-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-z7tfh\" (UID: \"7e86fc9f-d34e-41ca-b945-b8b45136512f\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-z7tfh" Oct 03 09:20:19 crc kubenswrapper[4790]: I1003 09:20:19.694899 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7e86fc9f-d34e-41ca-b945-b8b45136512f-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-z7tfh\" (UID: \"7e86fc9f-d34e-41ca-b945-b8b45136512f\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-z7tfh" Oct 03 09:20:19 crc kubenswrapper[4790]: I1003 09:20:19.694947 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/7e86fc9f-d34e-41ca-b945-b8b45136512f-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-z7tfh\" (UID: \"7e86fc9f-d34e-41ca-b945-b8b45136512f\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-z7tfh" Oct 03 09:20:19 crc kubenswrapper[4790]: I1003 09:20:19.695087 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/7e86fc9f-d34e-41ca-b945-b8b45136512f-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-z7tfh\" (UID: \"7e86fc9f-d34e-41ca-b945-b8b45136512f\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-z7tfh" Oct 03 09:20:19 crc kubenswrapper[4790]: I1003 09:20:19.798470 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/7e86fc9f-d34e-41ca-b945-b8b45136512f-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-z7tfh\" (UID: \"7e86fc9f-d34e-41ca-b945-b8b45136512f\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-z7tfh" Oct 03 09:20:19 crc kubenswrapper[4790]: I1003 09:20:19.798613 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7e86fc9f-d34e-41ca-b945-b8b45136512f-ssh-key\") pod \"nova-edpm-deployment-openstack-edpm-ipam-z7tfh\" (UID: \"7e86fc9f-d34e-41ca-b945-b8b45136512f\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-z7tfh" Oct 03 09:20:19 crc kubenswrapper[4790]: I1003 09:20:19.798646 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/7e86fc9f-d34e-41ca-b945-b8b45136512f-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-z7tfh\" (UID: \"7e86fc9f-d34e-41ca-b945-b8b45136512f\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-z7tfh" Oct 03 09:20:19 crc kubenswrapper[4790]: I1003 09:20:19.798728 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2vmcz\" (UniqueName: \"kubernetes.io/projected/7e86fc9f-d34e-41ca-b945-b8b45136512f-kube-api-access-2vmcz\") pod \"nova-edpm-deployment-openstack-edpm-ipam-z7tfh\" (UID: \"7e86fc9f-d34e-41ca-b945-b8b45136512f\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-z7tfh" Oct 03 09:20:19 crc kubenswrapper[4790]: I1003 09:20:19.798812 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/7e86fc9f-d34e-41ca-b945-b8b45136512f-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-z7tfh\" (UID: \"7e86fc9f-d34e-41ca-b945-b8b45136512f\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-z7tfh" Oct 03 09:20:19 crc kubenswrapper[4790]: I1003 09:20:19.798845 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e86fc9f-d34e-41ca-b945-b8b45136512f-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-z7tfh\" (UID: \"7e86fc9f-d34e-41ca-b945-b8b45136512f\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-z7tfh" Oct 03 09:20:19 crc kubenswrapper[4790]: I1003 09:20:19.798868 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7e86fc9f-d34e-41ca-b945-b8b45136512f-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-z7tfh\" (UID: \"7e86fc9f-d34e-41ca-b945-b8b45136512f\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-z7tfh" Oct 03 09:20:19 crc kubenswrapper[4790]: I1003 09:20:19.798910 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/7e86fc9f-d34e-41ca-b945-b8b45136512f-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-z7tfh\" (UID: \"7e86fc9f-d34e-41ca-b945-b8b45136512f\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-z7tfh" Oct 03 09:20:19 crc kubenswrapper[4790]: I1003 09:20:19.798990 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/7e86fc9f-d34e-41ca-b945-b8b45136512f-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-z7tfh\" (UID: \"7e86fc9f-d34e-41ca-b945-b8b45136512f\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-z7tfh" Oct 03 09:20:19 crc kubenswrapper[4790]: I1003 09:20:19.800291 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/7e86fc9f-d34e-41ca-b945-b8b45136512f-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-z7tfh\" (UID: \"7e86fc9f-d34e-41ca-b945-b8b45136512f\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-z7tfh" Oct 03 09:20:19 crc kubenswrapper[4790]: I1003 09:20:19.803791 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/7e86fc9f-d34e-41ca-b945-b8b45136512f-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-z7tfh\" (UID: \"7e86fc9f-d34e-41ca-b945-b8b45136512f\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-z7tfh" Oct 03 09:20:19 crc kubenswrapper[4790]: I1003 09:20:19.804081 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/7e86fc9f-d34e-41ca-b945-b8b45136512f-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-z7tfh\" (UID: \"7e86fc9f-d34e-41ca-b945-b8b45136512f\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-z7tfh" Oct 03 09:20:19 crc kubenswrapper[4790]: I1003 09:20:19.804369 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7e86fc9f-d34e-41ca-b945-b8b45136512f-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-z7tfh\" (UID: \"7e86fc9f-d34e-41ca-b945-b8b45136512f\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-z7tfh" Oct 03 09:20:19 crc kubenswrapper[4790]: I1003 09:20:19.804383 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7e86fc9f-d34e-41ca-b945-b8b45136512f-ssh-key\") pod \"nova-edpm-deployment-openstack-edpm-ipam-z7tfh\" (UID: \"7e86fc9f-d34e-41ca-b945-b8b45136512f\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-z7tfh" Oct 03 09:20:19 crc kubenswrapper[4790]: I1003 09:20:19.804909 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/7e86fc9f-d34e-41ca-b945-b8b45136512f-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-z7tfh\" (UID: \"7e86fc9f-d34e-41ca-b945-b8b45136512f\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-z7tfh" Oct 03 09:20:19 crc kubenswrapper[4790]: I1003 09:20:19.807458 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/7e86fc9f-d34e-41ca-b945-b8b45136512f-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-z7tfh\" (UID: \"7e86fc9f-d34e-41ca-b945-b8b45136512f\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-z7tfh" Oct 03 09:20:19 crc kubenswrapper[4790]: I1003 09:20:19.808191 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e86fc9f-d34e-41ca-b945-b8b45136512f-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-z7tfh\" (UID: \"7e86fc9f-d34e-41ca-b945-b8b45136512f\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-z7tfh" Oct 03 09:20:19 crc kubenswrapper[4790]: I1003 09:20:19.821455 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2vmcz\" (UniqueName: \"kubernetes.io/projected/7e86fc9f-d34e-41ca-b945-b8b45136512f-kube-api-access-2vmcz\") pod \"nova-edpm-deployment-openstack-edpm-ipam-z7tfh\" (UID: \"7e86fc9f-d34e-41ca-b945-b8b45136512f\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-z7tfh" Oct 03 09:20:19 crc kubenswrapper[4790]: I1003 09:20:19.855669 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-z7tfh" Oct 03 09:20:20 crc kubenswrapper[4790]: I1003 09:20:20.381306 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-z7tfh"] Oct 03 09:20:20 crc kubenswrapper[4790]: I1003 09:20:20.451065 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-z7tfh" event={"ID":"7e86fc9f-d34e-41ca-b945-b8b45136512f","Type":"ContainerStarted","Data":"a441e96a6038c6e61186d2e4221b8ff5e4dc615f3e05efbe9d3eb55aa501962a"} Oct 03 09:20:21 crc kubenswrapper[4790]: I1003 09:20:21.462370 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-z7tfh" event={"ID":"7e86fc9f-d34e-41ca-b945-b8b45136512f","Type":"ContainerStarted","Data":"3e58dccd9fc7aad54ee5027dffcd1d40914d00ae07b1cc05b0e8d7dc56e796d5"} Oct 03 09:20:21 crc kubenswrapper[4790]: I1003 09:20:21.496462 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-z7tfh" podStartSLOduration=1.901792614 podStartE2EDuration="2.496435838s" podCreationTimestamp="2025-10-03 09:20:19 +0000 UTC" firstStartedPulling="2025-10-03 09:20:20.379468287 +0000 UTC m=+2406.732402535" lastFinishedPulling="2025-10-03 09:20:20.974111521 +0000 UTC m=+2407.327045759" observedRunningTime="2025-10-03 09:20:21.48119987 +0000 UTC m=+2407.834134148" watchObservedRunningTime="2025-10-03 09:20:21.496435838 +0000 UTC m=+2407.849370116" Oct 03 09:20:28 crc kubenswrapper[4790]: I1003 09:20:28.328873 4790 scope.go:117] "RemoveContainer" containerID="105f7e31a1b4fd122ba3755f58ec2001671f4c1e80e4803ea6d27a8c94b0c9b8" Oct 03 09:20:28 crc kubenswrapper[4790]: E1003 09:20:28.329752 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqj4j_openshift-machine-config-operator(36eae06a-d8be-4128-8d68-acb32e1f011b)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" podUID="36eae06a-d8be-4128-8d68-acb32e1f011b" Oct 03 09:20:32 crc kubenswrapper[4790]: I1003 09:20:32.083903 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-zv42m"] Oct 03 09:20:32 crc kubenswrapper[4790]: I1003 09:20:32.086449 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-zv42m" Oct 03 09:20:32 crc kubenswrapper[4790]: I1003 09:20:32.097946 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-zv42m"] Oct 03 09:20:32 crc kubenswrapper[4790]: I1003 09:20:32.153952 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b6756d3b-7bb8-4454-bcae-92c1683d92be-catalog-content\") pod \"redhat-marketplace-zv42m\" (UID: \"b6756d3b-7bb8-4454-bcae-92c1683d92be\") " pod="openshift-marketplace/redhat-marketplace-zv42m" Oct 03 09:20:32 crc kubenswrapper[4790]: I1003 09:20:32.154211 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b6756d3b-7bb8-4454-bcae-92c1683d92be-utilities\") pod \"redhat-marketplace-zv42m\" (UID: \"b6756d3b-7bb8-4454-bcae-92c1683d92be\") " pod="openshift-marketplace/redhat-marketplace-zv42m" Oct 03 09:20:32 crc kubenswrapper[4790]: I1003 09:20:32.154304 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-znwfn\" (UniqueName: \"kubernetes.io/projected/b6756d3b-7bb8-4454-bcae-92c1683d92be-kube-api-access-znwfn\") pod \"redhat-marketplace-zv42m\" (UID: \"b6756d3b-7bb8-4454-bcae-92c1683d92be\") " pod="openshift-marketplace/redhat-marketplace-zv42m" Oct 03 09:20:32 crc kubenswrapper[4790]: I1003 09:20:32.256444 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b6756d3b-7bb8-4454-bcae-92c1683d92be-catalog-content\") pod \"redhat-marketplace-zv42m\" (UID: \"b6756d3b-7bb8-4454-bcae-92c1683d92be\") " pod="openshift-marketplace/redhat-marketplace-zv42m" Oct 03 09:20:32 crc kubenswrapper[4790]: I1003 09:20:32.256599 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b6756d3b-7bb8-4454-bcae-92c1683d92be-utilities\") pod \"redhat-marketplace-zv42m\" (UID: \"b6756d3b-7bb8-4454-bcae-92c1683d92be\") " pod="openshift-marketplace/redhat-marketplace-zv42m" Oct 03 09:20:32 crc kubenswrapper[4790]: I1003 09:20:32.256640 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-znwfn\" (UniqueName: \"kubernetes.io/projected/b6756d3b-7bb8-4454-bcae-92c1683d92be-kube-api-access-znwfn\") pod \"redhat-marketplace-zv42m\" (UID: \"b6756d3b-7bb8-4454-bcae-92c1683d92be\") " pod="openshift-marketplace/redhat-marketplace-zv42m" Oct 03 09:20:32 crc kubenswrapper[4790]: I1003 09:20:32.256973 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b6756d3b-7bb8-4454-bcae-92c1683d92be-catalog-content\") pod \"redhat-marketplace-zv42m\" (UID: \"b6756d3b-7bb8-4454-bcae-92c1683d92be\") " pod="openshift-marketplace/redhat-marketplace-zv42m" Oct 03 09:20:32 crc kubenswrapper[4790]: I1003 09:20:32.256990 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b6756d3b-7bb8-4454-bcae-92c1683d92be-utilities\") pod \"redhat-marketplace-zv42m\" (UID: \"b6756d3b-7bb8-4454-bcae-92c1683d92be\") " pod="openshift-marketplace/redhat-marketplace-zv42m" Oct 03 09:20:32 crc kubenswrapper[4790]: I1003 09:20:32.287402 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-znwfn\" (UniqueName: \"kubernetes.io/projected/b6756d3b-7bb8-4454-bcae-92c1683d92be-kube-api-access-znwfn\") pod \"redhat-marketplace-zv42m\" (UID: \"b6756d3b-7bb8-4454-bcae-92c1683d92be\") " pod="openshift-marketplace/redhat-marketplace-zv42m" Oct 03 09:20:32 crc kubenswrapper[4790]: I1003 09:20:32.417123 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-zv42m" Oct 03 09:20:32 crc kubenswrapper[4790]: I1003 09:20:32.865067 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-zv42m"] Oct 03 09:20:33 crc kubenswrapper[4790]: I1003 09:20:33.573622 4790 generic.go:334] "Generic (PLEG): container finished" podID="b6756d3b-7bb8-4454-bcae-92c1683d92be" containerID="3609586a518a150706cc494d172b0470d688067495d00ea8c0a9fc66b3e5d146" exitCode=0 Oct 03 09:20:33 crc kubenswrapper[4790]: I1003 09:20:33.573734 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zv42m" event={"ID":"b6756d3b-7bb8-4454-bcae-92c1683d92be","Type":"ContainerDied","Data":"3609586a518a150706cc494d172b0470d688067495d00ea8c0a9fc66b3e5d146"} Oct 03 09:20:33 crc kubenswrapper[4790]: I1003 09:20:33.573893 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zv42m" event={"ID":"b6756d3b-7bb8-4454-bcae-92c1683d92be","Type":"ContainerStarted","Data":"ca98a9b66502ddc8c2b3e747c400c9742bec744764eac5d1374dc94c9e6fa74b"} Oct 03 09:20:34 crc kubenswrapper[4790]: I1003 09:20:34.590910 4790 generic.go:334] "Generic (PLEG): container finished" podID="b6756d3b-7bb8-4454-bcae-92c1683d92be" containerID="d42ae775dd8aff1330f549b42592fafcca8a7470d15bf6840ac8dba4630d410a" exitCode=0 Oct 03 09:20:34 crc kubenswrapper[4790]: I1003 09:20:34.591022 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zv42m" event={"ID":"b6756d3b-7bb8-4454-bcae-92c1683d92be","Type":"ContainerDied","Data":"d42ae775dd8aff1330f549b42592fafcca8a7470d15bf6840ac8dba4630d410a"} Oct 03 09:20:35 crc kubenswrapper[4790]: I1003 09:20:35.602755 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zv42m" event={"ID":"b6756d3b-7bb8-4454-bcae-92c1683d92be","Type":"ContainerStarted","Data":"d457403bd56c984bc17d53ec968711cd4cd504ad84f3e7ae85e169f20fe40982"} Oct 03 09:20:35 crc kubenswrapper[4790]: I1003 09:20:35.626788 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-zv42m" podStartSLOduration=1.957086543 podStartE2EDuration="3.626771297s" podCreationTimestamp="2025-10-03 09:20:32 +0000 UTC" firstStartedPulling="2025-10-03 09:20:33.578788602 +0000 UTC m=+2419.931722840" lastFinishedPulling="2025-10-03 09:20:35.248473356 +0000 UTC m=+2421.601407594" observedRunningTime="2025-10-03 09:20:35.619240261 +0000 UTC m=+2421.972174499" watchObservedRunningTime="2025-10-03 09:20:35.626771297 +0000 UTC m=+2421.979705535" Oct 03 09:20:39 crc kubenswrapper[4790]: I1003 09:20:39.328001 4790 scope.go:117] "RemoveContainer" containerID="105f7e31a1b4fd122ba3755f58ec2001671f4c1e80e4803ea6d27a8c94b0c9b8" Oct 03 09:20:39 crc kubenswrapper[4790]: E1003 09:20:39.329209 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqj4j_openshift-machine-config-operator(36eae06a-d8be-4128-8d68-acb32e1f011b)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" podUID="36eae06a-d8be-4128-8d68-acb32e1f011b" Oct 03 09:20:42 crc kubenswrapper[4790]: I1003 09:20:42.418110 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-zv42m" Oct 03 09:20:42 crc kubenswrapper[4790]: I1003 09:20:42.418743 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-zv42m" Oct 03 09:20:42 crc kubenswrapper[4790]: I1003 09:20:42.468755 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-zv42m" Oct 03 09:20:42 crc kubenswrapper[4790]: I1003 09:20:42.729739 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-zv42m" Oct 03 09:20:42 crc kubenswrapper[4790]: I1003 09:20:42.794664 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-zv42m"] Oct 03 09:20:44 crc kubenswrapper[4790]: I1003 09:20:44.697316 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-zv42m" podUID="b6756d3b-7bb8-4454-bcae-92c1683d92be" containerName="registry-server" containerID="cri-o://d457403bd56c984bc17d53ec968711cd4cd504ad84f3e7ae85e169f20fe40982" gracePeriod=2 Oct 03 09:20:45 crc kubenswrapper[4790]: I1003 09:20:45.169212 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-zv42m" Oct 03 09:20:45 crc kubenswrapper[4790]: I1003 09:20:45.253134 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b6756d3b-7bb8-4454-bcae-92c1683d92be-utilities\") pod \"b6756d3b-7bb8-4454-bcae-92c1683d92be\" (UID: \"b6756d3b-7bb8-4454-bcae-92c1683d92be\") " Oct 03 09:20:45 crc kubenswrapper[4790]: I1003 09:20:45.253216 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b6756d3b-7bb8-4454-bcae-92c1683d92be-catalog-content\") pod \"b6756d3b-7bb8-4454-bcae-92c1683d92be\" (UID: \"b6756d3b-7bb8-4454-bcae-92c1683d92be\") " Oct 03 09:20:45 crc kubenswrapper[4790]: I1003 09:20:45.253418 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-znwfn\" (UniqueName: \"kubernetes.io/projected/b6756d3b-7bb8-4454-bcae-92c1683d92be-kube-api-access-znwfn\") pod \"b6756d3b-7bb8-4454-bcae-92c1683d92be\" (UID: \"b6756d3b-7bb8-4454-bcae-92c1683d92be\") " Oct 03 09:20:45 crc kubenswrapper[4790]: I1003 09:20:45.254298 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b6756d3b-7bb8-4454-bcae-92c1683d92be-utilities" (OuterVolumeSpecName: "utilities") pod "b6756d3b-7bb8-4454-bcae-92c1683d92be" (UID: "b6756d3b-7bb8-4454-bcae-92c1683d92be"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 09:20:45 crc kubenswrapper[4790]: I1003 09:20:45.259887 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6756d3b-7bb8-4454-bcae-92c1683d92be-kube-api-access-znwfn" (OuterVolumeSpecName: "kube-api-access-znwfn") pod "b6756d3b-7bb8-4454-bcae-92c1683d92be" (UID: "b6756d3b-7bb8-4454-bcae-92c1683d92be"). InnerVolumeSpecName "kube-api-access-znwfn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:20:45 crc kubenswrapper[4790]: I1003 09:20:45.267708 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b6756d3b-7bb8-4454-bcae-92c1683d92be-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b6756d3b-7bb8-4454-bcae-92c1683d92be" (UID: "b6756d3b-7bb8-4454-bcae-92c1683d92be"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 09:20:45 crc kubenswrapper[4790]: I1003 09:20:45.356372 4790 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b6756d3b-7bb8-4454-bcae-92c1683d92be-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 09:20:45 crc kubenswrapper[4790]: I1003 09:20:45.356415 4790 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b6756d3b-7bb8-4454-bcae-92c1683d92be-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 09:20:45 crc kubenswrapper[4790]: I1003 09:20:45.356429 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-znwfn\" (UniqueName: \"kubernetes.io/projected/b6756d3b-7bb8-4454-bcae-92c1683d92be-kube-api-access-znwfn\") on node \"crc\" DevicePath \"\"" Oct 03 09:20:45 crc kubenswrapper[4790]: I1003 09:20:45.715035 4790 generic.go:334] "Generic (PLEG): container finished" podID="b6756d3b-7bb8-4454-bcae-92c1683d92be" containerID="d457403bd56c984bc17d53ec968711cd4cd504ad84f3e7ae85e169f20fe40982" exitCode=0 Oct 03 09:20:45 crc kubenswrapper[4790]: I1003 09:20:45.715733 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zv42m" event={"ID":"b6756d3b-7bb8-4454-bcae-92c1683d92be","Type":"ContainerDied","Data":"d457403bd56c984bc17d53ec968711cd4cd504ad84f3e7ae85e169f20fe40982"} Oct 03 09:20:45 crc kubenswrapper[4790]: I1003 09:20:45.715790 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zv42m" event={"ID":"b6756d3b-7bb8-4454-bcae-92c1683d92be","Type":"ContainerDied","Data":"ca98a9b66502ddc8c2b3e747c400c9742bec744764eac5d1374dc94c9e6fa74b"} Oct 03 09:20:45 crc kubenswrapper[4790]: I1003 09:20:45.715823 4790 scope.go:117] "RemoveContainer" containerID="d457403bd56c984bc17d53ec968711cd4cd504ad84f3e7ae85e169f20fe40982" Oct 03 09:20:45 crc kubenswrapper[4790]: I1003 09:20:45.716016 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-zv42m" Oct 03 09:20:45 crc kubenswrapper[4790]: I1003 09:20:45.740336 4790 scope.go:117] "RemoveContainer" containerID="d42ae775dd8aff1330f549b42592fafcca8a7470d15bf6840ac8dba4630d410a" Oct 03 09:20:45 crc kubenswrapper[4790]: I1003 09:20:45.779942 4790 scope.go:117] "RemoveContainer" containerID="3609586a518a150706cc494d172b0470d688067495d00ea8c0a9fc66b3e5d146" Oct 03 09:20:45 crc kubenswrapper[4790]: I1003 09:20:45.788879 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-zv42m"] Oct 03 09:20:45 crc kubenswrapper[4790]: I1003 09:20:45.800644 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-zv42m"] Oct 03 09:20:45 crc kubenswrapper[4790]: I1003 09:20:45.835280 4790 scope.go:117] "RemoveContainer" containerID="d457403bd56c984bc17d53ec968711cd4cd504ad84f3e7ae85e169f20fe40982" Oct 03 09:20:45 crc kubenswrapper[4790]: E1003 09:20:45.835871 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d457403bd56c984bc17d53ec968711cd4cd504ad84f3e7ae85e169f20fe40982\": container with ID starting with d457403bd56c984bc17d53ec968711cd4cd504ad84f3e7ae85e169f20fe40982 not found: ID does not exist" containerID="d457403bd56c984bc17d53ec968711cd4cd504ad84f3e7ae85e169f20fe40982" Oct 03 09:20:45 crc kubenswrapper[4790]: I1003 09:20:45.835918 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d457403bd56c984bc17d53ec968711cd4cd504ad84f3e7ae85e169f20fe40982"} err="failed to get container status \"d457403bd56c984bc17d53ec968711cd4cd504ad84f3e7ae85e169f20fe40982\": rpc error: code = NotFound desc = could not find container \"d457403bd56c984bc17d53ec968711cd4cd504ad84f3e7ae85e169f20fe40982\": container with ID starting with d457403bd56c984bc17d53ec968711cd4cd504ad84f3e7ae85e169f20fe40982 not found: ID does not exist" Oct 03 09:20:45 crc kubenswrapper[4790]: I1003 09:20:45.835946 4790 scope.go:117] "RemoveContainer" containerID="d42ae775dd8aff1330f549b42592fafcca8a7470d15bf6840ac8dba4630d410a" Oct 03 09:20:45 crc kubenswrapper[4790]: E1003 09:20:45.836257 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d42ae775dd8aff1330f549b42592fafcca8a7470d15bf6840ac8dba4630d410a\": container with ID starting with d42ae775dd8aff1330f549b42592fafcca8a7470d15bf6840ac8dba4630d410a not found: ID does not exist" containerID="d42ae775dd8aff1330f549b42592fafcca8a7470d15bf6840ac8dba4630d410a" Oct 03 09:20:45 crc kubenswrapper[4790]: I1003 09:20:45.836285 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d42ae775dd8aff1330f549b42592fafcca8a7470d15bf6840ac8dba4630d410a"} err="failed to get container status \"d42ae775dd8aff1330f549b42592fafcca8a7470d15bf6840ac8dba4630d410a\": rpc error: code = NotFound desc = could not find container \"d42ae775dd8aff1330f549b42592fafcca8a7470d15bf6840ac8dba4630d410a\": container with ID starting with d42ae775dd8aff1330f549b42592fafcca8a7470d15bf6840ac8dba4630d410a not found: ID does not exist" Oct 03 09:20:45 crc kubenswrapper[4790]: I1003 09:20:45.836305 4790 scope.go:117] "RemoveContainer" containerID="3609586a518a150706cc494d172b0470d688067495d00ea8c0a9fc66b3e5d146" Oct 03 09:20:45 crc kubenswrapper[4790]: E1003 09:20:45.836754 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3609586a518a150706cc494d172b0470d688067495d00ea8c0a9fc66b3e5d146\": container with ID starting with 3609586a518a150706cc494d172b0470d688067495d00ea8c0a9fc66b3e5d146 not found: ID does not exist" containerID="3609586a518a150706cc494d172b0470d688067495d00ea8c0a9fc66b3e5d146" Oct 03 09:20:45 crc kubenswrapper[4790]: I1003 09:20:45.836793 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3609586a518a150706cc494d172b0470d688067495d00ea8c0a9fc66b3e5d146"} err="failed to get container status \"3609586a518a150706cc494d172b0470d688067495d00ea8c0a9fc66b3e5d146\": rpc error: code = NotFound desc = could not find container \"3609586a518a150706cc494d172b0470d688067495d00ea8c0a9fc66b3e5d146\": container with ID starting with 3609586a518a150706cc494d172b0470d688067495d00ea8c0a9fc66b3e5d146 not found: ID does not exist" Oct 03 09:20:46 crc kubenswrapper[4790]: I1003 09:20:46.344853 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6756d3b-7bb8-4454-bcae-92c1683d92be" path="/var/lib/kubelet/pods/b6756d3b-7bb8-4454-bcae-92c1683d92be/volumes" Oct 03 09:20:51 crc kubenswrapper[4790]: I1003 09:20:51.328926 4790 scope.go:117] "RemoveContainer" containerID="105f7e31a1b4fd122ba3755f58ec2001671f4c1e80e4803ea6d27a8c94b0c9b8" Oct 03 09:20:51 crc kubenswrapper[4790]: E1003 09:20:51.329771 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqj4j_openshift-machine-config-operator(36eae06a-d8be-4128-8d68-acb32e1f011b)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" podUID="36eae06a-d8be-4128-8d68-acb32e1f011b" Oct 03 09:21:05 crc kubenswrapper[4790]: I1003 09:21:05.328139 4790 scope.go:117] "RemoveContainer" containerID="105f7e31a1b4fd122ba3755f58ec2001671f4c1e80e4803ea6d27a8c94b0c9b8" Oct 03 09:21:05 crc kubenswrapper[4790]: E1003 09:21:05.329158 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqj4j_openshift-machine-config-operator(36eae06a-d8be-4128-8d68-acb32e1f011b)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" podUID="36eae06a-d8be-4128-8d68-acb32e1f011b" Oct 03 09:21:19 crc kubenswrapper[4790]: I1003 09:21:19.328905 4790 scope.go:117] "RemoveContainer" containerID="105f7e31a1b4fd122ba3755f58ec2001671f4c1e80e4803ea6d27a8c94b0c9b8" Oct 03 09:21:19 crc kubenswrapper[4790]: E1003 09:21:19.329684 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqj4j_openshift-machine-config-operator(36eae06a-d8be-4128-8d68-acb32e1f011b)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" podUID="36eae06a-d8be-4128-8d68-acb32e1f011b" Oct 03 09:21:31 crc kubenswrapper[4790]: I1003 09:21:31.329103 4790 scope.go:117] "RemoveContainer" containerID="105f7e31a1b4fd122ba3755f58ec2001671f4c1e80e4803ea6d27a8c94b0c9b8" Oct 03 09:21:31 crc kubenswrapper[4790]: E1003 09:21:31.329886 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqj4j_openshift-machine-config-operator(36eae06a-d8be-4128-8d68-acb32e1f011b)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" podUID="36eae06a-d8be-4128-8d68-acb32e1f011b" Oct 03 09:21:45 crc kubenswrapper[4790]: I1003 09:21:45.342946 4790 scope.go:117] "RemoveContainer" containerID="105f7e31a1b4fd122ba3755f58ec2001671f4c1e80e4803ea6d27a8c94b0c9b8" Oct 03 09:21:45 crc kubenswrapper[4790]: E1003 09:21:45.346356 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqj4j_openshift-machine-config-operator(36eae06a-d8be-4128-8d68-acb32e1f011b)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" podUID="36eae06a-d8be-4128-8d68-acb32e1f011b" Oct 03 09:21:56 crc kubenswrapper[4790]: I1003 09:21:56.329198 4790 scope.go:117] "RemoveContainer" containerID="105f7e31a1b4fd122ba3755f58ec2001671f4c1e80e4803ea6d27a8c94b0c9b8" Oct 03 09:21:56 crc kubenswrapper[4790]: E1003 09:21:56.330796 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqj4j_openshift-machine-config-operator(36eae06a-d8be-4128-8d68-acb32e1f011b)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" podUID="36eae06a-d8be-4128-8d68-acb32e1f011b" Oct 03 09:22:09 crc kubenswrapper[4790]: I1003 09:22:09.329847 4790 scope.go:117] "RemoveContainer" containerID="105f7e31a1b4fd122ba3755f58ec2001671f4c1e80e4803ea6d27a8c94b0c9b8" Oct 03 09:22:09 crc kubenswrapper[4790]: E1003 09:22:09.331313 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqj4j_openshift-machine-config-operator(36eae06a-d8be-4128-8d68-acb32e1f011b)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" podUID="36eae06a-d8be-4128-8d68-acb32e1f011b" Oct 03 09:22:21 crc kubenswrapper[4790]: I1003 09:22:21.329442 4790 scope.go:117] "RemoveContainer" containerID="105f7e31a1b4fd122ba3755f58ec2001671f4c1e80e4803ea6d27a8c94b0c9b8" Oct 03 09:22:21 crc kubenswrapper[4790]: I1003 09:22:21.746213 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" event={"ID":"36eae06a-d8be-4128-8d68-acb32e1f011b","Type":"ContainerStarted","Data":"19d3a3ec163b3ce35fa277de2e5c9c85ded382af52d82e819b0c00a5f4d71d5c"} Oct 03 09:24:09 crc kubenswrapper[4790]: I1003 09:24:09.851467 4790 generic.go:334] "Generic (PLEG): container finished" podID="7e86fc9f-d34e-41ca-b945-b8b45136512f" containerID="3e58dccd9fc7aad54ee5027dffcd1d40914d00ae07b1cc05b0e8d7dc56e796d5" exitCode=0 Oct 03 09:24:09 crc kubenswrapper[4790]: I1003 09:24:09.851619 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-z7tfh" event={"ID":"7e86fc9f-d34e-41ca-b945-b8b45136512f","Type":"ContainerDied","Data":"3e58dccd9fc7aad54ee5027dffcd1d40914d00ae07b1cc05b0e8d7dc56e796d5"} Oct 03 09:24:11 crc kubenswrapper[4790]: I1003 09:24:11.300757 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-z7tfh" Oct 03 09:24:11 crc kubenswrapper[4790]: I1003 09:24:11.407315 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/7e86fc9f-d34e-41ca-b945-b8b45136512f-nova-migration-ssh-key-0\") pod \"7e86fc9f-d34e-41ca-b945-b8b45136512f\" (UID: \"7e86fc9f-d34e-41ca-b945-b8b45136512f\") " Oct 03 09:24:11 crc kubenswrapper[4790]: I1003 09:24:11.407417 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7e86fc9f-d34e-41ca-b945-b8b45136512f-inventory\") pod \"7e86fc9f-d34e-41ca-b945-b8b45136512f\" (UID: \"7e86fc9f-d34e-41ca-b945-b8b45136512f\") " Oct 03 09:24:11 crc kubenswrapper[4790]: I1003 09:24:11.407493 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/7e86fc9f-d34e-41ca-b945-b8b45136512f-nova-extra-config-0\") pod \"7e86fc9f-d34e-41ca-b945-b8b45136512f\" (UID: \"7e86fc9f-d34e-41ca-b945-b8b45136512f\") " Oct 03 09:24:11 crc kubenswrapper[4790]: I1003 09:24:11.407527 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7e86fc9f-d34e-41ca-b945-b8b45136512f-ssh-key\") pod \"7e86fc9f-d34e-41ca-b945-b8b45136512f\" (UID: \"7e86fc9f-d34e-41ca-b945-b8b45136512f\") " Oct 03 09:24:11 crc kubenswrapper[4790]: I1003 09:24:11.407594 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/7e86fc9f-d34e-41ca-b945-b8b45136512f-nova-cell1-compute-config-0\") pod \"7e86fc9f-d34e-41ca-b945-b8b45136512f\" (UID: \"7e86fc9f-d34e-41ca-b945-b8b45136512f\") " Oct 03 09:24:11 crc kubenswrapper[4790]: I1003 09:24:11.407629 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2vmcz\" (UniqueName: \"kubernetes.io/projected/7e86fc9f-d34e-41ca-b945-b8b45136512f-kube-api-access-2vmcz\") pod \"7e86fc9f-d34e-41ca-b945-b8b45136512f\" (UID: \"7e86fc9f-d34e-41ca-b945-b8b45136512f\") " Oct 03 09:24:11 crc kubenswrapper[4790]: I1003 09:24:11.407665 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e86fc9f-d34e-41ca-b945-b8b45136512f-nova-combined-ca-bundle\") pod \"7e86fc9f-d34e-41ca-b945-b8b45136512f\" (UID: \"7e86fc9f-d34e-41ca-b945-b8b45136512f\") " Oct 03 09:24:11 crc kubenswrapper[4790]: I1003 09:24:11.407732 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/7e86fc9f-d34e-41ca-b945-b8b45136512f-nova-migration-ssh-key-1\") pod \"7e86fc9f-d34e-41ca-b945-b8b45136512f\" (UID: \"7e86fc9f-d34e-41ca-b945-b8b45136512f\") " Oct 03 09:24:11 crc kubenswrapper[4790]: I1003 09:24:11.407762 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/7e86fc9f-d34e-41ca-b945-b8b45136512f-nova-cell1-compute-config-1\") pod \"7e86fc9f-d34e-41ca-b945-b8b45136512f\" (UID: \"7e86fc9f-d34e-41ca-b945-b8b45136512f\") " Oct 03 09:24:11 crc kubenswrapper[4790]: I1003 09:24:11.414631 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7e86fc9f-d34e-41ca-b945-b8b45136512f-kube-api-access-2vmcz" (OuterVolumeSpecName: "kube-api-access-2vmcz") pod "7e86fc9f-d34e-41ca-b945-b8b45136512f" (UID: "7e86fc9f-d34e-41ca-b945-b8b45136512f"). InnerVolumeSpecName "kube-api-access-2vmcz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:24:11 crc kubenswrapper[4790]: I1003 09:24:11.454939 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7e86fc9f-d34e-41ca-b945-b8b45136512f-nova-migration-ssh-key-1" (OuterVolumeSpecName: "nova-migration-ssh-key-1") pod "7e86fc9f-d34e-41ca-b945-b8b45136512f" (UID: "7e86fc9f-d34e-41ca-b945-b8b45136512f"). InnerVolumeSpecName "nova-migration-ssh-key-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:24:11 crc kubenswrapper[4790]: I1003 09:24:11.455866 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7e86fc9f-d34e-41ca-b945-b8b45136512f-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "7e86fc9f-d34e-41ca-b945-b8b45136512f" (UID: "7e86fc9f-d34e-41ca-b945-b8b45136512f"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:24:11 crc kubenswrapper[4790]: I1003 09:24:11.458559 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7e86fc9f-d34e-41ca-b945-b8b45136512f-nova-migration-ssh-key-0" (OuterVolumeSpecName: "nova-migration-ssh-key-0") pod "7e86fc9f-d34e-41ca-b945-b8b45136512f" (UID: "7e86fc9f-d34e-41ca-b945-b8b45136512f"). InnerVolumeSpecName "nova-migration-ssh-key-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:24:11 crc kubenswrapper[4790]: I1003 09:24:11.459213 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7e86fc9f-d34e-41ca-b945-b8b45136512f-inventory" (OuterVolumeSpecName: "inventory") pod "7e86fc9f-d34e-41ca-b945-b8b45136512f" (UID: "7e86fc9f-d34e-41ca-b945-b8b45136512f"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:24:11 crc kubenswrapper[4790]: I1003 09:24:11.472468 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7e86fc9f-d34e-41ca-b945-b8b45136512f-nova-cell1-compute-config-1" (OuterVolumeSpecName: "nova-cell1-compute-config-1") pod "7e86fc9f-d34e-41ca-b945-b8b45136512f" (UID: "7e86fc9f-d34e-41ca-b945-b8b45136512f"). InnerVolumeSpecName "nova-cell1-compute-config-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:24:11 crc kubenswrapper[4790]: I1003 09:24:11.478959 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7e86fc9f-d34e-41ca-b945-b8b45136512f-nova-cell1-compute-config-0" (OuterVolumeSpecName: "nova-cell1-compute-config-0") pod "7e86fc9f-d34e-41ca-b945-b8b45136512f" (UID: "7e86fc9f-d34e-41ca-b945-b8b45136512f"). InnerVolumeSpecName "nova-cell1-compute-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:24:11 crc kubenswrapper[4790]: I1003 09:24:11.486420 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7e86fc9f-d34e-41ca-b945-b8b45136512f-nova-extra-config-0" (OuterVolumeSpecName: "nova-extra-config-0") pod "7e86fc9f-d34e-41ca-b945-b8b45136512f" (UID: "7e86fc9f-d34e-41ca-b945-b8b45136512f"). InnerVolumeSpecName "nova-extra-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 09:24:11 crc kubenswrapper[4790]: I1003 09:24:11.489110 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7e86fc9f-d34e-41ca-b945-b8b45136512f-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "7e86fc9f-d34e-41ca-b945-b8b45136512f" (UID: "7e86fc9f-d34e-41ca-b945-b8b45136512f"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:24:11 crc kubenswrapper[4790]: I1003 09:24:11.510092 4790 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/7e86fc9f-d34e-41ca-b945-b8b45136512f-nova-migration-ssh-key-0\") on node \"crc\" DevicePath \"\"" Oct 03 09:24:11 crc kubenswrapper[4790]: I1003 09:24:11.510119 4790 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7e86fc9f-d34e-41ca-b945-b8b45136512f-inventory\") on node \"crc\" DevicePath \"\"" Oct 03 09:24:11 crc kubenswrapper[4790]: I1003 09:24:11.510130 4790 reconciler_common.go:293] "Volume detached for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/7e86fc9f-d34e-41ca-b945-b8b45136512f-nova-extra-config-0\") on node \"crc\" DevicePath \"\"" Oct 03 09:24:11 crc kubenswrapper[4790]: I1003 09:24:11.510138 4790 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7e86fc9f-d34e-41ca-b945-b8b45136512f-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 03 09:24:11 crc kubenswrapper[4790]: I1003 09:24:11.510148 4790 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/7e86fc9f-d34e-41ca-b945-b8b45136512f-nova-cell1-compute-config-0\") on node \"crc\" DevicePath \"\"" Oct 03 09:24:11 crc kubenswrapper[4790]: I1003 09:24:11.510156 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2vmcz\" (UniqueName: \"kubernetes.io/projected/7e86fc9f-d34e-41ca-b945-b8b45136512f-kube-api-access-2vmcz\") on node \"crc\" DevicePath \"\"" Oct 03 09:24:11 crc kubenswrapper[4790]: I1003 09:24:11.510164 4790 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e86fc9f-d34e-41ca-b945-b8b45136512f-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 09:24:11 crc kubenswrapper[4790]: I1003 09:24:11.510172 4790 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/7e86fc9f-d34e-41ca-b945-b8b45136512f-nova-migration-ssh-key-1\") on node \"crc\" DevicePath \"\"" Oct 03 09:24:11 crc kubenswrapper[4790]: I1003 09:24:11.510181 4790 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/7e86fc9f-d34e-41ca-b945-b8b45136512f-nova-cell1-compute-config-1\") on node \"crc\" DevicePath \"\"" Oct 03 09:24:11 crc kubenswrapper[4790]: I1003 09:24:11.875374 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-z7tfh" event={"ID":"7e86fc9f-d34e-41ca-b945-b8b45136512f","Type":"ContainerDied","Data":"a441e96a6038c6e61186d2e4221b8ff5e4dc615f3e05efbe9d3eb55aa501962a"} Oct 03 09:24:11 crc kubenswrapper[4790]: I1003 09:24:11.875932 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a441e96a6038c6e61186d2e4221b8ff5e4dc615f3e05efbe9d3eb55aa501962a" Oct 03 09:24:11 crc kubenswrapper[4790]: I1003 09:24:11.875489 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-z7tfh" Oct 03 09:24:12 crc kubenswrapper[4790]: I1003 09:24:12.004866 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-48d4w"] Oct 03 09:24:12 crc kubenswrapper[4790]: E1003 09:24:12.005328 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b6756d3b-7bb8-4454-bcae-92c1683d92be" containerName="extract-content" Oct 03 09:24:12 crc kubenswrapper[4790]: I1003 09:24:12.005354 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="b6756d3b-7bb8-4454-bcae-92c1683d92be" containerName="extract-content" Oct 03 09:24:12 crc kubenswrapper[4790]: E1003 09:24:12.005373 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b6756d3b-7bb8-4454-bcae-92c1683d92be" containerName="registry-server" Oct 03 09:24:12 crc kubenswrapper[4790]: I1003 09:24:12.005382 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="b6756d3b-7bb8-4454-bcae-92c1683d92be" containerName="registry-server" Oct 03 09:24:12 crc kubenswrapper[4790]: E1003 09:24:12.005409 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e86fc9f-d34e-41ca-b945-b8b45136512f" containerName="nova-edpm-deployment-openstack-edpm-ipam" Oct 03 09:24:12 crc kubenswrapper[4790]: I1003 09:24:12.005419 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e86fc9f-d34e-41ca-b945-b8b45136512f" containerName="nova-edpm-deployment-openstack-edpm-ipam" Oct 03 09:24:12 crc kubenswrapper[4790]: E1003 09:24:12.005448 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b6756d3b-7bb8-4454-bcae-92c1683d92be" containerName="extract-utilities" Oct 03 09:24:12 crc kubenswrapper[4790]: I1003 09:24:12.005456 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="b6756d3b-7bb8-4454-bcae-92c1683d92be" containerName="extract-utilities" Oct 03 09:24:12 crc kubenswrapper[4790]: I1003 09:24:12.005683 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="7e86fc9f-d34e-41ca-b945-b8b45136512f" containerName="nova-edpm-deployment-openstack-edpm-ipam" Oct 03 09:24:12 crc kubenswrapper[4790]: I1003 09:24:12.005694 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="b6756d3b-7bb8-4454-bcae-92c1683d92be" containerName="registry-server" Oct 03 09:24:12 crc kubenswrapper[4790]: I1003 09:24:12.006894 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-48d4w" Oct 03 09:24:12 crc kubenswrapper[4790]: I1003 09:24:12.009237 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-8j46v" Oct 03 09:24:12 crc kubenswrapper[4790]: I1003 09:24:12.009414 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 03 09:24:12 crc kubenswrapper[4790]: I1003 09:24:12.011629 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 03 09:24:12 crc kubenswrapper[4790]: I1003 09:24:12.011871 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-compute-config-data" Oct 03 09:24:12 crc kubenswrapper[4790]: I1003 09:24:12.012064 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 03 09:24:12 crc kubenswrapper[4790]: I1003 09:24:12.020533 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-48d4w"] Oct 03 09:24:12 crc kubenswrapper[4790]: I1003 09:24:12.032454 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-665hx\" (UniqueName: \"kubernetes.io/projected/780db1d7-b699-4f38-9f86-c6bcf5c7c9a6-kube-api-access-665hx\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-48d4w\" (UID: \"780db1d7-b699-4f38-9f86-c6bcf5c7c9a6\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-48d4w" Oct 03 09:24:12 crc kubenswrapper[4790]: I1003 09:24:12.032554 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/780db1d7-b699-4f38-9f86-c6bcf5c7c9a6-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-48d4w\" (UID: \"780db1d7-b699-4f38-9f86-c6bcf5c7c9a6\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-48d4w" Oct 03 09:24:12 crc kubenswrapper[4790]: I1003 09:24:12.032789 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/780db1d7-b699-4f38-9f86-c6bcf5c7c9a6-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-48d4w\" (UID: \"780db1d7-b699-4f38-9f86-c6bcf5c7c9a6\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-48d4w" Oct 03 09:24:12 crc kubenswrapper[4790]: I1003 09:24:12.032893 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/780db1d7-b699-4f38-9f86-c6bcf5c7c9a6-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-48d4w\" (UID: \"780db1d7-b699-4f38-9f86-c6bcf5c7c9a6\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-48d4w" Oct 03 09:24:12 crc kubenswrapper[4790]: I1003 09:24:12.032994 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/780db1d7-b699-4f38-9f86-c6bcf5c7c9a6-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-48d4w\" (UID: \"780db1d7-b699-4f38-9f86-c6bcf5c7c9a6\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-48d4w" Oct 03 09:24:12 crc kubenswrapper[4790]: I1003 09:24:12.033029 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/780db1d7-b699-4f38-9f86-c6bcf5c7c9a6-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-48d4w\" (UID: \"780db1d7-b699-4f38-9f86-c6bcf5c7c9a6\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-48d4w" Oct 03 09:24:12 crc kubenswrapper[4790]: I1003 09:24:12.033142 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/780db1d7-b699-4f38-9f86-c6bcf5c7c9a6-ssh-key\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-48d4w\" (UID: \"780db1d7-b699-4f38-9f86-c6bcf5c7c9a6\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-48d4w" Oct 03 09:24:12 crc kubenswrapper[4790]: I1003 09:24:12.134269 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/780db1d7-b699-4f38-9f86-c6bcf5c7c9a6-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-48d4w\" (UID: \"780db1d7-b699-4f38-9f86-c6bcf5c7c9a6\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-48d4w" Oct 03 09:24:12 crc kubenswrapper[4790]: I1003 09:24:12.134339 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/780db1d7-b699-4f38-9f86-c6bcf5c7c9a6-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-48d4w\" (UID: \"780db1d7-b699-4f38-9f86-c6bcf5c7c9a6\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-48d4w" Oct 03 09:24:12 crc kubenswrapper[4790]: I1003 09:24:12.134383 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/780db1d7-b699-4f38-9f86-c6bcf5c7c9a6-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-48d4w\" (UID: \"780db1d7-b699-4f38-9f86-c6bcf5c7c9a6\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-48d4w" Oct 03 09:24:12 crc kubenswrapper[4790]: I1003 09:24:12.134402 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/780db1d7-b699-4f38-9f86-c6bcf5c7c9a6-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-48d4w\" (UID: \"780db1d7-b699-4f38-9f86-c6bcf5c7c9a6\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-48d4w" Oct 03 09:24:12 crc kubenswrapper[4790]: I1003 09:24:12.134449 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/780db1d7-b699-4f38-9f86-c6bcf5c7c9a6-ssh-key\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-48d4w\" (UID: \"780db1d7-b699-4f38-9f86-c6bcf5c7c9a6\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-48d4w" Oct 03 09:24:12 crc kubenswrapper[4790]: I1003 09:24:12.134477 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-665hx\" (UniqueName: \"kubernetes.io/projected/780db1d7-b699-4f38-9f86-c6bcf5c7c9a6-kube-api-access-665hx\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-48d4w\" (UID: \"780db1d7-b699-4f38-9f86-c6bcf5c7c9a6\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-48d4w" Oct 03 09:24:12 crc kubenswrapper[4790]: I1003 09:24:12.134504 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/780db1d7-b699-4f38-9f86-c6bcf5c7c9a6-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-48d4w\" (UID: \"780db1d7-b699-4f38-9f86-c6bcf5c7c9a6\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-48d4w" Oct 03 09:24:12 crc kubenswrapper[4790]: I1003 09:24:12.137947 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/780db1d7-b699-4f38-9f86-c6bcf5c7c9a6-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-48d4w\" (UID: \"780db1d7-b699-4f38-9f86-c6bcf5c7c9a6\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-48d4w" Oct 03 09:24:12 crc kubenswrapper[4790]: I1003 09:24:12.137997 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/780db1d7-b699-4f38-9f86-c6bcf5c7c9a6-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-48d4w\" (UID: \"780db1d7-b699-4f38-9f86-c6bcf5c7c9a6\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-48d4w" Oct 03 09:24:12 crc kubenswrapper[4790]: I1003 09:24:12.138738 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/780db1d7-b699-4f38-9f86-c6bcf5c7c9a6-ssh-key\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-48d4w\" (UID: \"780db1d7-b699-4f38-9f86-c6bcf5c7c9a6\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-48d4w" Oct 03 09:24:12 crc kubenswrapper[4790]: I1003 09:24:12.139024 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/780db1d7-b699-4f38-9f86-c6bcf5c7c9a6-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-48d4w\" (UID: \"780db1d7-b699-4f38-9f86-c6bcf5c7c9a6\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-48d4w" Oct 03 09:24:12 crc kubenswrapper[4790]: I1003 09:24:12.139721 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/780db1d7-b699-4f38-9f86-c6bcf5c7c9a6-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-48d4w\" (UID: \"780db1d7-b699-4f38-9f86-c6bcf5c7c9a6\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-48d4w" Oct 03 09:24:12 crc kubenswrapper[4790]: I1003 09:24:12.142213 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/780db1d7-b699-4f38-9f86-c6bcf5c7c9a6-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-48d4w\" (UID: \"780db1d7-b699-4f38-9f86-c6bcf5c7c9a6\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-48d4w" Oct 03 09:24:12 crc kubenswrapper[4790]: I1003 09:24:12.151185 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-665hx\" (UniqueName: \"kubernetes.io/projected/780db1d7-b699-4f38-9f86-c6bcf5c7c9a6-kube-api-access-665hx\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-48d4w\" (UID: \"780db1d7-b699-4f38-9f86-c6bcf5c7c9a6\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-48d4w" Oct 03 09:24:12 crc kubenswrapper[4790]: I1003 09:24:12.338712 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-48d4w" Oct 03 09:24:12 crc kubenswrapper[4790]: I1003 09:24:12.866889 4790 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 03 09:24:12 crc kubenswrapper[4790]: I1003 09:24:12.867806 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-48d4w"] Oct 03 09:24:12 crc kubenswrapper[4790]: I1003 09:24:12.886365 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-48d4w" event={"ID":"780db1d7-b699-4f38-9f86-c6bcf5c7c9a6","Type":"ContainerStarted","Data":"d0aa859a38dbcc3167943e2a8ce883fad03e566c32a9245ced37d5a0d54dc36f"} Oct 03 09:24:13 crc kubenswrapper[4790]: I1003 09:24:13.899280 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-48d4w" event={"ID":"780db1d7-b699-4f38-9f86-c6bcf5c7c9a6","Type":"ContainerStarted","Data":"fee00ef21159bb1b571160f9e0c1ee2ce0d144c955b69e12b7cc9351e71653c2"} Oct 03 09:24:13 crc kubenswrapper[4790]: I1003 09:24:13.922671 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-48d4w" podStartSLOduration=2.496137212 podStartE2EDuration="2.922648642s" podCreationTimestamp="2025-10-03 09:24:11 +0000 UTC" firstStartedPulling="2025-10-03 09:24:12.866622023 +0000 UTC m=+2639.219556281" lastFinishedPulling="2025-10-03 09:24:13.293133473 +0000 UTC m=+2639.646067711" observedRunningTime="2025-10-03 09:24:13.919253968 +0000 UTC m=+2640.272188246" watchObservedRunningTime="2025-10-03 09:24:13.922648642 +0000 UTC m=+2640.275582880" Oct 03 09:24:40 crc kubenswrapper[4790]: I1003 09:24:40.186200 4790 patch_prober.go:28] interesting pod/machine-config-daemon-zqj4j container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 09:24:40 crc kubenswrapper[4790]: I1003 09:24:40.186999 4790 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" podUID="36eae06a-d8be-4128-8d68-acb32e1f011b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 09:25:10 crc kubenswrapper[4790]: I1003 09:25:10.185724 4790 patch_prober.go:28] interesting pod/machine-config-daemon-zqj4j container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 09:25:10 crc kubenswrapper[4790]: I1003 09:25:10.186415 4790 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" podUID="36eae06a-d8be-4128-8d68-acb32e1f011b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 09:25:25 crc kubenswrapper[4790]: I1003 09:25:25.631289 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-ggcfg"] Oct 03 09:25:25 crc kubenswrapper[4790]: I1003 09:25:25.635023 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ggcfg" Oct 03 09:25:25 crc kubenswrapper[4790]: I1003 09:25:25.643459 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-ggcfg"] Oct 03 09:25:25 crc kubenswrapper[4790]: I1003 09:25:25.710756 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5d5253a8-eac9-41f0-8770-642f1bee807c-catalog-content\") pod \"community-operators-ggcfg\" (UID: \"5d5253a8-eac9-41f0-8770-642f1bee807c\") " pod="openshift-marketplace/community-operators-ggcfg" Oct 03 09:25:25 crc kubenswrapper[4790]: I1003 09:25:25.711108 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5d5253a8-eac9-41f0-8770-642f1bee807c-utilities\") pod \"community-operators-ggcfg\" (UID: \"5d5253a8-eac9-41f0-8770-642f1bee807c\") " pod="openshift-marketplace/community-operators-ggcfg" Oct 03 09:25:25 crc kubenswrapper[4790]: I1003 09:25:25.711148 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g4qdz\" (UniqueName: \"kubernetes.io/projected/5d5253a8-eac9-41f0-8770-642f1bee807c-kube-api-access-g4qdz\") pod \"community-operators-ggcfg\" (UID: \"5d5253a8-eac9-41f0-8770-642f1bee807c\") " pod="openshift-marketplace/community-operators-ggcfg" Oct 03 09:25:25 crc kubenswrapper[4790]: I1003 09:25:25.812719 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5d5253a8-eac9-41f0-8770-642f1bee807c-catalog-content\") pod \"community-operators-ggcfg\" (UID: \"5d5253a8-eac9-41f0-8770-642f1bee807c\") " pod="openshift-marketplace/community-operators-ggcfg" Oct 03 09:25:25 crc kubenswrapper[4790]: I1003 09:25:25.813182 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5d5253a8-eac9-41f0-8770-642f1bee807c-utilities\") pod \"community-operators-ggcfg\" (UID: \"5d5253a8-eac9-41f0-8770-642f1bee807c\") " pod="openshift-marketplace/community-operators-ggcfg" Oct 03 09:25:25 crc kubenswrapper[4790]: I1003 09:25:25.813265 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5d5253a8-eac9-41f0-8770-642f1bee807c-catalog-content\") pod \"community-operators-ggcfg\" (UID: \"5d5253a8-eac9-41f0-8770-642f1bee807c\") " pod="openshift-marketplace/community-operators-ggcfg" Oct 03 09:25:25 crc kubenswrapper[4790]: I1003 09:25:25.813397 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g4qdz\" (UniqueName: \"kubernetes.io/projected/5d5253a8-eac9-41f0-8770-642f1bee807c-kube-api-access-g4qdz\") pod \"community-operators-ggcfg\" (UID: \"5d5253a8-eac9-41f0-8770-642f1bee807c\") " pod="openshift-marketplace/community-operators-ggcfg" Oct 03 09:25:25 crc kubenswrapper[4790]: I1003 09:25:25.813671 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5d5253a8-eac9-41f0-8770-642f1bee807c-utilities\") pod \"community-operators-ggcfg\" (UID: \"5d5253a8-eac9-41f0-8770-642f1bee807c\") " pod="openshift-marketplace/community-operators-ggcfg" Oct 03 09:25:25 crc kubenswrapper[4790]: I1003 09:25:25.825351 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-c8z87"] Oct 03 09:25:25 crc kubenswrapper[4790]: I1003 09:25:25.827999 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-c8z87" Oct 03 09:25:25 crc kubenswrapper[4790]: I1003 09:25:25.833002 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g4qdz\" (UniqueName: \"kubernetes.io/projected/5d5253a8-eac9-41f0-8770-642f1bee807c-kube-api-access-g4qdz\") pod \"community-operators-ggcfg\" (UID: \"5d5253a8-eac9-41f0-8770-642f1bee807c\") " pod="openshift-marketplace/community-operators-ggcfg" Oct 03 09:25:25 crc kubenswrapper[4790]: I1003 09:25:25.837340 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-c8z87"] Oct 03 09:25:25 crc kubenswrapper[4790]: I1003 09:25:25.914635 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s5bgl\" (UniqueName: \"kubernetes.io/projected/db5acd04-b20a-4f50-b8a6-6813961ec49f-kube-api-access-s5bgl\") pod \"certified-operators-c8z87\" (UID: \"db5acd04-b20a-4f50-b8a6-6813961ec49f\") " pod="openshift-marketplace/certified-operators-c8z87" Oct 03 09:25:25 crc kubenswrapper[4790]: I1003 09:25:25.914958 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/db5acd04-b20a-4f50-b8a6-6813961ec49f-utilities\") pod \"certified-operators-c8z87\" (UID: \"db5acd04-b20a-4f50-b8a6-6813961ec49f\") " pod="openshift-marketplace/certified-operators-c8z87" Oct 03 09:25:25 crc kubenswrapper[4790]: I1003 09:25:25.915087 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/db5acd04-b20a-4f50-b8a6-6813961ec49f-catalog-content\") pod \"certified-operators-c8z87\" (UID: \"db5acd04-b20a-4f50-b8a6-6813961ec49f\") " pod="openshift-marketplace/certified-operators-c8z87" Oct 03 09:25:25 crc kubenswrapper[4790]: I1003 09:25:25.975759 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ggcfg" Oct 03 09:25:26 crc kubenswrapper[4790]: I1003 09:25:26.017261 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/db5acd04-b20a-4f50-b8a6-6813961ec49f-utilities\") pod \"certified-operators-c8z87\" (UID: \"db5acd04-b20a-4f50-b8a6-6813961ec49f\") " pod="openshift-marketplace/certified-operators-c8z87" Oct 03 09:25:26 crc kubenswrapper[4790]: I1003 09:25:26.017371 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/db5acd04-b20a-4f50-b8a6-6813961ec49f-catalog-content\") pod \"certified-operators-c8z87\" (UID: \"db5acd04-b20a-4f50-b8a6-6813961ec49f\") " pod="openshift-marketplace/certified-operators-c8z87" Oct 03 09:25:26 crc kubenswrapper[4790]: I1003 09:25:26.017421 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s5bgl\" (UniqueName: \"kubernetes.io/projected/db5acd04-b20a-4f50-b8a6-6813961ec49f-kube-api-access-s5bgl\") pod \"certified-operators-c8z87\" (UID: \"db5acd04-b20a-4f50-b8a6-6813961ec49f\") " pod="openshift-marketplace/certified-operators-c8z87" Oct 03 09:25:26 crc kubenswrapper[4790]: I1003 09:25:26.018078 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/db5acd04-b20a-4f50-b8a6-6813961ec49f-catalog-content\") pod \"certified-operators-c8z87\" (UID: \"db5acd04-b20a-4f50-b8a6-6813961ec49f\") " pod="openshift-marketplace/certified-operators-c8z87" Oct 03 09:25:26 crc kubenswrapper[4790]: I1003 09:25:26.018377 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/db5acd04-b20a-4f50-b8a6-6813961ec49f-utilities\") pod \"certified-operators-c8z87\" (UID: \"db5acd04-b20a-4f50-b8a6-6813961ec49f\") " pod="openshift-marketplace/certified-operators-c8z87" Oct 03 09:25:26 crc kubenswrapper[4790]: I1003 09:25:26.034434 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s5bgl\" (UniqueName: \"kubernetes.io/projected/db5acd04-b20a-4f50-b8a6-6813961ec49f-kube-api-access-s5bgl\") pod \"certified-operators-c8z87\" (UID: \"db5acd04-b20a-4f50-b8a6-6813961ec49f\") " pod="openshift-marketplace/certified-operators-c8z87" Oct 03 09:25:26 crc kubenswrapper[4790]: I1003 09:25:26.192206 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-c8z87" Oct 03 09:25:26 crc kubenswrapper[4790]: I1003 09:25:26.546145 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-ggcfg"] Oct 03 09:25:26 crc kubenswrapper[4790]: I1003 09:25:26.773042 4790 generic.go:334] "Generic (PLEG): container finished" podID="5d5253a8-eac9-41f0-8770-642f1bee807c" containerID="364357a7a673457ebb706570f2ebecc41022092affa4cda4e385b8444d9f6493" exitCode=0 Oct 03 09:25:26 crc kubenswrapper[4790]: I1003 09:25:26.773118 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ggcfg" event={"ID":"5d5253a8-eac9-41f0-8770-642f1bee807c","Type":"ContainerDied","Data":"364357a7a673457ebb706570f2ebecc41022092affa4cda4e385b8444d9f6493"} Oct 03 09:25:26 crc kubenswrapper[4790]: I1003 09:25:26.773458 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ggcfg" event={"ID":"5d5253a8-eac9-41f0-8770-642f1bee807c","Type":"ContainerStarted","Data":"c8e222648856cd33b29d8c874c5830b4f912a835d100e81fb4f61904d060734c"} Oct 03 09:25:26 crc kubenswrapper[4790]: I1003 09:25:26.778457 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-c8z87"] Oct 03 09:25:27 crc kubenswrapper[4790]: I1003 09:25:27.783882 4790 generic.go:334] "Generic (PLEG): container finished" podID="db5acd04-b20a-4f50-b8a6-6813961ec49f" containerID="e8263c4ca0f33c72ee130a3416b33cba6eff877ada96c943dd3221f03fcf9046" exitCode=0 Oct 03 09:25:27 crc kubenswrapper[4790]: I1003 09:25:27.784011 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-c8z87" event={"ID":"db5acd04-b20a-4f50-b8a6-6813961ec49f","Type":"ContainerDied","Data":"e8263c4ca0f33c72ee130a3416b33cba6eff877ada96c943dd3221f03fcf9046"} Oct 03 09:25:27 crc kubenswrapper[4790]: I1003 09:25:27.784235 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-c8z87" event={"ID":"db5acd04-b20a-4f50-b8a6-6813961ec49f","Type":"ContainerStarted","Data":"6801ca83e36a97f0a02f79ab1c7822ee72cce20eeb0f307258fccccd1e074421"} Oct 03 09:25:28 crc kubenswrapper[4790]: I1003 09:25:28.795947 4790 generic.go:334] "Generic (PLEG): container finished" podID="5d5253a8-eac9-41f0-8770-642f1bee807c" containerID="27570409cc2020e007553f7d4bf8d695d4b7830655873066bce921951623e0b2" exitCode=0 Oct 03 09:25:28 crc kubenswrapper[4790]: I1003 09:25:28.795991 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ggcfg" event={"ID":"5d5253a8-eac9-41f0-8770-642f1bee807c","Type":"ContainerDied","Data":"27570409cc2020e007553f7d4bf8d695d4b7830655873066bce921951623e0b2"} Oct 03 09:25:28 crc kubenswrapper[4790]: I1003 09:25:28.799383 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-c8z87" event={"ID":"db5acd04-b20a-4f50-b8a6-6813961ec49f","Type":"ContainerStarted","Data":"c90678431bd12dac0717b7327f2a141dbc5109220f68b98486bf47abfd263ca0"} Oct 03 09:25:29 crc kubenswrapper[4790]: I1003 09:25:29.840322 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ggcfg" event={"ID":"5d5253a8-eac9-41f0-8770-642f1bee807c","Type":"ContainerStarted","Data":"be7f65090a4f95c5c9b677d89fabae916182fccf05260fda52f0a5cf69118fa3"} Oct 03 09:25:29 crc kubenswrapper[4790]: I1003 09:25:29.860071 4790 generic.go:334] "Generic (PLEG): container finished" podID="db5acd04-b20a-4f50-b8a6-6813961ec49f" containerID="c90678431bd12dac0717b7327f2a141dbc5109220f68b98486bf47abfd263ca0" exitCode=0 Oct 03 09:25:29 crc kubenswrapper[4790]: I1003 09:25:29.860210 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-c8z87" event={"ID":"db5acd04-b20a-4f50-b8a6-6813961ec49f","Type":"ContainerDied","Data":"c90678431bd12dac0717b7327f2a141dbc5109220f68b98486bf47abfd263ca0"} Oct 03 09:25:29 crc kubenswrapper[4790]: I1003 09:25:29.875729 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-ggcfg" podStartSLOduration=2.453297026 podStartE2EDuration="4.875715924s" podCreationTimestamp="2025-10-03 09:25:25 +0000 UTC" firstStartedPulling="2025-10-03 09:25:26.774866913 +0000 UTC m=+2713.127801151" lastFinishedPulling="2025-10-03 09:25:29.197285801 +0000 UTC m=+2715.550220049" observedRunningTime="2025-10-03 09:25:29.874714747 +0000 UTC m=+2716.227648995" watchObservedRunningTime="2025-10-03 09:25:29.875715924 +0000 UTC m=+2716.228650162" Oct 03 09:25:30 crc kubenswrapper[4790]: I1003 09:25:30.874701 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-c8z87" event={"ID":"db5acd04-b20a-4f50-b8a6-6813961ec49f","Type":"ContainerStarted","Data":"b7c1ca79b10422d0870729bbbd68ea296add4625e63a6675e23abd9f668df02b"} Oct 03 09:25:30 crc kubenswrapper[4790]: I1003 09:25:30.911533 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-c8z87" podStartSLOduration=3.099284768 podStartE2EDuration="5.911514867s" podCreationTimestamp="2025-10-03 09:25:25 +0000 UTC" firstStartedPulling="2025-10-03 09:25:27.785968258 +0000 UTC m=+2714.138902506" lastFinishedPulling="2025-10-03 09:25:30.598198347 +0000 UTC m=+2716.951132605" observedRunningTime="2025-10-03 09:25:30.90687911 +0000 UTC m=+2717.259813358" watchObservedRunningTime="2025-10-03 09:25:30.911514867 +0000 UTC m=+2717.264449115" Oct 03 09:25:35 crc kubenswrapper[4790]: I1003 09:25:35.976675 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-ggcfg" Oct 03 09:25:35 crc kubenswrapper[4790]: I1003 09:25:35.977293 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-ggcfg" Oct 03 09:25:36 crc kubenswrapper[4790]: I1003 09:25:36.035310 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-ggcfg" Oct 03 09:25:36 crc kubenswrapper[4790]: I1003 09:25:36.193603 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-c8z87" Oct 03 09:25:36 crc kubenswrapper[4790]: I1003 09:25:36.193652 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-c8z87" Oct 03 09:25:36 crc kubenswrapper[4790]: I1003 09:25:36.259859 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-c8z87" Oct 03 09:25:37 crc kubenswrapper[4790]: I1003 09:25:37.016243 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-c8z87" Oct 03 09:25:37 crc kubenswrapper[4790]: I1003 09:25:37.019044 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-ggcfg" Oct 03 09:25:38 crc kubenswrapper[4790]: I1003 09:25:38.823060 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-c8z87"] Oct 03 09:25:38 crc kubenswrapper[4790]: I1003 09:25:38.963245 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-c8z87" podUID="db5acd04-b20a-4f50-b8a6-6813961ec49f" containerName="registry-server" containerID="cri-o://b7c1ca79b10422d0870729bbbd68ea296add4625e63a6675e23abd9f668df02b" gracePeriod=2 Oct 03 09:25:39 crc kubenswrapper[4790]: I1003 09:25:39.425595 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-ggcfg"] Oct 03 09:25:39 crc kubenswrapper[4790]: I1003 09:25:39.426083 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-ggcfg" podUID="5d5253a8-eac9-41f0-8770-642f1bee807c" containerName="registry-server" containerID="cri-o://be7f65090a4f95c5c9b677d89fabae916182fccf05260fda52f0a5cf69118fa3" gracePeriod=2 Oct 03 09:25:39 crc kubenswrapper[4790]: I1003 09:25:39.450931 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-c8z87" Oct 03 09:25:39 crc kubenswrapper[4790]: I1003 09:25:39.500092 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/db5acd04-b20a-4f50-b8a6-6813961ec49f-utilities\") pod \"db5acd04-b20a-4f50-b8a6-6813961ec49f\" (UID: \"db5acd04-b20a-4f50-b8a6-6813961ec49f\") " Oct 03 09:25:39 crc kubenswrapper[4790]: I1003 09:25:39.500146 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s5bgl\" (UniqueName: \"kubernetes.io/projected/db5acd04-b20a-4f50-b8a6-6813961ec49f-kube-api-access-s5bgl\") pod \"db5acd04-b20a-4f50-b8a6-6813961ec49f\" (UID: \"db5acd04-b20a-4f50-b8a6-6813961ec49f\") " Oct 03 09:25:39 crc kubenswrapper[4790]: I1003 09:25:39.500398 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/db5acd04-b20a-4f50-b8a6-6813961ec49f-catalog-content\") pod \"db5acd04-b20a-4f50-b8a6-6813961ec49f\" (UID: \"db5acd04-b20a-4f50-b8a6-6813961ec49f\") " Oct 03 09:25:39 crc kubenswrapper[4790]: I1003 09:25:39.501510 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/db5acd04-b20a-4f50-b8a6-6813961ec49f-utilities" (OuterVolumeSpecName: "utilities") pod "db5acd04-b20a-4f50-b8a6-6813961ec49f" (UID: "db5acd04-b20a-4f50-b8a6-6813961ec49f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 09:25:39 crc kubenswrapper[4790]: I1003 09:25:39.508321 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/db5acd04-b20a-4f50-b8a6-6813961ec49f-kube-api-access-s5bgl" (OuterVolumeSpecName: "kube-api-access-s5bgl") pod "db5acd04-b20a-4f50-b8a6-6813961ec49f" (UID: "db5acd04-b20a-4f50-b8a6-6813961ec49f"). InnerVolumeSpecName "kube-api-access-s5bgl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:25:39 crc kubenswrapper[4790]: I1003 09:25:39.560249 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/db5acd04-b20a-4f50-b8a6-6813961ec49f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "db5acd04-b20a-4f50-b8a6-6813961ec49f" (UID: "db5acd04-b20a-4f50-b8a6-6813961ec49f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 09:25:39 crc kubenswrapper[4790]: I1003 09:25:39.602262 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s5bgl\" (UniqueName: \"kubernetes.io/projected/db5acd04-b20a-4f50-b8a6-6813961ec49f-kube-api-access-s5bgl\") on node \"crc\" DevicePath \"\"" Oct 03 09:25:39 crc kubenswrapper[4790]: I1003 09:25:39.602295 4790 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/db5acd04-b20a-4f50-b8a6-6813961ec49f-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 09:25:39 crc kubenswrapper[4790]: I1003 09:25:39.602308 4790 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/db5acd04-b20a-4f50-b8a6-6813961ec49f-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 09:25:39 crc kubenswrapper[4790]: I1003 09:25:39.937187 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ggcfg" Oct 03 09:25:39 crc kubenswrapper[4790]: I1003 09:25:39.975862 4790 generic.go:334] "Generic (PLEG): container finished" podID="5d5253a8-eac9-41f0-8770-642f1bee807c" containerID="be7f65090a4f95c5c9b677d89fabae916182fccf05260fda52f0a5cf69118fa3" exitCode=0 Oct 03 09:25:39 crc kubenswrapper[4790]: I1003 09:25:39.975946 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ggcfg" Oct 03 09:25:39 crc kubenswrapper[4790]: I1003 09:25:39.975906 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ggcfg" event={"ID":"5d5253a8-eac9-41f0-8770-642f1bee807c","Type":"ContainerDied","Data":"be7f65090a4f95c5c9b677d89fabae916182fccf05260fda52f0a5cf69118fa3"} Oct 03 09:25:39 crc kubenswrapper[4790]: I1003 09:25:39.976312 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ggcfg" event={"ID":"5d5253a8-eac9-41f0-8770-642f1bee807c","Type":"ContainerDied","Data":"c8e222648856cd33b29d8c874c5830b4f912a835d100e81fb4f61904d060734c"} Oct 03 09:25:39 crc kubenswrapper[4790]: I1003 09:25:39.976371 4790 scope.go:117] "RemoveContainer" containerID="be7f65090a4f95c5c9b677d89fabae916182fccf05260fda52f0a5cf69118fa3" Oct 03 09:25:39 crc kubenswrapper[4790]: I1003 09:25:39.979061 4790 generic.go:334] "Generic (PLEG): container finished" podID="db5acd04-b20a-4f50-b8a6-6813961ec49f" containerID="b7c1ca79b10422d0870729bbbd68ea296add4625e63a6675e23abd9f668df02b" exitCode=0 Oct 03 09:25:39 crc kubenswrapper[4790]: I1003 09:25:39.979093 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-c8z87" event={"ID":"db5acd04-b20a-4f50-b8a6-6813961ec49f","Type":"ContainerDied","Data":"b7c1ca79b10422d0870729bbbd68ea296add4625e63a6675e23abd9f668df02b"} Oct 03 09:25:39 crc kubenswrapper[4790]: I1003 09:25:39.979116 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-c8z87" event={"ID":"db5acd04-b20a-4f50-b8a6-6813961ec49f","Type":"ContainerDied","Data":"6801ca83e36a97f0a02f79ab1c7822ee72cce20eeb0f307258fccccd1e074421"} Oct 03 09:25:39 crc kubenswrapper[4790]: I1003 09:25:39.979236 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-c8z87" Oct 03 09:25:40 crc kubenswrapper[4790]: I1003 09:25:40.007330 4790 scope.go:117] "RemoveContainer" containerID="27570409cc2020e007553f7d4bf8d695d4b7830655873066bce921951623e0b2" Oct 03 09:25:40 crc kubenswrapper[4790]: I1003 09:25:40.010236 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5d5253a8-eac9-41f0-8770-642f1bee807c-catalog-content\") pod \"5d5253a8-eac9-41f0-8770-642f1bee807c\" (UID: \"5d5253a8-eac9-41f0-8770-642f1bee807c\") " Oct 03 09:25:40 crc kubenswrapper[4790]: I1003 09:25:40.010461 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g4qdz\" (UniqueName: \"kubernetes.io/projected/5d5253a8-eac9-41f0-8770-642f1bee807c-kube-api-access-g4qdz\") pod \"5d5253a8-eac9-41f0-8770-642f1bee807c\" (UID: \"5d5253a8-eac9-41f0-8770-642f1bee807c\") " Oct 03 09:25:40 crc kubenswrapper[4790]: I1003 09:25:40.010743 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5d5253a8-eac9-41f0-8770-642f1bee807c-utilities\") pod \"5d5253a8-eac9-41f0-8770-642f1bee807c\" (UID: \"5d5253a8-eac9-41f0-8770-642f1bee807c\") " Oct 03 09:25:40 crc kubenswrapper[4790]: I1003 09:25:40.012121 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5d5253a8-eac9-41f0-8770-642f1bee807c-utilities" (OuterVolumeSpecName: "utilities") pod "5d5253a8-eac9-41f0-8770-642f1bee807c" (UID: "5d5253a8-eac9-41f0-8770-642f1bee807c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 09:25:40 crc kubenswrapper[4790]: I1003 09:25:40.021948 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-c8z87"] Oct 03 09:25:40 crc kubenswrapper[4790]: I1003 09:25:40.033595 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-c8z87"] Oct 03 09:25:40 crc kubenswrapper[4790]: I1003 09:25:40.033711 4790 scope.go:117] "RemoveContainer" containerID="364357a7a673457ebb706570f2ebecc41022092affa4cda4e385b8444d9f6493" Oct 03 09:25:40 crc kubenswrapper[4790]: I1003 09:25:40.033838 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5d5253a8-eac9-41f0-8770-642f1bee807c-kube-api-access-g4qdz" (OuterVolumeSpecName: "kube-api-access-g4qdz") pod "5d5253a8-eac9-41f0-8770-642f1bee807c" (UID: "5d5253a8-eac9-41f0-8770-642f1bee807c"). InnerVolumeSpecName "kube-api-access-g4qdz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:25:40 crc kubenswrapper[4790]: I1003 09:25:40.057337 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5d5253a8-eac9-41f0-8770-642f1bee807c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5d5253a8-eac9-41f0-8770-642f1bee807c" (UID: "5d5253a8-eac9-41f0-8770-642f1bee807c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 09:25:40 crc kubenswrapper[4790]: I1003 09:25:40.059000 4790 scope.go:117] "RemoveContainer" containerID="be7f65090a4f95c5c9b677d89fabae916182fccf05260fda52f0a5cf69118fa3" Oct 03 09:25:40 crc kubenswrapper[4790]: E1003 09:25:40.059508 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"be7f65090a4f95c5c9b677d89fabae916182fccf05260fda52f0a5cf69118fa3\": container with ID starting with be7f65090a4f95c5c9b677d89fabae916182fccf05260fda52f0a5cf69118fa3 not found: ID does not exist" containerID="be7f65090a4f95c5c9b677d89fabae916182fccf05260fda52f0a5cf69118fa3" Oct 03 09:25:40 crc kubenswrapper[4790]: I1003 09:25:40.059696 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"be7f65090a4f95c5c9b677d89fabae916182fccf05260fda52f0a5cf69118fa3"} err="failed to get container status \"be7f65090a4f95c5c9b677d89fabae916182fccf05260fda52f0a5cf69118fa3\": rpc error: code = NotFound desc = could not find container \"be7f65090a4f95c5c9b677d89fabae916182fccf05260fda52f0a5cf69118fa3\": container with ID starting with be7f65090a4f95c5c9b677d89fabae916182fccf05260fda52f0a5cf69118fa3 not found: ID does not exist" Oct 03 09:25:40 crc kubenswrapper[4790]: I1003 09:25:40.059736 4790 scope.go:117] "RemoveContainer" containerID="27570409cc2020e007553f7d4bf8d695d4b7830655873066bce921951623e0b2" Oct 03 09:25:40 crc kubenswrapper[4790]: E1003 09:25:40.060139 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"27570409cc2020e007553f7d4bf8d695d4b7830655873066bce921951623e0b2\": container with ID starting with 27570409cc2020e007553f7d4bf8d695d4b7830655873066bce921951623e0b2 not found: ID does not exist" containerID="27570409cc2020e007553f7d4bf8d695d4b7830655873066bce921951623e0b2" Oct 03 09:25:40 crc kubenswrapper[4790]: I1003 09:25:40.060228 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"27570409cc2020e007553f7d4bf8d695d4b7830655873066bce921951623e0b2"} err="failed to get container status \"27570409cc2020e007553f7d4bf8d695d4b7830655873066bce921951623e0b2\": rpc error: code = NotFound desc = could not find container \"27570409cc2020e007553f7d4bf8d695d4b7830655873066bce921951623e0b2\": container with ID starting with 27570409cc2020e007553f7d4bf8d695d4b7830655873066bce921951623e0b2 not found: ID does not exist" Oct 03 09:25:40 crc kubenswrapper[4790]: I1003 09:25:40.060305 4790 scope.go:117] "RemoveContainer" containerID="364357a7a673457ebb706570f2ebecc41022092affa4cda4e385b8444d9f6493" Oct 03 09:25:40 crc kubenswrapper[4790]: E1003 09:25:40.060676 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"364357a7a673457ebb706570f2ebecc41022092affa4cda4e385b8444d9f6493\": container with ID starting with 364357a7a673457ebb706570f2ebecc41022092affa4cda4e385b8444d9f6493 not found: ID does not exist" containerID="364357a7a673457ebb706570f2ebecc41022092affa4cda4e385b8444d9f6493" Oct 03 09:25:40 crc kubenswrapper[4790]: I1003 09:25:40.060713 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"364357a7a673457ebb706570f2ebecc41022092affa4cda4e385b8444d9f6493"} err="failed to get container status \"364357a7a673457ebb706570f2ebecc41022092affa4cda4e385b8444d9f6493\": rpc error: code = NotFound desc = could not find container \"364357a7a673457ebb706570f2ebecc41022092affa4cda4e385b8444d9f6493\": container with ID starting with 364357a7a673457ebb706570f2ebecc41022092affa4cda4e385b8444d9f6493 not found: ID does not exist" Oct 03 09:25:40 crc kubenswrapper[4790]: I1003 09:25:40.060734 4790 scope.go:117] "RemoveContainer" containerID="b7c1ca79b10422d0870729bbbd68ea296add4625e63a6675e23abd9f668df02b" Oct 03 09:25:40 crc kubenswrapper[4790]: I1003 09:25:40.078294 4790 scope.go:117] "RemoveContainer" containerID="c90678431bd12dac0717b7327f2a141dbc5109220f68b98486bf47abfd263ca0" Oct 03 09:25:40 crc kubenswrapper[4790]: I1003 09:25:40.098590 4790 scope.go:117] "RemoveContainer" containerID="e8263c4ca0f33c72ee130a3416b33cba6eff877ada96c943dd3221f03fcf9046" Oct 03 09:25:40 crc kubenswrapper[4790]: I1003 09:25:40.114718 4790 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5d5253a8-eac9-41f0-8770-642f1bee807c-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 09:25:40 crc kubenswrapper[4790]: I1003 09:25:40.114764 4790 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5d5253a8-eac9-41f0-8770-642f1bee807c-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 09:25:40 crc kubenswrapper[4790]: I1003 09:25:40.114790 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g4qdz\" (UniqueName: \"kubernetes.io/projected/5d5253a8-eac9-41f0-8770-642f1bee807c-kube-api-access-g4qdz\") on node \"crc\" DevicePath \"\"" Oct 03 09:25:40 crc kubenswrapper[4790]: I1003 09:25:40.187661 4790 patch_prober.go:28] interesting pod/machine-config-daemon-zqj4j container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 09:25:40 crc kubenswrapper[4790]: I1003 09:25:40.187749 4790 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" podUID="36eae06a-d8be-4128-8d68-acb32e1f011b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 09:25:40 crc kubenswrapper[4790]: I1003 09:25:40.187801 4790 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" Oct 03 09:25:40 crc kubenswrapper[4790]: I1003 09:25:40.188676 4790 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"19d3a3ec163b3ce35fa277de2e5c9c85ded382af52d82e819b0c00a5f4d71d5c"} pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 03 09:25:40 crc kubenswrapper[4790]: I1003 09:25:40.188748 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" podUID="36eae06a-d8be-4128-8d68-acb32e1f011b" containerName="machine-config-daemon" containerID="cri-o://19d3a3ec163b3ce35fa277de2e5c9c85ded382af52d82e819b0c00a5f4d71d5c" gracePeriod=600 Oct 03 09:25:40 crc kubenswrapper[4790]: I1003 09:25:40.200035 4790 scope.go:117] "RemoveContainer" containerID="b7c1ca79b10422d0870729bbbd68ea296add4625e63a6675e23abd9f668df02b" Oct 03 09:25:40 crc kubenswrapper[4790]: E1003 09:25:40.202163 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b7c1ca79b10422d0870729bbbd68ea296add4625e63a6675e23abd9f668df02b\": container with ID starting with b7c1ca79b10422d0870729bbbd68ea296add4625e63a6675e23abd9f668df02b not found: ID does not exist" containerID="b7c1ca79b10422d0870729bbbd68ea296add4625e63a6675e23abd9f668df02b" Oct 03 09:25:40 crc kubenswrapper[4790]: I1003 09:25:40.202211 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b7c1ca79b10422d0870729bbbd68ea296add4625e63a6675e23abd9f668df02b"} err="failed to get container status \"b7c1ca79b10422d0870729bbbd68ea296add4625e63a6675e23abd9f668df02b\": rpc error: code = NotFound desc = could not find container \"b7c1ca79b10422d0870729bbbd68ea296add4625e63a6675e23abd9f668df02b\": container with ID starting with b7c1ca79b10422d0870729bbbd68ea296add4625e63a6675e23abd9f668df02b not found: ID does not exist" Oct 03 09:25:40 crc kubenswrapper[4790]: I1003 09:25:40.202245 4790 scope.go:117] "RemoveContainer" containerID="c90678431bd12dac0717b7327f2a141dbc5109220f68b98486bf47abfd263ca0" Oct 03 09:25:40 crc kubenswrapper[4790]: E1003 09:25:40.205789 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c90678431bd12dac0717b7327f2a141dbc5109220f68b98486bf47abfd263ca0\": container with ID starting with c90678431bd12dac0717b7327f2a141dbc5109220f68b98486bf47abfd263ca0 not found: ID does not exist" containerID="c90678431bd12dac0717b7327f2a141dbc5109220f68b98486bf47abfd263ca0" Oct 03 09:25:40 crc kubenswrapper[4790]: I1003 09:25:40.205834 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c90678431bd12dac0717b7327f2a141dbc5109220f68b98486bf47abfd263ca0"} err="failed to get container status \"c90678431bd12dac0717b7327f2a141dbc5109220f68b98486bf47abfd263ca0\": rpc error: code = NotFound desc = could not find container \"c90678431bd12dac0717b7327f2a141dbc5109220f68b98486bf47abfd263ca0\": container with ID starting with c90678431bd12dac0717b7327f2a141dbc5109220f68b98486bf47abfd263ca0 not found: ID does not exist" Oct 03 09:25:40 crc kubenswrapper[4790]: I1003 09:25:40.205863 4790 scope.go:117] "RemoveContainer" containerID="e8263c4ca0f33c72ee130a3416b33cba6eff877ada96c943dd3221f03fcf9046" Oct 03 09:25:40 crc kubenswrapper[4790]: E1003 09:25:40.212956 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e8263c4ca0f33c72ee130a3416b33cba6eff877ada96c943dd3221f03fcf9046\": container with ID starting with e8263c4ca0f33c72ee130a3416b33cba6eff877ada96c943dd3221f03fcf9046 not found: ID does not exist" containerID="e8263c4ca0f33c72ee130a3416b33cba6eff877ada96c943dd3221f03fcf9046" Oct 03 09:25:40 crc kubenswrapper[4790]: I1003 09:25:40.213012 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e8263c4ca0f33c72ee130a3416b33cba6eff877ada96c943dd3221f03fcf9046"} err="failed to get container status \"e8263c4ca0f33c72ee130a3416b33cba6eff877ada96c943dd3221f03fcf9046\": rpc error: code = NotFound desc = could not find container \"e8263c4ca0f33c72ee130a3416b33cba6eff877ada96c943dd3221f03fcf9046\": container with ID starting with e8263c4ca0f33c72ee130a3416b33cba6eff877ada96c943dd3221f03fcf9046 not found: ID does not exist" Oct 03 09:25:40 crc kubenswrapper[4790]: I1003 09:25:40.327262 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-ggcfg"] Oct 03 09:25:40 crc kubenswrapper[4790]: I1003 09:25:40.348093 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="db5acd04-b20a-4f50-b8a6-6813961ec49f" path="/var/lib/kubelet/pods/db5acd04-b20a-4f50-b8a6-6813961ec49f/volumes" Oct 03 09:25:40 crc kubenswrapper[4790]: I1003 09:25:40.349336 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-ggcfg"] Oct 03 09:25:41 crc kubenswrapper[4790]: I1003 09:25:41.005179 4790 generic.go:334] "Generic (PLEG): container finished" podID="36eae06a-d8be-4128-8d68-acb32e1f011b" containerID="19d3a3ec163b3ce35fa277de2e5c9c85ded382af52d82e819b0c00a5f4d71d5c" exitCode=0 Oct 03 09:25:41 crc kubenswrapper[4790]: I1003 09:25:41.005247 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" event={"ID":"36eae06a-d8be-4128-8d68-acb32e1f011b","Type":"ContainerDied","Data":"19d3a3ec163b3ce35fa277de2e5c9c85ded382af52d82e819b0c00a5f4d71d5c"} Oct 03 09:25:41 crc kubenswrapper[4790]: I1003 09:25:41.006439 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" event={"ID":"36eae06a-d8be-4128-8d68-acb32e1f011b","Type":"ContainerStarted","Data":"6ecb310a15330a3dcf7170c7134b2b3efb7d1ab4c378f8b09d033752d3cbd92b"} Oct 03 09:25:41 crc kubenswrapper[4790]: I1003 09:25:41.006517 4790 scope.go:117] "RemoveContainer" containerID="105f7e31a1b4fd122ba3755f58ec2001671f4c1e80e4803ea6d27a8c94b0c9b8" Oct 03 09:25:42 crc kubenswrapper[4790]: I1003 09:25:42.354290 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5d5253a8-eac9-41f0-8770-642f1bee807c" path="/var/lib/kubelet/pods/5d5253a8-eac9-41f0-8770-642f1bee807c/volumes" Oct 03 09:26:44 crc kubenswrapper[4790]: I1003 09:26:44.821431 4790 generic.go:334] "Generic (PLEG): container finished" podID="780db1d7-b699-4f38-9f86-c6bcf5c7c9a6" containerID="fee00ef21159bb1b571160f9e0c1ee2ce0d144c955b69e12b7cc9351e71653c2" exitCode=0 Oct 03 09:26:44 crc kubenswrapper[4790]: I1003 09:26:44.821623 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-48d4w" event={"ID":"780db1d7-b699-4f38-9f86-c6bcf5c7c9a6","Type":"ContainerDied","Data":"fee00ef21159bb1b571160f9e0c1ee2ce0d144c955b69e12b7cc9351e71653c2"} Oct 03 09:26:46 crc kubenswrapper[4790]: I1003 09:26:46.268783 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-48d4w" Oct 03 09:26:46 crc kubenswrapper[4790]: I1003 09:26:46.371368 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/780db1d7-b699-4f38-9f86-c6bcf5c7c9a6-ceilometer-compute-config-data-0\") pod \"780db1d7-b699-4f38-9f86-c6bcf5c7c9a6\" (UID: \"780db1d7-b699-4f38-9f86-c6bcf5c7c9a6\") " Oct 03 09:26:46 crc kubenswrapper[4790]: I1003 09:26:46.371446 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/780db1d7-b699-4f38-9f86-c6bcf5c7c9a6-ceilometer-compute-config-data-1\") pod \"780db1d7-b699-4f38-9f86-c6bcf5c7c9a6\" (UID: \"780db1d7-b699-4f38-9f86-c6bcf5c7c9a6\") " Oct 03 09:26:46 crc kubenswrapper[4790]: I1003 09:26:46.371493 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/780db1d7-b699-4f38-9f86-c6bcf5c7c9a6-telemetry-combined-ca-bundle\") pod \"780db1d7-b699-4f38-9f86-c6bcf5c7c9a6\" (UID: \"780db1d7-b699-4f38-9f86-c6bcf5c7c9a6\") " Oct 03 09:26:46 crc kubenswrapper[4790]: I1003 09:26:46.371706 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-665hx\" (UniqueName: \"kubernetes.io/projected/780db1d7-b699-4f38-9f86-c6bcf5c7c9a6-kube-api-access-665hx\") pod \"780db1d7-b699-4f38-9f86-c6bcf5c7c9a6\" (UID: \"780db1d7-b699-4f38-9f86-c6bcf5c7c9a6\") " Oct 03 09:26:46 crc kubenswrapper[4790]: I1003 09:26:46.371767 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/780db1d7-b699-4f38-9f86-c6bcf5c7c9a6-inventory\") pod \"780db1d7-b699-4f38-9f86-c6bcf5c7c9a6\" (UID: \"780db1d7-b699-4f38-9f86-c6bcf5c7c9a6\") " Oct 03 09:26:46 crc kubenswrapper[4790]: I1003 09:26:46.371892 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/780db1d7-b699-4f38-9f86-c6bcf5c7c9a6-ssh-key\") pod \"780db1d7-b699-4f38-9f86-c6bcf5c7c9a6\" (UID: \"780db1d7-b699-4f38-9f86-c6bcf5c7c9a6\") " Oct 03 09:26:46 crc kubenswrapper[4790]: I1003 09:26:46.372779 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/780db1d7-b699-4f38-9f86-c6bcf5c7c9a6-ceilometer-compute-config-data-2\") pod \"780db1d7-b699-4f38-9f86-c6bcf5c7c9a6\" (UID: \"780db1d7-b699-4f38-9f86-c6bcf5c7c9a6\") " Oct 03 09:26:46 crc kubenswrapper[4790]: I1003 09:26:46.379671 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/780db1d7-b699-4f38-9f86-c6bcf5c7c9a6-kube-api-access-665hx" (OuterVolumeSpecName: "kube-api-access-665hx") pod "780db1d7-b699-4f38-9f86-c6bcf5c7c9a6" (UID: "780db1d7-b699-4f38-9f86-c6bcf5c7c9a6"). InnerVolumeSpecName "kube-api-access-665hx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:26:46 crc kubenswrapper[4790]: I1003 09:26:46.391382 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/780db1d7-b699-4f38-9f86-c6bcf5c7c9a6-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "780db1d7-b699-4f38-9f86-c6bcf5c7c9a6" (UID: "780db1d7-b699-4f38-9f86-c6bcf5c7c9a6"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:26:46 crc kubenswrapper[4790]: I1003 09:26:46.400480 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/780db1d7-b699-4f38-9f86-c6bcf5c7c9a6-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "780db1d7-b699-4f38-9f86-c6bcf5c7c9a6" (UID: "780db1d7-b699-4f38-9f86-c6bcf5c7c9a6"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:26:46 crc kubenswrapper[4790]: I1003 09:26:46.408201 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/780db1d7-b699-4f38-9f86-c6bcf5c7c9a6-inventory" (OuterVolumeSpecName: "inventory") pod "780db1d7-b699-4f38-9f86-c6bcf5c7c9a6" (UID: "780db1d7-b699-4f38-9f86-c6bcf5c7c9a6"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:26:46 crc kubenswrapper[4790]: I1003 09:26:46.410403 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/780db1d7-b699-4f38-9f86-c6bcf5c7c9a6-ceilometer-compute-config-data-0" (OuterVolumeSpecName: "ceilometer-compute-config-data-0") pod "780db1d7-b699-4f38-9f86-c6bcf5c7c9a6" (UID: "780db1d7-b699-4f38-9f86-c6bcf5c7c9a6"). InnerVolumeSpecName "ceilometer-compute-config-data-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:26:46 crc kubenswrapper[4790]: I1003 09:26:46.414203 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/780db1d7-b699-4f38-9f86-c6bcf5c7c9a6-ceilometer-compute-config-data-1" (OuterVolumeSpecName: "ceilometer-compute-config-data-1") pod "780db1d7-b699-4f38-9f86-c6bcf5c7c9a6" (UID: "780db1d7-b699-4f38-9f86-c6bcf5c7c9a6"). InnerVolumeSpecName "ceilometer-compute-config-data-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:26:46 crc kubenswrapper[4790]: I1003 09:26:46.419085 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/780db1d7-b699-4f38-9f86-c6bcf5c7c9a6-ceilometer-compute-config-data-2" (OuterVolumeSpecName: "ceilometer-compute-config-data-2") pod "780db1d7-b699-4f38-9f86-c6bcf5c7c9a6" (UID: "780db1d7-b699-4f38-9f86-c6bcf5c7c9a6"). InnerVolumeSpecName "ceilometer-compute-config-data-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:26:46 crc kubenswrapper[4790]: I1003 09:26:46.483798 4790 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/780db1d7-b699-4f38-9f86-c6bcf5c7c9a6-ceilometer-compute-config-data-0\") on node \"crc\" DevicePath \"\"" Oct 03 09:26:46 crc kubenswrapper[4790]: I1003 09:26:46.483875 4790 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/780db1d7-b699-4f38-9f86-c6bcf5c7c9a6-ceilometer-compute-config-data-1\") on node \"crc\" DevicePath \"\"" Oct 03 09:26:46 crc kubenswrapper[4790]: I1003 09:26:46.483890 4790 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/780db1d7-b699-4f38-9f86-c6bcf5c7c9a6-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 09:26:46 crc kubenswrapper[4790]: I1003 09:26:46.483915 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-665hx\" (UniqueName: \"kubernetes.io/projected/780db1d7-b699-4f38-9f86-c6bcf5c7c9a6-kube-api-access-665hx\") on node \"crc\" DevicePath \"\"" Oct 03 09:26:46 crc kubenswrapper[4790]: I1003 09:26:46.483939 4790 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/780db1d7-b699-4f38-9f86-c6bcf5c7c9a6-inventory\") on node \"crc\" DevicePath \"\"" Oct 03 09:26:46 crc kubenswrapper[4790]: I1003 09:26:46.483964 4790 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/780db1d7-b699-4f38-9f86-c6bcf5c7c9a6-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 03 09:26:46 crc kubenswrapper[4790]: I1003 09:26:46.483990 4790 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/780db1d7-b699-4f38-9f86-c6bcf5c7c9a6-ceilometer-compute-config-data-2\") on node \"crc\" DevicePath \"\"" Oct 03 09:26:46 crc kubenswrapper[4790]: I1003 09:26:46.845350 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-48d4w" event={"ID":"780db1d7-b699-4f38-9f86-c6bcf5c7c9a6","Type":"ContainerDied","Data":"d0aa859a38dbcc3167943e2a8ce883fad03e566c32a9245ced37d5a0d54dc36f"} Oct 03 09:26:46 crc kubenswrapper[4790]: I1003 09:26:46.845420 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d0aa859a38dbcc3167943e2a8ce883fad03e566c32a9245ced37d5a0d54dc36f" Oct 03 09:26:46 crc kubenswrapper[4790]: I1003 09:26:46.845378 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-48d4w" Oct 03 09:27:21 crc kubenswrapper[4790]: I1003 09:27:21.447874 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-backup-0"] Oct 03 09:27:21 crc kubenswrapper[4790]: E1003 09:27:21.449111 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5d5253a8-eac9-41f0-8770-642f1bee807c" containerName="registry-server" Oct 03 09:27:21 crc kubenswrapper[4790]: I1003 09:27:21.449134 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d5253a8-eac9-41f0-8770-642f1bee807c" containerName="registry-server" Oct 03 09:27:21 crc kubenswrapper[4790]: E1003 09:27:21.449168 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5d5253a8-eac9-41f0-8770-642f1bee807c" containerName="extract-content" Oct 03 09:27:21 crc kubenswrapper[4790]: I1003 09:27:21.449179 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d5253a8-eac9-41f0-8770-642f1bee807c" containerName="extract-content" Oct 03 09:27:21 crc kubenswrapper[4790]: E1003 09:27:21.449193 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="780db1d7-b699-4f38-9f86-c6bcf5c7c9a6" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Oct 03 09:27:21 crc kubenswrapper[4790]: I1003 09:27:21.449205 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="780db1d7-b699-4f38-9f86-c6bcf5c7c9a6" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Oct 03 09:27:21 crc kubenswrapper[4790]: E1003 09:27:21.449233 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="db5acd04-b20a-4f50-b8a6-6813961ec49f" containerName="registry-server" Oct 03 09:27:21 crc kubenswrapper[4790]: I1003 09:27:21.449244 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="db5acd04-b20a-4f50-b8a6-6813961ec49f" containerName="registry-server" Oct 03 09:27:21 crc kubenswrapper[4790]: E1003 09:27:21.449263 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5d5253a8-eac9-41f0-8770-642f1bee807c" containerName="extract-utilities" Oct 03 09:27:21 crc kubenswrapper[4790]: I1003 09:27:21.449273 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d5253a8-eac9-41f0-8770-642f1bee807c" containerName="extract-utilities" Oct 03 09:27:21 crc kubenswrapper[4790]: E1003 09:27:21.449297 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="db5acd04-b20a-4f50-b8a6-6813961ec49f" containerName="extract-content" Oct 03 09:27:21 crc kubenswrapper[4790]: I1003 09:27:21.449307 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="db5acd04-b20a-4f50-b8a6-6813961ec49f" containerName="extract-content" Oct 03 09:27:21 crc kubenswrapper[4790]: E1003 09:27:21.449338 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="db5acd04-b20a-4f50-b8a6-6813961ec49f" containerName="extract-utilities" Oct 03 09:27:21 crc kubenswrapper[4790]: I1003 09:27:21.449348 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="db5acd04-b20a-4f50-b8a6-6813961ec49f" containerName="extract-utilities" Oct 03 09:27:21 crc kubenswrapper[4790]: I1003 09:27:21.449674 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="780db1d7-b699-4f38-9f86-c6bcf5c7c9a6" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Oct 03 09:27:21 crc kubenswrapper[4790]: I1003 09:27:21.449702 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="5d5253a8-eac9-41f0-8770-642f1bee807c" containerName="registry-server" Oct 03 09:27:21 crc kubenswrapper[4790]: I1003 09:27:21.449724 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="db5acd04-b20a-4f50-b8a6-6813961ec49f" containerName="registry-server" Oct 03 09:27:21 crc kubenswrapper[4790]: I1003 09:27:21.451058 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-backup-0" Oct 03 09:27:21 crc kubenswrapper[4790]: I1003 09:27:21.459136 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-backup-config-data" Oct 03 09:27:21 crc kubenswrapper[4790]: I1003 09:27:21.484273 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-backup-0"] Oct 03 09:27:21 crc kubenswrapper[4790]: I1003 09:27:21.573776 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/8927cfc4-2b2e-4a79-bd9e-3ff95160b86a-etc-iscsi\") pod \"cinder-backup-0\" (UID: \"8927cfc4-2b2e-4a79-bd9e-3ff95160b86a\") " pod="openstack/cinder-backup-0" Oct 03 09:27:21 crc kubenswrapper[4790]: I1003 09:27:21.573841 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-njfms\" (UniqueName: \"kubernetes.io/projected/8927cfc4-2b2e-4a79-bd9e-3ff95160b86a-kube-api-access-njfms\") pod \"cinder-backup-0\" (UID: \"8927cfc4-2b2e-4a79-bd9e-3ff95160b86a\") " pod="openstack/cinder-backup-0" Oct 03 09:27:21 crc kubenswrapper[4790]: I1003 09:27:21.573884 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/8927cfc4-2b2e-4a79-bd9e-3ff95160b86a-dev\") pod \"cinder-backup-0\" (UID: \"8927cfc4-2b2e-4a79-bd9e-3ff95160b86a\") " pod="openstack/cinder-backup-0" Oct 03 09:27:21 crc kubenswrapper[4790]: I1003 09:27:21.573915 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8927cfc4-2b2e-4a79-bd9e-3ff95160b86a-config-data-custom\") pod \"cinder-backup-0\" (UID: \"8927cfc4-2b2e-4a79-bd9e-3ff95160b86a\") " pod="openstack/cinder-backup-0" Oct 03 09:27:21 crc kubenswrapper[4790]: I1003 09:27:21.574230 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/8927cfc4-2b2e-4a79-bd9e-3ff95160b86a-sys\") pod \"cinder-backup-0\" (UID: \"8927cfc4-2b2e-4a79-bd9e-3ff95160b86a\") " pod="openstack/cinder-backup-0" Oct 03 09:27:21 crc kubenswrapper[4790]: I1003 09:27:21.574387 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/8927cfc4-2b2e-4a79-bd9e-3ff95160b86a-lib-modules\") pod \"cinder-backup-0\" (UID: \"8927cfc4-2b2e-4a79-bd9e-3ff95160b86a\") " pod="openstack/cinder-backup-0" Oct 03 09:27:21 crc kubenswrapper[4790]: I1003 09:27:21.574470 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8927cfc4-2b2e-4a79-bd9e-3ff95160b86a-config-data\") pod \"cinder-backup-0\" (UID: \"8927cfc4-2b2e-4a79-bd9e-3ff95160b86a\") " pod="openstack/cinder-backup-0" Oct 03 09:27:21 crc kubenswrapper[4790]: I1003 09:27:21.574532 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/8927cfc4-2b2e-4a79-bd9e-3ff95160b86a-var-locks-cinder\") pod \"cinder-backup-0\" (UID: \"8927cfc4-2b2e-4a79-bd9e-3ff95160b86a\") " pod="openstack/cinder-backup-0" Oct 03 09:27:21 crc kubenswrapper[4790]: I1003 09:27:21.574628 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/8927cfc4-2b2e-4a79-bd9e-3ff95160b86a-var-locks-brick\") pod \"cinder-backup-0\" (UID: \"8927cfc4-2b2e-4a79-bd9e-3ff95160b86a\") " pod="openstack/cinder-backup-0" Oct 03 09:27:21 crc kubenswrapper[4790]: I1003 09:27:21.574795 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8927cfc4-2b2e-4a79-bd9e-3ff95160b86a-combined-ca-bundle\") pod \"cinder-backup-0\" (UID: \"8927cfc4-2b2e-4a79-bd9e-3ff95160b86a\") " pod="openstack/cinder-backup-0" Oct 03 09:27:21 crc kubenswrapper[4790]: I1003 09:27:21.574851 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/8927cfc4-2b2e-4a79-bd9e-3ff95160b86a-etc-machine-id\") pod \"cinder-backup-0\" (UID: \"8927cfc4-2b2e-4a79-bd9e-3ff95160b86a\") " pod="openstack/cinder-backup-0" Oct 03 09:27:21 crc kubenswrapper[4790]: I1003 09:27:21.574912 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8927cfc4-2b2e-4a79-bd9e-3ff95160b86a-scripts\") pod \"cinder-backup-0\" (UID: \"8927cfc4-2b2e-4a79-bd9e-3ff95160b86a\") " pod="openstack/cinder-backup-0" Oct 03 09:27:21 crc kubenswrapper[4790]: I1003 09:27:21.574943 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/8927cfc4-2b2e-4a79-bd9e-3ff95160b86a-run\") pod \"cinder-backup-0\" (UID: \"8927cfc4-2b2e-4a79-bd9e-3ff95160b86a\") " pod="openstack/cinder-backup-0" Oct 03 09:27:21 crc kubenswrapper[4790]: I1003 09:27:21.574981 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/8927cfc4-2b2e-4a79-bd9e-3ff95160b86a-var-lib-cinder\") pod \"cinder-backup-0\" (UID: \"8927cfc4-2b2e-4a79-bd9e-3ff95160b86a\") " pod="openstack/cinder-backup-0" Oct 03 09:27:21 crc kubenswrapper[4790]: I1003 09:27:21.575020 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/8927cfc4-2b2e-4a79-bd9e-3ff95160b86a-etc-nvme\") pod \"cinder-backup-0\" (UID: \"8927cfc4-2b2e-4a79-bd9e-3ff95160b86a\") " pod="openstack/cinder-backup-0" Oct 03 09:27:21 crc kubenswrapper[4790]: I1003 09:27:21.613486 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-volume-nfs-0"] Oct 03 09:27:21 crc kubenswrapper[4790]: I1003 09:27:21.615447 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-volume-nfs-0" Oct 03 09:27:21 crc kubenswrapper[4790]: I1003 09:27:21.618723 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-volume-nfs-config-data" Oct 03 09:27:21 crc kubenswrapper[4790]: I1003 09:27:21.638363 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-volume-nfs-0"] Oct 03 09:27:21 crc kubenswrapper[4790]: I1003 09:27:21.659769 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-volume-nfs-2-0"] Oct 03 09:27:21 crc kubenswrapper[4790]: I1003 09:27:21.661381 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-volume-nfs-2-0" Oct 03 09:27:21 crc kubenswrapper[4790]: I1003 09:27:21.668872 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-volume-nfs-2-config-data" Oct 03 09:27:21 crc kubenswrapper[4790]: I1003 09:27:21.676934 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-njfms\" (UniqueName: \"kubernetes.io/projected/8927cfc4-2b2e-4a79-bd9e-3ff95160b86a-kube-api-access-njfms\") pod \"cinder-backup-0\" (UID: \"8927cfc4-2b2e-4a79-bd9e-3ff95160b86a\") " pod="openstack/cinder-backup-0" Oct 03 09:27:21 crc kubenswrapper[4790]: I1003 09:27:21.676982 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/8927cfc4-2b2e-4a79-bd9e-3ff95160b86a-dev\") pod \"cinder-backup-0\" (UID: \"8927cfc4-2b2e-4a79-bd9e-3ff95160b86a\") " pod="openstack/cinder-backup-0" Oct 03 09:27:21 crc kubenswrapper[4790]: I1003 09:27:21.677006 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8927cfc4-2b2e-4a79-bd9e-3ff95160b86a-config-data-custom\") pod \"cinder-backup-0\" (UID: \"8927cfc4-2b2e-4a79-bd9e-3ff95160b86a\") " pod="openstack/cinder-backup-0" Oct 03 09:27:21 crc kubenswrapper[4790]: I1003 09:27:21.677035 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/63400791-2785-485b-86b4-ffd9943163a5-config-data-custom\") pod \"cinder-volume-nfs-0\" (UID: \"63400791-2785-485b-86b4-ffd9943163a5\") " pod="openstack/cinder-volume-nfs-0" Oct 03 09:27:21 crc kubenswrapper[4790]: I1003 09:27:21.677072 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nrhdl\" (UniqueName: \"kubernetes.io/projected/63400791-2785-485b-86b4-ffd9943163a5-kube-api-access-nrhdl\") pod \"cinder-volume-nfs-0\" (UID: \"63400791-2785-485b-86b4-ffd9943163a5\") " pod="openstack/cinder-volume-nfs-0" Oct 03 09:27:21 crc kubenswrapper[4790]: I1003 09:27:21.677100 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/63400791-2785-485b-86b4-ffd9943163a5-run\") pod \"cinder-volume-nfs-0\" (UID: \"63400791-2785-485b-86b4-ffd9943163a5\") " pod="openstack/cinder-volume-nfs-0" Oct 03 09:27:21 crc kubenswrapper[4790]: I1003 09:27:21.677116 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/63400791-2785-485b-86b4-ffd9943163a5-sys\") pod \"cinder-volume-nfs-0\" (UID: \"63400791-2785-485b-86b4-ffd9943163a5\") " pod="openstack/cinder-volume-nfs-0" Oct 03 09:27:21 crc kubenswrapper[4790]: I1003 09:27:21.677133 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/8927cfc4-2b2e-4a79-bd9e-3ff95160b86a-sys\") pod \"cinder-backup-0\" (UID: \"8927cfc4-2b2e-4a79-bd9e-3ff95160b86a\") " pod="openstack/cinder-backup-0" Oct 03 09:27:21 crc kubenswrapper[4790]: I1003 09:27:21.677155 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/63400791-2785-485b-86b4-ffd9943163a5-var-locks-cinder\") pod \"cinder-volume-nfs-0\" (UID: \"63400791-2785-485b-86b4-ffd9943163a5\") " pod="openstack/cinder-volume-nfs-0" Oct 03 09:27:21 crc kubenswrapper[4790]: I1003 09:27:21.677177 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/63400791-2785-485b-86b4-ffd9943163a5-etc-nvme\") pod \"cinder-volume-nfs-0\" (UID: \"63400791-2785-485b-86b4-ffd9943163a5\") " pod="openstack/cinder-volume-nfs-0" Oct 03 09:27:21 crc kubenswrapper[4790]: I1003 09:27:21.677203 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/8927cfc4-2b2e-4a79-bd9e-3ff95160b86a-lib-modules\") pod \"cinder-backup-0\" (UID: \"8927cfc4-2b2e-4a79-bd9e-3ff95160b86a\") " pod="openstack/cinder-backup-0" Oct 03 09:27:21 crc kubenswrapper[4790]: I1003 09:27:21.677227 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/63400791-2785-485b-86b4-ffd9943163a5-var-lib-cinder\") pod \"cinder-volume-nfs-0\" (UID: \"63400791-2785-485b-86b4-ffd9943163a5\") " pod="openstack/cinder-volume-nfs-0" Oct 03 09:27:21 crc kubenswrapper[4790]: I1003 09:27:21.677244 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8927cfc4-2b2e-4a79-bd9e-3ff95160b86a-config-data\") pod \"cinder-backup-0\" (UID: \"8927cfc4-2b2e-4a79-bd9e-3ff95160b86a\") " pod="openstack/cinder-backup-0" Oct 03 09:27:21 crc kubenswrapper[4790]: I1003 09:27:21.677268 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/8927cfc4-2b2e-4a79-bd9e-3ff95160b86a-var-locks-cinder\") pod \"cinder-backup-0\" (UID: \"8927cfc4-2b2e-4a79-bd9e-3ff95160b86a\") " pod="openstack/cinder-backup-0" Oct 03 09:27:21 crc kubenswrapper[4790]: I1003 09:27:21.677292 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/8927cfc4-2b2e-4a79-bd9e-3ff95160b86a-var-locks-brick\") pod \"cinder-backup-0\" (UID: \"8927cfc4-2b2e-4a79-bd9e-3ff95160b86a\") " pod="openstack/cinder-backup-0" Oct 03 09:27:21 crc kubenswrapper[4790]: I1003 09:27:21.677321 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/63400791-2785-485b-86b4-ffd9943163a5-etc-machine-id\") pod \"cinder-volume-nfs-0\" (UID: \"63400791-2785-485b-86b4-ffd9943163a5\") " pod="openstack/cinder-volume-nfs-0" Oct 03 09:27:21 crc kubenswrapper[4790]: I1003 09:27:21.677337 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/63400791-2785-485b-86b4-ffd9943163a5-var-locks-brick\") pod \"cinder-volume-nfs-0\" (UID: \"63400791-2785-485b-86b4-ffd9943163a5\") " pod="openstack/cinder-volume-nfs-0" Oct 03 09:27:21 crc kubenswrapper[4790]: I1003 09:27:21.677353 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/63400791-2785-485b-86b4-ffd9943163a5-combined-ca-bundle\") pod \"cinder-volume-nfs-0\" (UID: \"63400791-2785-485b-86b4-ffd9943163a5\") " pod="openstack/cinder-volume-nfs-0" Oct 03 09:27:21 crc kubenswrapper[4790]: I1003 09:27:21.677373 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/63400791-2785-485b-86b4-ffd9943163a5-scripts\") pod \"cinder-volume-nfs-0\" (UID: \"63400791-2785-485b-86b4-ffd9943163a5\") " pod="openstack/cinder-volume-nfs-0" Oct 03 09:27:21 crc kubenswrapper[4790]: I1003 09:27:21.677400 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8927cfc4-2b2e-4a79-bd9e-3ff95160b86a-combined-ca-bundle\") pod \"cinder-backup-0\" (UID: \"8927cfc4-2b2e-4a79-bd9e-3ff95160b86a\") " pod="openstack/cinder-backup-0" Oct 03 09:27:21 crc kubenswrapper[4790]: I1003 09:27:21.677421 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/8927cfc4-2b2e-4a79-bd9e-3ff95160b86a-etc-machine-id\") pod \"cinder-backup-0\" (UID: \"8927cfc4-2b2e-4a79-bd9e-3ff95160b86a\") " pod="openstack/cinder-backup-0" Oct 03 09:27:21 crc kubenswrapper[4790]: I1003 09:27:21.677437 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/63400791-2785-485b-86b4-ffd9943163a5-lib-modules\") pod \"cinder-volume-nfs-0\" (UID: \"63400791-2785-485b-86b4-ffd9943163a5\") " pod="openstack/cinder-volume-nfs-0" Oct 03 09:27:21 crc kubenswrapper[4790]: I1003 09:27:21.677458 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8927cfc4-2b2e-4a79-bd9e-3ff95160b86a-scripts\") pod \"cinder-backup-0\" (UID: \"8927cfc4-2b2e-4a79-bd9e-3ff95160b86a\") " pod="openstack/cinder-backup-0" Oct 03 09:27:21 crc kubenswrapper[4790]: I1003 09:27:21.677473 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/63400791-2785-485b-86b4-ffd9943163a5-config-data\") pod \"cinder-volume-nfs-0\" (UID: \"63400791-2785-485b-86b4-ffd9943163a5\") " pod="openstack/cinder-volume-nfs-0" Oct 03 09:27:21 crc kubenswrapper[4790]: I1003 09:27:21.677489 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/8927cfc4-2b2e-4a79-bd9e-3ff95160b86a-run\") pod \"cinder-backup-0\" (UID: \"8927cfc4-2b2e-4a79-bd9e-3ff95160b86a\") " pod="openstack/cinder-backup-0" Oct 03 09:27:21 crc kubenswrapper[4790]: I1003 09:27:21.677505 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/8927cfc4-2b2e-4a79-bd9e-3ff95160b86a-var-lib-cinder\") pod \"cinder-backup-0\" (UID: \"8927cfc4-2b2e-4a79-bd9e-3ff95160b86a\") " pod="openstack/cinder-backup-0" Oct 03 09:27:21 crc kubenswrapper[4790]: I1003 09:27:21.677527 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/8927cfc4-2b2e-4a79-bd9e-3ff95160b86a-etc-nvme\") pod \"cinder-backup-0\" (UID: \"8927cfc4-2b2e-4a79-bd9e-3ff95160b86a\") " pod="openstack/cinder-backup-0" Oct 03 09:27:21 crc kubenswrapper[4790]: I1003 09:27:21.677548 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/63400791-2785-485b-86b4-ffd9943163a5-dev\") pod \"cinder-volume-nfs-0\" (UID: \"63400791-2785-485b-86b4-ffd9943163a5\") " pod="openstack/cinder-volume-nfs-0" Oct 03 09:27:21 crc kubenswrapper[4790]: I1003 09:27:21.677604 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/63400791-2785-485b-86b4-ffd9943163a5-etc-iscsi\") pod \"cinder-volume-nfs-0\" (UID: \"63400791-2785-485b-86b4-ffd9943163a5\") " pod="openstack/cinder-volume-nfs-0" Oct 03 09:27:21 crc kubenswrapper[4790]: I1003 09:27:21.677640 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/8927cfc4-2b2e-4a79-bd9e-3ff95160b86a-etc-iscsi\") pod \"cinder-backup-0\" (UID: \"8927cfc4-2b2e-4a79-bd9e-3ff95160b86a\") " pod="openstack/cinder-backup-0" Oct 03 09:27:21 crc kubenswrapper[4790]: I1003 09:27:21.677736 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/8927cfc4-2b2e-4a79-bd9e-3ff95160b86a-etc-iscsi\") pod \"cinder-backup-0\" (UID: \"8927cfc4-2b2e-4a79-bd9e-3ff95160b86a\") " pod="openstack/cinder-backup-0" Oct 03 09:27:21 crc kubenswrapper[4790]: I1003 09:27:21.677992 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/8927cfc4-2b2e-4a79-bd9e-3ff95160b86a-dev\") pod \"cinder-backup-0\" (UID: \"8927cfc4-2b2e-4a79-bd9e-3ff95160b86a\") " pod="openstack/cinder-backup-0" Oct 03 09:27:21 crc kubenswrapper[4790]: I1003 09:27:21.691911 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8927cfc4-2b2e-4a79-bd9e-3ff95160b86a-config-data-custom\") pod \"cinder-backup-0\" (UID: \"8927cfc4-2b2e-4a79-bd9e-3ff95160b86a\") " pod="openstack/cinder-backup-0" Oct 03 09:27:21 crc kubenswrapper[4790]: I1003 09:27:21.701593 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/8927cfc4-2b2e-4a79-bd9e-3ff95160b86a-lib-modules\") pod \"cinder-backup-0\" (UID: \"8927cfc4-2b2e-4a79-bd9e-3ff95160b86a\") " pod="openstack/cinder-backup-0" Oct 03 09:27:21 crc kubenswrapper[4790]: I1003 09:27:21.701956 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/8927cfc4-2b2e-4a79-bd9e-3ff95160b86a-etc-machine-id\") pod \"cinder-backup-0\" (UID: \"8927cfc4-2b2e-4a79-bd9e-3ff95160b86a\") " pod="openstack/cinder-backup-0" Oct 03 09:27:21 crc kubenswrapper[4790]: I1003 09:27:21.705753 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/8927cfc4-2b2e-4a79-bd9e-3ff95160b86a-var-locks-cinder\") pod \"cinder-backup-0\" (UID: \"8927cfc4-2b2e-4a79-bd9e-3ff95160b86a\") " pod="openstack/cinder-backup-0" Oct 03 09:27:21 crc kubenswrapper[4790]: I1003 09:27:21.705843 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/8927cfc4-2b2e-4a79-bd9e-3ff95160b86a-var-locks-brick\") pod \"cinder-backup-0\" (UID: \"8927cfc4-2b2e-4a79-bd9e-3ff95160b86a\") " pod="openstack/cinder-backup-0" Oct 03 09:27:21 crc kubenswrapper[4790]: I1003 09:27:21.705869 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/8927cfc4-2b2e-4a79-bd9e-3ff95160b86a-sys\") pod \"cinder-backup-0\" (UID: \"8927cfc4-2b2e-4a79-bd9e-3ff95160b86a\") " pod="openstack/cinder-backup-0" Oct 03 09:27:21 crc kubenswrapper[4790]: I1003 09:27:21.706681 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/8927cfc4-2b2e-4a79-bd9e-3ff95160b86a-run\") pod \"cinder-backup-0\" (UID: \"8927cfc4-2b2e-4a79-bd9e-3ff95160b86a\") " pod="openstack/cinder-backup-0" Oct 03 09:27:21 crc kubenswrapper[4790]: I1003 09:27:21.707754 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8927cfc4-2b2e-4a79-bd9e-3ff95160b86a-config-data\") pod \"cinder-backup-0\" (UID: \"8927cfc4-2b2e-4a79-bd9e-3ff95160b86a\") " pod="openstack/cinder-backup-0" Oct 03 09:27:21 crc kubenswrapper[4790]: I1003 09:27:21.707903 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/8927cfc4-2b2e-4a79-bd9e-3ff95160b86a-var-lib-cinder\") pod \"cinder-backup-0\" (UID: \"8927cfc4-2b2e-4a79-bd9e-3ff95160b86a\") " pod="openstack/cinder-backup-0" Oct 03 09:27:21 crc kubenswrapper[4790]: I1003 09:27:21.708676 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/8927cfc4-2b2e-4a79-bd9e-3ff95160b86a-etc-nvme\") pod \"cinder-backup-0\" (UID: \"8927cfc4-2b2e-4a79-bd9e-3ff95160b86a\") " pod="openstack/cinder-backup-0" Oct 03 09:27:21 crc kubenswrapper[4790]: I1003 09:27:21.736484 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8927cfc4-2b2e-4a79-bd9e-3ff95160b86a-combined-ca-bundle\") pod \"cinder-backup-0\" (UID: \"8927cfc4-2b2e-4a79-bd9e-3ff95160b86a\") " pod="openstack/cinder-backup-0" Oct 03 09:27:21 crc kubenswrapper[4790]: I1003 09:27:21.759194 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8927cfc4-2b2e-4a79-bd9e-3ff95160b86a-scripts\") pod \"cinder-backup-0\" (UID: \"8927cfc4-2b2e-4a79-bd9e-3ff95160b86a\") " pod="openstack/cinder-backup-0" Oct 03 09:27:21 crc kubenswrapper[4790]: I1003 09:27:21.761505 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-njfms\" (UniqueName: \"kubernetes.io/projected/8927cfc4-2b2e-4a79-bd9e-3ff95160b86a-kube-api-access-njfms\") pod \"cinder-backup-0\" (UID: \"8927cfc4-2b2e-4a79-bd9e-3ff95160b86a\") " pod="openstack/cinder-backup-0" Oct 03 09:27:21 crc kubenswrapper[4790]: I1003 09:27:21.779820 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/63400791-2785-485b-86b4-ffd9943163a5-config-data-custom\") pod \"cinder-volume-nfs-0\" (UID: \"63400791-2785-485b-86b4-ffd9943163a5\") " pod="openstack/cinder-volume-nfs-0" Oct 03 09:27:21 crc kubenswrapper[4790]: I1003 09:27:21.779879 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nrhdl\" (UniqueName: \"kubernetes.io/projected/63400791-2785-485b-86b4-ffd9943163a5-kube-api-access-nrhdl\") pod \"cinder-volume-nfs-0\" (UID: \"63400791-2785-485b-86b4-ffd9943163a5\") " pod="openstack/cinder-volume-nfs-0" Oct 03 09:27:21 crc kubenswrapper[4790]: I1003 09:27:21.779914 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/63400791-2785-485b-86b4-ffd9943163a5-run\") pod \"cinder-volume-nfs-0\" (UID: \"63400791-2785-485b-86b4-ffd9943163a5\") " pod="openstack/cinder-volume-nfs-0" Oct 03 09:27:21 crc kubenswrapper[4790]: I1003 09:27:21.779941 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/63400791-2785-485b-86b4-ffd9943163a5-sys\") pod \"cinder-volume-nfs-0\" (UID: \"63400791-2785-485b-86b4-ffd9943163a5\") " pod="openstack/cinder-volume-nfs-0" Oct 03 09:27:21 crc kubenswrapper[4790]: I1003 09:27:21.779980 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/63400791-2785-485b-86b4-ffd9943163a5-var-locks-cinder\") pod \"cinder-volume-nfs-0\" (UID: \"63400791-2785-485b-86b4-ffd9943163a5\") " pod="openstack/cinder-volume-nfs-0" Oct 03 09:27:21 crc kubenswrapper[4790]: I1003 09:27:21.780005 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/63400791-2785-485b-86b4-ffd9943163a5-etc-nvme\") pod \"cinder-volume-nfs-0\" (UID: \"63400791-2785-485b-86b4-ffd9943163a5\") " pod="openstack/cinder-volume-nfs-0" Oct 03 09:27:21 crc kubenswrapper[4790]: I1003 09:27:21.780059 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/63400791-2785-485b-86b4-ffd9943163a5-var-lib-cinder\") pod \"cinder-volume-nfs-0\" (UID: \"63400791-2785-485b-86b4-ffd9943163a5\") " pod="openstack/cinder-volume-nfs-0" Oct 03 09:27:21 crc kubenswrapper[4790]: I1003 09:27:21.780093 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/313b0b89-2532-482b-994d-317dfc54b6a7-run\") pod \"cinder-volume-nfs-2-0\" (UID: \"313b0b89-2532-482b-994d-317dfc54b6a7\") " pod="openstack/cinder-volume-nfs-2-0" Oct 03 09:27:21 crc kubenswrapper[4790]: I1003 09:27:21.780121 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/313b0b89-2532-482b-994d-317dfc54b6a7-var-lib-cinder\") pod \"cinder-volume-nfs-2-0\" (UID: \"313b0b89-2532-482b-994d-317dfc54b6a7\") " pod="openstack/cinder-volume-nfs-2-0" Oct 03 09:27:21 crc kubenswrapper[4790]: I1003 09:27:21.780158 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/63400791-2785-485b-86b4-ffd9943163a5-etc-machine-id\") pod \"cinder-volume-nfs-0\" (UID: \"63400791-2785-485b-86b4-ffd9943163a5\") " pod="openstack/cinder-volume-nfs-0" Oct 03 09:27:21 crc kubenswrapper[4790]: I1003 09:27:21.780176 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/63400791-2785-485b-86b4-ffd9943163a5-var-locks-brick\") pod \"cinder-volume-nfs-0\" (UID: \"63400791-2785-485b-86b4-ffd9943163a5\") " pod="openstack/cinder-volume-nfs-0" Oct 03 09:27:21 crc kubenswrapper[4790]: I1003 09:27:21.780196 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/63400791-2785-485b-86b4-ffd9943163a5-combined-ca-bundle\") pod \"cinder-volume-nfs-0\" (UID: \"63400791-2785-485b-86b4-ffd9943163a5\") " pod="openstack/cinder-volume-nfs-0" Oct 03 09:27:21 crc kubenswrapper[4790]: I1003 09:27:21.780222 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/63400791-2785-485b-86b4-ffd9943163a5-scripts\") pod \"cinder-volume-nfs-0\" (UID: \"63400791-2785-485b-86b4-ffd9943163a5\") " pod="openstack/cinder-volume-nfs-0" Oct 03 09:27:21 crc kubenswrapper[4790]: I1003 09:27:21.780256 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/313b0b89-2532-482b-994d-317dfc54b6a7-scripts\") pod \"cinder-volume-nfs-2-0\" (UID: \"313b0b89-2532-482b-994d-317dfc54b6a7\") " pod="openstack/cinder-volume-nfs-2-0" Oct 03 09:27:21 crc kubenswrapper[4790]: I1003 09:27:21.780275 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/313b0b89-2532-482b-994d-317dfc54b6a7-config-data-custom\") pod \"cinder-volume-nfs-2-0\" (UID: \"313b0b89-2532-482b-994d-317dfc54b6a7\") " pod="openstack/cinder-volume-nfs-2-0" Oct 03 09:27:21 crc kubenswrapper[4790]: I1003 09:27:21.780295 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/63400791-2785-485b-86b4-ffd9943163a5-lib-modules\") pod \"cinder-volume-nfs-0\" (UID: \"63400791-2785-485b-86b4-ffd9943163a5\") " pod="openstack/cinder-volume-nfs-0" Oct 03 09:27:21 crc kubenswrapper[4790]: I1003 09:27:21.792269 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/313b0b89-2532-482b-994d-317dfc54b6a7-var-locks-cinder\") pod \"cinder-volume-nfs-2-0\" (UID: \"313b0b89-2532-482b-994d-317dfc54b6a7\") " pod="openstack/cinder-volume-nfs-2-0" Oct 03 09:27:21 crc kubenswrapper[4790]: I1003 09:27:21.827306 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-backup-0" Oct 03 09:27:21 crc kubenswrapper[4790]: I1003 09:27:21.828626 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-volume-nfs-2-0"] Oct 03 09:27:21 crc kubenswrapper[4790]: I1003 09:27:21.831720 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/63400791-2785-485b-86b4-ffd9943163a5-var-locks-cinder\") pod \"cinder-volume-nfs-0\" (UID: \"63400791-2785-485b-86b4-ffd9943163a5\") " pod="openstack/cinder-volume-nfs-0" Oct 03 09:27:21 crc kubenswrapper[4790]: I1003 09:27:21.832329 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/63400791-2785-485b-86b4-ffd9943163a5-run\") pod \"cinder-volume-nfs-0\" (UID: \"63400791-2785-485b-86b4-ffd9943163a5\") " pod="openstack/cinder-volume-nfs-0" Oct 03 09:27:21 crc kubenswrapper[4790]: I1003 09:27:21.832361 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/63400791-2785-485b-86b4-ffd9943163a5-sys\") pod \"cinder-volume-nfs-0\" (UID: \"63400791-2785-485b-86b4-ffd9943163a5\") " pod="openstack/cinder-volume-nfs-0" Oct 03 09:27:21 crc kubenswrapper[4790]: I1003 09:27:21.832748 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/63400791-2785-485b-86b4-ffd9943163a5-var-lib-cinder\") pod \"cinder-volume-nfs-0\" (UID: \"63400791-2785-485b-86b4-ffd9943163a5\") " pod="openstack/cinder-volume-nfs-0" Oct 03 09:27:21 crc kubenswrapper[4790]: I1003 09:27:21.832805 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/63400791-2785-485b-86b4-ffd9943163a5-etc-nvme\") pod \"cinder-volume-nfs-0\" (UID: \"63400791-2785-485b-86b4-ffd9943163a5\") " pod="openstack/cinder-volume-nfs-0" Oct 03 09:27:21 crc kubenswrapper[4790]: I1003 09:27:21.832833 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/63400791-2785-485b-86b4-ffd9943163a5-var-locks-brick\") pod \"cinder-volume-nfs-0\" (UID: \"63400791-2785-485b-86b4-ffd9943163a5\") " pod="openstack/cinder-volume-nfs-0" Oct 03 09:27:21 crc kubenswrapper[4790]: I1003 09:27:21.832865 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/63400791-2785-485b-86b4-ffd9943163a5-etc-machine-id\") pod \"cinder-volume-nfs-0\" (UID: \"63400791-2785-485b-86b4-ffd9943163a5\") " pod="openstack/cinder-volume-nfs-0" Oct 03 09:27:21 crc kubenswrapper[4790]: I1003 09:27:21.832893 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/63400791-2785-485b-86b4-ffd9943163a5-lib-modules\") pod \"cinder-volume-nfs-0\" (UID: \"63400791-2785-485b-86b4-ffd9943163a5\") " pod="openstack/cinder-volume-nfs-0" Oct 03 09:27:21 crc kubenswrapper[4790]: I1003 09:27:21.833001 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/63400791-2785-485b-86b4-ffd9943163a5-config-data\") pod \"cinder-volume-nfs-0\" (UID: \"63400791-2785-485b-86b4-ffd9943163a5\") " pod="openstack/cinder-volume-nfs-0" Oct 03 09:27:21 crc kubenswrapper[4790]: I1003 09:27:21.833266 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/313b0b89-2532-482b-994d-317dfc54b6a7-etc-machine-id\") pod \"cinder-volume-nfs-2-0\" (UID: \"313b0b89-2532-482b-994d-317dfc54b6a7\") " pod="openstack/cinder-volume-nfs-2-0" Oct 03 09:27:21 crc kubenswrapper[4790]: I1003 09:27:21.833398 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/313b0b89-2532-482b-994d-317dfc54b6a7-combined-ca-bundle\") pod \"cinder-volume-nfs-2-0\" (UID: \"313b0b89-2532-482b-994d-317dfc54b6a7\") " pod="openstack/cinder-volume-nfs-2-0" Oct 03 09:27:21 crc kubenswrapper[4790]: I1003 09:27:21.833552 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/63400791-2785-485b-86b4-ffd9943163a5-dev\") pod \"cinder-volume-nfs-0\" (UID: \"63400791-2785-485b-86b4-ffd9943163a5\") " pod="openstack/cinder-volume-nfs-0" Oct 03 09:27:21 crc kubenswrapper[4790]: I1003 09:27:21.833684 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/313b0b89-2532-482b-994d-317dfc54b6a7-etc-nvme\") pod \"cinder-volume-nfs-2-0\" (UID: \"313b0b89-2532-482b-994d-317dfc54b6a7\") " pod="openstack/cinder-volume-nfs-2-0" Oct 03 09:27:21 crc kubenswrapper[4790]: I1003 09:27:21.833781 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/63400791-2785-485b-86b4-ffd9943163a5-etc-iscsi\") pod \"cinder-volume-nfs-0\" (UID: \"63400791-2785-485b-86b4-ffd9943163a5\") " pod="openstack/cinder-volume-nfs-0" Oct 03 09:27:21 crc kubenswrapper[4790]: I1003 09:27:21.833866 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q284b\" (UniqueName: \"kubernetes.io/projected/313b0b89-2532-482b-994d-317dfc54b6a7-kube-api-access-q284b\") pod \"cinder-volume-nfs-2-0\" (UID: \"313b0b89-2532-482b-994d-317dfc54b6a7\") " pod="openstack/cinder-volume-nfs-2-0" Oct 03 09:27:21 crc kubenswrapper[4790]: I1003 09:27:21.833957 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/313b0b89-2532-482b-994d-317dfc54b6a7-etc-iscsi\") pod \"cinder-volume-nfs-2-0\" (UID: \"313b0b89-2532-482b-994d-317dfc54b6a7\") " pod="openstack/cinder-volume-nfs-2-0" Oct 03 09:27:21 crc kubenswrapper[4790]: I1003 09:27:21.834070 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/313b0b89-2532-482b-994d-317dfc54b6a7-var-locks-brick\") pod \"cinder-volume-nfs-2-0\" (UID: \"313b0b89-2532-482b-994d-317dfc54b6a7\") " pod="openstack/cinder-volume-nfs-2-0" Oct 03 09:27:21 crc kubenswrapper[4790]: I1003 09:27:21.834204 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/313b0b89-2532-482b-994d-317dfc54b6a7-dev\") pod \"cinder-volume-nfs-2-0\" (UID: \"313b0b89-2532-482b-994d-317dfc54b6a7\") " pod="openstack/cinder-volume-nfs-2-0" Oct 03 09:27:21 crc kubenswrapper[4790]: I1003 09:27:21.834288 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/313b0b89-2532-482b-994d-317dfc54b6a7-config-data\") pod \"cinder-volume-nfs-2-0\" (UID: \"313b0b89-2532-482b-994d-317dfc54b6a7\") " pod="openstack/cinder-volume-nfs-2-0" Oct 03 09:27:21 crc kubenswrapper[4790]: I1003 09:27:21.834447 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/313b0b89-2532-482b-994d-317dfc54b6a7-lib-modules\") pod \"cinder-volume-nfs-2-0\" (UID: \"313b0b89-2532-482b-994d-317dfc54b6a7\") " pod="openstack/cinder-volume-nfs-2-0" Oct 03 09:27:21 crc kubenswrapper[4790]: I1003 09:27:21.834546 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/313b0b89-2532-482b-994d-317dfc54b6a7-sys\") pod \"cinder-volume-nfs-2-0\" (UID: \"313b0b89-2532-482b-994d-317dfc54b6a7\") " pod="openstack/cinder-volume-nfs-2-0" Oct 03 09:27:21 crc kubenswrapper[4790]: I1003 09:27:21.833698 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/63400791-2785-485b-86b4-ffd9943163a5-dev\") pod \"cinder-volume-nfs-0\" (UID: \"63400791-2785-485b-86b4-ffd9943163a5\") " pod="openstack/cinder-volume-nfs-0" Oct 03 09:27:21 crc kubenswrapper[4790]: I1003 09:27:21.834917 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/63400791-2785-485b-86b4-ffd9943163a5-etc-iscsi\") pod \"cinder-volume-nfs-0\" (UID: \"63400791-2785-485b-86b4-ffd9943163a5\") " pod="openstack/cinder-volume-nfs-0" Oct 03 09:27:21 crc kubenswrapper[4790]: I1003 09:27:21.842148 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/63400791-2785-485b-86b4-ffd9943163a5-config-data\") pod \"cinder-volume-nfs-0\" (UID: \"63400791-2785-485b-86b4-ffd9943163a5\") " pod="openstack/cinder-volume-nfs-0" Oct 03 09:27:21 crc kubenswrapper[4790]: I1003 09:27:21.842227 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/63400791-2785-485b-86b4-ffd9943163a5-config-data-custom\") pod \"cinder-volume-nfs-0\" (UID: \"63400791-2785-485b-86b4-ffd9943163a5\") " pod="openstack/cinder-volume-nfs-0" Oct 03 09:27:21 crc kubenswrapper[4790]: I1003 09:27:21.843193 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/63400791-2785-485b-86b4-ffd9943163a5-combined-ca-bundle\") pod \"cinder-volume-nfs-0\" (UID: \"63400791-2785-485b-86b4-ffd9943163a5\") " pod="openstack/cinder-volume-nfs-0" Oct 03 09:27:21 crc kubenswrapper[4790]: I1003 09:27:21.847080 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/63400791-2785-485b-86b4-ffd9943163a5-scripts\") pod \"cinder-volume-nfs-0\" (UID: \"63400791-2785-485b-86b4-ffd9943163a5\") " pod="openstack/cinder-volume-nfs-0" Oct 03 09:27:21 crc kubenswrapper[4790]: I1003 09:27:21.854634 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nrhdl\" (UniqueName: \"kubernetes.io/projected/63400791-2785-485b-86b4-ffd9943163a5-kube-api-access-nrhdl\") pod \"cinder-volume-nfs-0\" (UID: \"63400791-2785-485b-86b4-ffd9943163a5\") " pod="openstack/cinder-volume-nfs-0" Oct 03 09:27:21 crc kubenswrapper[4790]: I1003 09:27:21.936275 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/313b0b89-2532-482b-994d-317dfc54b6a7-scripts\") pod \"cinder-volume-nfs-2-0\" (UID: \"313b0b89-2532-482b-994d-317dfc54b6a7\") " pod="openstack/cinder-volume-nfs-2-0" Oct 03 09:27:21 crc kubenswrapper[4790]: I1003 09:27:21.936322 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/313b0b89-2532-482b-994d-317dfc54b6a7-config-data-custom\") pod \"cinder-volume-nfs-2-0\" (UID: \"313b0b89-2532-482b-994d-317dfc54b6a7\") " pod="openstack/cinder-volume-nfs-2-0" Oct 03 09:27:21 crc kubenswrapper[4790]: I1003 09:27:21.936349 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/313b0b89-2532-482b-994d-317dfc54b6a7-var-locks-cinder\") pod \"cinder-volume-nfs-2-0\" (UID: \"313b0b89-2532-482b-994d-317dfc54b6a7\") " pod="openstack/cinder-volume-nfs-2-0" Oct 03 09:27:21 crc kubenswrapper[4790]: I1003 09:27:21.936374 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/313b0b89-2532-482b-994d-317dfc54b6a7-etc-machine-id\") pod \"cinder-volume-nfs-2-0\" (UID: \"313b0b89-2532-482b-994d-317dfc54b6a7\") " pod="openstack/cinder-volume-nfs-2-0" Oct 03 09:27:21 crc kubenswrapper[4790]: I1003 09:27:21.936399 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/313b0b89-2532-482b-994d-317dfc54b6a7-combined-ca-bundle\") pod \"cinder-volume-nfs-2-0\" (UID: \"313b0b89-2532-482b-994d-317dfc54b6a7\") " pod="openstack/cinder-volume-nfs-2-0" Oct 03 09:27:21 crc kubenswrapper[4790]: I1003 09:27:21.936432 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/313b0b89-2532-482b-994d-317dfc54b6a7-etc-nvme\") pod \"cinder-volume-nfs-2-0\" (UID: \"313b0b89-2532-482b-994d-317dfc54b6a7\") " pod="openstack/cinder-volume-nfs-2-0" Oct 03 09:27:21 crc kubenswrapper[4790]: I1003 09:27:21.936464 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q284b\" (UniqueName: \"kubernetes.io/projected/313b0b89-2532-482b-994d-317dfc54b6a7-kube-api-access-q284b\") pod \"cinder-volume-nfs-2-0\" (UID: \"313b0b89-2532-482b-994d-317dfc54b6a7\") " pod="openstack/cinder-volume-nfs-2-0" Oct 03 09:27:21 crc kubenswrapper[4790]: I1003 09:27:21.936490 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/313b0b89-2532-482b-994d-317dfc54b6a7-etc-iscsi\") pod \"cinder-volume-nfs-2-0\" (UID: \"313b0b89-2532-482b-994d-317dfc54b6a7\") " pod="openstack/cinder-volume-nfs-2-0" Oct 03 09:27:21 crc kubenswrapper[4790]: I1003 09:27:21.936522 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/313b0b89-2532-482b-994d-317dfc54b6a7-var-locks-brick\") pod \"cinder-volume-nfs-2-0\" (UID: \"313b0b89-2532-482b-994d-317dfc54b6a7\") " pod="openstack/cinder-volume-nfs-2-0" Oct 03 09:27:21 crc kubenswrapper[4790]: I1003 09:27:21.936557 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/313b0b89-2532-482b-994d-317dfc54b6a7-dev\") pod \"cinder-volume-nfs-2-0\" (UID: \"313b0b89-2532-482b-994d-317dfc54b6a7\") " pod="openstack/cinder-volume-nfs-2-0" Oct 03 09:27:21 crc kubenswrapper[4790]: I1003 09:27:21.936602 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/313b0b89-2532-482b-994d-317dfc54b6a7-config-data\") pod \"cinder-volume-nfs-2-0\" (UID: \"313b0b89-2532-482b-994d-317dfc54b6a7\") " pod="openstack/cinder-volume-nfs-2-0" Oct 03 09:27:21 crc kubenswrapper[4790]: I1003 09:27:21.936627 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/313b0b89-2532-482b-994d-317dfc54b6a7-lib-modules\") pod \"cinder-volume-nfs-2-0\" (UID: \"313b0b89-2532-482b-994d-317dfc54b6a7\") " pod="openstack/cinder-volume-nfs-2-0" Oct 03 09:27:21 crc kubenswrapper[4790]: I1003 09:27:21.936704 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/313b0b89-2532-482b-994d-317dfc54b6a7-sys\") pod \"cinder-volume-nfs-2-0\" (UID: \"313b0b89-2532-482b-994d-317dfc54b6a7\") " pod="openstack/cinder-volume-nfs-2-0" Oct 03 09:27:21 crc kubenswrapper[4790]: I1003 09:27:21.936803 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/313b0b89-2532-482b-994d-317dfc54b6a7-run\") pod \"cinder-volume-nfs-2-0\" (UID: \"313b0b89-2532-482b-994d-317dfc54b6a7\") " pod="openstack/cinder-volume-nfs-2-0" Oct 03 09:27:21 crc kubenswrapper[4790]: I1003 09:27:21.936842 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/313b0b89-2532-482b-994d-317dfc54b6a7-var-lib-cinder\") pod \"cinder-volume-nfs-2-0\" (UID: \"313b0b89-2532-482b-994d-317dfc54b6a7\") " pod="openstack/cinder-volume-nfs-2-0" Oct 03 09:27:21 crc kubenswrapper[4790]: I1003 09:27:21.936836 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-volume-nfs-0" Oct 03 09:27:21 crc kubenswrapper[4790]: I1003 09:27:21.936926 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/313b0b89-2532-482b-994d-317dfc54b6a7-etc-iscsi\") pod \"cinder-volume-nfs-2-0\" (UID: \"313b0b89-2532-482b-994d-317dfc54b6a7\") " pod="openstack/cinder-volume-nfs-2-0" Oct 03 09:27:21 crc kubenswrapper[4790]: I1003 09:27:21.937514 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/313b0b89-2532-482b-994d-317dfc54b6a7-var-locks-brick\") pod \"cinder-volume-nfs-2-0\" (UID: \"313b0b89-2532-482b-994d-317dfc54b6a7\") " pod="openstack/cinder-volume-nfs-2-0" Oct 03 09:27:21 crc kubenswrapper[4790]: I1003 09:27:21.937560 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/313b0b89-2532-482b-994d-317dfc54b6a7-dev\") pod \"cinder-volume-nfs-2-0\" (UID: \"313b0b89-2532-482b-994d-317dfc54b6a7\") " pod="openstack/cinder-volume-nfs-2-0" Oct 03 09:27:21 crc kubenswrapper[4790]: I1003 09:27:21.938256 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/313b0b89-2532-482b-994d-317dfc54b6a7-etc-nvme\") pod \"cinder-volume-nfs-2-0\" (UID: \"313b0b89-2532-482b-994d-317dfc54b6a7\") " pod="openstack/cinder-volume-nfs-2-0" Oct 03 09:27:21 crc kubenswrapper[4790]: I1003 09:27:21.938781 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/313b0b89-2532-482b-994d-317dfc54b6a7-sys\") pod \"cinder-volume-nfs-2-0\" (UID: \"313b0b89-2532-482b-994d-317dfc54b6a7\") " pod="openstack/cinder-volume-nfs-2-0" Oct 03 09:27:21 crc kubenswrapper[4790]: I1003 09:27:21.938815 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/313b0b89-2532-482b-994d-317dfc54b6a7-lib-modules\") pod \"cinder-volume-nfs-2-0\" (UID: \"313b0b89-2532-482b-994d-317dfc54b6a7\") " pod="openstack/cinder-volume-nfs-2-0" Oct 03 09:27:21 crc kubenswrapper[4790]: I1003 09:27:21.938849 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/313b0b89-2532-482b-994d-317dfc54b6a7-var-locks-cinder\") pod \"cinder-volume-nfs-2-0\" (UID: \"313b0b89-2532-482b-994d-317dfc54b6a7\") " pod="openstack/cinder-volume-nfs-2-0" Oct 03 09:27:21 crc kubenswrapper[4790]: I1003 09:27:21.938871 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/313b0b89-2532-482b-994d-317dfc54b6a7-run\") pod \"cinder-volume-nfs-2-0\" (UID: \"313b0b89-2532-482b-994d-317dfc54b6a7\") " pod="openstack/cinder-volume-nfs-2-0" Oct 03 09:27:21 crc kubenswrapper[4790]: I1003 09:27:21.938900 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/313b0b89-2532-482b-994d-317dfc54b6a7-var-lib-cinder\") pod \"cinder-volume-nfs-2-0\" (UID: \"313b0b89-2532-482b-994d-317dfc54b6a7\") " pod="openstack/cinder-volume-nfs-2-0" Oct 03 09:27:21 crc kubenswrapper[4790]: I1003 09:27:21.938922 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/313b0b89-2532-482b-994d-317dfc54b6a7-etc-machine-id\") pod \"cinder-volume-nfs-2-0\" (UID: \"313b0b89-2532-482b-994d-317dfc54b6a7\") " pod="openstack/cinder-volume-nfs-2-0" Oct 03 09:27:21 crc kubenswrapper[4790]: I1003 09:27:21.942017 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/313b0b89-2532-482b-994d-317dfc54b6a7-combined-ca-bundle\") pod \"cinder-volume-nfs-2-0\" (UID: \"313b0b89-2532-482b-994d-317dfc54b6a7\") " pod="openstack/cinder-volume-nfs-2-0" Oct 03 09:27:21 crc kubenswrapper[4790]: I1003 09:27:21.944012 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/313b0b89-2532-482b-994d-317dfc54b6a7-config-data\") pod \"cinder-volume-nfs-2-0\" (UID: \"313b0b89-2532-482b-994d-317dfc54b6a7\") " pod="openstack/cinder-volume-nfs-2-0" Oct 03 09:27:21 crc kubenswrapper[4790]: I1003 09:27:21.948038 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/313b0b89-2532-482b-994d-317dfc54b6a7-scripts\") pod \"cinder-volume-nfs-2-0\" (UID: \"313b0b89-2532-482b-994d-317dfc54b6a7\") " pod="openstack/cinder-volume-nfs-2-0" Oct 03 09:27:21 crc kubenswrapper[4790]: I1003 09:27:21.956102 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/313b0b89-2532-482b-994d-317dfc54b6a7-config-data-custom\") pod \"cinder-volume-nfs-2-0\" (UID: \"313b0b89-2532-482b-994d-317dfc54b6a7\") " pod="openstack/cinder-volume-nfs-2-0" Oct 03 09:27:21 crc kubenswrapper[4790]: I1003 09:27:21.956495 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q284b\" (UniqueName: \"kubernetes.io/projected/313b0b89-2532-482b-994d-317dfc54b6a7-kube-api-access-q284b\") pod \"cinder-volume-nfs-2-0\" (UID: \"313b0b89-2532-482b-994d-317dfc54b6a7\") " pod="openstack/cinder-volume-nfs-2-0" Oct 03 09:27:22 crc kubenswrapper[4790]: I1003 09:27:22.226487 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-volume-nfs-2-0" Oct 03 09:27:22 crc kubenswrapper[4790]: I1003 09:27:22.459134 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-backup-0"] Oct 03 09:27:22 crc kubenswrapper[4790]: I1003 09:27:22.549456 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-volume-nfs-0"] Oct 03 09:27:22 crc kubenswrapper[4790]: W1003 09:27:22.591286 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod63400791_2785_485b_86b4_ffd9943163a5.slice/crio-f82852ed5c27e5516f440bc6ea954867259051550d2fc2b992a24c6b95898393 WatchSource:0}: Error finding container f82852ed5c27e5516f440bc6ea954867259051550d2fc2b992a24c6b95898393: Status 404 returned error can't find the container with id f82852ed5c27e5516f440bc6ea954867259051550d2fc2b992a24c6b95898393 Oct 03 09:27:22 crc kubenswrapper[4790]: I1003 09:27:22.801272 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-volume-nfs-2-0"] Oct 03 09:27:23 crc kubenswrapper[4790]: I1003 09:27:23.299785 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-nfs-0" event={"ID":"63400791-2785-485b-86b4-ffd9943163a5","Type":"ContainerStarted","Data":"1f5a2639d537d21adc635d9e277bc60af214beadbf8830e2a4ec15cc7aecc41c"} Oct 03 09:27:23 crc kubenswrapper[4790]: I1003 09:27:23.301425 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-nfs-0" event={"ID":"63400791-2785-485b-86b4-ffd9943163a5","Type":"ContainerStarted","Data":"f82852ed5c27e5516f440bc6ea954867259051550d2fc2b992a24c6b95898393"} Oct 03 09:27:23 crc kubenswrapper[4790]: I1003 09:27:23.302027 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-backup-0" event={"ID":"8927cfc4-2b2e-4a79-bd9e-3ff95160b86a","Type":"ContainerStarted","Data":"92fc2882e1235a12156281617969c94138290217e3cd3b8619ade46c272b9c64"} Oct 03 09:27:23 crc kubenswrapper[4790]: I1003 09:27:23.302077 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-backup-0" event={"ID":"8927cfc4-2b2e-4a79-bd9e-3ff95160b86a","Type":"ContainerStarted","Data":"24b3c23124e36df36956bfcaad843ed77acebc13daeb0a34dd5793da6e48ab52"} Oct 03 09:27:23 crc kubenswrapper[4790]: I1003 09:27:23.308059 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-nfs-2-0" event={"ID":"313b0b89-2532-482b-994d-317dfc54b6a7","Type":"ContainerStarted","Data":"e2ef2c8c47d633914ab700273d4630c8f8d3debd0bf340735304fb88320d596b"} Oct 03 09:27:23 crc kubenswrapper[4790]: I1003 09:27:23.308104 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-nfs-2-0" event={"ID":"313b0b89-2532-482b-994d-317dfc54b6a7","Type":"ContainerStarted","Data":"65f4262e229f53231e9c3afdf9fcfab92396c615e785d1f4df1eaa31652ebfcb"} Oct 03 09:27:24 crc kubenswrapper[4790]: I1003 09:27:24.319266 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-backup-0" event={"ID":"8927cfc4-2b2e-4a79-bd9e-3ff95160b86a","Type":"ContainerStarted","Data":"0e2a65b01e5941ba6970734f98cbaed18744481ebdaa8a0233be69f731bfe078"} Oct 03 09:27:24 crc kubenswrapper[4790]: I1003 09:27:24.322634 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-nfs-2-0" event={"ID":"313b0b89-2532-482b-994d-317dfc54b6a7","Type":"ContainerStarted","Data":"ade2948af64b91854822365b8a6c9e1dcdb4e3032a82e4775f5aaf9a1950ed2a"} Oct 03 09:27:24 crc kubenswrapper[4790]: I1003 09:27:24.323918 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-nfs-0" event={"ID":"63400791-2785-485b-86b4-ffd9943163a5","Type":"ContainerStarted","Data":"89ce4b1b787eff5cfa6a7ddfad2f839290d5b49a542b9e9b6f88910e259d01cb"} Oct 03 09:27:24 crc kubenswrapper[4790]: I1003 09:27:24.351001 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-backup-0" podStartSLOduration=3.015263249 podStartE2EDuration="3.350978892s" podCreationTimestamp="2025-10-03 09:27:21 +0000 UTC" firstStartedPulling="2025-10-03 09:27:22.454919599 +0000 UTC m=+2828.807853837" lastFinishedPulling="2025-10-03 09:27:22.790635252 +0000 UTC m=+2829.143569480" observedRunningTime="2025-10-03 09:27:24.341406069 +0000 UTC m=+2830.694340327" watchObservedRunningTime="2025-10-03 09:27:24.350978892 +0000 UTC m=+2830.703913130" Oct 03 09:27:24 crc kubenswrapper[4790]: I1003 09:27:24.382031 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-volume-nfs-2-0" podStartSLOduration=3.294358364 podStartE2EDuration="3.382010865s" podCreationTimestamp="2025-10-03 09:27:21 +0000 UTC" firstStartedPulling="2025-10-03 09:27:22.814131958 +0000 UTC m=+2829.167066196" lastFinishedPulling="2025-10-03 09:27:22.901784459 +0000 UTC m=+2829.254718697" observedRunningTime="2025-10-03 09:27:24.37524806 +0000 UTC m=+2830.728182298" watchObservedRunningTime="2025-10-03 09:27:24.382010865 +0000 UTC m=+2830.734945103" Oct 03 09:27:24 crc kubenswrapper[4790]: I1003 09:27:24.423735 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-volume-nfs-0" podStartSLOduration=3.098866429 podStartE2EDuration="3.423720683s" podCreationTimestamp="2025-10-03 09:27:21 +0000 UTC" firstStartedPulling="2025-10-03 09:27:22.593632534 +0000 UTC m=+2828.946566772" lastFinishedPulling="2025-10-03 09:27:22.918486788 +0000 UTC m=+2829.271421026" observedRunningTime="2025-10-03 09:27:24.417381328 +0000 UTC m=+2830.770315566" watchObservedRunningTime="2025-10-03 09:27:24.423720683 +0000 UTC m=+2830.776654921" Oct 03 09:27:26 crc kubenswrapper[4790]: I1003 09:27:26.829130 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-backup-0" Oct 03 09:27:26 crc kubenswrapper[4790]: I1003 09:27:26.937319 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-volume-nfs-0" Oct 03 09:27:27 crc kubenswrapper[4790]: I1003 09:27:27.228358 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-volume-nfs-2-0" Oct 03 09:27:31 crc kubenswrapper[4790]: I1003 09:27:31.962414 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-backup-0" Oct 03 09:27:32 crc kubenswrapper[4790]: I1003 09:27:32.360938 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-volume-nfs-0" Oct 03 09:27:32 crc kubenswrapper[4790]: I1003 09:27:32.488280 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-volume-nfs-2-0" Oct 03 09:27:40 crc kubenswrapper[4790]: I1003 09:27:40.186491 4790 patch_prober.go:28] interesting pod/machine-config-daemon-zqj4j container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 09:27:40 crc kubenswrapper[4790]: I1003 09:27:40.187080 4790 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" podUID="36eae06a-d8be-4128-8d68-acb32e1f011b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 09:28:10 crc kubenswrapper[4790]: I1003 09:28:10.186765 4790 patch_prober.go:28] interesting pod/machine-config-daemon-zqj4j container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 09:28:10 crc kubenswrapper[4790]: I1003 09:28:10.187516 4790 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" podUID="36eae06a-d8be-4128-8d68-acb32e1f011b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 09:28:22 crc kubenswrapper[4790]: I1003 09:28:22.627949 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/prometheus-metric-storage-0"] Oct 03 09:28:22 crc kubenswrapper[4790]: I1003 09:28:22.628960 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="969ea996-248e-41c5-a868-bd2321bfdf49" containerName="prometheus" containerID="cri-o://b2fa0bbfa445c98b31a8b51965a7772f66cb02af44db0718136e149250b82e41" gracePeriod=600 Oct 03 09:28:22 crc kubenswrapper[4790]: I1003 09:28:22.629046 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="969ea996-248e-41c5-a868-bd2321bfdf49" containerName="thanos-sidecar" containerID="cri-o://b820bc6e813910c00009b922deffd923d0f8bac02f59c283da5b716de5621538" gracePeriod=600 Oct 03 09:28:22 crc kubenswrapper[4790]: I1003 09:28:22.629113 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="969ea996-248e-41c5-a868-bd2321bfdf49" containerName="config-reloader" containerID="cri-o://6a4046c5494a8658b86a447e2ee54bf5c6268e28cb833c7de38e281d7992caad" gracePeriod=600 Oct 03 09:28:23 crc kubenswrapper[4790]: I1003 09:28:23.024683 4790 generic.go:334] "Generic (PLEG): container finished" podID="969ea996-248e-41c5-a868-bd2321bfdf49" containerID="b820bc6e813910c00009b922deffd923d0f8bac02f59c283da5b716de5621538" exitCode=0 Oct 03 09:28:23 crc kubenswrapper[4790]: I1003 09:28:23.024714 4790 generic.go:334] "Generic (PLEG): container finished" podID="969ea996-248e-41c5-a868-bd2321bfdf49" containerID="b2fa0bbfa445c98b31a8b51965a7772f66cb02af44db0718136e149250b82e41" exitCode=0 Oct 03 09:28:23 crc kubenswrapper[4790]: I1003 09:28:23.024737 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"969ea996-248e-41c5-a868-bd2321bfdf49","Type":"ContainerDied","Data":"b820bc6e813910c00009b922deffd923d0f8bac02f59c283da5b716de5621538"} Oct 03 09:28:23 crc kubenswrapper[4790]: I1003 09:28:23.024763 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"969ea996-248e-41c5-a868-bd2321bfdf49","Type":"ContainerDied","Data":"b2fa0bbfa445c98b31a8b51965a7772f66cb02af44db0718136e149250b82e41"} Oct 03 09:28:23 crc kubenswrapper[4790]: I1003 09:28:23.449592 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Oct 03 09:28:23 crc kubenswrapper[4790]: I1003 09:28:23.485648 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/969ea996-248e-41c5-a868-bd2321bfdf49-web-config\") pod \"969ea996-248e-41c5-a868-bd2321bfdf49\" (UID: \"969ea996-248e-41c5-a868-bd2321bfdf49\") " Oct 03 09:28:23 crc kubenswrapper[4790]: I1003 09:28:23.500338 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/969ea996-248e-41c5-a868-bd2321bfdf49-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"969ea996-248e-41c5-a868-bd2321bfdf49\" (UID: \"969ea996-248e-41c5-a868-bd2321bfdf49\") " Oct 03 09:28:23 crc kubenswrapper[4790]: I1003 09:28:23.507736 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/969ea996-248e-41c5-a868-bd2321bfdf49-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d" (OuterVolumeSpecName: "web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d") pod "969ea996-248e-41c5-a868-bd2321bfdf49" (UID: "969ea996-248e-41c5-a868-bd2321bfdf49"). InnerVolumeSpecName "web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:28:23 crc kubenswrapper[4790]: I1003 09:28:23.578406 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/969ea996-248e-41c5-a868-bd2321bfdf49-web-config" (OuterVolumeSpecName: "web-config") pod "969ea996-248e-41c5-a868-bd2321bfdf49" (UID: "969ea996-248e-41c5-a868-bd2321bfdf49"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:28:23 crc kubenswrapper[4790]: I1003 09:28:23.602391 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/969ea996-248e-41c5-a868-bd2321bfdf49-config\") pod \"969ea996-248e-41c5-a868-bd2321bfdf49\" (UID: \"969ea996-248e-41c5-a868-bd2321bfdf49\") " Oct 03 09:28:23 crc kubenswrapper[4790]: I1003 09:28:23.602585 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g55gr\" (UniqueName: \"kubernetes.io/projected/969ea996-248e-41c5-a868-bd2321bfdf49-kube-api-access-g55gr\") pod \"969ea996-248e-41c5-a868-bd2321bfdf49\" (UID: \"969ea996-248e-41c5-a868-bd2321bfdf49\") " Oct 03 09:28:23 crc kubenswrapper[4790]: I1003 09:28:23.602616 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/969ea996-248e-41c5-a868-bd2321bfdf49-thanos-prometheus-http-client-file\") pod \"969ea996-248e-41c5-a868-bd2321bfdf49\" (UID: \"969ea996-248e-41c5-a868-bd2321bfdf49\") " Oct 03 09:28:23 crc kubenswrapper[4790]: I1003 09:28:23.602654 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/969ea996-248e-41c5-a868-bd2321bfdf49-config-out\") pod \"969ea996-248e-41c5-a868-bd2321bfdf49\" (UID: \"969ea996-248e-41c5-a868-bd2321bfdf49\") " Oct 03 09:28:23 crc kubenswrapper[4790]: I1003 09:28:23.602765 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/969ea996-248e-41c5-a868-bd2321bfdf49-secret-combined-ca-bundle\") pod \"969ea996-248e-41c5-a868-bd2321bfdf49\" (UID: \"969ea996-248e-41c5-a868-bd2321bfdf49\") " Oct 03 09:28:23 crc kubenswrapper[4790]: I1003 09:28:23.603351 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-db\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-84e9d10c-4308-45f2-a94a-547cf8f7d3b4\") pod \"969ea996-248e-41c5-a868-bd2321bfdf49\" (UID: \"969ea996-248e-41c5-a868-bd2321bfdf49\") " Oct 03 09:28:23 crc kubenswrapper[4790]: I1003 09:28:23.603383 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/969ea996-248e-41c5-a868-bd2321bfdf49-tls-assets\") pod \"969ea996-248e-41c5-a868-bd2321bfdf49\" (UID: \"969ea996-248e-41c5-a868-bd2321bfdf49\") " Oct 03 09:28:23 crc kubenswrapper[4790]: I1003 09:28:23.603432 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/969ea996-248e-41c5-a868-bd2321bfdf49-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"969ea996-248e-41c5-a868-bd2321bfdf49\" (UID: \"969ea996-248e-41c5-a868-bd2321bfdf49\") " Oct 03 09:28:23 crc kubenswrapper[4790]: I1003 09:28:23.603455 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/969ea996-248e-41c5-a868-bd2321bfdf49-prometheus-metric-storage-rulefiles-0\") pod \"969ea996-248e-41c5-a868-bd2321bfdf49\" (UID: \"969ea996-248e-41c5-a868-bd2321bfdf49\") " Oct 03 09:28:23 crc kubenswrapper[4790]: I1003 09:28:23.604029 4790 reconciler_common.go:293] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/969ea996-248e-41c5-a868-bd2321bfdf49-web-config\") on node \"crc\" DevicePath \"\"" Oct 03 09:28:23 crc kubenswrapper[4790]: I1003 09:28:23.604063 4790 reconciler_common.go:293] "Volume detached for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/969ea996-248e-41c5-a868-bd2321bfdf49-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") on node \"crc\" DevicePath \"\"" Oct 03 09:28:23 crc kubenswrapper[4790]: I1003 09:28:23.607648 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/969ea996-248e-41c5-a868-bd2321bfdf49-prometheus-metric-storage-rulefiles-0" (OuterVolumeSpecName: "prometheus-metric-storage-rulefiles-0") pod "969ea996-248e-41c5-a868-bd2321bfdf49" (UID: "969ea996-248e-41c5-a868-bd2321bfdf49"). InnerVolumeSpecName "prometheus-metric-storage-rulefiles-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 09:28:23 crc kubenswrapper[4790]: I1003 09:28:23.607662 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/969ea996-248e-41c5-a868-bd2321bfdf49-config-out" (OuterVolumeSpecName: "config-out") pod "969ea996-248e-41c5-a868-bd2321bfdf49" (UID: "969ea996-248e-41c5-a868-bd2321bfdf49"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 09:28:23 crc kubenswrapper[4790]: I1003 09:28:23.610159 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/969ea996-248e-41c5-a868-bd2321bfdf49-config" (OuterVolumeSpecName: "config") pod "969ea996-248e-41c5-a868-bd2321bfdf49" (UID: "969ea996-248e-41c5-a868-bd2321bfdf49"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:28:23 crc kubenswrapper[4790]: I1003 09:28:23.612691 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/969ea996-248e-41c5-a868-bd2321bfdf49-thanos-prometheus-http-client-file" (OuterVolumeSpecName: "thanos-prometheus-http-client-file") pod "969ea996-248e-41c5-a868-bd2321bfdf49" (UID: "969ea996-248e-41c5-a868-bd2321bfdf49"). InnerVolumeSpecName "thanos-prometheus-http-client-file". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:28:23 crc kubenswrapper[4790]: I1003 09:28:23.613194 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/969ea996-248e-41c5-a868-bd2321bfdf49-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "969ea996-248e-41c5-a868-bd2321bfdf49" (UID: "969ea996-248e-41c5-a868-bd2321bfdf49"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:28:23 crc kubenswrapper[4790]: I1003 09:28:23.614347 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/969ea996-248e-41c5-a868-bd2321bfdf49-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d" (OuterVolumeSpecName: "web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d") pod "969ea996-248e-41c5-a868-bd2321bfdf49" (UID: "969ea996-248e-41c5-a868-bd2321bfdf49"). InnerVolumeSpecName "web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:28:23 crc kubenswrapper[4790]: I1003 09:28:23.618750 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/969ea996-248e-41c5-a868-bd2321bfdf49-secret-combined-ca-bundle" (OuterVolumeSpecName: "secret-combined-ca-bundle") pod "969ea996-248e-41c5-a868-bd2321bfdf49" (UID: "969ea996-248e-41c5-a868-bd2321bfdf49"). InnerVolumeSpecName "secret-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:28:23 crc kubenswrapper[4790]: I1003 09:28:23.620909 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/969ea996-248e-41c5-a868-bd2321bfdf49-kube-api-access-g55gr" (OuterVolumeSpecName: "kube-api-access-g55gr") pod "969ea996-248e-41c5-a868-bd2321bfdf49" (UID: "969ea996-248e-41c5-a868-bd2321bfdf49"). InnerVolumeSpecName "kube-api-access-g55gr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:28:23 crc kubenswrapper[4790]: I1003 09:28:23.636151 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-84e9d10c-4308-45f2-a94a-547cf8f7d3b4" (OuterVolumeSpecName: "prometheus-metric-storage-db") pod "969ea996-248e-41c5-a868-bd2321bfdf49" (UID: "969ea996-248e-41c5-a868-bd2321bfdf49"). InnerVolumeSpecName "pvc-84e9d10c-4308-45f2-a94a-547cf8f7d3b4". PluginName "kubernetes.io/csi", VolumeGidValue "" Oct 03 09:28:23 crc kubenswrapper[4790]: I1003 09:28:23.705853 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g55gr\" (UniqueName: \"kubernetes.io/projected/969ea996-248e-41c5-a868-bd2321bfdf49-kube-api-access-g55gr\") on node \"crc\" DevicePath \"\"" Oct 03 09:28:23 crc kubenswrapper[4790]: I1003 09:28:23.705890 4790 reconciler_common.go:293] "Volume detached for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/969ea996-248e-41c5-a868-bd2321bfdf49-thanos-prometheus-http-client-file\") on node \"crc\" DevicePath \"\"" Oct 03 09:28:23 crc kubenswrapper[4790]: I1003 09:28:23.705908 4790 reconciler_common.go:293] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/969ea996-248e-41c5-a868-bd2321bfdf49-config-out\") on node \"crc\" DevicePath \"\"" Oct 03 09:28:23 crc kubenswrapper[4790]: I1003 09:28:23.705920 4790 reconciler_common.go:293] "Volume detached for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/969ea996-248e-41c5-a868-bd2321bfdf49-secret-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 09:28:23 crc kubenswrapper[4790]: I1003 09:28:23.705965 4790 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-84e9d10c-4308-45f2-a94a-547cf8f7d3b4\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-84e9d10c-4308-45f2-a94a-547cf8f7d3b4\") on node \"crc\" " Oct 03 09:28:23 crc kubenswrapper[4790]: I1003 09:28:23.706004 4790 reconciler_common.go:293] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/969ea996-248e-41c5-a868-bd2321bfdf49-tls-assets\") on node \"crc\" DevicePath \"\"" Oct 03 09:28:23 crc kubenswrapper[4790]: I1003 09:28:23.706017 4790 reconciler_common.go:293] "Volume detached for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/969ea996-248e-41c5-a868-bd2321bfdf49-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") on node \"crc\" DevicePath \"\"" Oct 03 09:28:23 crc kubenswrapper[4790]: I1003 09:28:23.706029 4790 reconciler_common.go:293] "Volume detached for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/969ea996-248e-41c5-a868-bd2321bfdf49-prometheus-metric-storage-rulefiles-0\") on node \"crc\" DevicePath \"\"" Oct 03 09:28:23 crc kubenswrapper[4790]: I1003 09:28:23.706041 4790 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/969ea996-248e-41c5-a868-bd2321bfdf49-config\") on node \"crc\" DevicePath \"\"" Oct 03 09:28:23 crc kubenswrapper[4790]: I1003 09:28:23.735070 4790 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Oct 03 09:28:23 crc kubenswrapper[4790]: I1003 09:28:23.735873 4790 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-84e9d10c-4308-45f2-a94a-547cf8f7d3b4" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-84e9d10c-4308-45f2-a94a-547cf8f7d3b4") on node "crc" Oct 03 09:28:23 crc kubenswrapper[4790]: I1003 09:28:23.807953 4790 reconciler_common.go:293] "Volume detached for volume \"pvc-84e9d10c-4308-45f2-a94a-547cf8f7d3b4\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-84e9d10c-4308-45f2-a94a-547cf8f7d3b4\") on node \"crc\" DevicePath \"\"" Oct 03 09:28:24 crc kubenswrapper[4790]: I1003 09:28:24.034728 4790 generic.go:334] "Generic (PLEG): container finished" podID="969ea996-248e-41c5-a868-bd2321bfdf49" containerID="6a4046c5494a8658b86a447e2ee54bf5c6268e28cb833c7de38e281d7992caad" exitCode=0 Oct 03 09:28:24 crc kubenswrapper[4790]: I1003 09:28:24.034778 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"969ea996-248e-41c5-a868-bd2321bfdf49","Type":"ContainerDied","Data":"6a4046c5494a8658b86a447e2ee54bf5c6268e28cb833c7de38e281d7992caad"} Oct 03 09:28:24 crc kubenswrapper[4790]: I1003 09:28:24.034785 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Oct 03 09:28:24 crc kubenswrapper[4790]: I1003 09:28:24.034814 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"969ea996-248e-41c5-a868-bd2321bfdf49","Type":"ContainerDied","Data":"6aba2847061e4fa530565b96e4fe6bc6987c8e70fff3a26d62d3735a6b628e86"} Oct 03 09:28:24 crc kubenswrapper[4790]: I1003 09:28:24.034834 4790 scope.go:117] "RemoveContainer" containerID="b820bc6e813910c00009b922deffd923d0f8bac02f59c283da5b716de5621538" Oct 03 09:28:24 crc kubenswrapper[4790]: I1003 09:28:24.067784 4790 scope.go:117] "RemoveContainer" containerID="6a4046c5494a8658b86a447e2ee54bf5c6268e28cb833c7de38e281d7992caad" Oct 03 09:28:24 crc kubenswrapper[4790]: I1003 09:28:24.078764 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/prometheus-metric-storage-0"] Oct 03 09:28:24 crc kubenswrapper[4790]: I1003 09:28:24.091509 4790 scope.go:117] "RemoveContainer" containerID="b2fa0bbfa445c98b31a8b51965a7772f66cb02af44db0718136e149250b82e41" Oct 03 09:28:24 crc kubenswrapper[4790]: I1003 09:28:24.096645 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/prometheus-metric-storage-0"] Oct 03 09:28:24 crc kubenswrapper[4790]: I1003 09:28:24.105693 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/prometheus-metric-storage-0"] Oct 03 09:28:24 crc kubenswrapper[4790]: E1003 09:28:24.106173 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="969ea996-248e-41c5-a868-bd2321bfdf49" containerName="thanos-sidecar" Oct 03 09:28:24 crc kubenswrapper[4790]: I1003 09:28:24.106186 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="969ea996-248e-41c5-a868-bd2321bfdf49" containerName="thanos-sidecar" Oct 03 09:28:24 crc kubenswrapper[4790]: E1003 09:28:24.106216 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="969ea996-248e-41c5-a868-bd2321bfdf49" containerName="prometheus" Oct 03 09:28:24 crc kubenswrapper[4790]: I1003 09:28:24.106221 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="969ea996-248e-41c5-a868-bd2321bfdf49" containerName="prometheus" Oct 03 09:28:24 crc kubenswrapper[4790]: E1003 09:28:24.106239 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="969ea996-248e-41c5-a868-bd2321bfdf49" containerName="config-reloader" Oct 03 09:28:24 crc kubenswrapper[4790]: I1003 09:28:24.106246 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="969ea996-248e-41c5-a868-bd2321bfdf49" containerName="config-reloader" Oct 03 09:28:24 crc kubenswrapper[4790]: E1003 09:28:24.106268 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="969ea996-248e-41c5-a868-bd2321bfdf49" containerName="init-config-reloader" Oct 03 09:28:24 crc kubenswrapper[4790]: I1003 09:28:24.106274 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="969ea996-248e-41c5-a868-bd2321bfdf49" containerName="init-config-reloader" Oct 03 09:28:24 crc kubenswrapper[4790]: I1003 09:28:24.106471 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="969ea996-248e-41c5-a868-bd2321bfdf49" containerName="prometheus" Oct 03 09:28:24 crc kubenswrapper[4790]: I1003 09:28:24.106483 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="969ea996-248e-41c5-a868-bd2321bfdf49" containerName="config-reloader" Oct 03 09:28:24 crc kubenswrapper[4790]: I1003 09:28:24.106500 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="969ea996-248e-41c5-a868-bd2321bfdf49" containerName="thanos-sidecar" Oct 03 09:28:24 crc kubenswrapper[4790]: I1003 09:28:24.108646 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Oct 03 09:28:24 crc kubenswrapper[4790]: I1003 09:28:24.111151 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-thanos-prometheus-http-client-file" Oct 03 09:28:24 crc kubenswrapper[4790]: I1003 09:28:24.111384 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-0" Oct 03 09:28:24 crc kubenswrapper[4790]: I1003 09:28:24.111515 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage" Oct 03 09:28:24 crc kubenswrapper[4790]: I1003 09:28:24.112415 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-web-config" Oct 03 09:28:24 crc kubenswrapper[4790]: I1003 09:28:24.113706 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-prometheus-dockercfg-xvhkt" Oct 03 09:28:24 crc kubenswrapper[4790]: I1003 09:28:24.116602 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-tls-assets-0" Oct 03 09:28:24 crc kubenswrapper[4790]: I1003 09:28:24.116828 4790 scope.go:117] "RemoveContainer" containerID="1be963d592188b95b344a514f65a2bc6200f41163ad164180e6b2aa47557d623" Oct 03 09:28:24 crc kubenswrapper[4790]: I1003 09:28:24.123705 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Oct 03 09:28:24 crc kubenswrapper[4790]: I1003 09:28:24.141643 4790 scope.go:117] "RemoveContainer" containerID="b820bc6e813910c00009b922deffd923d0f8bac02f59c283da5b716de5621538" Oct 03 09:28:24 crc kubenswrapper[4790]: E1003 09:28:24.143533 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b820bc6e813910c00009b922deffd923d0f8bac02f59c283da5b716de5621538\": container with ID starting with b820bc6e813910c00009b922deffd923d0f8bac02f59c283da5b716de5621538 not found: ID does not exist" containerID="b820bc6e813910c00009b922deffd923d0f8bac02f59c283da5b716de5621538" Oct 03 09:28:24 crc kubenswrapper[4790]: I1003 09:28:24.144212 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b820bc6e813910c00009b922deffd923d0f8bac02f59c283da5b716de5621538"} err="failed to get container status \"b820bc6e813910c00009b922deffd923d0f8bac02f59c283da5b716de5621538\": rpc error: code = NotFound desc = could not find container \"b820bc6e813910c00009b922deffd923d0f8bac02f59c283da5b716de5621538\": container with ID starting with b820bc6e813910c00009b922deffd923d0f8bac02f59c283da5b716de5621538 not found: ID does not exist" Oct 03 09:28:24 crc kubenswrapper[4790]: I1003 09:28:24.144251 4790 scope.go:117] "RemoveContainer" containerID="6a4046c5494a8658b86a447e2ee54bf5c6268e28cb833c7de38e281d7992caad" Oct 03 09:28:24 crc kubenswrapper[4790]: E1003 09:28:24.145823 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6a4046c5494a8658b86a447e2ee54bf5c6268e28cb833c7de38e281d7992caad\": container with ID starting with 6a4046c5494a8658b86a447e2ee54bf5c6268e28cb833c7de38e281d7992caad not found: ID does not exist" containerID="6a4046c5494a8658b86a447e2ee54bf5c6268e28cb833c7de38e281d7992caad" Oct 03 09:28:24 crc kubenswrapper[4790]: I1003 09:28:24.145882 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6a4046c5494a8658b86a447e2ee54bf5c6268e28cb833c7de38e281d7992caad"} err="failed to get container status \"6a4046c5494a8658b86a447e2ee54bf5c6268e28cb833c7de38e281d7992caad\": rpc error: code = NotFound desc = could not find container \"6a4046c5494a8658b86a447e2ee54bf5c6268e28cb833c7de38e281d7992caad\": container with ID starting with 6a4046c5494a8658b86a447e2ee54bf5c6268e28cb833c7de38e281d7992caad not found: ID does not exist" Oct 03 09:28:24 crc kubenswrapper[4790]: I1003 09:28:24.145913 4790 scope.go:117] "RemoveContainer" containerID="b2fa0bbfa445c98b31a8b51965a7772f66cb02af44db0718136e149250b82e41" Oct 03 09:28:24 crc kubenswrapper[4790]: E1003 09:28:24.147239 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b2fa0bbfa445c98b31a8b51965a7772f66cb02af44db0718136e149250b82e41\": container with ID starting with b2fa0bbfa445c98b31a8b51965a7772f66cb02af44db0718136e149250b82e41 not found: ID does not exist" containerID="b2fa0bbfa445c98b31a8b51965a7772f66cb02af44db0718136e149250b82e41" Oct 03 09:28:24 crc kubenswrapper[4790]: I1003 09:28:24.147294 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b2fa0bbfa445c98b31a8b51965a7772f66cb02af44db0718136e149250b82e41"} err="failed to get container status \"b2fa0bbfa445c98b31a8b51965a7772f66cb02af44db0718136e149250b82e41\": rpc error: code = NotFound desc = could not find container \"b2fa0bbfa445c98b31a8b51965a7772f66cb02af44db0718136e149250b82e41\": container with ID starting with b2fa0bbfa445c98b31a8b51965a7772f66cb02af44db0718136e149250b82e41 not found: ID does not exist" Oct 03 09:28:24 crc kubenswrapper[4790]: I1003 09:28:24.147307 4790 scope.go:117] "RemoveContainer" containerID="1be963d592188b95b344a514f65a2bc6200f41163ad164180e6b2aa47557d623" Oct 03 09:28:24 crc kubenswrapper[4790]: E1003 09:28:24.150376 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1be963d592188b95b344a514f65a2bc6200f41163ad164180e6b2aa47557d623\": container with ID starting with 1be963d592188b95b344a514f65a2bc6200f41163ad164180e6b2aa47557d623 not found: ID does not exist" containerID="1be963d592188b95b344a514f65a2bc6200f41163ad164180e6b2aa47557d623" Oct 03 09:28:24 crc kubenswrapper[4790]: I1003 09:28:24.150416 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1be963d592188b95b344a514f65a2bc6200f41163ad164180e6b2aa47557d623"} err="failed to get container status \"1be963d592188b95b344a514f65a2bc6200f41163ad164180e6b2aa47557d623\": rpc error: code = NotFound desc = could not find container \"1be963d592188b95b344a514f65a2bc6200f41163ad164180e6b2aa47557d623\": container with ID starting with 1be963d592188b95b344a514f65a2bc6200f41163ad164180e6b2aa47557d623 not found: ID does not exist" Oct 03 09:28:24 crc kubenswrapper[4790]: I1003 09:28:24.216568 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/ba88a153-0441-4fd3-a3b3-37338c589169-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"ba88a153-0441-4fd3-a3b3-37338c589169\") " pod="openstack/prometheus-metric-storage-0" Oct 03 09:28:24 crc kubenswrapper[4790]: I1003 09:28:24.216624 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/ba88a153-0441-4fd3-a3b3-37338c589169-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"ba88a153-0441-4fd3-a3b3-37338c589169\") " pod="openstack/prometheus-metric-storage-0" Oct 03 09:28:24 crc kubenswrapper[4790]: I1003 09:28:24.216693 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/ba88a153-0441-4fd3-a3b3-37338c589169-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"ba88a153-0441-4fd3-a3b3-37338c589169\") " pod="openstack/prometheus-metric-storage-0" Oct 03 09:28:24 crc kubenswrapper[4790]: I1003 09:28:24.216721 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-brgld\" (UniqueName: \"kubernetes.io/projected/ba88a153-0441-4fd3-a3b3-37338c589169-kube-api-access-brgld\") pod \"prometheus-metric-storage-0\" (UID: \"ba88a153-0441-4fd3-a3b3-37338c589169\") " pod="openstack/prometheus-metric-storage-0" Oct 03 09:28:24 crc kubenswrapper[4790]: I1003 09:28:24.216750 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/ba88a153-0441-4fd3-a3b3-37338c589169-config\") pod \"prometheus-metric-storage-0\" (UID: \"ba88a153-0441-4fd3-a3b3-37338c589169\") " pod="openstack/prometheus-metric-storage-0" Oct 03 09:28:24 crc kubenswrapper[4790]: I1003 09:28:24.216777 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-84e9d10c-4308-45f2-a94a-547cf8f7d3b4\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-84e9d10c-4308-45f2-a94a-547cf8f7d3b4\") pod \"prometheus-metric-storage-0\" (UID: \"ba88a153-0441-4fd3-a3b3-37338c589169\") " pod="openstack/prometheus-metric-storage-0" Oct 03 09:28:24 crc kubenswrapper[4790]: I1003 09:28:24.216824 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/ba88a153-0441-4fd3-a3b3-37338c589169-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"ba88a153-0441-4fd3-a3b3-37338c589169\") " pod="openstack/prometheus-metric-storage-0" Oct 03 09:28:24 crc kubenswrapper[4790]: I1003 09:28:24.216872 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ba88a153-0441-4fd3-a3b3-37338c589169-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"ba88a153-0441-4fd3-a3b3-37338c589169\") " pod="openstack/prometheus-metric-storage-0" Oct 03 09:28:24 crc kubenswrapper[4790]: I1003 09:28:24.216910 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/ba88a153-0441-4fd3-a3b3-37338c589169-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"ba88a153-0441-4fd3-a3b3-37338c589169\") " pod="openstack/prometheus-metric-storage-0" Oct 03 09:28:24 crc kubenswrapper[4790]: I1003 09:28:24.216953 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/ba88a153-0441-4fd3-a3b3-37338c589169-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"ba88a153-0441-4fd3-a3b3-37338c589169\") " pod="openstack/prometheus-metric-storage-0" Oct 03 09:28:24 crc kubenswrapper[4790]: I1003 09:28:24.216983 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/ba88a153-0441-4fd3-a3b3-37338c589169-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"ba88a153-0441-4fd3-a3b3-37338c589169\") " pod="openstack/prometheus-metric-storage-0" Oct 03 09:28:24 crc kubenswrapper[4790]: I1003 09:28:24.319276 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/ba88a153-0441-4fd3-a3b3-37338c589169-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"ba88a153-0441-4fd3-a3b3-37338c589169\") " pod="openstack/prometheus-metric-storage-0" Oct 03 09:28:24 crc kubenswrapper[4790]: I1003 09:28:24.319383 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/ba88a153-0441-4fd3-a3b3-37338c589169-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"ba88a153-0441-4fd3-a3b3-37338c589169\") " pod="openstack/prometheus-metric-storage-0" Oct 03 09:28:24 crc kubenswrapper[4790]: I1003 09:28:24.319403 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/ba88a153-0441-4fd3-a3b3-37338c589169-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"ba88a153-0441-4fd3-a3b3-37338c589169\") " pod="openstack/prometheus-metric-storage-0" Oct 03 09:28:24 crc kubenswrapper[4790]: I1003 09:28:24.319453 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/ba88a153-0441-4fd3-a3b3-37338c589169-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"ba88a153-0441-4fd3-a3b3-37338c589169\") " pod="openstack/prometheus-metric-storage-0" Oct 03 09:28:24 crc kubenswrapper[4790]: I1003 09:28:24.319473 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-brgld\" (UniqueName: \"kubernetes.io/projected/ba88a153-0441-4fd3-a3b3-37338c589169-kube-api-access-brgld\") pod \"prometheus-metric-storage-0\" (UID: \"ba88a153-0441-4fd3-a3b3-37338c589169\") " pod="openstack/prometheus-metric-storage-0" Oct 03 09:28:24 crc kubenswrapper[4790]: I1003 09:28:24.319497 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/ba88a153-0441-4fd3-a3b3-37338c589169-config\") pod \"prometheus-metric-storage-0\" (UID: \"ba88a153-0441-4fd3-a3b3-37338c589169\") " pod="openstack/prometheus-metric-storage-0" Oct 03 09:28:24 crc kubenswrapper[4790]: I1003 09:28:24.319534 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-84e9d10c-4308-45f2-a94a-547cf8f7d3b4\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-84e9d10c-4308-45f2-a94a-547cf8f7d3b4\") pod \"prometheus-metric-storage-0\" (UID: \"ba88a153-0441-4fd3-a3b3-37338c589169\") " pod="openstack/prometheus-metric-storage-0" Oct 03 09:28:24 crc kubenswrapper[4790]: I1003 09:28:24.319569 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/ba88a153-0441-4fd3-a3b3-37338c589169-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"ba88a153-0441-4fd3-a3b3-37338c589169\") " pod="openstack/prometheus-metric-storage-0" Oct 03 09:28:24 crc kubenswrapper[4790]: I1003 09:28:24.319614 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ba88a153-0441-4fd3-a3b3-37338c589169-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"ba88a153-0441-4fd3-a3b3-37338c589169\") " pod="openstack/prometheus-metric-storage-0" Oct 03 09:28:24 crc kubenswrapper[4790]: I1003 09:28:24.319636 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/ba88a153-0441-4fd3-a3b3-37338c589169-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"ba88a153-0441-4fd3-a3b3-37338c589169\") " pod="openstack/prometheus-metric-storage-0" Oct 03 09:28:24 crc kubenswrapper[4790]: I1003 09:28:24.319667 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/ba88a153-0441-4fd3-a3b3-37338c589169-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"ba88a153-0441-4fd3-a3b3-37338c589169\") " pod="openstack/prometheus-metric-storage-0" Oct 03 09:28:24 crc kubenswrapper[4790]: I1003 09:28:24.320832 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/ba88a153-0441-4fd3-a3b3-37338c589169-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"ba88a153-0441-4fd3-a3b3-37338c589169\") " pod="openstack/prometheus-metric-storage-0" Oct 03 09:28:24 crc kubenswrapper[4790]: I1003 09:28:24.326488 4790 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Oct 03 09:28:24 crc kubenswrapper[4790]: I1003 09:28:24.326522 4790 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-84e9d10c-4308-45f2-a94a-547cf8f7d3b4\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-84e9d10c-4308-45f2-a94a-547cf8f7d3b4\") pod \"prometheus-metric-storage-0\" (UID: \"ba88a153-0441-4fd3-a3b3-37338c589169\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/a9f968d81b8706f372416bcf6becb6316da12568f5c5e3584a75f6147541fb3e/globalmount\"" pod="openstack/prometheus-metric-storage-0" Oct 03 09:28:24 crc kubenswrapper[4790]: I1003 09:28:24.327724 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/ba88a153-0441-4fd3-a3b3-37338c589169-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"ba88a153-0441-4fd3-a3b3-37338c589169\") " pod="openstack/prometheus-metric-storage-0" Oct 03 09:28:24 crc kubenswrapper[4790]: I1003 09:28:24.329405 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/ba88a153-0441-4fd3-a3b3-37338c589169-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"ba88a153-0441-4fd3-a3b3-37338c589169\") " pod="openstack/prometheus-metric-storage-0" Oct 03 09:28:24 crc kubenswrapper[4790]: I1003 09:28:24.330189 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ba88a153-0441-4fd3-a3b3-37338c589169-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"ba88a153-0441-4fd3-a3b3-37338c589169\") " pod="openstack/prometheus-metric-storage-0" Oct 03 09:28:24 crc kubenswrapper[4790]: I1003 09:28:24.333691 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/ba88a153-0441-4fd3-a3b3-37338c589169-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"ba88a153-0441-4fd3-a3b3-37338c589169\") " pod="openstack/prometheus-metric-storage-0" Oct 03 09:28:24 crc kubenswrapper[4790]: I1003 09:28:24.335847 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/ba88a153-0441-4fd3-a3b3-37338c589169-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"ba88a153-0441-4fd3-a3b3-37338c589169\") " pod="openstack/prometheus-metric-storage-0" Oct 03 09:28:24 crc kubenswrapper[4790]: I1003 09:28:24.336305 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/ba88a153-0441-4fd3-a3b3-37338c589169-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"ba88a153-0441-4fd3-a3b3-37338c589169\") " pod="openstack/prometheus-metric-storage-0" Oct 03 09:28:24 crc kubenswrapper[4790]: I1003 09:28:24.337992 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/ba88a153-0441-4fd3-a3b3-37338c589169-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"ba88a153-0441-4fd3-a3b3-37338c589169\") " pod="openstack/prometheus-metric-storage-0" Oct 03 09:28:24 crc kubenswrapper[4790]: I1003 09:28:24.339455 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-brgld\" (UniqueName: \"kubernetes.io/projected/ba88a153-0441-4fd3-a3b3-37338c589169-kube-api-access-brgld\") pod \"prometheus-metric-storage-0\" (UID: \"ba88a153-0441-4fd3-a3b3-37338c589169\") " pod="openstack/prometheus-metric-storage-0" Oct 03 09:28:24 crc kubenswrapper[4790]: I1003 09:28:24.351146 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="969ea996-248e-41c5-a868-bd2321bfdf49" path="/var/lib/kubelet/pods/969ea996-248e-41c5-a868-bd2321bfdf49/volumes" Oct 03 09:28:24 crc kubenswrapper[4790]: I1003 09:28:24.382173 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/ba88a153-0441-4fd3-a3b3-37338c589169-config\") pod \"prometheus-metric-storage-0\" (UID: \"ba88a153-0441-4fd3-a3b3-37338c589169\") " pod="openstack/prometheus-metric-storage-0" Oct 03 09:28:24 crc kubenswrapper[4790]: I1003 09:28:24.433278 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-84e9d10c-4308-45f2-a94a-547cf8f7d3b4\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-84e9d10c-4308-45f2-a94a-547cf8f7d3b4\") pod \"prometheus-metric-storage-0\" (UID: \"ba88a153-0441-4fd3-a3b3-37338c589169\") " pod="openstack/prometheus-metric-storage-0" Oct 03 09:28:24 crc kubenswrapper[4790]: I1003 09:28:24.472075 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Oct 03 09:28:24 crc kubenswrapper[4790]: I1003 09:28:24.942694 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Oct 03 09:28:25 crc kubenswrapper[4790]: I1003 09:28:25.062864 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"ba88a153-0441-4fd3-a3b3-37338c589169","Type":"ContainerStarted","Data":"8b86a4c0fb5659dea207725dbd5c710c84f0c00fba5712fc3de247e6a654ac3a"} Oct 03 09:28:26 crc kubenswrapper[4790]: I1003 09:28:26.317443 4790 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/prometheus-metric-storage-0" podUID="969ea996-248e-41c5-a868-bd2321bfdf49" containerName="prometheus" probeResult="failure" output="Get \"https://10.217.0.131:9090/-/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Oct 03 09:28:29 crc kubenswrapper[4790]: I1003 09:28:29.105835 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"ba88a153-0441-4fd3-a3b3-37338c589169","Type":"ContainerStarted","Data":"590da544425ecfd8dfe9f89998f125d20bca0dfe586c0feee0e65618e41ed082"} Oct 03 09:28:35 crc kubenswrapper[4790]: I1003 09:28:35.676057 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-fxdpq"] Oct 03 09:28:35 crc kubenswrapper[4790]: I1003 09:28:35.679860 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fxdpq" Oct 03 09:28:35 crc kubenswrapper[4790]: I1003 09:28:35.694341 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-fxdpq"] Oct 03 09:28:35 crc kubenswrapper[4790]: I1003 09:28:35.792327 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/06710fa8-bc46-4708-b08d-ebfa316e1086-utilities\") pod \"redhat-operators-fxdpq\" (UID: \"06710fa8-bc46-4708-b08d-ebfa316e1086\") " pod="openshift-marketplace/redhat-operators-fxdpq" Oct 03 09:28:35 crc kubenswrapper[4790]: I1003 09:28:35.792891 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/06710fa8-bc46-4708-b08d-ebfa316e1086-catalog-content\") pod \"redhat-operators-fxdpq\" (UID: \"06710fa8-bc46-4708-b08d-ebfa316e1086\") " pod="openshift-marketplace/redhat-operators-fxdpq" Oct 03 09:28:35 crc kubenswrapper[4790]: I1003 09:28:35.792945 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wh5fx\" (UniqueName: \"kubernetes.io/projected/06710fa8-bc46-4708-b08d-ebfa316e1086-kube-api-access-wh5fx\") pod \"redhat-operators-fxdpq\" (UID: \"06710fa8-bc46-4708-b08d-ebfa316e1086\") " pod="openshift-marketplace/redhat-operators-fxdpq" Oct 03 09:28:35 crc kubenswrapper[4790]: I1003 09:28:35.895466 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/06710fa8-bc46-4708-b08d-ebfa316e1086-utilities\") pod \"redhat-operators-fxdpq\" (UID: \"06710fa8-bc46-4708-b08d-ebfa316e1086\") " pod="openshift-marketplace/redhat-operators-fxdpq" Oct 03 09:28:35 crc kubenswrapper[4790]: I1003 09:28:35.895610 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/06710fa8-bc46-4708-b08d-ebfa316e1086-catalog-content\") pod \"redhat-operators-fxdpq\" (UID: \"06710fa8-bc46-4708-b08d-ebfa316e1086\") " pod="openshift-marketplace/redhat-operators-fxdpq" Oct 03 09:28:35 crc kubenswrapper[4790]: I1003 09:28:35.895648 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wh5fx\" (UniqueName: \"kubernetes.io/projected/06710fa8-bc46-4708-b08d-ebfa316e1086-kube-api-access-wh5fx\") pod \"redhat-operators-fxdpq\" (UID: \"06710fa8-bc46-4708-b08d-ebfa316e1086\") " pod="openshift-marketplace/redhat-operators-fxdpq" Oct 03 09:28:35 crc kubenswrapper[4790]: I1003 09:28:35.896180 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/06710fa8-bc46-4708-b08d-ebfa316e1086-utilities\") pod \"redhat-operators-fxdpq\" (UID: \"06710fa8-bc46-4708-b08d-ebfa316e1086\") " pod="openshift-marketplace/redhat-operators-fxdpq" Oct 03 09:28:35 crc kubenswrapper[4790]: I1003 09:28:35.896252 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/06710fa8-bc46-4708-b08d-ebfa316e1086-catalog-content\") pod \"redhat-operators-fxdpq\" (UID: \"06710fa8-bc46-4708-b08d-ebfa316e1086\") " pod="openshift-marketplace/redhat-operators-fxdpq" Oct 03 09:28:35 crc kubenswrapper[4790]: I1003 09:28:35.922341 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wh5fx\" (UniqueName: \"kubernetes.io/projected/06710fa8-bc46-4708-b08d-ebfa316e1086-kube-api-access-wh5fx\") pod \"redhat-operators-fxdpq\" (UID: \"06710fa8-bc46-4708-b08d-ebfa316e1086\") " pod="openshift-marketplace/redhat-operators-fxdpq" Oct 03 09:28:36 crc kubenswrapper[4790]: I1003 09:28:36.001939 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fxdpq" Oct 03 09:28:36 crc kubenswrapper[4790]: I1003 09:28:36.182901 4790 generic.go:334] "Generic (PLEG): container finished" podID="ba88a153-0441-4fd3-a3b3-37338c589169" containerID="590da544425ecfd8dfe9f89998f125d20bca0dfe586c0feee0e65618e41ed082" exitCode=0 Oct 03 09:28:36 crc kubenswrapper[4790]: I1003 09:28:36.183191 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"ba88a153-0441-4fd3-a3b3-37338c589169","Type":"ContainerDied","Data":"590da544425ecfd8dfe9f89998f125d20bca0dfe586c0feee0e65618e41ed082"} Oct 03 09:28:36 crc kubenswrapper[4790]: W1003 09:28:36.483091 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod06710fa8_bc46_4708_b08d_ebfa316e1086.slice/crio-226347b1ce4078075e403b3564c30b2d366befcab24e2e19eca086c77fba0eac WatchSource:0}: Error finding container 226347b1ce4078075e403b3564c30b2d366befcab24e2e19eca086c77fba0eac: Status 404 returned error can't find the container with id 226347b1ce4078075e403b3564c30b2d366befcab24e2e19eca086c77fba0eac Oct 03 09:28:36 crc kubenswrapper[4790]: I1003 09:28:36.484072 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-fxdpq"] Oct 03 09:28:37 crc kubenswrapper[4790]: I1003 09:28:37.205442 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"ba88a153-0441-4fd3-a3b3-37338c589169","Type":"ContainerStarted","Data":"6746b5ed24858a076887ba8dedaa26498d146a59e3fc8de314658b9d0608b7c1"} Oct 03 09:28:37 crc kubenswrapper[4790]: I1003 09:28:37.216511 4790 generic.go:334] "Generic (PLEG): container finished" podID="06710fa8-bc46-4708-b08d-ebfa316e1086" containerID="f15013e98ba04c79aeb4b1c0727fc0d12422f2e3650d7f5dc6e47aeb593be566" exitCode=0 Oct 03 09:28:37 crc kubenswrapper[4790]: I1003 09:28:37.216583 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fxdpq" event={"ID":"06710fa8-bc46-4708-b08d-ebfa316e1086","Type":"ContainerDied","Data":"f15013e98ba04c79aeb4b1c0727fc0d12422f2e3650d7f5dc6e47aeb593be566"} Oct 03 09:28:37 crc kubenswrapper[4790]: I1003 09:28:37.216619 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fxdpq" event={"ID":"06710fa8-bc46-4708-b08d-ebfa316e1086","Type":"ContainerStarted","Data":"226347b1ce4078075e403b3564c30b2d366befcab24e2e19eca086c77fba0eac"} Oct 03 09:28:38 crc kubenswrapper[4790]: I1003 09:28:38.232660 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fxdpq" event={"ID":"06710fa8-bc46-4708-b08d-ebfa316e1086","Type":"ContainerStarted","Data":"6e8bd8ec9d195f5bb2cceca34fa23767421c86f6d2e6ca54d24425f7c18e9080"} Oct 03 09:28:40 crc kubenswrapper[4790]: I1003 09:28:40.186872 4790 patch_prober.go:28] interesting pod/machine-config-daemon-zqj4j container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 09:28:40 crc kubenswrapper[4790]: I1003 09:28:40.187360 4790 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" podUID="36eae06a-d8be-4128-8d68-acb32e1f011b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 09:28:40 crc kubenswrapper[4790]: I1003 09:28:40.187400 4790 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" Oct 03 09:28:40 crc kubenswrapper[4790]: I1003 09:28:40.188102 4790 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"6ecb310a15330a3dcf7170c7134b2b3efb7d1ab4c378f8b09d033752d3cbd92b"} pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 03 09:28:40 crc kubenswrapper[4790]: I1003 09:28:40.188148 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" podUID="36eae06a-d8be-4128-8d68-acb32e1f011b" containerName="machine-config-daemon" containerID="cri-o://6ecb310a15330a3dcf7170c7134b2b3efb7d1ab4c378f8b09d033752d3cbd92b" gracePeriod=600 Oct 03 09:28:40 crc kubenswrapper[4790]: I1003 09:28:40.252642 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"ba88a153-0441-4fd3-a3b3-37338c589169","Type":"ContainerStarted","Data":"86ee688425319765759cb72a5d208e1a20d0b95a39421e8be7ccd40c157dab5c"} Oct 03 09:28:40 crc kubenswrapper[4790]: I1003 09:28:40.253036 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"ba88a153-0441-4fd3-a3b3-37338c589169","Type":"ContainerStarted","Data":"7672b432a31e2fd37003c1db8a4185653a26fe96b6558416ff64c78021938c53"} Oct 03 09:28:40 crc kubenswrapper[4790]: I1003 09:28:40.281455 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/prometheus-metric-storage-0" podStartSLOduration=16.281438819 podStartE2EDuration="16.281438819s" podCreationTimestamp="2025-10-03 09:28:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 09:28:40.274819206 +0000 UTC m=+2906.627753464" watchObservedRunningTime="2025-10-03 09:28:40.281438819 +0000 UTC m=+2906.634373057" Oct 03 09:28:40 crc kubenswrapper[4790]: E1003 09:28:40.331517 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqj4j_openshift-machine-config-operator(36eae06a-d8be-4128-8d68-acb32e1f011b)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" podUID="36eae06a-d8be-4128-8d68-acb32e1f011b" Oct 03 09:28:41 crc kubenswrapper[4790]: I1003 09:28:41.263531 4790 generic.go:334] "Generic (PLEG): container finished" podID="06710fa8-bc46-4708-b08d-ebfa316e1086" containerID="6e8bd8ec9d195f5bb2cceca34fa23767421c86f6d2e6ca54d24425f7c18e9080" exitCode=0 Oct 03 09:28:41 crc kubenswrapper[4790]: I1003 09:28:41.263649 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fxdpq" event={"ID":"06710fa8-bc46-4708-b08d-ebfa316e1086","Type":"ContainerDied","Data":"6e8bd8ec9d195f5bb2cceca34fa23767421c86f6d2e6ca54d24425f7c18e9080"} Oct 03 09:28:41 crc kubenswrapper[4790]: I1003 09:28:41.266917 4790 generic.go:334] "Generic (PLEG): container finished" podID="36eae06a-d8be-4128-8d68-acb32e1f011b" containerID="6ecb310a15330a3dcf7170c7134b2b3efb7d1ab4c378f8b09d033752d3cbd92b" exitCode=0 Oct 03 09:28:41 crc kubenswrapper[4790]: I1003 09:28:41.267001 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" event={"ID":"36eae06a-d8be-4128-8d68-acb32e1f011b","Type":"ContainerDied","Data":"6ecb310a15330a3dcf7170c7134b2b3efb7d1ab4c378f8b09d033752d3cbd92b"} Oct 03 09:28:41 crc kubenswrapper[4790]: I1003 09:28:41.267090 4790 scope.go:117] "RemoveContainer" containerID="19d3a3ec163b3ce35fa277de2e5c9c85ded382af52d82e819b0c00a5f4d71d5c" Oct 03 09:28:41 crc kubenswrapper[4790]: I1003 09:28:41.268278 4790 scope.go:117] "RemoveContainer" containerID="6ecb310a15330a3dcf7170c7134b2b3efb7d1ab4c378f8b09d033752d3cbd92b" Oct 03 09:28:41 crc kubenswrapper[4790]: E1003 09:28:41.268781 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqj4j_openshift-machine-config-operator(36eae06a-d8be-4128-8d68-acb32e1f011b)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" podUID="36eae06a-d8be-4128-8d68-acb32e1f011b" Oct 03 09:28:42 crc kubenswrapper[4790]: I1003 09:28:42.283392 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fxdpq" event={"ID":"06710fa8-bc46-4708-b08d-ebfa316e1086","Type":"ContainerStarted","Data":"9d029c918ef6bee44c7f79de327d86ef916c3d192f5a2f2bb3af244907723077"} Oct 03 09:28:42 crc kubenswrapper[4790]: I1003 09:28:42.318335 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-fxdpq" podStartSLOduration=2.830821815 podStartE2EDuration="7.318318464s" podCreationTimestamp="2025-10-03 09:28:35 +0000 UTC" firstStartedPulling="2025-10-03 09:28:37.221347124 +0000 UTC m=+2903.574281362" lastFinishedPulling="2025-10-03 09:28:41.708843763 +0000 UTC m=+2908.061778011" observedRunningTime="2025-10-03 09:28:42.312180735 +0000 UTC m=+2908.665115003" watchObservedRunningTime="2025-10-03 09:28:42.318318464 +0000 UTC m=+2908.671252702" Oct 03 09:28:44 crc kubenswrapper[4790]: I1003 09:28:44.473182 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/prometheus-metric-storage-0" Oct 03 09:28:46 crc kubenswrapper[4790]: I1003 09:28:46.002058 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-fxdpq" Oct 03 09:28:46 crc kubenswrapper[4790]: I1003 09:28:46.002369 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-fxdpq" Oct 03 09:28:47 crc kubenswrapper[4790]: I1003 09:28:47.071226 4790 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-fxdpq" podUID="06710fa8-bc46-4708-b08d-ebfa316e1086" containerName="registry-server" probeResult="failure" output=< Oct 03 09:28:47 crc kubenswrapper[4790]: timeout: failed to connect service ":50051" within 1s Oct 03 09:28:47 crc kubenswrapper[4790]: > Oct 03 09:28:54 crc kubenswrapper[4790]: I1003 09:28:54.339460 4790 scope.go:117] "RemoveContainer" containerID="6ecb310a15330a3dcf7170c7134b2b3efb7d1ab4c378f8b09d033752d3cbd92b" Oct 03 09:28:54 crc kubenswrapper[4790]: E1003 09:28:54.341646 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqj4j_openshift-machine-config-operator(36eae06a-d8be-4128-8d68-acb32e1f011b)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" podUID="36eae06a-d8be-4128-8d68-acb32e1f011b" Oct 03 09:28:54 crc kubenswrapper[4790]: I1003 09:28:54.472918 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/prometheus-metric-storage-0" Oct 03 09:28:54 crc kubenswrapper[4790]: I1003 09:28:54.481313 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/prometheus-metric-storage-0" Oct 03 09:28:55 crc kubenswrapper[4790]: I1003 09:28:55.445916 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/prometheus-metric-storage-0" Oct 03 09:28:57 crc kubenswrapper[4790]: I1003 09:28:57.053314 4790 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-fxdpq" podUID="06710fa8-bc46-4708-b08d-ebfa316e1086" containerName="registry-server" probeResult="failure" output=< Oct 03 09:28:57 crc kubenswrapper[4790]: timeout: failed to connect service ":50051" within 1s Oct 03 09:28:57 crc kubenswrapper[4790]: > Oct 03 09:29:06 crc kubenswrapper[4790]: I1003 09:29:06.067753 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-fxdpq" Oct 03 09:29:06 crc kubenswrapper[4790]: I1003 09:29:06.137932 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-fxdpq" Oct 03 09:29:06 crc kubenswrapper[4790]: I1003 09:29:06.879700 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-fxdpq"] Oct 03 09:29:07 crc kubenswrapper[4790]: I1003 09:29:07.570334 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-fxdpq" podUID="06710fa8-bc46-4708-b08d-ebfa316e1086" containerName="registry-server" containerID="cri-o://9d029c918ef6bee44c7f79de327d86ef916c3d192f5a2f2bb3af244907723077" gracePeriod=2 Oct 03 09:29:08 crc kubenswrapper[4790]: I1003 09:29:08.155956 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fxdpq" Oct 03 09:29:08 crc kubenswrapper[4790]: I1003 09:29:08.336938 4790 scope.go:117] "RemoveContainer" containerID="6ecb310a15330a3dcf7170c7134b2b3efb7d1ab4c378f8b09d033752d3cbd92b" Oct 03 09:29:08 crc kubenswrapper[4790]: E1003 09:29:08.337145 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqj4j_openshift-machine-config-operator(36eae06a-d8be-4128-8d68-acb32e1f011b)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" podUID="36eae06a-d8be-4128-8d68-acb32e1f011b" Oct 03 09:29:08 crc kubenswrapper[4790]: I1003 09:29:08.337814 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/06710fa8-bc46-4708-b08d-ebfa316e1086-utilities\") pod \"06710fa8-bc46-4708-b08d-ebfa316e1086\" (UID: \"06710fa8-bc46-4708-b08d-ebfa316e1086\") " Oct 03 09:29:08 crc kubenswrapper[4790]: I1003 09:29:08.338656 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wh5fx\" (UniqueName: \"kubernetes.io/projected/06710fa8-bc46-4708-b08d-ebfa316e1086-kube-api-access-wh5fx\") pod \"06710fa8-bc46-4708-b08d-ebfa316e1086\" (UID: \"06710fa8-bc46-4708-b08d-ebfa316e1086\") " Oct 03 09:29:08 crc kubenswrapper[4790]: I1003 09:29:08.338729 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/06710fa8-bc46-4708-b08d-ebfa316e1086-catalog-content\") pod \"06710fa8-bc46-4708-b08d-ebfa316e1086\" (UID: \"06710fa8-bc46-4708-b08d-ebfa316e1086\") " Oct 03 09:29:08 crc kubenswrapper[4790]: I1003 09:29:08.339279 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/06710fa8-bc46-4708-b08d-ebfa316e1086-utilities" (OuterVolumeSpecName: "utilities") pod "06710fa8-bc46-4708-b08d-ebfa316e1086" (UID: "06710fa8-bc46-4708-b08d-ebfa316e1086"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 09:29:08 crc kubenswrapper[4790]: I1003 09:29:08.339493 4790 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/06710fa8-bc46-4708-b08d-ebfa316e1086-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 09:29:08 crc kubenswrapper[4790]: I1003 09:29:08.347934 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/06710fa8-bc46-4708-b08d-ebfa316e1086-kube-api-access-wh5fx" (OuterVolumeSpecName: "kube-api-access-wh5fx") pod "06710fa8-bc46-4708-b08d-ebfa316e1086" (UID: "06710fa8-bc46-4708-b08d-ebfa316e1086"). InnerVolumeSpecName "kube-api-access-wh5fx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:29:08 crc kubenswrapper[4790]: I1003 09:29:08.437959 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/06710fa8-bc46-4708-b08d-ebfa316e1086-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "06710fa8-bc46-4708-b08d-ebfa316e1086" (UID: "06710fa8-bc46-4708-b08d-ebfa316e1086"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 09:29:08 crc kubenswrapper[4790]: I1003 09:29:08.441246 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wh5fx\" (UniqueName: \"kubernetes.io/projected/06710fa8-bc46-4708-b08d-ebfa316e1086-kube-api-access-wh5fx\") on node \"crc\" DevicePath \"\"" Oct 03 09:29:08 crc kubenswrapper[4790]: I1003 09:29:08.441281 4790 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/06710fa8-bc46-4708-b08d-ebfa316e1086-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 09:29:08 crc kubenswrapper[4790]: I1003 09:29:08.584827 4790 generic.go:334] "Generic (PLEG): container finished" podID="06710fa8-bc46-4708-b08d-ebfa316e1086" containerID="9d029c918ef6bee44c7f79de327d86ef916c3d192f5a2f2bb3af244907723077" exitCode=0 Oct 03 09:29:08 crc kubenswrapper[4790]: I1003 09:29:08.585605 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fxdpq" event={"ID":"06710fa8-bc46-4708-b08d-ebfa316e1086","Type":"ContainerDied","Data":"9d029c918ef6bee44c7f79de327d86ef916c3d192f5a2f2bb3af244907723077"} Oct 03 09:29:08 crc kubenswrapper[4790]: I1003 09:29:08.586035 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fxdpq" event={"ID":"06710fa8-bc46-4708-b08d-ebfa316e1086","Type":"ContainerDied","Data":"226347b1ce4078075e403b3564c30b2d366befcab24e2e19eca086c77fba0eac"} Oct 03 09:29:08 crc kubenswrapper[4790]: I1003 09:29:08.586118 4790 scope.go:117] "RemoveContainer" containerID="9d029c918ef6bee44c7f79de327d86ef916c3d192f5a2f2bb3af244907723077" Oct 03 09:29:08 crc kubenswrapper[4790]: I1003 09:29:08.585732 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fxdpq" Oct 03 09:29:08 crc kubenswrapper[4790]: I1003 09:29:08.635999 4790 scope.go:117] "RemoveContainer" containerID="6e8bd8ec9d195f5bb2cceca34fa23767421c86f6d2e6ca54d24425f7c18e9080" Oct 03 09:29:08 crc kubenswrapper[4790]: I1003 09:29:08.644408 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-fxdpq"] Oct 03 09:29:08 crc kubenswrapper[4790]: I1003 09:29:08.657882 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-fxdpq"] Oct 03 09:29:08 crc kubenswrapper[4790]: I1003 09:29:08.659527 4790 scope.go:117] "RemoveContainer" containerID="f15013e98ba04c79aeb4b1c0727fc0d12422f2e3650d7f5dc6e47aeb593be566" Oct 03 09:29:08 crc kubenswrapper[4790]: I1003 09:29:08.732286 4790 scope.go:117] "RemoveContainer" containerID="9d029c918ef6bee44c7f79de327d86ef916c3d192f5a2f2bb3af244907723077" Oct 03 09:29:08 crc kubenswrapper[4790]: E1003 09:29:08.733659 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9d029c918ef6bee44c7f79de327d86ef916c3d192f5a2f2bb3af244907723077\": container with ID starting with 9d029c918ef6bee44c7f79de327d86ef916c3d192f5a2f2bb3af244907723077 not found: ID does not exist" containerID="9d029c918ef6bee44c7f79de327d86ef916c3d192f5a2f2bb3af244907723077" Oct 03 09:29:08 crc kubenswrapper[4790]: I1003 09:29:08.733769 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9d029c918ef6bee44c7f79de327d86ef916c3d192f5a2f2bb3af244907723077"} err="failed to get container status \"9d029c918ef6bee44c7f79de327d86ef916c3d192f5a2f2bb3af244907723077\": rpc error: code = NotFound desc = could not find container \"9d029c918ef6bee44c7f79de327d86ef916c3d192f5a2f2bb3af244907723077\": container with ID starting with 9d029c918ef6bee44c7f79de327d86ef916c3d192f5a2f2bb3af244907723077 not found: ID does not exist" Oct 03 09:29:08 crc kubenswrapper[4790]: I1003 09:29:08.733857 4790 scope.go:117] "RemoveContainer" containerID="6e8bd8ec9d195f5bb2cceca34fa23767421c86f6d2e6ca54d24425f7c18e9080" Oct 03 09:29:08 crc kubenswrapper[4790]: E1003 09:29:08.734330 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6e8bd8ec9d195f5bb2cceca34fa23767421c86f6d2e6ca54d24425f7c18e9080\": container with ID starting with 6e8bd8ec9d195f5bb2cceca34fa23767421c86f6d2e6ca54d24425f7c18e9080 not found: ID does not exist" containerID="6e8bd8ec9d195f5bb2cceca34fa23767421c86f6d2e6ca54d24425f7c18e9080" Oct 03 09:29:08 crc kubenswrapper[4790]: I1003 09:29:08.734438 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6e8bd8ec9d195f5bb2cceca34fa23767421c86f6d2e6ca54d24425f7c18e9080"} err="failed to get container status \"6e8bd8ec9d195f5bb2cceca34fa23767421c86f6d2e6ca54d24425f7c18e9080\": rpc error: code = NotFound desc = could not find container \"6e8bd8ec9d195f5bb2cceca34fa23767421c86f6d2e6ca54d24425f7c18e9080\": container with ID starting with 6e8bd8ec9d195f5bb2cceca34fa23767421c86f6d2e6ca54d24425f7c18e9080 not found: ID does not exist" Oct 03 09:29:08 crc kubenswrapper[4790]: I1003 09:29:08.734513 4790 scope.go:117] "RemoveContainer" containerID="f15013e98ba04c79aeb4b1c0727fc0d12422f2e3650d7f5dc6e47aeb593be566" Oct 03 09:29:08 crc kubenswrapper[4790]: E1003 09:29:08.734886 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f15013e98ba04c79aeb4b1c0727fc0d12422f2e3650d7f5dc6e47aeb593be566\": container with ID starting with f15013e98ba04c79aeb4b1c0727fc0d12422f2e3650d7f5dc6e47aeb593be566 not found: ID does not exist" containerID="f15013e98ba04c79aeb4b1c0727fc0d12422f2e3650d7f5dc6e47aeb593be566" Oct 03 09:29:08 crc kubenswrapper[4790]: I1003 09:29:08.734908 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f15013e98ba04c79aeb4b1c0727fc0d12422f2e3650d7f5dc6e47aeb593be566"} err="failed to get container status \"f15013e98ba04c79aeb4b1c0727fc0d12422f2e3650d7f5dc6e47aeb593be566\": rpc error: code = NotFound desc = could not find container \"f15013e98ba04c79aeb4b1c0727fc0d12422f2e3650d7f5dc6e47aeb593be566\": container with ID starting with f15013e98ba04c79aeb4b1c0727fc0d12422f2e3650d7f5dc6e47aeb593be566 not found: ID does not exist" Oct 03 09:29:10 crc kubenswrapper[4790]: I1003 09:29:10.348995 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="06710fa8-bc46-4708-b08d-ebfa316e1086" path="/var/lib/kubelet/pods/06710fa8-bc46-4708-b08d-ebfa316e1086/volumes" Oct 03 09:29:19 crc kubenswrapper[4790]: I1003 09:29:19.583765 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/tempest-tests-tempest"] Oct 03 09:29:19 crc kubenswrapper[4790]: E1003 09:29:19.589388 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="06710fa8-bc46-4708-b08d-ebfa316e1086" containerName="extract-utilities" Oct 03 09:29:19 crc kubenswrapper[4790]: I1003 09:29:19.589447 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="06710fa8-bc46-4708-b08d-ebfa316e1086" containerName="extract-utilities" Oct 03 09:29:19 crc kubenswrapper[4790]: E1003 09:29:19.589556 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="06710fa8-bc46-4708-b08d-ebfa316e1086" containerName="extract-content" Oct 03 09:29:19 crc kubenswrapper[4790]: I1003 09:29:19.589602 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="06710fa8-bc46-4708-b08d-ebfa316e1086" containerName="extract-content" Oct 03 09:29:19 crc kubenswrapper[4790]: E1003 09:29:19.589730 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="06710fa8-bc46-4708-b08d-ebfa316e1086" containerName="registry-server" Oct 03 09:29:19 crc kubenswrapper[4790]: I1003 09:29:19.589749 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="06710fa8-bc46-4708-b08d-ebfa316e1086" containerName="registry-server" Oct 03 09:29:19 crc kubenswrapper[4790]: I1003 09:29:19.590974 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="06710fa8-bc46-4708-b08d-ebfa316e1086" containerName="registry-server" Oct 03 09:29:19 crc kubenswrapper[4790]: I1003 09:29:19.621726 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Oct 03 09:29:19 crc kubenswrapper[4790]: I1003 09:29:19.621885 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Oct 03 09:29:19 crc kubenswrapper[4790]: I1003 09:29:19.624554 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"test-operator-controller-priv-key" Oct 03 09:29:19 crc kubenswrapper[4790]: I1003 09:29:19.629869 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-custom-data-s0" Oct 03 09:29:19 crc kubenswrapper[4790]: I1003 09:29:19.630190 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-xmfm8" Oct 03 09:29:19 crc kubenswrapper[4790]: I1003 09:29:19.631800 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Oct 03 09:29:19 crc kubenswrapper[4790]: I1003 09:29:19.713133 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/69ff309d-d784-487d-a190-4529008cb497-config-data\") pod \"tempest-tests-tempest\" (UID: \"69ff309d-d784-487d-a190-4529008cb497\") " pod="openstack/tempest-tests-tempest" Oct 03 09:29:19 crc kubenswrapper[4790]: I1003 09:29:19.713237 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/69ff309d-d784-487d-a190-4529008cb497-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"69ff309d-d784-487d-a190-4529008cb497\") " pod="openstack/tempest-tests-tempest" Oct 03 09:29:19 crc kubenswrapper[4790]: I1003 09:29:19.713260 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/69ff309d-d784-487d-a190-4529008cb497-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"69ff309d-d784-487d-a190-4529008cb497\") " pod="openstack/tempest-tests-tempest" Oct 03 09:29:19 crc kubenswrapper[4790]: I1003 09:29:19.815851 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/69ff309d-d784-487d-a190-4529008cb497-config-data\") pod \"tempest-tests-tempest\" (UID: \"69ff309d-d784-487d-a190-4529008cb497\") " pod="openstack/tempest-tests-tempest" Oct 03 09:29:19 crc kubenswrapper[4790]: I1003 09:29:19.815954 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/69ff309d-d784-487d-a190-4529008cb497-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"69ff309d-d784-487d-a190-4529008cb497\") " pod="openstack/tempest-tests-tempest" Oct 03 09:29:19 crc kubenswrapper[4790]: I1003 09:29:19.815994 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/69ff309d-d784-487d-a190-4529008cb497-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"69ff309d-d784-487d-a190-4529008cb497\") " pod="openstack/tempest-tests-tempest" Oct 03 09:29:19 crc kubenswrapper[4790]: I1003 09:29:19.816266 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/69ff309d-d784-487d-a190-4529008cb497-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"69ff309d-d784-487d-a190-4529008cb497\") " pod="openstack/tempest-tests-tempest" Oct 03 09:29:19 crc kubenswrapper[4790]: I1003 09:29:19.816396 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/69ff309d-d784-487d-a190-4529008cb497-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"69ff309d-d784-487d-a190-4529008cb497\") " pod="openstack/tempest-tests-tempest" Oct 03 09:29:19 crc kubenswrapper[4790]: I1003 09:29:19.816497 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"tempest-tests-tempest\" (UID: \"69ff309d-d784-487d-a190-4529008cb497\") " pod="openstack/tempest-tests-tempest" Oct 03 09:29:19 crc kubenswrapper[4790]: I1003 09:29:19.816853 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/69ff309d-d784-487d-a190-4529008cb497-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"69ff309d-d784-487d-a190-4529008cb497\") " pod="openstack/tempest-tests-tempest" Oct 03 09:29:19 crc kubenswrapper[4790]: I1003 09:29:19.816969 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/69ff309d-d784-487d-a190-4529008cb497-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"69ff309d-d784-487d-a190-4529008cb497\") " pod="openstack/tempest-tests-tempest" Oct 03 09:29:19 crc kubenswrapper[4790]: I1003 09:29:19.817045 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sm8pg\" (UniqueName: \"kubernetes.io/projected/69ff309d-d784-487d-a190-4529008cb497-kube-api-access-sm8pg\") pod \"tempest-tests-tempest\" (UID: \"69ff309d-d784-487d-a190-4529008cb497\") " pod="openstack/tempest-tests-tempest" Oct 03 09:29:19 crc kubenswrapper[4790]: I1003 09:29:19.817672 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/69ff309d-d784-487d-a190-4529008cb497-config-data\") pod \"tempest-tests-tempest\" (UID: \"69ff309d-d784-487d-a190-4529008cb497\") " pod="openstack/tempest-tests-tempest" Oct 03 09:29:19 crc kubenswrapper[4790]: I1003 09:29:19.818929 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/69ff309d-d784-487d-a190-4529008cb497-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"69ff309d-d784-487d-a190-4529008cb497\") " pod="openstack/tempest-tests-tempest" Oct 03 09:29:19 crc kubenswrapper[4790]: I1003 09:29:19.823075 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/69ff309d-d784-487d-a190-4529008cb497-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"69ff309d-d784-487d-a190-4529008cb497\") " pod="openstack/tempest-tests-tempest" Oct 03 09:29:19 crc kubenswrapper[4790]: I1003 09:29:19.918902 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/69ff309d-d784-487d-a190-4529008cb497-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"69ff309d-d784-487d-a190-4529008cb497\") " pod="openstack/tempest-tests-tempest" Oct 03 09:29:19 crc kubenswrapper[4790]: I1003 09:29:19.918965 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/69ff309d-d784-487d-a190-4529008cb497-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"69ff309d-d784-487d-a190-4529008cb497\") " pod="openstack/tempest-tests-tempest" Oct 03 09:29:19 crc kubenswrapper[4790]: I1003 09:29:19.919052 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"tempest-tests-tempest\" (UID: \"69ff309d-d784-487d-a190-4529008cb497\") " pod="openstack/tempest-tests-tempest" Oct 03 09:29:19 crc kubenswrapper[4790]: I1003 09:29:19.919130 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/69ff309d-d784-487d-a190-4529008cb497-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"69ff309d-d784-487d-a190-4529008cb497\") " pod="openstack/tempest-tests-tempest" Oct 03 09:29:19 crc kubenswrapper[4790]: I1003 09:29:19.919171 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/69ff309d-d784-487d-a190-4529008cb497-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"69ff309d-d784-487d-a190-4529008cb497\") " pod="openstack/tempest-tests-tempest" Oct 03 09:29:19 crc kubenswrapper[4790]: I1003 09:29:19.919205 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sm8pg\" (UniqueName: \"kubernetes.io/projected/69ff309d-d784-487d-a190-4529008cb497-kube-api-access-sm8pg\") pod \"tempest-tests-tempest\" (UID: \"69ff309d-d784-487d-a190-4529008cb497\") " pod="openstack/tempest-tests-tempest" Oct 03 09:29:19 crc kubenswrapper[4790]: I1003 09:29:19.919966 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/69ff309d-d784-487d-a190-4529008cb497-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"69ff309d-d784-487d-a190-4529008cb497\") " pod="openstack/tempest-tests-tempest" Oct 03 09:29:19 crc kubenswrapper[4790]: I1003 09:29:19.920057 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/69ff309d-d784-487d-a190-4529008cb497-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"69ff309d-d784-487d-a190-4529008cb497\") " pod="openstack/tempest-tests-tempest" Oct 03 09:29:19 crc kubenswrapper[4790]: I1003 09:29:19.920921 4790 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"tempest-tests-tempest\" (UID: \"69ff309d-d784-487d-a190-4529008cb497\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/tempest-tests-tempest" Oct 03 09:29:19 crc kubenswrapper[4790]: I1003 09:29:19.924645 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/69ff309d-d784-487d-a190-4529008cb497-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"69ff309d-d784-487d-a190-4529008cb497\") " pod="openstack/tempest-tests-tempest" Oct 03 09:29:19 crc kubenswrapper[4790]: I1003 09:29:19.930719 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/69ff309d-d784-487d-a190-4529008cb497-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"69ff309d-d784-487d-a190-4529008cb497\") " pod="openstack/tempest-tests-tempest" Oct 03 09:29:19 crc kubenswrapper[4790]: I1003 09:29:19.938082 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sm8pg\" (UniqueName: \"kubernetes.io/projected/69ff309d-d784-487d-a190-4529008cb497-kube-api-access-sm8pg\") pod \"tempest-tests-tempest\" (UID: \"69ff309d-d784-487d-a190-4529008cb497\") " pod="openstack/tempest-tests-tempest" Oct 03 09:29:19 crc kubenswrapper[4790]: I1003 09:29:19.978954 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"tempest-tests-tempest\" (UID: \"69ff309d-d784-487d-a190-4529008cb497\") " pod="openstack/tempest-tests-tempest" Oct 03 09:29:20 crc kubenswrapper[4790]: I1003 09:29:20.013218 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Oct 03 09:29:20 crc kubenswrapper[4790]: I1003 09:29:20.509601 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Oct 03 09:29:20 crc kubenswrapper[4790]: W1003 09:29:20.513982 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod69ff309d_d784_487d_a190_4529008cb497.slice/crio-9565be091cb3cbf943f5ebb99d76d02e775f0f8a5d2bad58d9a6ab36356159e1 WatchSource:0}: Error finding container 9565be091cb3cbf943f5ebb99d76d02e775f0f8a5d2bad58d9a6ab36356159e1: Status 404 returned error can't find the container with id 9565be091cb3cbf943f5ebb99d76d02e775f0f8a5d2bad58d9a6ab36356159e1 Oct 03 09:29:20 crc kubenswrapper[4790]: I1003 09:29:20.520380 4790 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 03 09:29:20 crc kubenswrapper[4790]: I1003 09:29:20.708859 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"69ff309d-d784-487d-a190-4529008cb497","Type":"ContainerStarted","Data":"9565be091cb3cbf943f5ebb99d76d02e775f0f8a5d2bad58d9a6ab36356159e1"} Oct 03 09:29:21 crc kubenswrapper[4790]: I1003 09:29:21.329346 4790 scope.go:117] "RemoveContainer" containerID="6ecb310a15330a3dcf7170c7134b2b3efb7d1ab4c378f8b09d033752d3cbd92b" Oct 03 09:29:21 crc kubenswrapper[4790]: E1003 09:29:21.329909 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqj4j_openshift-machine-config-operator(36eae06a-d8be-4128-8d68-acb32e1f011b)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" podUID="36eae06a-d8be-4128-8d68-acb32e1f011b" Oct 03 09:29:30 crc kubenswrapper[4790]: I1003 09:29:30.827556 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"69ff309d-d784-487d-a190-4529008cb497","Type":"ContainerStarted","Data":"2e5feb2eaec978d38327aa47155c6f3b66adf79d0d1f1bc741f20249ab2c1c2b"} Oct 03 09:29:30 crc kubenswrapper[4790]: I1003 09:29:30.861715 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/tempest-tests-tempest" podStartSLOduration=4.095707196 podStartE2EDuration="12.861693905s" podCreationTimestamp="2025-10-03 09:29:18 +0000 UTC" firstStartedPulling="2025-10-03 09:29:20.520177228 +0000 UTC m=+2946.873111466" lastFinishedPulling="2025-10-03 09:29:29.286163937 +0000 UTC m=+2955.639098175" observedRunningTime="2025-10-03 09:29:30.860204095 +0000 UTC m=+2957.213138343" watchObservedRunningTime="2025-10-03 09:29:30.861693905 +0000 UTC m=+2957.214628163" Oct 03 09:29:32 crc kubenswrapper[4790]: I1003 09:29:32.333889 4790 scope.go:117] "RemoveContainer" containerID="6ecb310a15330a3dcf7170c7134b2b3efb7d1ab4c378f8b09d033752d3cbd92b" Oct 03 09:29:32 crc kubenswrapper[4790]: E1003 09:29:32.334425 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqj4j_openshift-machine-config-operator(36eae06a-d8be-4128-8d68-acb32e1f011b)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" podUID="36eae06a-d8be-4128-8d68-acb32e1f011b" Oct 03 09:29:45 crc kubenswrapper[4790]: I1003 09:29:45.329259 4790 scope.go:117] "RemoveContainer" containerID="6ecb310a15330a3dcf7170c7134b2b3efb7d1ab4c378f8b09d033752d3cbd92b" Oct 03 09:29:45 crc kubenswrapper[4790]: E1003 09:29:45.330547 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqj4j_openshift-machine-config-operator(36eae06a-d8be-4128-8d68-acb32e1f011b)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" podUID="36eae06a-d8be-4128-8d68-acb32e1f011b" Oct 03 09:29:59 crc kubenswrapper[4790]: I1003 09:29:59.328397 4790 scope.go:117] "RemoveContainer" containerID="6ecb310a15330a3dcf7170c7134b2b3efb7d1ab4c378f8b09d033752d3cbd92b" Oct 03 09:29:59 crc kubenswrapper[4790]: E1003 09:29:59.329820 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqj4j_openshift-machine-config-operator(36eae06a-d8be-4128-8d68-acb32e1f011b)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" podUID="36eae06a-d8be-4128-8d68-acb32e1f011b" Oct 03 09:30:00 crc kubenswrapper[4790]: I1003 09:30:00.216467 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29324730-q4thh"] Oct 03 09:30:00 crc kubenswrapper[4790]: I1003 09:30:00.218429 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29324730-q4thh" Oct 03 09:30:00 crc kubenswrapper[4790]: I1003 09:30:00.220534 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 03 09:30:00 crc kubenswrapper[4790]: I1003 09:30:00.220814 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 03 09:30:00 crc kubenswrapper[4790]: I1003 09:30:00.240327 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29324730-q4thh"] Oct 03 09:30:00 crc kubenswrapper[4790]: I1003 09:30:00.379066 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/72133265-7f2f-4394-b2a6-79ea86c4d8d2-config-volume\") pod \"collect-profiles-29324730-q4thh\" (UID: \"72133265-7f2f-4394-b2a6-79ea86c4d8d2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324730-q4thh" Oct 03 09:30:00 crc kubenswrapper[4790]: I1003 09:30:00.380424 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/72133265-7f2f-4394-b2a6-79ea86c4d8d2-secret-volume\") pod \"collect-profiles-29324730-q4thh\" (UID: \"72133265-7f2f-4394-b2a6-79ea86c4d8d2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324730-q4thh" Oct 03 09:30:00 crc kubenswrapper[4790]: I1003 09:30:00.380524 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j6l75\" (UniqueName: \"kubernetes.io/projected/72133265-7f2f-4394-b2a6-79ea86c4d8d2-kube-api-access-j6l75\") pod \"collect-profiles-29324730-q4thh\" (UID: \"72133265-7f2f-4394-b2a6-79ea86c4d8d2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324730-q4thh" Oct 03 09:30:00 crc kubenswrapper[4790]: I1003 09:30:00.482168 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/72133265-7f2f-4394-b2a6-79ea86c4d8d2-config-volume\") pod \"collect-profiles-29324730-q4thh\" (UID: \"72133265-7f2f-4394-b2a6-79ea86c4d8d2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324730-q4thh" Oct 03 09:30:00 crc kubenswrapper[4790]: I1003 09:30:00.482563 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/72133265-7f2f-4394-b2a6-79ea86c4d8d2-secret-volume\") pod \"collect-profiles-29324730-q4thh\" (UID: \"72133265-7f2f-4394-b2a6-79ea86c4d8d2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324730-q4thh" Oct 03 09:30:00 crc kubenswrapper[4790]: I1003 09:30:00.483188 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/72133265-7f2f-4394-b2a6-79ea86c4d8d2-config-volume\") pod \"collect-profiles-29324730-q4thh\" (UID: \"72133265-7f2f-4394-b2a6-79ea86c4d8d2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324730-q4thh" Oct 03 09:30:00 crc kubenswrapper[4790]: I1003 09:30:00.483726 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j6l75\" (UniqueName: \"kubernetes.io/projected/72133265-7f2f-4394-b2a6-79ea86c4d8d2-kube-api-access-j6l75\") pod \"collect-profiles-29324730-q4thh\" (UID: \"72133265-7f2f-4394-b2a6-79ea86c4d8d2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324730-q4thh" Oct 03 09:30:00 crc kubenswrapper[4790]: I1003 09:30:00.497277 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/72133265-7f2f-4394-b2a6-79ea86c4d8d2-secret-volume\") pod \"collect-profiles-29324730-q4thh\" (UID: \"72133265-7f2f-4394-b2a6-79ea86c4d8d2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324730-q4thh" Oct 03 09:30:00 crc kubenswrapper[4790]: I1003 09:30:00.500498 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j6l75\" (UniqueName: \"kubernetes.io/projected/72133265-7f2f-4394-b2a6-79ea86c4d8d2-kube-api-access-j6l75\") pod \"collect-profiles-29324730-q4thh\" (UID: \"72133265-7f2f-4394-b2a6-79ea86c4d8d2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324730-q4thh" Oct 03 09:30:00 crc kubenswrapper[4790]: I1003 09:30:00.549421 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29324730-q4thh" Oct 03 09:30:01 crc kubenswrapper[4790]: I1003 09:30:01.026010 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29324730-q4thh"] Oct 03 09:30:01 crc kubenswrapper[4790]: W1003 09:30:01.026875 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod72133265_7f2f_4394_b2a6_79ea86c4d8d2.slice/crio-7f1f567d0f384b0dc4a79e7a7edee41f4755080fa010dffb2e13ecb330f71685 WatchSource:0}: Error finding container 7f1f567d0f384b0dc4a79e7a7edee41f4755080fa010dffb2e13ecb330f71685: Status 404 returned error can't find the container with id 7f1f567d0f384b0dc4a79e7a7edee41f4755080fa010dffb2e13ecb330f71685 Oct 03 09:30:01 crc kubenswrapper[4790]: I1003 09:30:01.216942 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29324730-q4thh" event={"ID":"72133265-7f2f-4394-b2a6-79ea86c4d8d2","Type":"ContainerStarted","Data":"bb1ed60ac67850026255fc988a3f9d1023630b7dfd4e875c77b9a3235bb0fbd5"} Oct 03 09:30:01 crc kubenswrapper[4790]: I1003 09:30:01.217284 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29324730-q4thh" event={"ID":"72133265-7f2f-4394-b2a6-79ea86c4d8d2","Type":"ContainerStarted","Data":"7f1f567d0f384b0dc4a79e7a7edee41f4755080fa010dffb2e13ecb330f71685"} Oct 03 09:30:01 crc kubenswrapper[4790]: I1003 09:30:01.239663 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29324730-q4thh" podStartSLOduration=1.239639648 podStartE2EDuration="1.239639648s" podCreationTimestamp="2025-10-03 09:30:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 09:30:01.230701922 +0000 UTC m=+2987.583636180" watchObservedRunningTime="2025-10-03 09:30:01.239639648 +0000 UTC m=+2987.592573896" Oct 03 09:30:02 crc kubenswrapper[4790]: I1003 09:30:02.228307 4790 generic.go:334] "Generic (PLEG): container finished" podID="72133265-7f2f-4394-b2a6-79ea86c4d8d2" containerID="bb1ed60ac67850026255fc988a3f9d1023630b7dfd4e875c77b9a3235bb0fbd5" exitCode=0 Oct 03 09:30:02 crc kubenswrapper[4790]: I1003 09:30:02.228522 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29324730-q4thh" event={"ID":"72133265-7f2f-4394-b2a6-79ea86c4d8d2","Type":"ContainerDied","Data":"bb1ed60ac67850026255fc988a3f9d1023630b7dfd4e875c77b9a3235bb0fbd5"} Oct 03 09:30:03 crc kubenswrapper[4790]: I1003 09:30:03.629432 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29324730-q4thh" Oct 03 09:30:03 crc kubenswrapper[4790]: I1003 09:30:03.752929 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j6l75\" (UniqueName: \"kubernetes.io/projected/72133265-7f2f-4394-b2a6-79ea86c4d8d2-kube-api-access-j6l75\") pod \"72133265-7f2f-4394-b2a6-79ea86c4d8d2\" (UID: \"72133265-7f2f-4394-b2a6-79ea86c4d8d2\") " Oct 03 09:30:03 crc kubenswrapper[4790]: I1003 09:30:03.753078 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/72133265-7f2f-4394-b2a6-79ea86c4d8d2-secret-volume\") pod \"72133265-7f2f-4394-b2a6-79ea86c4d8d2\" (UID: \"72133265-7f2f-4394-b2a6-79ea86c4d8d2\") " Oct 03 09:30:03 crc kubenswrapper[4790]: I1003 09:30:03.753182 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/72133265-7f2f-4394-b2a6-79ea86c4d8d2-config-volume\") pod \"72133265-7f2f-4394-b2a6-79ea86c4d8d2\" (UID: \"72133265-7f2f-4394-b2a6-79ea86c4d8d2\") " Oct 03 09:30:03 crc kubenswrapper[4790]: I1003 09:30:03.753758 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/72133265-7f2f-4394-b2a6-79ea86c4d8d2-config-volume" (OuterVolumeSpecName: "config-volume") pod "72133265-7f2f-4394-b2a6-79ea86c4d8d2" (UID: "72133265-7f2f-4394-b2a6-79ea86c4d8d2"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 09:30:03 crc kubenswrapper[4790]: I1003 09:30:03.758745 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/72133265-7f2f-4394-b2a6-79ea86c4d8d2-kube-api-access-j6l75" (OuterVolumeSpecName: "kube-api-access-j6l75") pod "72133265-7f2f-4394-b2a6-79ea86c4d8d2" (UID: "72133265-7f2f-4394-b2a6-79ea86c4d8d2"). InnerVolumeSpecName "kube-api-access-j6l75". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:30:03 crc kubenswrapper[4790]: I1003 09:30:03.758867 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/72133265-7f2f-4394-b2a6-79ea86c4d8d2-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "72133265-7f2f-4394-b2a6-79ea86c4d8d2" (UID: "72133265-7f2f-4394-b2a6-79ea86c4d8d2"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:30:03 crc kubenswrapper[4790]: I1003 09:30:03.856528 4790 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/72133265-7f2f-4394-b2a6-79ea86c4d8d2-config-volume\") on node \"crc\" DevicePath \"\"" Oct 03 09:30:03 crc kubenswrapper[4790]: I1003 09:30:03.856598 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j6l75\" (UniqueName: \"kubernetes.io/projected/72133265-7f2f-4394-b2a6-79ea86c4d8d2-kube-api-access-j6l75\") on node \"crc\" DevicePath \"\"" Oct 03 09:30:03 crc kubenswrapper[4790]: I1003 09:30:03.856624 4790 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/72133265-7f2f-4394-b2a6-79ea86c4d8d2-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 03 09:30:04 crc kubenswrapper[4790]: I1003 09:30:04.248358 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29324730-q4thh" event={"ID":"72133265-7f2f-4394-b2a6-79ea86c4d8d2","Type":"ContainerDied","Data":"7f1f567d0f384b0dc4a79e7a7edee41f4755080fa010dffb2e13ecb330f71685"} Oct 03 09:30:04 crc kubenswrapper[4790]: I1003 09:30:04.248407 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7f1f567d0f384b0dc4a79e7a7edee41f4755080fa010dffb2e13ecb330f71685" Oct 03 09:30:04 crc kubenswrapper[4790]: I1003 09:30:04.248462 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29324730-q4thh" Oct 03 09:30:04 crc kubenswrapper[4790]: I1003 09:30:04.324486 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29324685-lhpqd"] Oct 03 09:30:04 crc kubenswrapper[4790]: I1003 09:30:04.341280 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29324685-lhpqd"] Oct 03 09:30:06 crc kubenswrapper[4790]: I1003 09:30:06.345649 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="52625f75-0999-4e17-a847-d2227effc006" path="/var/lib/kubelet/pods/52625f75-0999-4e17-a847-d2227effc006/volumes" Oct 03 09:30:14 crc kubenswrapper[4790]: I1003 09:30:14.338954 4790 scope.go:117] "RemoveContainer" containerID="6ecb310a15330a3dcf7170c7134b2b3efb7d1ab4c378f8b09d033752d3cbd92b" Oct 03 09:30:14 crc kubenswrapper[4790]: E1003 09:30:14.340170 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqj4j_openshift-machine-config-operator(36eae06a-d8be-4128-8d68-acb32e1f011b)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" podUID="36eae06a-d8be-4128-8d68-acb32e1f011b" Oct 03 09:30:25 crc kubenswrapper[4790]: I1003 09:30:25.329547 4790 scope.go:117] "RemoveContainer" containerID="6ecb310a15330a3dcf7170c7134b2b3efb7d1ab4c378f8b09d033752d3cbd92b" Oct 03 09:30:25 crc kubenswrapper[4790]: E1003 09:30:25.330506 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqj4j_openshift-machine-config-operator(36eae06a-d8be-4128-8d68-acb32e1f011b)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" podUID="36eae06a-d8be-4128-8d68-acb32e1f011b" Oct 03 09:30:29 crc kubenswrapper[4790]: I1003 09:30:29.266870 4790 scope.go:117] "RemoveContainer" containerID="ab6c5bc82cfc5d6f91e6562d40f19a146cb0220ddddb4b39730c66593c86b2a4" Oct 03 09:30:40 crc kubenswrapper[4790]: I1003 09:30:40.330796 4790 scope.go:117] "RemoveContainer" containerID="6ecb310a15330a3dcf7170c7134b2b3efb7d1ab4c378f8b09d033752d3cbd92b" Oct 03 09:30:40 crc kubenswrapper[4790]: E1003 09:30:40.331379 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqj4j_openshift-machine-config-operator(36eae06a-d8be-4128-8d68-acb32e1f011b)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" podUID="36eae06a-d8be-4128-8d68-acb32e1f011b" Oct 03 09:30:53 crc kubenswrapper[4790]: I1003 09:30:53.328742 4790 scope.go:117] "RemoveContainer" containerID="6ecb310a15330a3dcf7170c7134b2b3efb7d1ab4c378f8b09d033752d3cbd92b" Oct 03 09:30:53 crc kubenswrapper[4790]: E1003 09:30:53.330072 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqj4j_openshift-machine-config-operator(36eae06a-d8be-4128-8d68-acb32e1f011b)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" podUID="36eae06a-d8be-4128-8d68-acb32e1f011b" Oct 03 09:31:04 crc kubenswrapper[4790]: I1003 09:31:04.341633 4790 scope.go:117] "RemoveContainer" containerID="6ecb310a15330a3dcf7170c7134b2b3efb7d1ab4c378f8b09d033752d3cbd92b" Oct 03 09:31:04 crc kubenswrapper[4790]: E1003 09:31:04.342547 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqj4j_openshift-machine-config-operator(36eae06a-d8be-4128-8d68-acb32e1f011b)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" podUID="36eae06a-d8be-4128-8d68-acb32e1f011b" Oct 03 09:31:19 crc kubenswrapper[4790]: I1003 09:31:19.328977 4790 scope.go:117] "RemoveContainer" containerID="6ecb310a15330a3dcf7170c7134b2b3efb7d1ab4c378f8b09d033752d3cbd92b" Oct 03 09:31:19 crc kubenswrapper[4790]: E1003 09:31:19.330070 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqj4j_openshift-machine-config-operator(36eae06a-d8be-4128-8d68-acb32e1f011b)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" podUID="36eae06a-d8be-4128-8d68-acb32e1f011b" Oct 03 09:31:30 crc kubenswrapper[4790]: I1003 09:31:30.328769 4790 scope.go:117] "RemoveContainer" containerID="6ecb310a15330a3dcf7170c7134b2b3efb7d1ab4c378f8b09d033752d3cbd92b" Oct 03 09:31:30 crc kubenswrapper[4790]: E1003 09:31:30.329846 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqj4j_openshift-machine-config-operator(36eae06a-d8be-4128-8d68-acb32e1f011b)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" podUID="36eae06a-d8be-4128-8d68-acb32e1f011b" Oct 03 09:31:40 crc kubenswrapper[4790]: I1003 09:31:40.821803 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-xsqlg"] Oct 03 09:31:40 crc kubenswrapper[4790]: E1003 09:31:40.822992 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="72133265-7f2f-4394-b2a6-79ea86c4d8d2" containerName="collect-profiles" Oct 03 09:31:40 crc kubenswrapper[4790]: I1003 09:31:40.823009 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="72133265-7f2f-4394-b2a6-79ea86c4d8d2" containerName="collect-profiles" Oct 03 09:31:40 crc kubenswrapper[4790]: I1003 09:31:40.823247 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="72133265-7f2f-4394-b2a6-79ea86c4d8d2" containerName="collect-profiles" Oct 03 09:31:40 crc kubenswrapper[4790]: I1003 09:31:40.825002 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-xsqlg" Oct 03 09:31:40 crc kubenswrapper[4790]: I1003 09:31:40.855167 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-xsqlg"] Oct 03 09:31:40 crc kubenswrapper[4790]: I1003 09:31:40.929855 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/523de508-11df-4dd9-9244-d13ea2181872-utilities\") pod \"redhat-marketplace-xsqlg\" (UID: \"523de508-11df-4dd9-9244-d13ea2181872\") " pod="openshift-marketplace/redhat-marketplace-xsqlg" Oct 03 09:31:40 crc kubenswrapper[4790]: I1003 09:31:40.929952 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fkvxk\" (UniqueName: \"kubernetes.io/projected/523de508-11df-4dd9-9244-d13ea2181872-kube-api-access-fkvxk\") pod \"redhat-marketplace-xsqlg\" (UID: \"523de508-11df-4dd9-9244-d13ea2181872\") " pod="openshift-marketplace/redhat-marketplace-xsqlg" Oct 03 09:31:40 crc kubenswrapper[4790]: I1003 09:31:40.930104 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/523de508-11df-4dd9-9244-d13ea2181872-catalog-content\") pod \"redhat-marketplace-xsqlg\" (UID: \"523de508-11df-4dd9-9244-d13ea2181872\") " pod="openshift-marketplace/redhat-marketplace-xsqlg" Oct 03 09:31:41 crc kubenswrapper[4790]: I1003 09:31:41.035153 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/523de508-11df-4dd9-9244-d13ea2181872-utilities\") pod \"redhat-marketplace-xsqlg\" (UID: \"523de508-11df-4dd9-9244-d13ea2181872\") " pod="openshift-marketplace/redhat-marketplace-xsqlg" Oct 03 09:31:41 crc kubenswrapper[4790]: I1003 09:31:41.035263 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fkvxk\" (UniqueName: \"kubernetes.io/projected/523de508-11df-4dd9-9244-d13ea2181872-kube-api-access-fkvxk\") pod \"redhat-marketplace-xsqlg\" (UID: \"523de508-11df-4dd9-9244-d13ea2181872\") " pod="openshift-marketplace/redhat-marketplace-xsqlg" Oct 03 09:31:41 crc kubenswrapper[4790]: I1003 09:31:41.035420 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/523de508-11df-4dd9-9244-d13ea2181872-catalog-content\") pod \"redhat-marketplace-xsqlg\" (UID: \"523de508-11df-4dd9-9244-d13ea2181872\") " pod="openshift-marketplace/redhat-marketplace-xsqlg" Oct 03 09:31:41 crc kubenswrapper[4790]: I1003 09:31:41.035670 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/523de508-11df-4dd9-9244-d13ea2181872-utilities\") pod \"redhat-marketplace-xsqlg\" (UID: \"523de508-11df-4dd9-9244-d13ea2181872\") " pod="openshift-marketplace/redhat-marketplace-xsqlg" Oct 03 09:31:41 crc kubenswrapper[4790]: I1003 09:31:41.041212 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/523de508-11df-4dd9-9244-d13ea2181872-catalog-content\") pod \"redhat-marketplace-xsqlg\" (UID: \"523de508-11df-4dd9-9244-d13ea2181872\") " pod="openshift-marketplace/redhat-marketplace-xsqlg" Oct 03 09:31:41 crc kubenswrapper[4790]: I1003 09:31:41.062823 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fkvxk\" (UniqueName: \"kubernetes.io/projected/523de508-11df-4dd9-9244-d13ea2181872-kube-api-access-fkvxk\") pod \"redhat-marketplace-xsqlg\" (UID: \"523de508-11df-4dd9-9244-d13ea2181872\") " pod="openshift-marketplace/redhat-marketplace-xsqlg" Oct 03 09:31:41 crc kubenswrapper[4790]: I1003 09:31:41.152597 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-xsqlg" Oct 03 09:31:41 crc kubenswrapper[4790]: I1003 09:31:41.632783 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-xsqlg"] Oct 03 09:31:42 crc kubenswrapper[4790]: I1003 09:31:42.305484 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xsqlg" event={"ID":"523de508-11df-4dd9-9244-d13ea2181872","Type":"ContainerStarted","Data":"891731d6dfe0c589c888bf65b8b6b4fe0bd48e38069fea23b0b0528fd74b7703"} Oct 03 09:31:43 crc kubenswrapper[4790]: I1003 09:31:43.316332 4790 generic.go:334] "Generic (PLEG): container finished" podID="523de508-11df-4dd9-9244-d13ea2181872" containerID="d6df54724106b996e6c9190b7d57245bc610c2d1d4f54c0d2ed9e5b6a1eb15bc" exitCode=0 Oct 03 09:31:43 crc kubenswrapper[4790]: I1003 09:31:43.316377 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xsqlg" event={"ID":"523de508-11df-4dd9-9244-d13ea2181872","Type":"ContainerDied","Data":"d6df54724106b996e6c9190b7d57245bc610c2d1d4f54c0d2ed9e5b6a1eb15bc"} Oct 03 09:31:43 crc kubenswrapper[4790]: I1003 09:31:43.329058 4790 scope.go:117] "RemoveContainer" containerID="6ecb310a15330a3dcf7170c7134b2b3efb7d1ab4c378f8b09d033752d3cbd92b" Oct 03 09:31:43 crc kubenswrapper[4790]: E1003 09:31:43.329319 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqj4j_openshift-machine-config-operator(36eae06a-d8be-4128-8d68-acb32e1f011b)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" podUID="36eae06a-d8be-4128-8d68-acb32e1f011b" Oct 03 09:31:46 crc kubenswrapper[4790]: I1003 09:31:46.364182 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xsqlg" event={"ID":"523de508-11df-4dd9-9244-d13ea2181872","Type":"ContainerStarted","Data":"4264cf396cb794d1d02951befdb13dca6a716b518d310bb985e852f3d8a5e1e4"} Oct 03 09:31:47 crc kubenswrapper[4790]: I1003 09:31:47.390393 4790 generic.go:334] "Generic (PLEG): container finished" podID="523de508-11df-4dd9-9244-d13ea2181872" containerID="4264cf396cb794d1d02951befdb13dca6a716b518d310bb985e852f3d8a5e1e4" exitCode=0 Oct 03 09:31:47 crc kubenswrapper[4790]: I1003 09:31:47.390441 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xsqlg" event={"ID":"523de508-11df-4dd9-9244-d13ea2181872","Type":"ContainerDied","Data":"4264cf396cb794d1d02951befdb13dca6a716b518d310bb985e852f3d8a5e1e4"} Oct 03 09:31:50 crc kubenswrapper[4790]: I1003 09:31:50.428262 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xsqlg" event={"ID":"523de508-11df-4dd9-9244-d13ea2181872","Type":"ContainerStarted","Data":"8bed306148b867fdc9421c6826b90afe7a2fdb2a256b337ab0a52f16c3320c9b"} Oct 03 09:31:50 crc kubenswrapper[4790]: I1003 09:31:50.459456 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-xsqlg" podStartSLOduration=3.815769317 podStartE2EDuration="10.459428993s" podCreationTimestamp="2025-10-03 09:31:40 +0000 UTC" firstStartedPulling="2025-10-03 09:31:43.318194858 +0000 UTC m=+3089.671129106" lastFinishedPulling="2025-10-03 09:31:49.961854544 +0000 UTC m=+3096.314788782" observedRunningTime="2025-10-03 09:31:50.449039097 +0000 UTC m=+3096.801973335" watchObservedRunningTime="2025-10-03 09:31:50.459428993 +0000 UTC m=+3096.812363241" Oct 03 09:31:51 crc kubenswrapper[4790]: I1003 09:31:51.152747 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-xsqlg" Oct 03 09:31:51 crc kubenswrapper[4790]: I1003 09:31:51.152893 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-xsqlg" Oct 03 09:31:52 crc kubenswrapper[4790]: I1003 09:31:52.204713 4790 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-marketplace-xsqlg" podUID="523de508-11df-4dd9-9244-d13ea2181872" containerName="registry-server" probeResult="failure" output=< Oct 03 09:31:52 crc kubenswrapper[4790]: timeout: failed to connect service ":50051" within 1s Oct 03 09:31:52 crc kubenswrapper[4790]: > Oct 03 09:31:56 crc kubenswrapper[4790]: I1003 09:31:56.329257 4790 scope.go:117] "RemoveContainer" containerID="6ecb310a15330a3dcf7170c7134b2b3efb7d1ab4c378f8b09d033752d3cbd92b" Oct 03 09:31:56 crc kubenswrapper[4790]: E1003 09:31:56.342198 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqj4j_openshift-machine-config-operator(36eae06a-d8be-4128-8d68-acb32e1f011b)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" podUID="36eae06a-d8be-4128-8d68-acb32e1f011b" Oct 03 09:32:01 crc kubenswrapper[4790]: I1003 09:32:01.197743 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-xsqlg" Oct 03 09:32:01 crc kubenswrapper[4790]: I1003 09:32:01.276353 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-xsqlg" Oct 03 09:32:01 crc kubenswrapper[4790]: I1003 09:32:01.457958 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-xsqlg"] Oct 03 09:32:02 crc kubenswrapper[4790]: I1003 09:32:02.572827 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-xsqlg" podUID="523de508-11df-4dd9-9244-d13ea2181872" containerName="registry-server" containerID="cri-o://8bed306148b867fdc9421c6826b90afe7a2fdb2a256b337ab0a52f16c3320c9b" gracePeriod=2 Oct 03 09:32:03 crc kubenswrapper[4790]: I1003 09:32:03.201448 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-xsqlg" Oct 03 09:32:03 crc kubenswrapper[4790]: I1003 09:32:03.361811 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/523de508-11df-4dd9-9244-d13ea2181872-catalog-content\") pod \"523de508-11df-4dd9-9244-d13ea2181872\" (UID: \"523de508-11df-4dd9-9244-d13ea2181872\") " Oct 03 09:32:03 crc kubenswrapper[4790]: I1003 09:32:03.362052 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fkvxk\" (UniqueName: \"kubernetes.io/projected/523de508-11df-4dd9-9244-d13ea2181872-kube-api-access-fkvxk\") pod \"523de508-11df-4dd9-9244-d13ea2181872\" (UID: \"523de508-11df-4dd9-9244-d13ea2181872\") " Oct 03 09:32:03 crc kubenswrapper[4790]: I1003 09:32:03.362099 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/523de508-11df-4dd9-9244-d13ea2181872-utilities\") pod \"523de508-11df-4dd9-9244-d13ea2181872\" (UID: \"523de508-11df-4dd9-9244-d13ea2181872\") " Oct 03 09:32:03 crc kubenswrapper[4790]: I1003 09:32:03.363851 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/523de508-11df-4dd9-9244-d13ea2181872-utilities" (OuterVolumeSpecName: "utilities") pod "523de508-11df-4dd9-9244-d13ea2181872" (UID: "523de508-11df-4dd9-9244-d13ea2181872"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 09:32:03 crc kubenswrapper[4790]: I1003 09:32:03.370915 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/523de508-11df-4dd9-9244-d13ea2181872-kube-api-access-fkvxk" (OuterVolumeSpecName: "kube-api-access-fkvxk") pod "523de508-11df-4dd9-9244-d13ea2181872" (UID: "523de508-11df-4dd9-9244-d13ea2181872"). InnerVolumeSpecName "kube-api-access-fkvxk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:32:03 crc kubenswrapper[4790]: I1003 09:32:03.389490 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/523de508-11df-4dd9-9244-d13ea2181872-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "523de508-11df-4dd9-9244-d13ea2181872" (UID: "523de508-11df-4dd9-9244-d13ea2181872"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 09:32:03 crc kubenswrapper[4790]: I1003 09:32:03.464825 4790 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/523de508-11df-4dd9-9244-d13ea2181872-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 09:32:03 crc kubenswrapper[4790]: I1003 09:32:03.464865 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fkvxk\" (UniqueName: \"kubernetes.io/projected/523de508-11df-4dd9-9244-d13ea2181872-kube-api-access-fkvxk\") on node \"crc\" DevicePath \"\"" Oct 03 09:32:03 crc kubenswrapper[4790]: I1003 09:32:03.464878 4790 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/523de508-11df-4dd9-9244-d13ea2181872-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 09:32:03 crc kubenswrapper[4790]: I1003 09:32:03.583140 4790 generic.go:334] "Generic (PLEG): container finished" podID="523de508-11df-4dd9-9244-d13ea2181872" containerID="8bed306148b867fdc9421c6826b90afe7a2fdb2a256b337ab0a52f16c3320c9b" exitCode=0 Oct 03 09:32:03 crc kubenswrapper[4790]: I1003 09:32:03.583186 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xsqlg" event={"ID":"523de508-11df-4dd9-9244-d13ea2181872","Type":"ContainerDied","Data":"8bed306148b867fdc9421c6826b90afe7a2fdb2a256b337ab0a52f16c3320c9b"} Oct 03 09:32:03 crc kubenswrapper[4790]: I1003 09:32:03.583212 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xsqlg" event={"ID":"523de508-11df-4dd9-9244-d13ea2181872","Type":"ContainerDied","Data":"891731d6dfe0c589c888bf65b8b6b4fe0bd48e38069fea23b0b0528fd74b7703"} Oct 03 09:32:03 crc kubenswrapper[4790]: I1003 09:32:03.583229 4790 scope.go:117] "RemoveContainer" containerID="8bed306148b867fdc9421c6826b90afe7a2fdb2a256b337ab0a52f16c3320c9b" Oct 03 09:32:03 crc kubenswrapper[4790]: I1003 09:32:03.583248 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-xsqlg" Oct 03 09:32:03 crc kubenswrapper[4790]: I1003 09:32:03.620099 4790 scope.go:117] "RemoveContainer" containerID="4264cf396cb794d1d02951befdb13dca6a716b518d310bb985e852f3d8a5e1e4" Oct 03 09:32:03 crc kubenswrapper[4790]: I1003 09:32:03.623330 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-xsqlg"] Oct 03 09:32:03 crc kubenswrapper[4790]: I1003 09:32:03.641049 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-xsqlg"] Oct 03 09:32:03 crc kubenswrapper[4790]: I1003 09:32:03.650450 4790 scope.go:117] "RemoveContainer" containerID="d6df54724106b996e6c9190b7d57245bc610c2d1d4f54c0d2ed9e5b6a1eb15bc" Oct 03 09:32:03 crc kubenswrapper[4790]: I1003 09:32:03.701661 4790 scope.go:117] "RemoveContainer" containerID="8bed306148b867fdc9421c6826b90afe7a2fdb2a256b337ab0a52f16c3320c9b" Oct 03 09:32:03 crc kubenswrapper[4790]: E1003 09:32:03.702478 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8bed306148b867fdc9421c6826b90afe7a2fdb2a256b337ab0a52f16c3320c9b\": container with ID starting with 8bed306148b867fdc9421c6826b90afe7a2fdb2a256b337ab0a52f16c3320c9b not found: ID does not exist" containerID="8bed306148b867fdc9421c6826b90afe7a2fdb2a256b337ab0a52f16c3320c9b" Oct 03 09:32:03 crc kubenswrapper[4790]: I1003 09:32:03.702518 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8bed306148b867fdc9421c6826b90afe7a2fdb2a256b337ab0a52f16c3320c9b"} err="failed to get container status \"8bed306148b867fdc9421c6826b90afe7a2fdb2a256b337ab0a52f16c3320c9b\": rpc error: code = NotFound desc = could not find container \"8bed306148b867fdc9421c6826b90afe7a2fdb2a256b337ab0a52f16c3320c9b\": container with ID starting with 8bed306148b867fdc9421c6826b90afe7a2fdb2a256b337ab0a52f16c3320c9b not found: ID does not exist" Oct 03 09:32:03 crc kubenswrapper[4790]: I1003 09:32:03.702547 4790 scope.go:117] "RemoveContainer" containerID="4264cf396cb794d1d02951befdb13dca6a716b518d310bb985e852f3d8a5e1e4" Oct 03 09:32:03 crc kubenswrapper[4790]: E1003 09:32:03.703002 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4264cf396cb794d1d02951befdb13dca6a716b518d310bb985e852f3d8a5e1e4\": container with ID starting with 4264cf396cb794d1d02951befdb13dca6a716b518d310bb985e852f3d8a5e1e4 not found: ID does not exist" containerID="4264cf396cb794d1d02951befdb13dca6a716b518d310bb985e852f3d8a5e1e4" Oct 03 09:32:03 crc kubenswrapper[4790]: I1003 09:32:03.703042 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4264cf396cb794d1d02951befdb13dca6a716b518d310bb985e852f3d8a5e1e4"} err="failed to get container status \"4264cf396cb794d1d02951befdb13dca6a716b518d310bb985e852f3d8a5e1e4\": rpc error: code = NotFound desc = could not find container \"4264cf396cb794d1d02951befdb13dca6a716b518d310bb985e852f3d8a5e1e4\": container with ID starting with 4264cf396cb794d1d02951befdb13dca6a716b518d310bb985e852f3d8a5e1e4 not found: ID does not exist" Oct 03 09:32:03 crc kubenswrapper[4790]: I1003 09:32:03.703056 4790 scope.go:117] "RemoveContainer" containerID="d6df54724106b996e6c9190b7d57245bc610c2d1d4f54c0d2ed9e5b6a1eb15bc" Oct 03 09:32:03 crc kubenswrapper[4790]: E1003 09:32:03.703290 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d6df54724106b996e6c9190b7d57245bc610c2d1d4f54c0d2ed9e5b6a1eb15bc\": container with ID starting with d6df54724106b996e6c9190b7d57245bc610c2d1d4f54c0d2ed9e5b6a1eb15bc not found: ID does not exist" containerID="d6df54724106b996e6c9190b7d57245bc610c2d1d4f54c0d2ed9e5b6a1eb15bc" Oct 03 09:32:03 crc kubenswrapper[4790]: I1003 09:32:03.703312 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d6df54724106b996e6c9190b7d57245bc610c2d1d4f54c0d2ed9e5b6a1eb15bc"} err="failed to get container status \"d6df54724106b996e6c9190b7d57245bc610c2d1d4f54c0d2ed9e5b6a1eb15bc\": rpc error: code = NotFound desc = could not find container \"d6df54724106b996e6c9190b7d57245bc610c2d1d4f54c0d2ed9e5b6a1eb15bc\": container with ID starting with d6df54724106b996e6c9190b7d57245bc610c2d1d4f54c0d2ed9e5b6a1eb15bc not found: ID does not exist" Oct 03 09:32:04 crc kubenswrapper[4790]: I1003 09:32:04.346461 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="523de508-11df-4dd9-9244-d13ea2181872" path="/var/lib/kubelet/pods/523de508-11df-4dd9-9244-d13ea2181872/volumes" Oct 03 09:32:07 crc kubenswrapper[4790]: I1003 09:32:07.328347 4790 scope.go:117] "RemoveContainer" containerID="6ecb310a15330a3dcf7170c7134b2b3efb7d1ab4c378f8b09d033752d3cbd92b" Oct 03 09:32:07 crc kubenswrapper[4790]: E1003 09:32:07.329979 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqj4j_openshift-machine-config-operator(36eae06a-d8be-4128-8d68-acb32e1f011b)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" podUID="36eae06a-d8be-4128-8d68-acb32e1f011b" Oct 03 09:32:21 crc kubenswrapper[4790]: I1003 09:32:21.328515 4790 scope.go:117] "RemoveContainer" containerID="6ecb310a15330a3dcf7170c7134b2b3efb7d1ab4c378f8b09d033752d3cbd92b" Oct 03 09:32:21 crc kubenswrapper[4790]: E1003 09:32:21.329356 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqj4j_openshift-machine-config-operator(36eae06a-d8be-4128-8d68-acb32e1f011b)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" podUID="36eae06a-d8be-4128-8d68-acb32e1f011b" Oct 03 09:32:33 crc kubenswrapper[4790]: I1003 09:32:33.328066 4790 scope.go:117] "RemoveContainer" containerID="6ecb310a15330a3dcf7170c7134b2b3efb7d1ab4c378f8b09d033752d3cbd92b" Oct 03 09:32:33 crc kubenswrapper[4790]: E1003 09:32:33.353285 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqj4j_openshift-machine-config-operator(36eae06a-d8be-4128-8d68-acb32e1f011b)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" podUID="36eae06a-d8be-4128-8d68-acb32e1f011b" Oct 03 09:32:44 crc kubenswrapper[4790]: I1003 09:32:44.338544 4790 scope.go:117] "RemoveContainer" containerID="6ecb310a15330a3dcf7170c7134b2b3efb7d1ab4c378f8b09d033752d3cbd92b" Oct 03 09:32:44 crc kubenswrapper[4790]: E1003 09:32:44.339464 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqj4j_openshift-machine-config-operator(36eae06a-d8be-4128-8d68-acb32e1f011b)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" podUID="36eae06a-d8be-4128-8d68-acb32e1f011b" Oct 03 09:32:57 crc kubenswrapper[4790]: I1003 09:32:57.328113 4790 scope.go:117] "RemoveContainer" containerID="6ecb310a15330a3dcf7170c7134b2b3efb7d1ab4c378f8b09d033752d3cbd92b" Oct 03 09:32:57 crc kubenswrapper[4790]: E1003 09:32:57.329067 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqj4j_openshift-machine-config-operator(36eae06a-d8be-4128-8d68-acb32e1f011b)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" podUID="36eae06a-d8be-4128-8d68-acb32e1f011b" Oct 03 09:33:08 crc kubenswrapper[4790]: I1003 09:33:08.329228 4790 scope.go:117] "RemoveContainer" containerID="6ecb310a15330a3dcf7170c7134b2b3efb7d1ab4c378f8b09d033752d3cbd92b" Oct 03 09:33:08 crc kubenswrapper[4790]: E1003 09:33:08.330197 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqj4j_openshift-machine-config-operator(36eae06a-d8be-4128-8d68-acb32e1f011b)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" podUID="36eae06a-d8be-4128-8d68-acb32e1f011b" Oct 03 09:33:21 crc kubenswrapper[4790]: I1003 09:33:21.328401 4790 scope.go:117] "RemoveContainer" containerID="6ecb310a15330a3dcf7170c7134b2b3efb7d1ab4c378f8b09d033752d3cbd92b" Oct 03 09:33:21 crc kubenswrapper[4790]: E1003 09:33:21.329153 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqj4j_openshift-machine-config-operator(36eae06a-d8be-4128-8d68-acb32e1f011b)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" podUID="36eae06a-d8be-4128-8d68-acb32e1f011b" Oct 03 09:33:33 crc kubenswrapper[4790]: I1003 09:33:33.328856 4790 scope.go:117] "RemoveContainer" containerID="6ecb310a15330a3dcf7170c7134b2b3efb7d1ab4c378f8b09d033752d3cbd92b" Oct 03 09:33:33 crc kubenswrapper[4790]: E1003 09:33:33.329828 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqj4j_openshift-machine-config-operator(36eae06a-d8be-4128-8d68-acb32e1f011b)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" podUID="36eae06a-d8be-4128-8d68-acb32e1f011b" Oct 03 09:33:48 crc kubenswrapper[4790]: I1003 09:33:48.328601 4790 scope.go:117] "RemoveContainer" containerID="6ecb310a15330a3dcf7170c7134b2b3efb7d1ab4c378f8b09d033752d3cbd92b" Oct 03 09:33:48 crc kubenswrapper[4790]: I1003 09:33:48.713126 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" event={"ID":"36eae06a-d8be-4128-8d68-acb32e1f011b","Type":"ContainerStarted","Data":"bbe57b25c1d9005b99952abb43502fff1160bcf8d2ed6a764726a202017f31fe"} Oct 03 09:35:58 crc kubenswrapper[4790]: I1003 09:35:58.999158 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-44tz2"] Oct 03 09:35:59 crc kubenswrapper[4790]: E1003 09:35:58.999983 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="523de508-11df-4dd9-9244-d13ea2181872" containerName="extract-utilities" Oct 03 09:35:59 crc kubenswrapper[4790]: I1003 09:35:58.999995 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="523de508-11df-4dd9-9244-d13ea2181872" containerName="extract-utilities" Oct 03 09:35:59 crc kubenswrapper[4790]: E1003 09:35:59.000032 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="523de508-11df-4dd9-9244-d13ea2181872" containerName="extract-content" Oct 03 09:35:59 crc kubenswrapper[4790]: I1003 09:35:59.000038 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="523de508-11df-4dd9-9244-d13ea2181872" containerName="extract-content" Oct 03 09:35:59 crc kubenswrapper[4790]: E1003 09:35:59.000051 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="523de508-11df-4dd9-9244-d13ea2181872" containerName="registry-server" Oct 03 09:35:59 crc kubenswrapper[4790]: I1003 09:35:59.000058 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="523de508-11df-4dd9-9244-d13ea2181872" containerName="registry-server" Oct 03 09:35:59 crc kubenswrapper[4790]: I1003 09:35:59.000242 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="523de508-11df-4dd9-9244-d13ea2181872" containerName="registry-server" Oct 03 09:35:59 crc kubenswrapper[4790]: I1003 09:35:59.001746 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-44tz2" Oct 03 09:35:59 crc kubenswrapper[4790]: I1003 09:35:59.025659 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-44tz2"] Oct 03 09:35:59 crc kubenswrapper[4790]: I1003 09:35:59.114962 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/388b15f9-ac41-4f3e-93ff-6d0a251a4a01-catalog-content\") pod \"certified-operators-44tz2\" (UID: \"388b15f9-ac41-4f3e-93ff-6d0a251a4a01\") " pod="openshift-marketplace/certified-operators-44tz2" Oct 03 09:35:59 crc kubenswrapper[4790]: I1003 09:35:59.115021 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rvwf7\" (UniqueName: \"kubernetes.io/projected/388b15f9-ac41-4f3e-93ff-6d0a251a4a01-kube-api-access-rvwf7\") pod \"certified-operators-44tz2\" (UID: \"388b15f9-ac41-4f3e-93ff-6d0a251a4a01\") " pod="openshift-marketplace/certified-operators-44tz2" Oct 03 09:35:59 crc kubenswrapper[4790]: I1003 09:35:59.115114 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/388b15f9-ac41-4f3e-93ff-6d0a251a4a01-utilities\") pod \"certified-operators-44tz2\" (UID: \"388b15f9-ac41-4f3e-93ff-6d0a251a4a01\") " pod="openshift-marketplace/certified-operators-44tz2" Oct 03 09:35:59 crc kubenswrapper[4790]: I1003 09:35:59.217507 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/388b15f9-ac41-4f3e-93ff-6d0a251a4a01-catalog-content\") pod \"certified-operators-44tz2\" (UID: \"388b15f9-ac41-4f3e-93ff-6d0a251a4a01\") " pod="openshift-marketplace/certified-operators-44tz2" Oct 03 09:35:59 crc kubenswrapper[4790]: I1003 09:35:59.217564 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rvwf7\" (UniqueName: \"kubernetes.io/projected/388b15f9-ac41-4f3e-93ff-6d0a251a4a01-kube-api-access-rvwf7\") pod \"certified-operators-44tz2\" (UID: \"388b15f9-ac41-4f3e-93ff-6d0a251a4a01\") " pod="openshift-marketplace/certified-operators-44tz2" Oct 03 09:35:59 crc kubenswrapper[4790]: I1003 09:35:59.217673 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/388b15f9-ac41-4f3e-93ff-6d0a251a4a01-utilities\") pod \"certified-operators-44tz2\" (UID: \"388b15f9-ac41-4f3e-93ff-6d0a251a4a01\") " pod="openshift-marketplace/certified-operators-44tz2" Oct 03 09:35:59 crc kubenswrapper[4790]: I1003 09:35:59.218008 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/388b15f9-ac41-4f3e-93ff-6d0a251a4a01-catalog-content\") pod \"certified-operators-44tz2\" (UID: \"388b15f9-ac41-4f3e-93ff-6d0a251a4a01\") " pod="openshift-marketplace/certified-operators-44tz2" Oct 03 09:35:59 crc kubenswrapper[4790]: I1003 09:35:59.218040 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/388b15f9-ac41-4f3e-93ff-6d0a251a4a01-utilities\") pod \"certified-operators-44tz2\" (UID: \"388b15f9-ac41-4f3e-93ff-6d0a251a4a01\") " pod="openshift-marketplace/certified-operators-44tz2" Oct 03 09:35:59 crc kubenswrapper[4790]: I1003 09:35:59.236942 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rvwf7\" (UniqueName: \"kubernetes.io/projected/388b15f9-ac41-4f3e-93ff-6d0a251a4a01-kube-api-access-rvwf7\") pod \"certified-operators-44tz2\" (UID: \"388b15f9-ac41-4f3e-93ff-6d0a251a4a01\") " pod="openshift-marketplace/certified-operators-44tz2" Oct 03 09:35:59 crc kubenswrapper[4790]: I1003 09:35:59.324489 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-44tz2" Oct 03 09:35:59 crc kubenswrapper[4790]: I1003 09:35:59.863626 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-44tz2"] Oct 03 09:36:00 crc kubenswrapper[4790]: I1003 09:36:00.085252 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-44tz2" event={"ID":"388b15f9-ac41-4f3e-93ff-6d0a251a4a01","Type":"ContainerStarted","Data":"09cc052cc4eeaeb9fa5615e5905dba821e846c03ad97983ff7bf65504ed2ecbf"} Oct 03 09:36:00 crc kubenswrapper[4790]: I1003 09:36:00.085294 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-44tz2" event={"ID":"388b15f9-ac41-4f3e-93ff-6d0a251a4a01","Type":"ContainerStarted","Data":"628a9a21acfd04051a4f5bb2589b6403b31ff5fbaec21c943520a585126e3064"} Oct 03 09:36:01 crc kubenswrapper[4790]: I1003 09:36:01.103035 4790 generic.go:334] "Generic (PLEG): container finished" podID="388b15f9-ac41-4f3e-93ff-6d0a251a4a01" containerID="09cc052cc4eeaeb9fa5615e5905dba821e846c03ad97983ff7bf65504ed2ecbf" exitCode=0 Oct 03 09:36:01 crc kubenswrapper[4790]: I1003 09:36:01.103279 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-44tz2" event={"ID":"388b15f9-ac41-4f3e-93ff-6d0a251a4a01","Type":"ContainerDied","Data":"09cc052cc4eeaeb9fa5615e5905dba821e846c03ad97983ff7bf65504ed2ecbf"} Oct 03 09:36:01 crc kubenswrapper[4790]: I1003 09:36:01.105977 4790 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 03 09:36:02 crc kubenswrapper[4790]: I1003 09:36:02.120808 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-44tz2" event={"ID":"388b15f9-ac41-4f3e-93ff-6d0a251a4a01","Type":"ContainerStarted","Data":"12e5877edf84d9653b96ebc9b1f21f55d5372f9775d0a0428cd09eab6fa2317a"} Oct 03 09:36:04 crc kubenswrapper[4790]: I1003 09:36:04.149261 4790 generic.go:334] "Generic (PLEG): container finished" podID="388b15f9-ac41-4f3e-93ff-6d0a251a4a01" containerID="12e5877edf84d9653b96ebc9b1f21f55d5372f9775d0a0428cd09eab6fa2317a" exitCode=0 Oct 03 09:36:04 crc kubenswrapper[4790]: I1003 09:36:04.149338 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-44tz2" event={"ID":"388b15f9-ac41-4f3e-93ff-6d0a251a4a01","Type":"ContainerDied","Data":"12e5877edf84d9653b96ebc9b1f21f55d5372f9775d0a0428cd09eab6fa2317a"} Oct 03 09:36:05 crc kubenswrapper[4790]: I1003 09:36:05.163157 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-44tz2" event={"ID":"388b15f9-ac41-4f3e-93ff-6d0a251a4a01","Type":"ContainerStarted","Data":"dc79b6c052165faf8fe4e961facb49e886fc5bda98725cc9e4b025e74e5855ae"} Oct 03 09:36:05 crc kubenswrapper[4790]: I1003 09:36:05.192065 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-44tz2" podStartSLOduration=3.699242856 podStartE2EDuration="7.19204551s" podCreationTimestamp="2025-10-03 09:35:58 +0000 UTC" firstStartedPulling="2025-10-03 09:36:01.105625539 +0000 UTC m=+3347.458559797" lastFinishedPulling="2025-10-03 09:36:04.598428213 +0000 UTC m=+3350.951362451" observedRunningTime="2025-10-03 09:36:05.183609928 +0000 UTC m=+3351.536544186" watchObservedRunningTime="2025-10-03 09:36:05.19204551 +0000 UTC m=+3351.544979748" Oct 03 09:36:09 crc kubenswrapper[4790]: I1003 09:36:09.325377 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-44tz2" Oct 03 09:36:09 crc kubenswrapper[4790]: I1003 09:36:09.326003 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-44tz2" Oct 03 09:36:09 crc kubenswrapper[4790]: I1003 09:36:09.372517 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-44tz2" Oct 03 09:36:10 crc kubenswrapper[4790]: I1003 09:36:10.186000 4790 patch_prober.go:28] interesting pod/machine-config-daemon-zqj4j container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 09:36:10 crc kubenswrapper[4790]: I1003 09:36:10.186314 4790 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" podUID="36eae06a-d8be-4128-8d68-acb32e1f011b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 09:36:10 crc kubenswrapper[4790]: I1003 09:36:10.267133 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-44tz2" Oct 03 09:36:10 crc kubenswrapper[4790]: I1003 09:36:10.312322 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-44tz2"] Oct 03 09:36:12 crc kubenswrapper[4790]: I1003 09:36:12.232279 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-44tz2" podUID="388b15f9-ac41-4f3e-93ff-6d0a251a4a01" containerName="registry-server" containerID="cri-o://dc79b6c052165faf8fe4e961facb49e886fc5bda98725cc9e4b025e74e5855ae" gracePeriod=2 Oct 03 09:36:12 crc kubenswrapper[4790]: E1003 09:36:12.381018 4790 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod388b15f9_ac41_4f3e_93ff_6d0a251a4a01.slice/crio-dc79b6c052165faf8fe4e961facb49e886fc5bda98725cc9e4b025e74e5855ae.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod388b15f9_ac41_4f3e_93ff_6d0a251a4a01.slice/crio-conmon-dc79b6c052165faf8fe4e961facb49e886fc5bda98725cc9e4b025e74e5855ae.scope\": RecentStats: unable to find data in memory cache]" Oct 03 09:36:12 crc kubenswrapper[4790]: I1003 09:36:12.738294 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-44tz2" Oct 03 09:36:12 crc kubenswrapper[4790]: I1003 09:36:12.838874 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rvwf7\" (UniqueName: \"kubernetes.io/projected/388b15f9-ac41-4f3e-93ff-6d0a251a4a01-kube-api-access-rvwf7\") pod \"388b15f9-ac41-4f3e-93ff-6d0a251a4a01\" (UID: \"388b15f9-ac41-4f3e-93ff-6d0a251a4a01\") " Oct 03 09:36:12 crc kubenswrapper[4790]: I1003 09:36:12.838965 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/388b15f9-ac41-4f3e-93ff-6d0a251a4a01-utilities\") pod \"388b15f9-ac41-4f3e-93ff-6d0a251a4a01\" (UID: \"388b15f9-ac41-4f3e-93ff-6d0a251a4a01\") " Oct 03 09:36:12 crc kubenswrapper[4790]: I1003 09:36:12.839098 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/388b15f9-ac41-4f3e-93ff-6d0a251a4a01-catalog-content\") pod \"388b15f9-ac41-4f3e-93ff-6d0a251a4a01\" (UID: \"388b15f9-ac41-4f3e-93ff-6d0a251a4a01\") " Oct 03 09:36:12 crc kubenswrapper[4790]: I1003 09:36:12.840022 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/388b15f9-ac41-4f3e-93ff-6d0a251a4a01-utilities" (OuterVolumeSpecName: "utilities") pod "388b15f9-ac41-4f3e-93ff-6d0a251a4a01" (UID: "388b15f9-ac41-4f3e-93ff-6d0a251a4a01"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 09:36:12 crc kubenswrapper[4790]: I1003 09:36:12.846224 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/388b15f9-ac41-4f3e-93ff-6d0a251a4a01-kube-api-access-rvwf7" (OuterVolumeSpecName: "kube-api-access-rvwf7") pod "388b15f9-ac41-4f3e-93ff-6d0a251a4a01" (UID: "388b15f9-ac41-4f3e-93ff-6d0a251a4a01"). InnerVolumeSpecName "kube-api-access-rvwf7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:36:12 crc kubenswrapper[4790]: I1003 09:36:12.880087 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/388b15f9-ac41-4f3e-93ff-6d0a251a4a01-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "388b15f9-ac41-4f3e-93ff-6d0a251a4a01" (UID: "388b15f9-ac41-4f3e-93ff-6d0a251a4a01"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 09:36:12 crc kubenswrapper[4790]: I1003 09:36:12.941159 4790 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/388b15f9-ac41-4f3e-93ff-6d0a251a4a01-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 09:36:12 crc kubenswrapper[4790]: I1003 09:36:12.941198 4790 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/388b15f9-ac41-4f3e-93ff-6d0a251a4a01-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 09:36:12 crc kubenswrapper[4790]: I1003 09:36:12.941213 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rvwf7\" (UniqueName: \"kubernetes.io/projected/388b15f9-ac41-4f3e-93ff-6d0a251a4a01-kube-api-access-rvwf7\") on node \"crc\" DevicePath \"\"" Oct 03 09:36:13 crc kubenswrapper[4790]: I1003 09:36:13.243479 4790 generic.go:334] "Generic (PLEG): container finished" podID="388b15f9-ac41-4f3e-93ff-6d0a251a4a01" containerID="dc79b6c052165faf8fe4e961facb49e886fc5bda98725cc9e4b025e74e5855ae" exitCode=0 Oct 03 09:36:13 crc kubenswrapper[4790]: I1003 09:36:13.243531 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-44tz2" event={"ID":"388b15f9-ac41-4f3e-93ff-6d0a251a4a01","Type":"ContainerDied","Data":"dc79b6c052165faf8fe4e961facb49e886fc5bda98725cc9e4b025e74e5855ae"} Oct 03 09:36:13 crc kubenswrapper[4790]: I1003 09:36:13.243607 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-44tz2" event={"ID":"388b15f9-ac41-4f3e-93ff-6d0a251a4a01","Type":"ContainerDied","Data":"628a9a21acfd04051a4f5bb2589b6403b31ff5fbaec21c943520a585126e3064"} Oct 03 09:36:13 crc kubenswrapper[4790]: I1003 09:36:13.243631 4790 scope.go:117] "RemoveContainer" containerID="dc79b6c052165faf8fe4e961facb49e886fc5bda98725cc9e4b025e74e5855ae" Oct 03 09:36:13 crc kubenswrapper[4790]: I1003 09:36:13.243545 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-44tz2" Oct 03 09:36:13 crc kubenswrapper[4790]: I1003 09:36:13.270612 4790 scope.go:117] "RemoveContainer" containerID="12e5877edf84d9653b96ebc9b1f21f55d5372f9775d0a0428cd09eab6fa2317a" Oct 03 09:36:13 crc kubenswrapper[4790]: I1003 09:36:13.288789 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-44tz2"] Oct 03 09:36:13 crc kubenswrapper[4790]: I1003 09:36:13.297860 4790 scope.go:117] "RemoveContainer" containerID="09cc052cc4eeaeb9fa5615e5905dba821e846c03ad97983ff7bf65504ed2ecbf" Oct 03 09:36:13 crc kubenswrapper[4790]: I1003 09:36:13.298547 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-44tz2"] Oct 03 09:36:13 crc kubenswrapper[4790]: I1003 09:36:13.344983 4790 scope.go:117] "RemoveContainer" containerID="dc79b6c052165faf8fe4e961facb49e886fc5bda98725cc9e4b025e74e5855ae" Oct 03 09:36:13 crc kubenswrapper[4790]: E1003 09:36:13.347147 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dc79b6c052165faf8fe4e961facb49e886fc5bda98725cc9e4b025e74e5855ae\": container with ID starting with dc79b6c052165faf8fe4e961facb49e886fc5bda98725cc9e4b025e74e5855ae not found: ID does not exist" containerID="dc79b6c052165faf8fe4e961facb49e886fc5bda98725cc9e4b025e74e5855ae" Oct 03 09:36:13 crc kubenswrapper[4790]: I1003 09:36:13.347198 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dc79b6c052165faf8fe4e961facb49e886fc5bda98725cc9e4b025e74e5855ae"} err="failed to get container status \"dc79b6c052165faf8fe4e961facb49e886fc5bda98725cc9e4b025e74e5855ae\": rpc error: code = NotFound desc = could not find container \"dc79b6c052165faf8fe4e961facb49e886fc5bda98725cc9e4b025e74e5855ae\": container with ID starting with dc79b6c052165faf8fe4e961facb49e886fc5bda98725cc9e4b025e74e5855ae not found: ID does not exist" Oct 03 09:36:13 crc kubenswrapper[4790]: I1003 09:36:13.347233 4790 scope.go:117] "RemoveContainer" containerID="12e5877edf84d9653b96ebc9b1f21f55d5372f9775d0a0428cd09eab6fa2317a" Oct 03 09:36:13 crc kubenswrapper[4790]: E1003 09:36:13.349091 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"12e5877edf84d9653b96ebc9b1f21f55d5372f9775d0a0428cd09eab6fa2317a\": container with ID starting with 12e5877edf84d9653b96ebc9b1f21f55d5372f9775d0a0428cd09eab6fa2317a not found: ID does not exist" containerID="12e5877edf84d9653b96ebc9b1f21f55d5372f9775d0a0428cd09eab6fa2317a" Oct 03 09:36:13 crc kubenswrapper[4790]: I1003 09:36:13.349130 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"12e5877edf84d9653b96ebc9b1f21f55d5372f9775d0a0428cd09eab6fa2317a"} err="failed to get container status \"12e5877edf84d9653b96ebc9b1f21f55d5372f9775d0a0428cd09eab6fa2317a\": rpc error: code = NotFound desc = could not find container \"12e5877edf84d9653b96ebc9b1f21f55d5372f9775d0a0428cd09eab6fa2317a\": container with ID starting with 12e5877edf84d9653b96ebc9b1f21f55d5372f9775d0a0428cd09eab6fa2317a not found: ID does not exist" Oct 03 09:36:13 crc kubenswrapper[4790]: I1003 09:36:13.349149 4790 scope.go:117] "RemoveContainer" containerID="09cc052cc4eeaeb9fa5615e5905dba821e846c03ad97983ff7bf65504ed2ecbf" Oct 03 09:36:13 crc kubenswrapper[4790]: E1003 09:36:13.349741 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"09cc052cc4eeaeb9fa5615e5905dba821e846c03ad97983ff7bf65504ed2ecbf\": container with ID starting with 09cc052cc4eeaeb9fa5615e5905dba821e846c03ad97983ff7bf65504ed2ecbf not found: ID does not exist" containerID="09cc052cc4eeaeb9fa5615e5905dba821e846c03ad97983ff7bf65504ed2ecbf" Oct 03 09:36:13 crc kubenswrapper[4790]: I1003 09:36:13.349786 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"09cc052cc4eeaeb9fa5615e5905dba821e846c03ad97983ff7bf65504ed2ecbf"} err="failed to get container status \"09cc052cc4eeaeb9fa5615e5905dba821e846c03ad97983ff7bf65504ed2ecbf\": rpc error: code = NotFound desc = could not find container \"09cc052cc4eeaeb9fa5615e5905dba821e846c03ad97983ff7bf65504ed2ecbf\": container with ID starting with 09cc052cc4eeaeb9fa5615e5905dba821e846c03ad97983ff7bf65504ed2ecbf not found: ID does not exist" Oct 03 09:36:14 crc kubenswrapper[4790]: I1003 09:36:14.347066 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="388b15f9-ac41-4f3e-93ff-6d0a251a4a01" path="/var/lib/kubelet/pods/388b15f9-ac41-4f3e-93ff-6d0a251a4a01/volumes" Oct 03 09:36:40 crc kubenswrapper[4790]: I1003 09:36:40.186038 4790 patch_prober.go:28] interesting pod/machine-config-daemon-zqj4j container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 09:36:40 crc kubenswrapper[4790]: I1003 09:36:40.186746 4790 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" podUID="36eae06a-d8be-4128-8d68-acb32e1f011b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 09:37:10 crc kubenswrapper[4790]: I1003 09:37:10.185698 4790 patch_prober.go:28] interesting pod/machine-config-daemon-zqj4j container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 09:37:10 crc kubenswrapper[4790]: I1003 09:37:10.186423 4790 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" podUID="36eae06a-d8be-4128-8d68-acb32e1f011b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 09:37:10 crc kubenswrapper[4790]: I1003 09:37:10.186485 4790 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" Oct 03 09:37:10 crc kubenswrapper[4790]: I1003 09:37:10.187668 4790 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"bbe57b25c1d9005b99952abb43502fff1160bcf8d2ed6a764726a202017f31fe"} pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 03 09:37:10 crc kubenswrapper[4790]: I1003 09:37:10.187768 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" podUID="36eae06a-d8be-4128-8d68-acb32e1f011b" containerName="machine-config-daemon" containerID="cri-o://bbe57b25c1d9005b99952abb43502fff1160bcf8d2ed6a764726a202017f31fe" gracePeriod=600 Oct 03 09:37:10 crc kubenswrapper[4790]: I1003 09:37:10.875355 4790 generic.go:334] "Generic (PLEG): container finished" podID="36eae06a-d8be-4128-8d68-acb32e1f011b" containerID="bbe57b25c1d9005b99952abb43502fff1160bcf8d2ed6a764726a202017f31fe" exitCode=0 Oct 03 09:37:10 crc kubenswrapper[4790]: I1003 09:37:10.875492 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" event={"ID":"36eae06a-d8be-4128-8d68-acb32e1f011b","Type":"ContainerDied","Data":"bbe57b25c1d9005b99952abb43502fff1160bcf8d2ed6a764726a202017f31fe"} Oct 03 09:37:10 crc kubenswrapper[4790]: I1003 09:37:10.876021 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" event={"ID":"36eae06a-d8be-4128-8d68-acb32e1f011b","Type":"ContainerStarted","Data":"d843dc69f3124e5a638bab870b578cf0acf9e76306d860efb1b16bc10073326a"} Oct 03 09:37:10 crc kubenswrapper[4790]: I1003 09:37:10.876053 4790 scope.go:117] "RemoveContainer" containerID="6ecb310a15330a3dcf7170c7134b2b3efb7d1ab4c378f8b09d033752d3cbd92b" Oct 03 09:37:35 crc kubenswrapper[4790]: I1003 09:37:35.033127 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-kwmdc"] Oct 03 09:37:35 crc kubenswrapper[4790]: E1003 09:37:35.034463 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="388b15f9-ac41-4f3e-93ff-6d0a251a4a01" containerName="registry-server" Oct 03 09:37:35 crc kubenswrapper[4790]: I1003 09:37:35.034490 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="388b15f9-ac41-4f3e-93ff-6d0a251a4a01" containerName="registry-server" Oct 03 09:37:35 crc kubenswrapper[4790]: E1003 09:37:35.034527 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="388b15f9-ac41-4f3e-93ff-6d0a251a4a01" containerName="extract-utilities" Oct 03 09:37:35 crc kubenswrapper[4790]: I1003 09:37:35.034542 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="388b15f9-ac41-4f3e-93ff-6d0a251a4a01" containerName="extract-utilities" Oct 03 09:37:35 crc kubenswrapper[4790]: E1003 09:37:35.034602 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="388b15f9-ac41-4f3e-93ff-6d0a251a4a01" containerName="extract-content" Oct 03 09:37:35 crc kubenswrapper[4790]: I1003 09:37:35.034619 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="388b15f9-ac41-4f3e-93ff-6d0a251a4a01" containerName="extract-content" Oct 03 09:37:35 crc kubenswrapper[4790]: I1003 09:37:35.035065 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="388b15f9-ac41-4f3e-93ff-6d0a251a4a01" containerName="registry-server" Oct 03 09:37:35 crc kubenswrapper[4790]: I1003 09:37:35.037869 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-kwmdc" Oct 03 09:37:35 crc kubenswrapper[4790]: I1003 09:37:35.043962 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-kwmdc"] Oct 03 09:37:35 crc kubenswrapper[4790]: I1003 09:37:35.135517 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d3f099ac-7a70-434b-887f-34acdc8fecbc-utilities\") pod \"community-operators-kwmdc\" (UID: \"d3f099ac-7a70-434b-887f-34acdc8fecbc\") " pod="openshift-marketplace/community-operators-kwmdc" Oct 03 09:37:35 crc kubenswrapper[4790]: I1003 09:37:35.135938 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d3f099ac-7a70-434b-887f-34acdc8fecbc-catalog-content\") pod \"community-operators-kwmdc\" (UID: \"d3f099ac-7a70-434b-887f-34acdc8fecbc\") " pod="openshift-marketplace/community-operators-kwmdc" Oct 03 09:37:35 crc kubenswrapper[4790]: I1003 09:37:35.135994 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ntx9n\" (UniqueName: \"kubernetes.io/projected/d3f099ac-7a70-434b-887f-34acdc8fecbc-kube-api-access-ntx9n\") pod \"community-operators-kwmdc\" (UID: \"d3f099ac-7a70-434b-887f-34acdc8fecbc\") " pod="openshift-marketplace/community-operators-kwmdc" Oct 03 09:37:35 crc kubenswrapper[4790]: I1003 09:37:35.237992 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d3f099ac-7a70-434b-887f-34acdc8fecbc-utilities\") pod \"community-operators-kwmdc\" (UID: \"d3f099ac-7a70-434b-887f-34acdc8fecbc\") " pod="openshift-marketplace/community-operators-kwmdc" Oct 03 09:37:35 crc kubenswrapper[4790]: I1003 09:37:35.238087 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d3f099ac-7a70-434b-887f-34acdc8fecbc-catalog-content\") pod \"community-operators-kwmdc\" (UID: \"d3f099ac-7a70-434b-887f-34acdc8fecbc\") " pod="openshift-marketplace/community-operators-kwmdc" Oct 03 09:37:35 crc kubenswrapper[4790]: I1003 09:37:35.238148 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ntx9n\" (UniqueName: \"kubernetes.io/projected/d3f099ac-7a70-434b-887f-34acdc8fecbc-kube-api-access-ntx9n\") pod \"community-operators-kwmdc\" (UID: \"d3f099ac-7a70-434b-887f-34acdc8fecbc\") " pod="openshift-marketplace/community-operators-kwmdc" Oct 03 09:37:35 crc kubenswrapper[4790]: I1003 09:37:35.238495 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d3f099ac-7a70-434b-887f-34acdc8fecbc-utilities\") pod \"community-operators-kwmdc\" (UID: \"d3f099ac-7a70-434b-887f-34acdc8fecbc\") " pod="openshift-marketplace/community-operators-kwmdc" Oct 03 09:37:35 crc kubenswrapper[4790]: I1003 09:37:35.238529 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d3f099ac-7a70-434b-887f-34acdc8fecbc-catalog-content\") pod \"community-operators-kwmdc\" (UID: \"d3f099ac-7a70-434b-887f-34acdc8fecbc\") " pod="openshift-marketplace/community-operators-kwmdc" Oct 03 09:37:35 crc kubenswrapper[4790]: I1003 09:37:35.259539 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ntx9n\" (UniqueName: \"kubernetes.io/projected/d3f099ac-7a70-434b-887f-34acdc8fecbc-kube-api-access-ntx9n\") pod \"community-operators-kwmdc\" (UID: \"d3f099ac-7a70-434b-887f-34acdc8fecbc\") " pod="openshift-marketplace/community-operators-kwmdc" Oct 03 09:37:35 crc kubenswrapper[4790]: I1003 09:37:35.362714 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-kwmdc" Oct 03 09:37:35 crc kubenswrapper[4790]: I1003 09:37:35.840039 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-kwmdc"] Oct 03 09:37:36 crc kubenswrapper[4790]: I1003 09:37:36.155218 4790 generic.go:334] "Generic (PLEG): container finished" podID="d3f099ac-7a70-434b-887f-34acdc8fecbc" containerID="b4092dc1c472f898c89dd98ffdf6fb45866c5318c92b5598a0a4664fbe51366b" exitCode=0 Oct 03 09:37:36 crc kubenswrapper[4790]: I1003 09:37:36.155289 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kwmdc" event={"ID":"d3f099ac-7a70-434b-887f-34acdc8fecbc","Type":"ContainerDied","Data":"b4092dc1c472f898c89dd98ffdf6fb45866c5318c92b5598a0a4664fbe51366b"} Oct 03 09:37:36 crc kubenswrapper[4790]: I1003 09:37:36.155518 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kwmdc" event={"ID":"d3f099ac-7a70-434b-887f-34acdc8fecbc","Type":"ContainerStarted","Data":"c6c93e4fb02b9041eaead493575710dc0f340afcc31727d677028b77ba5f6154"} Oct 03 09:37:37 crc kubenswrapper[4790]: I1003 09:37:37.170159 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kwmdc" event={"ID":"d3f099ac-7a70-434b-887f-34acdc8fecbc","Type":"ContainerStarted","Data":"c6c408ce43165c13b522327b8da830938782eea80bdd1a08d917b1f55eb8faa1"} Oct 03 09:37:39 crc kubenswrapper[4790]: I1003 09:37:39.194653 4790 generic.go:334] "Generic (PLEG): container finished" podID="d3f099ac-7a70-434b-887f-34acdc8fecbc" containerID="c6c408ce43165c13b522327b8da830938782eea80bdd1a08d917b1f55eb8faa1" exitCode=0 Oct 03 09:37:39 crc kubenswrapper[4790]: I1003 09:37:39.194782 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kwmdc" event={"ID":"d3f099ac-7a70-434b-887f-34acdc8fecbc","Type":"ContainerDied","Data":"c6c408ce43165c13b522327b8da830938782eea80bdd1a08d917b1f55eb8faa1"} Oct 03 09:37:40 crc kubenswrapper[4790]: I1003 09:37:40.209159 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kwmdc" event={"ID":"d3f099ac-7a70-434b-887f-34acdc8fecbc","Type":"ContainerStarted","Data":"9e1c539885421b519a03388aff88a60b55a4c55ace0bceadd9a3d4d5d1ed09f9"} Oct 03 09:37:40 crc kubenswrapper[4790]: I1003 09:37:40.232262 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-kwmdc" podStartSLOduration=1.797959235 podStartE2EDuration="5.232237303s" podCreationTimestamp="2025-10-03 09:37:35 +0000 UTC" firstStartedPulling="2025-10-03 09:37:36.156803977 +0000 UTC m=+3442.509738215" lastFinishedPulling="2025-10-03 09:37:39.591082035 +0000 UTC m=+3445.944016283" observedRunningTime="2025-10-03 09:37:40.229979181 +0000 UTC m=+3446.582913429" watchObservedRunningTime="2025-10-03 09:37:40.232237303 +0000 UTC m=+3446.585171551" Oct 03 09:37:45 crc kubenswrapper[4790]: I1003 09:37:45.363123 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-kwmdc" Oct 03 09:37:45 crc kubenswrapper[4790]: I1003 09:37:45.363726 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-kwmdc" Oct 03 09:37:45 crc kubenswrapper[4790]: I1003 09:37:45.422208 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-kwmdc" Oct 03 09:37:46 crc kubenswrapper[4790]: I1003 09:37:46.318329 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-kwmdc" Oct 03 09:37:46 crc kubenswrapper[4790]: I1003 09:37:46.387735 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-kwmdc"] Oct 03 09:37:48 crc kubenswrapper[4790]: I1003 09:37:48.297308 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-kwmdc" podUID="d3f099ac-7a70-434b-887f-34acdc8fecbc" containerName="registry-server" containerID="cri-o://9e1c539885421b519a03388aff88a60b55a4c55ace0bceadd9a3d4d5d1ed09f9" gracePeriod=2 Oct 03 09:37:48 crc kubenswrapper[4790]: I1003 09:37:48.780051 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-kwmdc" Oct 03 09:37:48 crc kubenswrapper[4790]: I1003 09:37:48.866495 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d3f099ac-7a70-434b-887f-34acdc8fecbc-catalog-content\") pod \"d3f099ac-7a70-434b-887f-34acdc8fecbc\" (UID: \"d3f099ac-7a70-434b-887f-34acdc8fecbc\") " Oct 03 09:37:48 crc kubenswrapper[4790]: I1003 09:37:48.866539 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d3f099ac-7a70-434b-887f-34acdc8fecbc-utilities\") pod \"d3f099ac-7a70-434b-887f-34acdc8fecbc\" (UID: \"d3f099ac-7a70-434b-887f-34acdc8fecbc\") " Oct 03 09:37:48 crc kubenswrapper[4790]: I1003 09:37:48.866822 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ntx9n\" (UniqueName: \"kubernetes.io/projected/d3f099ac-7a70-434b-887f-34acdc8fecbc-kube-api-access-ntx9n\") pod \"d3f099ac-7a70-434b-887f-34acdc8fecbc\" (UID: \"d3f099ac-7a70-434b-887f-34acdc8fecbc\") " Oct 03 09:37:48 crc kubenswrapper[4790]: I1003 09:37:48.869002 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d3f099ac-7a70-434b-887f-34acdc8fecbc-utilities" (OuterVolumeSpecName: "utilities") pod "d3f099ac-7a70-434b-887f-34acdc8fecbc" (UID: "d3f099ac-7a70-434b-887f-34acdc8fecbc"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 09:37:48 crc kubenswrapper[4790]: I1003 09:37:48.875568 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d3f099ac-7a70-434b-887f-34acdc8fecbc-kube-api-access-ntx9n" (OuterVolumeSpecName: "kube-api-access-ntx9n") pod "d3f099ac-7a70-434b-887f-34acdc8fecbc" (UID: "d3f099ac-7a70-434b-887f-34acdc8fecbc"). InnerVolumeSpecName "kube-api-access-ntx9n". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:37:48 crc kubenswrapper[4790]: I1003 09:37:48.945696 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d3f099ac-7a70-434b-887f-34acdc8fecbc-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d3f099ac-7a70-434b-887f-34acdc8fecbc" (UID: "d3f099ac-7a70-434b-887f-34acdc8fecbc"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 09:37:48 crc kubenswrapper[4790]: I1003 09:37:48.969853 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ntx9n\" (UniqueName: \"kubernetes.io/projected/d3f099ac-7a70-434b-887f-34acdc8fecbc-kube-api-access-ntx9n\") on node \"crc\" DevicePath \"\"" Oct 03 09:37:48 crc kubenswrapper[4790]: I1003 09:37:48.969897 4790 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d3f099ac-7a70-434b-887f-34acdc8fecbc-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 09:37:48 crc kubenswrapper[4790]: I1003 09:37:48.969910 4790 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d3f099ac-7a70-434b-887f-34acdc8fecbc-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 09:37:49 crc kubenswrapper[4790]: I1003 09:37:49.309315 4790 generic.go:334] "Generic (PLEG): container finished" podID="d3f099ac-7a70-434b-887f-34acdc8fecbc" containerID="9e1c539885421b519a03388aff88a60b55a4c55ace0bceadd9a3d4d5d1ed09f9" exitCode=0 Oct 03 09:37:49 crc kubenswrapper[4790]: I1003 09:37:49.309359 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kwmdc" event={"ID":"d3f099ac-7a70-434b-887f-34acdc8fecbc","Type":"ContainerDied","Data":"9e1c539885421b519a03388aff88a60b55a4c55ace0bceadd9a3d4d5d1ed09f9"} Oct 03 09:37:49 crc kubenswrapper[4790]: I1003 09:37:49.309384 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kwmdc" event={"ID":"d3f099ac-7a70-434b-887f-34acdc8fecbc","Type":"ContainerDied","Data":"c6c93e4fb02b9041eaead493575710dc0f340afcc31727d677028b77ba5f6154"} Oct 03 09:37:49 crc kubenswrapper[4790]: I1003 09:37:49.309399 4790 scope.go:117] "RemoveContainer" containerID="9e1c539885421b519a03388aff88a60b55a4c55ace0bceadd9a3d4d5d1ed09f9" Oct 03 09:37:49 crc kubenswrapper[4790]: I1003 09:37:49.309541 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-kwmdc" Oct 03 09:37:49 crc kubenswrapper[4790]: I1003 09:37:49.335853 4790 scope.go:117] "RemoveContainer" containerID="c6c408ce43165c13b522327b8da830938782eea80bdd1a08d917b1f55eb8faa1" Oct 03 09:37:49 crc kubenswrapper[4790]: I1003 09:37:49.347679 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-kwmdc"] Oct 03 09:37:49 crc kubenswrapper[4790]: I1003 09:37:49.357647 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-kwmdc"] Oct 03 09:37:49 crc kubenswrapper[4790]: I1003 09:37:49.365104 4790 scope.go:117] "RemoveContainer" containerID="b4092dc1c472f898c89dd98ffdf6fb45866c5318c92b5598a0a4664fbe51366b" Oct 03 09:37:49 crc kubenswrapper[4790]: I1003 09:37:49.431719 4790 scope.go:117] "RemoveContainer" containerID="9e1c539885421b519a03388aff88a60b55a4c55ace0bceadd9a3d4d5d1ed09f9" Oct 03 09:37:49 crc kubenswrapper[4790]: E1003 09:37:49.432275 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9e1c539885421b519a03388aff88a60b55a4c55ace0bceadd9a3d4d5d1ed09f9\": container with ID starting with 9e1c539885421b519a03388aff88a60b55a4c55ace0bceadd9a3d4d5d1ed09f9 not found: ID does not exist" containerID="9e1c539885421b519a03388aff88a60b55a4c55ace0bceadd9a3d4d5d1ed09f9" Oct 03 09:37:49 crc kubenswrapper[4790]: I1003 09:37:49.432314 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9e1c539885421b519a03388aff88a60b55a4c55ace0bceadd9a3d4d5d1ed09f9"} err="failed to get container status \"9e1c539885421b519a03388aff88a60b55a4c55ace0bceadd9a3d4d5d1ed09f9\": rpc error: code = NotFound desc = could not find container \"9e1c539885421b519a03388aff88a60b55a4c55ace0bceadd9a3d4d5d1ed09f9\": container with ID starting with 9e1c539885421b519a03388aff88a60b55a4c55ace0bceadd9a3d4d5d1ed09f9 not found: ID does not exist" Oct 03 09:37:49 crc kubenswrapper[4790]: I1003 09:37:49.432340 4790 scope.go:117] "RemoveContainer" containerID="c6c408ce43165c13b522327b8da830938782eea80bdd1a08d917b1f55eb8faa1" Oct 03 09:37:49 crc kubenswrapper[4790]: E1003 09:37:49.432609 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c6c408ce43165c13b522327b8da830938782eea80bdd1a08d917b1f55eb8faa1\": container with ID starting with c6c408ce43165c13b522327b8da830938782eea80bdd1a08d917b1f55eb8faa1 not found: ID does not exist" containerID="c6c408ce43165c13b522327b8da830938782eea80bdd1a08d917b1f55eb8faa1" Oct 03 09:37:49 crc kubenswrapper[4790]: I1003 09:37:49.432634 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c6c408ce43165c13b522327b8da830938782eea80bdd1a08d917b1f55eb8faa1"} err="failed to get container status \"c6c408ce43165c13b522327b8da830938782eea80bdd1a08d917b1f55eb8faa1\": rpc error: code = NotFound desc = could not find container \"c6c408ce43165c13b522327b8da830938782eea80bdd1a08d917b1f55eb8faa1\": container with ID starting with c6c408ce43165c13b522327b8da830938782eea80bdd1a08d917b1f55eb8faa1 not found: ID does not exist" Oct 03 09:37:49 crc kubenswrapper[4790]: I1003 09:37:49.432657 4790 scope.go:117] "RemoveContainer" containerID="b4092dc1c472f898c89dd98ffdf6fb45866c5318c92b5598a0a4664fbe51366b" Oct 03 09:37:49 crc kubenswrapper[4790]: E1003 09:37:49.433182 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b4092dc1c472f898c89dd98ffdf6fb45866c5318c92b5598a0a4664fbe51366b\": container with ID starting with b4092dc1c472f898c89dd98ffdf6fb45866c5318c92b5598a0a4664fbe51366b not found: ID does not exist" containerID="b4092dc1c472f898c89dd98ffdf6fb45866c5318c92b5598a0a4664fbe51366b" Oct 03 09:37:49 crc kubenswrapper[4790]: I1003 09:37:49.433214 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b4092dc1c472f898c89dd98ffdf6fb45866c5318c92b5598a0a4664fbe51366b"} err="failed to get container status \"b4092dc1c472f898c89dd98ffdf6fb45866c5318c92b5598a0a4664fbe51366b\": rpc error: code = NotFound desc = could not find container \"b4092dc1c472f898c89dd98ffdf6fb45866c5318c92b5598a0a4664fbe51366b\": container with ID starting with b4092dc1c472f898c89dd98ffdf6fb45866c5318c92b5598a0a4664fbe51366b not found: ID does not exist" Oct 03 09:37:50 crc kubenswrapper[4790]: I1003 09:37:50.339377 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d3f099ac-7a70-434b-887f-34acdc8fecbc" path="/var/lib/kubelet/pods/d3f099ac-7a70-434b-887f-34acdc8fecbc/volumes" Oct 03 09:38:40 crc kubenswrapper[4790]: I1003 09:38:40.613258 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-k8cqf"] Oct 03 09:38:40 crc kubenswrapper[4790]: E1003 09:38:40.614499 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d3f099ac-7a70-434b-887f-34acdc8fecbc" containerName="extract-utilities" Oct 03 09:38:40 crc kubenswrapper[4790]: I1003 09:38:40.614515 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3f099ac-7a70-434b-887f-34acdc8fecbc" containerName="extract-utilities" Oct 03 09:38:40 crc kubenswrapper[4790]: E1003 09:38:40.614535 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d3f099ac-7a70-434b-887f-34acdc8fecbc" containerName="registry-server" Oct 03 09:38:40 crc kubenswrapper[4790]: I1003 09:38:40.614544 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3f099ac-7a70-434b-887f-34acdc8fecbc" containerName="registry-server" Oct 03 09:38:40 crc kubenswrapper[4790]: E1003 09:38:40.614584 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d3f099ac-7a70-434b-887f-34acdc8fecbc" containerName="extract-content" Oct 03 09:38:40 crc kubenswrapper[4790]: I1003 09:38:40.614683 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3f099ac-7a70-434b-887f-34acdc8fecbc" containerName="extract-content" Oct 03 09:38:40 crc kubenswrapper[4790]: I1003 09:38:40.614980 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="d3f099ac-7a70-434b-887f-34acdc8fecbc" containerName="registry-server" Oct 03 09:38:40 crc kubenswrapper[4790]: I1003 09:38:40.616973 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-k8cqf" Oct 03 09:38:40 crc kubenswrapper[4790]: I1003 09:38:40.637455 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-k8cqf"] Oct 03 09:38:40 crc kubenswrapper[4790]: I1003 09:38:40.657561 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2rm2k\" (UniqueName: \"kubernetes.io/projected/c9fede08-4252-4c7c-88fe-058a7c0ebcc2-kube-api-access-2rm2k\") pod \"redhat-operators-k8cqf\" (UID: \"c9fede08-4252-4c7c-88fe-058a7c0ebcc2\") " pod="openshift-marketplace/redhat-operators-k8cqf" Oct 03 09:38:40 crc kubenswrapper[4790]: I1003 09:38:40.657682 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c9fede08-4252-4c7c-88fe-058a7c0ebcc2-utilities\") pod \"redhat-operators-k8cqf\" (UID: \"c9fede08-4252-4c7c-88fe-058a7c0ebcc2\") " pod="openshift-marketplace/redhat-operators-k8cqf" Oct 03 09:38:40 crc kubenswrapper[4790]: I1003 09:38:40.657767 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c9fede08-4252-4c7c-88fe-058a7c0ebcc2-catalog-content\") pod \"redhat-operators-k8cqf\" (UID: \"c9fede08-4252-4c7c-88fe-058a7c0ebcc2\") " pod="openshift-marketplace/redhat-operators-k8cqf" Oct 03 09:38:40 crc kubenswrapper[4790]: I1003 09:38:40.759218 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c9fede08-4252-4c7c-88fe-058a7c0ebcc2-catalog-content\") pod \"redhat-operators-k8cqf\" (UID: \"c9fede08-4252-4c7c-88fe-058a7c0ebcc2\") " pod="openshift-marketplace/redhat-operators-k8cqf" Oct 03 09:38:40 crc kubenswrapper[4790]: I1003 09:38:40.759362 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2rm2k\" (UniqueName: \"kubernetes.io/projected/c9fede08-4252-4c7c-88fe-058a7c0ebcc2-kube-api-access-2rm2k\") pod \"redhat-operators-k8cqf\" (UID: \"c9fede08-4252-4c7c-88fe-058a7c0ebcc2\") " pod="openshift-marketplace/redhat-operators-k8cqf" Oct 03 09:38:40 crc kubenswrapper[4790]: I1003 09:38:40.759391 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c9fede08-4252-4c7c-88fe-058a7c0ebcc2-utilities\") pod \"redhat-operators-k8cqf\" (UID: \"c9fede08-4252-4c7c-88fe-058a7c0ebcc2\") " pod="openshift-marketplace/redhat-operators-k8cqf" Oct 03 09:38:40 crc kubenswrapper[4790]: I1003 09:38:40.759970 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c9fede08-4252-4c7c-88fe-058a7c0ebcc2-utilities\") pod \"redhat-operators-k8cqf\" (UID: \"c9fede08-4252-4c7c-88fe-058a7c0ebcc2\") " pod="openshift-marketplace/redhat-operators-k8cqf" Oct 03 09:38:40 crc kubenswrapper[4790]: I1003 09:38:40.760028 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c9fede08-4252-4c7c-88fe-058a7c0ebcc2-catalog-content\") pod \"redhat-operators-k8cqf\" (UID: \"c9fede08-4252-4c7c-88fe-058a7c0ebcc2\") " pod="openshift-marketplace/redhat-operators-k8cqf" Oct 03 09:38:40 crc kubenswrapper[4790]: I1003 09:38:40.789106 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2rm2k\" (UniqueName: \"kubernetes.io/projected/c9fede08-4252-4c7c-88fe-058a7c0ebcc2-kube-api-access-2rm2k\") pod \"redhat-operators-k8cqf\" (UID: \"c9fede08-4252-4c7c-88fe-058a7c0ebcc2\") " pod="openshift-marketplace/redhat-operators-k8cqf" Oct 03 09:38:40 crc kubenswrapper[4790]: I1003 09:38:40.940758 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-k8cqf" Oct 03 09:38:41 crc kubenswrapper[4790]: I1003 09:38:41.920538 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-k8cqf"] Oct 03 09:38:42 crc kubenswrapper[4790]: I1003 09:38:42.905641 4790 generic.go:334] "Generic (PLEG): container finished" podID="c9fede08-4252-4c7c-88fe-058a7c0ebcc2" containerID="bb659b0a09f7041f771f7f2a7bccd947353923a6ebb58b9102b9b61b7510a977" exitCode=0 Oct 03 09:38:42 crc kubenswrapper[4790]: I1003 09:38:42.905884 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-k8cqf" event={"ID":"c9fede08-4252-4c7c-88fe-058a7c0ebcc2","Type":"ContainerDied","Data":"bb659b0a09f7041f771f7f2a7bccd947353923a6ebb58b9102b9b61b7510a977"} Oct 03 09:38:42 crc kubenswrapper[4790]: I1003 09:38:42.905971 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-k8cqf" event={"ID":"c9fede08-4252-4c7c-88fe-058a7c0ebcc2","Type":"ContainerStarted","Data":"4cebafb208765a2f0ae2055b0032b777bbfb4b18ff705bbb5e854daae0d2c940"} Oct 03 09:38:43 crc kubenswrapper[4790]: I1003 09:38:43.916567 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-k8cqf" event={"ID":"c9fede08-4252-4c7c-88fe-058a7c0ebcc2","Type":"ContainerStarted","Data":"8f4681cdc08b64965c9a6bd105470f4284e9ee1f373703039f05e47ece53e1b1"} Oct 03 09:38:48 crc kubenswrapper[4790]: I1003 09:38:48.963419 4790 generic.go:334] "Generic (PLEG): container finished" podID="c9fede08-4252-4c7c-88fe-058a7c0ebcc2" containerID="8f4681cdc08b64965c9a6bd105470f4284e9ee1f373703039f05e47ece53e1b1" exitCode=0 Oct 03 09:38:48 crc kubenswrapper[4790]: I1003 09:38:48.963511 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-k8cqf" event={"ID":"c9fede08-4252-4c7c-88fe-058a7c0ebcc2","Type":"ContainerDied","Data":"8f4681cdc08b64965c9a6bd105470f4284e9ee1f373703039f05e47ece53e1b1"} Oct 03 09:38:49 crc kubenswrapper[4790]: I1003 09:38:49.974745 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-k8cqf" event={"ID":"c9fede08-4252-4c7c-88fe-058a7c0ebcc2","Type":"ContainerStarted","Data":"6aacdc348aa52458dce32ec0bf656bf4cc5fac594c287eb19c7a43ef4c1801ae"} Oct 03 09:38:50 crc kubenswrapper[4790]: I1003 09:38:50.002374 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-k8cqf" podStartSLOduration=3.32664074 podStartE2EDuration="10.00234556s" podCreationTimestamp="2025-10-03 09:38:40 +0000 UTC" firstStartedPulling="2025-10-03 09:38:42.911212976 +0000 UTC m=+3509.264147214" lastFinishedPulling="2025-10-03 09:38:49.586917796 +0000 UTC m=+3515.939852034" observedRunningTime="2025-10-03 09:38:49.994186855 +0000 UTC m=+3516.347121113" watchObservedRunningTime="2025-10-03 09:38:50.00234556 +0000 UTC m=+3516.355279808" Oct 03 09:38:50 crc kubenswrapper[4790]: I1003 09:38:50.941385 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-k8cqf" Oct 03 09:38:50 crc kubenswrapper[4790]: I1003 09:38:50.941849 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-k8cqf" Oct 03 09:38:52 crc kubenswrapper[4790]: I1003 09:38:52.006427 4790 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-k8cqf" podUID="c9fede08-4252-4c7c-88fe-058a7c0ebcc2" containerName="registry-server" probeResult="failure" output=< Oct 03 09:38:52 crc kubenswrapper[4790]: timeout: failed to connect service ":50051" within 1s Oct 03 09:38:52 crc kubenswrapper[4790]: > Oct 03 09:39:01 crc kubenswrapper[4790]: I1003 09:39:01.006095 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-k8cqf" Oct 03 09:39:01 crc kubenswrapper[4790]: I1003 09:39:01.065271 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-k8cqf" Oct 03 09:39:01 crc kubenswrapper[4790]: I1003 09:39:01.247631 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-k8cqf"] Oct 03 09:39:02 crc kubenswrapper[4790]: I1003 09:39:02.093902 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-k8cqf" podUID="c9fede08-4252-4c7c-88fe-058a7c0ebcc2" containerName="registry-server" containerID="cri-o://6aacdc348aa52458dce32ec0bf656bf4cc5fac594c287eb19c7a43ef4c1801ae" gracePeriod=2 Oct 03 09:39:02 crc kubenswrapper[4790]: I1003 09:39:02.592980 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-k8cqf" Oct 03 09:39:02 crc kubenswrapper[4790]: I1003 09:39:02.618343 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c9fede08-4252-4c7c-88fe-058a7c0ebcc2-utilities\") pod \"c9fede08-4252-4c7c-88fe-058a7c0ebcc2\" (UID: \"c9fede08-4252-4c7c-88fe-058a7c0ebcc2\") " Oct 03 09:39:02 crc kubenswrapper[4790]: I1003 09:39:02.618585 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2rm2k\" (UniqueName: \"kubernetes.io/projected/c9fede08-4252-4c7c-88fe-058a7c0ebcc2-kube-api-access-2rm2k\") pod \"c9fede08-4252-4c7c-88fe-058a7c0ebcc2\" (UID: \"c9fede08-4252-4c7c-88fe-058a7c0ebcc2\") " Oct 03 09:39:02 crc kubenswrapper[4790]: I1003 09:39:02.618657 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c9fede08-4252-4c7c-88fe-058a7c0ebcc2-catalog-content\") pod \"c9fede08-4252-4c7c-88fe-058a7c0ebcc2\" (UID: \"c9fede08-4252-4c7c-88fe-058a7c0ebcc2\") " Oct 03 09:39:02 crc kubenswrapper[4790]: I1003 09:39:02.619327 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c9fede08-4252-4c7c-88fe-058a7c0ebcc2-utilities" (OuterVolumeSpecName: "utilities") pod "c9fede08-4252-4c7c-88fe-058a7c0ebcc2" (UID: "c9fede08-4252-4c7c-88fe-058a7c0ebcc2"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 09:39:02 crc kubenswrapper[4790]: I1003 09:39:02.630953 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c9fede08-4252-4c7c-88fe-058a7c0ebcc2-kube-api-access-2rm2k" (OuterVolumeSpecName: "kube-api-access-2rm2k") pod "c9fede08-4252-4c7c-88fe-058a7c0ebcc2" (UID: "c9fede08-4252-4c7c-88fe-058a7c0ebcc2"). InnerVolumeSpecName "kube-api-access-2rm2k". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:39:02 crc kubenswrapper[4790]: I1003 09:39:02.703303 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c9fede08-4252-4c7c-88fe-058a7c0ebcc2-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c9fede08-4252-4c7c-88fe-058a7c0ebcc2" (UID: "c9fede08-4252-4c7c-88fe-058a7c0ebcc2"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 09:39:02 crc kubenswrapper[4790]: I1003 09:39:02.721263 4790 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c9fede08-4252-4c7c-88fe-058a7c0ebcc2-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 09:39:02 crc kubenswrapper[4790]: I1003 09:39:02.721319 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2rm2k\" (UniqueName: \"kubernetes.io/projected/c9fede08-4252-4c7c-88fe-058a7c0ebcc2-kube-api-access-2rm2k\") on node \"crc\" DevicePath \"\"" Oct 03 09:39:02 crc kubenswrapper[4790]: I1003 09:39:02.721332 4790 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c9fede08-4252-4c7c-88fe-058a7c0ebcc2-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 09:39:03 crc kubenswrapper[4790]: I1003 09:39:03.106461 4790 generic.go:334] "Generic (PLEG): container finished" podID="c9fede08-4252-4c7c-88fe-058a7c0ebcc2" containerID="6aacdc348aa52458dce32ec0bf656bf4cc5fac594c287eb19c7a43ef4c1801ae" exitCode=0 Oct 03 09:39:03 crc kubenswrapper[4790]: I1003 09:39:03.106517 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-k8cqf" event={"ID":"c9fede08-4252-4c7c-88fe-058a7c0ebcc2","Type":"ContainerDied","Data":"6aacdc348aa52458dce32ec0bf656bf4cc5fac594c287eb19c7a43ef4c1801ae"} Oct 03 09:39:03 crc kubenswrapper[4790]: I1003 09:39:03.106811 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-k8cqf" event={"ID":"c9fede08-4252-4c7c-88fe-058a7c0ebcc2","Type":"ContainerDied","Data":"4cebafb208765a2f0ae2055b0032b777bbfb4b18ff705bbb5e854daae0d2c940"} Oct 03 09:39:03 crc kubenswrapper[4790]: I1003 09:39:03.106837 4790 scope.go:117] "RemoveContainer" containerID="6aacdc348aa52458dce32ec0bf656bf4cc5fac594c287eb19c7a43ef4c1801ae" Oct 03 09:39:03 crc kubenswrapper[4790]: I1003 09:39:03.106528 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-k8cqf" Oct 03 09:39:03 crc kubenswrapper[4790]: I1003 09:39:03.133518 4790 scope.go:117] "RemoveContainer" containerID="8f4681cdc08b64965c9a6bd105470f4284e9ee1f373703039f05e47ece53e1b1" Oct 03 09:39:03 crc kubenswrapper[4790]: I1003 09:39:03.146169 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-k8cqf"] Oct 03 09:39:03 crc kubenswrapper[4790]: I1003 09:39:03.155682 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-k8cqf"] Oct 03 09:39:03 crc kubenswrapper[4790]: I1003 09:39:03.188069 4790 scope.go:117] "RemoveContainer" containerID="bb659b0a09f7041f771f7f2a7bccd947353923a6ebb58b9102b9b61b7510a977" Oct 03 09:39:03 crc kubenswrapper[4790]: I1003 09:39:03.235627 4790 scope.go:117] "RemoveContainer" containerID="6aacdc348aa52458dce32ec0bf656bf4cc5fac594c287eb19c7a43ef4c1801ae" Oct 03 09:39:03 crc kubenswrapper[4790]: E1003 09:39:03.236561 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6aacdc348aa52458dce32ec0bf656bf4cc5fac594c287eb19c7a43ef4c1801ae\": container with ID starting with 6aacdc348aa52458dce32ec0bf656bf4cc5fac594c287eb19c7a43ef4c1801ae not found: ID does not exist" containerID="6aacdc348aa52458dce32ec0bf656bf4cc5fac594c287eb19c7a43ef4c1801ae" Oct 03 09:39:03 crc kubenswrapper[4790]: I1003 09:39:03.236645 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6aacdc348aa52458dce32ec0bf656bf4cc5fac594c287eb19c7a43ef4c1801ae"} err="failed to get container status \"6aacdc348aa52458dce32ec0bf656bf4cc5fac594c287eb19c7a43ef4c1801ae\": rpc error: code = NotFound desc = could not find container \"6aacdc348aa52458dce32ec0bf656bf4cc5fac594c287eb19c7a43ef4c1801ae\": container with ID starting with 6aacdc348aa52458dce32ec0bf656bf4cc5fac594c287eb19c7a43ef4c1801ae not found: ID does not exist" Oct 03 09:39:03 crc kubenswrapper[4790]: I1003 09:39:03.236675 4790 scope.go:117] "RemoveContainer" containerID="8f4681cdc08b64965c9a6bd105470f4284e9ee1f373703039f05e47ece53e1b1" Oct 03 09:39:03 crc kubenswrapper[4790]: E1003 09:39:03.237079 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8f4681cdc08b64965c9a6bd105470f4284e9ee1f373703039f05e47ece53e1b1\": container with ID starting with 8f4681cdc08b64965c9a6bd105470f4284e9ee1f373703039f05e47ece53e1b1 not found: ID does not exist" containerID="8f4681cdc08b64965c9a6bd105470f4284e9ee1f373703039f05e47ece53e1b1" Oct 03 09:39:03 crc kubenswrapper[4790]: I1003 09:39:03.237117 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8f4681cdc08b64965c9a6bd105470f4284e9ee1f373703039f05e47ece53e1b1"} err="failed to get container status \"8f4681cdc08b64965c9a6bd105470f4284e9ee1f373703039f05e47ece53e1b1\": rpc error: code = NotFound desc = could not find container \"8f4681cdc08b64965c9a6bd105470f4284e9ee1f373703039f05e47ece53e1b1\": container with ID starting with 8f4681cdc08b64965c9a6bd105470f4284e9ee1f373703039f05e47ece53e1b1 not found: ID does not exist" Oct 03 09:39:03 crc kubenswrapper[4790]: I1003 09:39:03.237143 4790 scope.go:117] "RemoveContainer" containerID="bb659b0a09f7041f771f7f2a7bccd947353923a6ebb58b9102b9b61b7510a977" Oct 03 09:39:03 crc kubenswrapper[4790]: E1003 09:39:03.237638 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bb659b0a09f7041f771f7f2a7bccd947353923a6ebb58b9102b9b61b7510a977\": container with ID starting with bb659b0a09f7041f771f7f2a7bccd947353923a6ebb58b9102b9b61b7510a977 not found: ID does not exist" containerID="bb659b0a09f7041f771f7f2a7bccd947353923a6ebb58b9102b9b61b7510a977" Oct 03 09:39:03 crc kubenswrapper[4790]: I1003 09:39:03.237707 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bb659b0a09f7041f771f7f2a7bccd947353923a6ebb58b9102b9b61b7510a977"} err="failed to get container status \"bb659b0a09f7041f771f7f2a7bccd947353923a6ebb58b9102b9b61b7510a977\": rpc error: code = NotFound desc = could not find container \"bb659b0a09f7041f771f7f2a7bccd947353923a6ebb58b9102b9b61b7510a977\": container with ID starting with bb659b0a09f7041f771f7f2a7bccd947353923a6ebb58b9102b9b61b7510a977 not found: ID does not exist" Oct 03 09:39:04 crc kubenswrapper[4790]: I1003 09:39:04.345906 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c9fede08-4252-4c7c-88fe-058a7c0ebcc2" path="/var/lib/kubelet/pods/c9fede08-4252-4c7c-88fe-058a7c0ebcc2/volumes" Oct 03 09:39:10 crc kubenswrapper[4790]: I1003 09:39:10.185724 4790 patch_prober.go:28] interesting pod/machine-config-daemon-zqj4j container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 09:39:10 crc kubenswrapper[4790]: I1003 09:39:10.186032 4790 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" podUID="36eae06a-d8be-4128-8d68-acb32e1f011b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 09:39:40 crc kubenswrapper[4790]: I1003 09:39:40.188038 4790 patch_prober.go:28] interesting pod/machine-config-daemon-zqj4j container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 09:39:40 crc kubenswrapper[4790]: I1003 09:39:40.188823 4790 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" podUID="36eae06a-d8be-4128-8d68-acb32e1f011b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 09:40:10 crc kubenswrapper[4790]: I1003 09:40:10.185969 4790 patch_prober.go:28] interesting pod/machine-config-daemon-zqj4j container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 09:40:10 crc kubenswrapper[4790]: I1003 09:40:10.186504 4790 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" podUID="36eae06a-d8be-4128-8d68-acb32e1f011b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 09:40:10 crc kubenswrapper[4790]: I1003 09:40:10.186553 4790 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" Oct 03 09:40:10 crc kubenswrapper[4790]: I1003 09:40:10.187387 4790 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"d843dc69f3124e5a638bab870b578cf0acf9e76306d860efb1b16bc10073326a"} pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 03 09:40:10 crc kubenswrapper[4790]: I1003 09:40:10.187440 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" podUID="36eae06a-d8be-4128-8d68-acb32e1f011b" containerName="machine-config-daemon" containerID="cri-o://d843dc69f3124e5a638bab870b578cf0acf9e76306d860efb1b16bc10073326a" gracePeriod=600 Oct 03 09:40:10 crc kubenswrapper[4790]: E1003 09:40:10.308904 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqj4j_openshift-machine-config-operator(36eae06a-d8be-4128-8d68-acb32e1f011b)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" podUID="36eae06a-d8be-4128-8d68-acb32e1f011b" Oct 03 09:40:10 crc kubenswrapper[4790]: I1003 09:40:10.853001 4790 generic.go:334] "Generic (PLEG): container finished" podID="36eae06a-d8be-4128-8d68-acb32e1f011b" containerID="d843dc69f3124e5a638bab870b578cf0acf9e76306d860efb1b16bc10073326a" exitCode=0 Oct 03 09:40:10 crc kubenswrapper[4790]: I1003 09:40:10.853058 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" event={"ID":"36eae06a-d8be-4128-8d68-acb32e1f011b","Type":"ContainerDied","Data":"d843dc69f3124e5a638bab870b578cf0acf9e76306d860efb1b16bc10073326a"} Oct 03 09:40:10 crc kubenswrapper[4790]: I1003 09:40:10.853668 4790 scope.go:117] "RemoveContainer" containerID="bbe57b25c1d9005b99952abb43502fff1160bcf8d2ed6a764726a202017f31fe" Oct 03 09:40:10 crc kubenswrapper[4790]: I1003 09:40:10.854562 4790 scope.go:117] "RemoveContainer" containerID="d843dc69f3124e5a638bab870b578cf0acf9e76306d860efb1b16bc10073326a" Oct 03 09:40:10 crc kubenswrapper[4790]: E1003 09:40:10.855053 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqj4j_openshift-machine-config-operator(36eae06a-d8be-4128-8d68-acb32e1f011b)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" podUID="36eae06a-d8be-4128-8d68-acb32e1f011b" Oct 03 09:40:19 crc kubenswrapper[4790]: E1003 09:40:19.040006 4790 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/system.slice/rpm-ostreed.service\": RecentStats: unable to find data in memory cache]" Oct 03 09:40:25 crc kubenswrapper[4790]: I1003 09:40:25.328333 4790 scope.go:117] "RemoveContainer" containerID="d843dc69f3124e5a638bab870b578cf0acf9e76306d860efb1b16bc10073326a" Oct 03 09:40:25 crc kubenswrapper[4790]: E1003 09:40:25.329138 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqj4j_openshift-machine-config-operator(36eae06a-d8be-4128-8d68-acb32e1f011b)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" podUID="36eae06a-d8be-4128-8d68-acb32e1f011b" Oct 03 09:40:39 crc kubenswrapper[4790]: I1003 09:40:39.329221 4790 scope.go:117] "RemoveContainer" containerID="d843dc69f3124e5a638bab870b578cf0acf9e76306d860efb1b16bc10073326a" Oct 03 09:40:39 crc kubenswrapper[4790]: E1003 09:40:39.330059 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqj4j_openshift-machine-config-operator(36eae06a-d8be-4128-8d68-acb32e1f011b)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" podUID="36eae06a-d8be-4128-8d68-acb32e1f011b" Oct 03 09:40:51 crc kubenswrapper[4790]: I1003 09:40:51.329030 4790 scope.go:117] "RemoveContainer" containerID="d843dc69f3124e5a638bab870b578cf0acf9e76306d860efb1b16bc10073326a" Oct 03 09:40:51 crc kubenswrapper[4790]: E1003 09:40:51.330226 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqj4j_openshift-machine-config-operator(36eae06a-d8be-4128-8d68-acb32e1f011b)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" podUID="36eae06a-d8be-4128-8d68-acb32e1f011b" Oct 03 09:41:02 crc kubenswrapper[4790]: I1003 09:41:02.328389 4790 scope.go:117] "RemoveContainer" containerID="d843dc69f3124e5a638bab870b578cf0acf9e76306d860efb1b16bc10073326a" Oct 03 09:41:02 crc kubenswrapper[4790]: E1003 09:41:02.329261 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqj4j_openshift-machine-config-operator(36eae06a-d8be-4128-8d68-acb32e1f011b)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" podUID="36eae06a-d8be-4128-8d68-acb32e1f011b" Oct 03 09:41:17 crc kubenswrapper[4790]: I1003 09:41:17.328103 4790 scope.go:117] "RemoveContainer" containerID="d843dc69f3124e5a638bab870b578cf0acf9e76306d860efb1b16bc10073326a" Oct 03 09:41:17 crc kubenswrapper[4790]: E1003 09:41:17.329191 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqj4j_openshift-machine-config-operator(36eae06a-d8be-4128-8d68-acb32e1f011b)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" podUID="36eae06a-d8be-4128-8d68-acb32e1f011b" Oct 03 09:41:32 crc kubenswrapper[4790]: I1003 09:41:32.328520 4790 scope.go:117] "RemoveContainer" containerID="d843dc69f3124e5a638bab870b578cf0acf9e76306d860efb1b16bc10073326a" Oct 03 09:41:32 crc kubenswrapper[4790]: E1003 09:41:32.329681 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqj4j_openshift-machine-config-operator(36eae06a-d8be-4128-8d68-acb32e1f011b)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" podUID="36eae06a-d8be-4128-8d68-acb32e1f011b" Oct 03 09:41:47 crc kubenswrapper[4790]: I1003 09:41:47.329198 4790 scope.go:117] "RemoveContainer" containerID="d843dc69f3124e5a638bab870b578cf0acf9e76306d860efb1b16bc10073326a" Oct 03 09:41:47 crc kubenswrapper[4790]: E1003 09:41:47.330105 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqj4j_openshift-machine-config-operator(36eae06a-d8be-4128-8d68-acb32e1f011b)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" podUID="36eae06a-d8be-4128-8d68-acb32e1f011b" Oct 03 09:42:02 crc kubenswrapper[4790]: I1003 09:42:02.340216 4790 scope.go:117] "RemoveContainer" containerID="d843dc69f3124e5a638bab870b578cf0acf9e76306d860efb1b16bc10073326a" Oct 03 09:42:02 crc kubenswrapper[4790]: E1003 09:42:02.343396 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqj4j_openshift-machine-config-operator(36eae06a-d8be-4128-8d68-acb32e1f011b)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" podUID="36eae06a-d8be-4128-8d68-acb32e1f011b" Oct 03 09:42:16 crc kubenswrapper[4790]: I1003 09:42:16.328316 4790 scope.go:117] "RemoveContainer" containerID="d843dc69f3124e5a638bab870b578cf0acf9e76306d860efb1b16bc10073326a" Oct 03 09:42:16 crc kubenswrapper[4790]: E1003 09:42:16.329209 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqj4j_openshift-machine-config-operator(36eae06a-d8be-4128-8d68-acb32e1f011b)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" podUID="36eae06a-d8be-4128-8d68-acb32e1f011b" Oct 03 09:42:31 crc kubenswrapper[4790]: I1003 09:42:31.328460 4790 scope.go:117] "RemoveContainer" containerID="d843dc69f3124e5a638bab870b578cf0acf9e76306d860efb1b16bc10073326a" Oct 03 09:42:31 crc kubenswrapper[4790]: E1003 09:42:31.329316 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqj4j_openshift-machine-config-operator(36eae06a-d8be-4128-8d68-acb32e1f011b)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" podUID="36eae06a-d8be-4128-8d68-acb32e1f011b" Oct 03 09:42:43 crc kubenswrapper[4790]: I1003 09:42:43.328473 4790 scope.go:117] "RemoveContainer" containerID="d843dc69f3124e5a638bab870b578cf0acf9e76306d860efb1b16bc10073326a" Oct 03 09:42:43 crc kubenswrapper[4790]: E1003 09:42:43.329508 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqj4j_openshift-machine-config-operator(36eae06a-d8be-4128-8d68-acb32e1f011b)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" podUID="36eae06a-d8be-4128-8d68-acb32e1f011b" Oct 03 09:42:56 crc kubenswrapper[4790]: I1003 09:42:56.329155 4790 scope.go:117] "RemoveContainer" containerID="d843dc69f3124e5a638bab870b578cf0acf9e76306d860efb1b16bc10073326a" Oct 03 09:42:56 crc kubenswrapper[4790]: E1003 09:42:56.331178 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqj4j_openshift-machine-config-operator(36eae06a-d8be-4128-8d68-acb32e1f011b)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" podUID="36eae06a-d8be-4128-8d68-acb32e1f011b" Oct 03 09:42:58 crc kubenswrapper[4790]: I1003 09:42:58.265933 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-x59hn"] Oct 03 09:42:58 crc kubenswrapper[4790]: E1003 09:42:58.266945 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9fede08-4252-4c7c-88fe-058a7c0ebcc2" containerName="extract-content" Oct 03 09:42:58 crc kubenswrapper[4790]: I1003 09:42:58.266971 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9fede08-4252-4c7c-88fe-058a7c0ebcc2" containerName="extract-content" Oct 03 09:42:58 crc kubenswrapper[4790]: E1003 09:42:58.267006 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9fede08-4252-4c7c-88fe-058a7c0ebcc2" containerName="registry-server" Oct 03 09:42:58 crc kubenswrapper[4790]: I1003 09:42:58.267017 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9fede08-4252-4c7c-88fe-058a7c0ebcc2" containerName="registry-server" Oct 03 09:42:58 crc kubenswrapper[4790]: E1003 09:42:58.267041 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9fede08-4252-4c7c-88fe-058a7c0ebcc2" containerName="extract-utilities" Oct 03 09:42:58 crc kubenswrapper[4790]: I1003 09:42:58.267053 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9fede08-4252-4c7c-88fe-058a7c0ebcc2" containerName="extract-utilities" Oct 03 09:42:58 crc kubenswrapper[4790]: I1003 09:42:58.267620 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="c9fede08-4252-4c7c-88fe-058a7c0ebcc2" containerName="registry-server" Oct 03 09:42:58 crc kubenswrapper[4790]: I1003 09:42:58.281995 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-x59hn" Oct 03 09:42:58 crc kubenswrapper[4790]: I1003 09:42:58.283643 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-x59hn"] Oct 03 09:42:58 crc kubenswrapper[4790]: I1003 09:42:58.370465 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mprjk\" (UniqueName: \"kubernetes.io/projected/46d81759-761b-4aa1-aadb-3e1609205d2c-kube-api-access-mprjk\") pod \"redhat-marketplace-x59hn\" (UID: \"46d81759-761b-4aa1-aadb-3e1609205d2c\") " pod="openshift-marketplace/redhat-marketplace-x59hn" Oct 03 09:42:58 crc kubenswrapper[4790]: I1003 09:42:58.370565 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/46d81759-761b-4aa1-aadb-3e1609205d2c-utilities\") pod \"redhat-marketplace-x59hn\" (UID: \"46d81759-761b-4aa1-aadb-3e1609205d2c\") " pod="openshift-marketplace/redhat-marketplace-x59hn" Oct 03 09:42:58 crc kubenswrapper[4790]: I1003 09:42:58.370689 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/46d81759-761b-4aa1-aadb-3e1609205d2c-catalog-content\") pod \"redhat-marketplace-x59hn\" (UID: \"46d81759-761b-4aa1-aadb-3e1609205d2c\") " pod="openshift-marketplace/redhat-marketplace-x59hn" Oct 03 09:42:58 crc kubenswrapper[4790]: I1003 09:42:58.472802 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mprjk\" (UniqueName: \"kubernetes.io/projected/46d81759-761b-4aa1-aadb-3e1609205d2c-kube-api-access-mprjk\") pod \"redhat-marketplace-x59hn\" (UID: \"46d81759-761b-4aa1-aadb-3e1609205d2c\") " pod="openshift-marketplace/redhat-marketplace-x59hn" Oct 03 09:42:58 crc kubenswrapper[4790]: I1003 09:42:58.472906 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/46d81759-761b-4aa1-aadb-3e1609205d2c-utilities\") pod \"redhat-marketplace-x59hn\" (UID: \"46d81759-761b-4aa1-aadb-3e1609205d2c\") " pod="openshift-marketplace/redhat-marketplace-x59hn" Oct 03 09:42:58 crc kubenswrapper[4790]: I1003 09:42:58.473091 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/46d81759-761b-4aa1-aadb-3e1609205d2c-catalog-content\") pod \"redhat-marketplace-x59hn\" (UID: \"46d81759-761b-4aa1-aadb-3e1609205d2c\") " pod="openshift-marketplace/redhat-marketplace-x59hn" Oct 03 09:42:58 crc kubenswrapper[4790]: I1003 09:42:58.475138 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/46d81759-761b-4aa1-aadb-3e1609205d2c-utilities\") pod \"redhat-marketplace-x59hn\" (UID: \"46d81759-761b-4aa1-aadb-3e1609205d2c\") " pod="openshift-marketplace/redhat-marketplace-x59hn" Oct 03 09:42:58 crc kubenswrapper[4790]: I1003 09:42:58.475973 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/46d81759-761b-4aa1-aadb-3e1609205d2c-catalog-content\") pod \"redhat-marketplace-x59hn\" (UID: \"46d81759-761b-4aa1-aadb-3e1609205d2c\") " pod="openshift-marketplace/redhat-marketplace-x59hn" Oct 03 09:42:58 crc kubenswrapper[4790]: I1003 09:42:58.495461 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mprjk\" (UniqueName: \"kubernetes.io/projected/46d81759-761b-4aa1-aadb-3e1609205d2c-kube-api-access-mprjk\") pod \"redhat-marketplace-x59hn\" (UID: \"46d81759-761b-4aa1-aadb-3e1609205d2c\") " pod="openshift-marketplace/redhat-marketplace-x59hn" Oct 03 09:42:58 crc kubenswrapper[4790]: I1003 09:42:58.608658 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-x59hn" Oct 03 09:42:59 crc kubenswrapper[4790]: I1003 09:42:59.121945 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-x59hn"] Oct 03 09:42:59 crc kubenswrapper[4790]: I1003 09:42:59.688671 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-x59hn" event={"ID":"46d81759-761b-4aa1-aadb-3e1609205d2c","Type":"ContainerStarted","Data":"7778698e8980a0f9549fd720ad8ec6664da794524a98f070072a2ab705e7558f"} Oct 03 09:43:00 crc kubenswrapper[4790]: I1003 09:43:00.702453 4790 generic.go:334] "Generic (PLEG): container finished" podID="46d81759-761b-4aa1-aadb-3e1609205d2c" containerID="2ea81b999fdc1a21b089a42ea0896c321f40861ca16abddf8ca2f539627242fb" exitCode=0 Oct 03 09:43:00 crc kubenswrapper[4790]: I1003 09:43:00.702630 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-x59hn" event={"ID":"46d81759-761b-4aa1-aadb-3e1609205d2c","Type":"ContainerDied","Data":"2ea81b999fdc1a21b089a42ea0896c321f40861ca16abddf8ca2f539627242fb"} Oct 03 09:43:00 crc kubenswrapper[4790]: I1003 09:43:00.706214 4790 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 03 09:43:01 crc kubenswrapper[4790]: I1003 09:43:01.715030 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-x59hn" event={"ID":"46d81759-761b-4aa1-aadb-3e1609205d2c","Type":"ContainerStarted","Data":"32f42c90c92f2ef0a917b39bf62f0e76ee8c8999d36286a250b3254300089ecf"} Oct 03 09:43:02 crc kubenswrapper[4790]: I1003 09:43:02.725049 4790 generic.go:334] "Generic (PLEG): container finished" podID="46d81759-761b-4aa1-aadb-3e1609205d2c" containerID="32f42c90c92f2ef0a917b39bf62f0e76ee8c8999d36286a250b3254300089ecf" exitCode=0 Oct 03 09:43:02 crc kubenswrapper[4790]: I1003 09:43:02.725089 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-x59hn" event={"ID":"46d81759-761b-4aa1-aadb-3e1609205d2c","Type":"ContainerDied","Data":"32f42c90c92f2ef0a917b39bf62f0e76ee8c8999d36286a250b3254300089ecf"} Oct 03 09:43:03 crc kubenswrapper[4790]: I1003 09:43:03.737960 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-x59hn" event={"ID":"46d81759-761b-4aa1-aadb-3e1609205d2c","Type":"ContainerStarted","Data":"36278e36c7087c8b111099185bd4fe465eda2a549dbeb56079df2a694ebac1ab"} Oct 03 09:43:03 crc kubenswrapper[4790]: I1003 09:43:03.765342 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-x59hn" podStartSLOduration=3.124401402 podStartE2EDuration="5.76532356s" podCreationTimestamp="2025-10-03 09:42:58 +0000 UTC" firstStartedPulling="2025-10-03 09:43:00.705869168 +0000 UTC m=+3767.058803406" lastFinishedPulling="2025-10-03 09:43:03.346791316 +0000 UTC m=+3769.699725564" observedRunningTime="2025-10-03 09:43:03.761160955 +0000 UTC m=+3770.114095203" watchObservedRunningTime="2025-10-03 09:43:03.76532356 +0000 UTC m=+3770.118257798" Oct 03 09:43:07 crc kubenswrapper[4790]: I1003 09:43:07.328282 4790 scope.go:117] "RemoveContainer" containerID="d843dc69f3124e5a638bab870b578cf0acf9e76306d860efb1b16bc10073326a" Oct 03 09:43:07 crc kubenswrapper[4790]: E1003 09:43:07.330226 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqj4j_openshift-machine-config-operator(36eae06a-d8be-4128-8d68-acb32e1f011b)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" podUID="36eae06a-d8be-4128-8d68-acb32e1f011b" Oct 03 09:43:08 crc kubenswrapper[4790]: I1003 09:43:08.609308 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-x59hn" Oct 03 09:43:08 crc kubenswrapper[4790]: I1003 09:43:08.610275 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-x59hn" Oct 03 09:43:08 crc kubenswrapper[4790]: I1003 09:43:08.661083 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-x59hn" Oct 03 09:43:08 crc kubenswrapper[4790]: I1003 09:43:08.845845 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-x59hn" Oct 03 09:43:08 crc kubenswrapper[4790]: I1003 09:43:08.902511 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-x59hn"] Oct 03 09:43:10 crc kubenswrapper[4790]: I1003 09:43:10.813025 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-x59hn" podUID="46d81759-761b-4aa1-aadb-3e1609205d2c" containerName="registry-server" containerID="cri-o://36278e36c7087c8b111099185bd4fe465eda2a549dbeb56079df2a694ebac1ab" gracePeriod=2 Oct 03 09:43:11 crc kubenswrapper[4790]: I1003 09:43:11.318152 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-x59hn" Oct 03 09:43:11 crc kubenswrapper[4790]: I1003 09:43:11.519918 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/46d81759-761b-4aa1-aadb-3e1609205d2c-catalog-content\") pod \"46d81759-761b-4aa1-aadb-3e1609205d2c\" (UID: \"46d81759-761b-4aa1-aadb-3e1609205d2c\") " Oct 03 09:43:11 crc kubenswrapper[4790]: I1003 09:43:11.520274 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/46d81759-761b-4aa1-aadb-3e1609205d2c-utilities\") pod \"46d81759-761b-4aa1-aadb-3e1609205d2c\" (UID: \"46d81759-761b-4aa1-aadb-3e1609205d2c\") " Oct 03 09:43:11 crc kubenswrapper[4790]: I1003 09:43:11.520430 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mprjk\" (UniqueName: \"kubernetes.io/projected/46d81759-761b-4aa1-aadb-3e1609205d2c-kube-api-access-mprjk\") pod \"46d81759-761b-4aa1-aadb-3e1609205d2c\" (UID: \"46d81759-761b-4aa1-aadb-3e1609205d2c\") " Oct 03 09:43:11 crc kubenswrapper[4790]: I1003 09:43:11.521090 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/46d81759-761b-4aa1-aadb-3e1609205d2c-utilities" (OuterVolumeSpecName: "utilities") pod "46d81759-761b-4aa1-aadb-3e1609205d2c" (UID: "46d81759-761b-4aa1-aadb-3e1609205d2c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 09:43:11 crc kubenswrapper[4790]: I1003 09:43:11.526547 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/46d81759-761b-4aa1-aadb-3e1609205d2c-kube-api-access-mprjk" (OuterVolumeSpecName: "kube-api-access-mprjk") pod "46d81759-761b-4aa1-aadb-3e1609205d2c" (UID: "46d81759-761b-4aa1-aadb-3e1609205d2c"). InnerVolumeSpecName "kube-api-access-mprjk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:43:11 crc kubenswrapper[4790]: I1003 09:43:11.535034 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/46d81759-761b-4aa1-aadb-3e1609205d2c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "46d81759-761b-4aa1-aadb-3e1609205d2c" (UID: "46d81759-761b-4aa1-aadb-3e1609205d2c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 09:43:11 crc kubenswrapper[4790]: I1003 09:43:11.622950 4790 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/46d81759-761b-4aa1-aadb-3e1609205d2c-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 09:43:11 crc kubenswrapper[4790]: I1003 09:43:11.622994 4790 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/46d81759-761b-4aa1-aadb-3e1609205d2c-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 09:43:11 crc kubenswrapper[4790]: I1003 09:43:11.623006 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mprjk\" (UniqueName: \"kubernetes.io/projected/46d81759-761b-4aa1-aadb-3e1609205d2c-kube-api-access-mprjk\") on node \"crc\" DevicePath \"\"" Oct 03 09:43:11 crc kubenswrapper[4790]: I1003 09:43:11.824294 4790 generic.go:334] "Generic (PLEG): container finished" podID="46d81759-761b-4aa1-aadb-3e1609205d2c" containerID="36278e36c7087c8b111099185bd4fe465eda2a549dbeb56079df2a694ebac1ab" exitCode=0 Oct 03 09:43:11 crc kubenswrapper[4790]: I1003 09:43:11.824349 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-x59hn" event={"ID":"46d81759-761b-4aa1-aadb-3e1609205d2c","Type":"ContainerDied","Data":"36278e36c7087c8b111099185bd4fe465eda2a549dbeb56079df2a694ebac1ab"} Oct 03 09:43:11 crc kubenswrapper[4790]: I1003 09:43:11.824424 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-x59hn" event={"ID":"46d81759-761b-4aa1-aadb-3e1609205d2c","Type":"ContainerDied","Data":"7778698e8980a0f9549fd720ad8ec6664da794524a98f070072a2ab705e7558f"} Oct 03 09:43:11 crc kubenswrapper[4790]: I1003 09:43:11.824422 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-x59hn" Oct 03 09:43:11 crc kubenswrapper[4790]: I1003 09:43:11.824463 4790 scope.go:117] "RemoveContainer" containerID="36278e36c7087c8b111099185bd4fe465eda2a549dbeb56079df2a694ebac1ab" Oct 03 09:43:11 crc kubenswrapper[4790]: I1003 09:43:11.858527 4790 scope.go:117] "RemoveContainer" containerID="32f42c90c92f2ef0a917b39bf62f0e76ee8c8999d36286a250b3254300089ecf" Oct 03 09:43:11 crc kubenswrapper[4790]: I1003 09:43:11.858843 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-x59hn"] Oct 03 09:43:11 crc kubenswrapper[4790]: I1003 09:43:11.870756 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-x59hn"] Oct 03 09:43:11 crc kubenswrapper[4790]: I1003 09:43:11.878988 4790 scope.go:117] "RemoveContainer" containerID="2ea81b999fdc1a21b089a42ea0896c321f40861ca16abddf8ca2f539627242fb" Oct 03 09:43:11 crc kubenswrapper[4790]: I1003 09:43:11.927395 4790 scope.go:117] "RemoveContainer" containerID="36278e36c7087c8b111099185bd4fe465eda2a549dbeb56079df2a694ebac1ab" Oct 03 09:43:11 crc kubenswrapper[4790]: E1003 09:43:11.927869 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"36278e36c7087c8b111099185bd4fe465eda2a549dbeb56079df2a694ebac1ab\": container with ID starting with 36278e36c7087c8b111099185bd4fe465eda2a549dbeb56079df2a694ebac1ab not found: ID does not exist" containerID="36278e36c7087c8b111099185bd4fe465eda2a549dbeb56079df2a694ebac1ab" Oct 03 09:43:11 crc kubenswrapper[4790]: I1003 09:43:11.927903 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"36278e36c7087c8b111099185bd4fe465eda2a549dbeb56079df2a694ebac1ab"} err="failed to get container status \"36278e36c7087c8b111099185bd4fe465eda2a549dbeb56079df2a694ebac1ab\": rpc error: code = NotFound desc = could not find container \"36278e36c7087c8b111099185bd4fe465eda2a549dbeb56079df2a694ebac1ab\": container with ID starting with 36278e36c7087c8b111099185bd4fe465eda2a549dbeb56079df2a694ebac1ab not found: ID does not exist" Oct 03 09:43:11 crc kubenswrapper[4790]: I1003 09:43:11.927923 4790 scope.go:117] "RemoveContainer" containerID="32f42c90c92f2ef0a917b39bf62f0e76ee8c8999d36286a250b3254300089ecf" Oct 03 09:43:11 crc kubenswrapper[4790]: E1003 09:43:11.931798 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"32f42c90c92f2ef0a917b39bf62f0e76ee8c8999d36286a250b3254300089ecf\": container with ID starting with 32f42c90c92f2ef0a917b39bf62f0e76ee8c8999d36286a250b3254300089ecf not found: ID does not exist" containerID="32f42c90c92f2ef0a917b39bf62f0e76ee8c8999d36286a250b3254300089ecf" Oct 03 09:43:11 crc kubenswrapper[4790]: I1003 09:43:11.931825 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"32f42c90c92f2ef0a917b39bf62f0e76ee8c8999d36286a250b3254300089ecf"} err="failed to get container status \"32f42c90c92f2ef0a917b39bf62f0e76ee8c8999d36286a250b3254300089ecf\": rpc error: code = NotFound desc = could not find container \"32f42c90c92f2ef0a917b39bf62f0e76ee8c8999d36286a250b3254300089ecf\": container with ID starting with 32f42c90c92f2ef0a917b39bf62f0e76ee8c8999d36286a250b3254300089ecf not found: ID does not exist" Oct 03 09:43:11 crc kubenswrapper[4790]: I1003 09:43:11.931844 4790 scope.go:117] "RemoveContainer" containerID="2ea81b999fdc1a21b089a42ea0896c321f40861ca16abddf8ca2f539627242fb" Oct 03 09:43:11 crc kubenswrapper[4790]: E1003 09:43:11.932118 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2ea81b999fdc1a21b089a42ea0896c321f40861ca16abddf8ca2f539627242fb\": container with ID starting with 2ea81b999fdc1a21b089a42ea0896c321f40861ca16abddf8ca2f539627242fb not found: ID does not exist" containerID="2ea81b999fdc1a21b089a42ea0896c321f40861ca16abddf8ca2f539627242fb" Oct 03 09:43:11 crc kubenswrapper[4790]: I1003 09:43:11.932139 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2ea81b999fdc1a21b089a42ea0896c321f40861ca16abddf8ca2f539627242fb"} err="failed to get container status \"2ea81b999fdc1a21b089a42ea0896c321f40861ca16abddf8ca2f539627242fb\": rpc error: code = NotFound desc = could not find container \"2ea81b999fdc1a21b089a42ea0896c321f40861ca16abddf8ca2f539627242fb\": container with ID starting with 2ea81b999fdc1a21b089a42ea0896c321f40861ca16abddf8ca2f539627242fb not found: ID does not exist" Oct 03 09:43:12 crc kubenswrapper[4790]: I1003 09:43:12.344659 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="46d81759-761b-4aa1-aadb-3e1609205d2c" path="/var/lib/kubelet/pods/46d81759-761b-4aa1-aadb-3e1609205d2c/volumes" Oct 03 09:43:18 crc kubenswrapper[4790]: I1003 09:43:18.328756 4790 scope.go:117] "RemoveContainer" containerID="d843dc69f3124e5a638bab870b578cf0acf9e76306d860efb1b16bc10073326a" Oct 03 09:43:18 crc kubenswrapper[4790]: E1003 09:43:18.330589 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqj4j_openshift-machine-config-operator(36eae06a-d8be-4128-8d68-acb32e1f011b)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" podUID="36eae06a-d8be-4128-8d68-acb32e1f011b" Oct 03 09:43:31 crc kubenswrapper[4790]: I1003 09:43:31.329091 4790 scope.go:117] "RemoveContainer" containerID="d843dc69f3124e5a638bab870b578cf0acf9e76306d860efb1b16bc10073326a" Oct 03 09:43:31 crc kubenswrapper[4790]: E1003 09:43:31.330653 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqj4j_openshift-machine-config-operator(36eae06a-d8be-4128-8d68-acb32e1f011b)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" podUID="36eae06a-d8be-4128-8d68-acb32e1f011b" Oct 03 09:43:43 crc kubenswrapper[4790]: I1003 09:43:43.328489 4790 scope.go:117] "RemoveContainer" containerID="d843dc69f3124e5a638bab870b578cf0acf9e76306d860efb1b16bc10073326a" Oct 03 09:43:43 crc kubenswrapper[4790]: E1003 09:43:43.329389 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqj4j_openshift-machine-config-operator(36eae06a-d8be-4128-8d68-acb32e1f011b)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" podUID="36eae06a-d8be-4128-8d68-acb32e1f011b" Oct 03 09:43:54 crc kubenswrapper[4790]: I1003 09:43:54.341017 4790 scope.go:117] "RemoveContainer" containerID="d843dc69f3124e5a638bab870b578cf0acf9e76306d860efb1b16bc10073326a" Oct 03 09:43:54 crc kubenswrapper[4790]: E1003 09:43:54.341771 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqj4j_openshift-machine-config-operator(36eae06a-d8be-4128-8d68-acb32e1f011b)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" podUID="36eae06a-d8be-4128-8d68-acb32e1f011b" Oct 03 09:44:08 crc kubenswrapper[4790]: I1003 09:44:08.329463 4790 scope.go:117] "RemoveContainer" containerID="d843dc69f3124e5a638bab870b578cf0acf9e76306d860efb1b16bc10073326a" Oct 03 09:44:08 crc kubenswrapper[4790]: E1003 09:44:08.330284 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqj4j_openshift-machine-config-operator(36eae06a-d8be-4128-8d68-acb32e1f011b)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" podUID="36eae06a-d8be-4128-8d68-acb32e1f011b" Oct 03 09:44:21 crc kubenswrapper[4790]: I1003 09:44:21.337441 4790 scope.go:117] "RemoveContainer" containerID="d843dc69f3124e5a638bab870b578cf0acf9e76306d860efb1b16bc10073326a" Oct 03 09:44:21 crc kubenswrapper[4790]: E1003 09:44:21.338429 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqj4j_openshift-machine-config-operator(36eae06a-d8be-4128-8d68-acb32e1f011b)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" podUID="36eae06a-d8be-4128-8d68-acb32e1f011b" Oct 03 09:44:34 crc kubenswrapper[4790]: I1003 09:44:34.336705 4790 scope.go:117] "RemoveContainer" containerID="d843dc69f3124e5a638bab870b578cf0acf9e76306d860efb1b16bc10073326a" Oct 03 09:44:34 crc kubenswrapper[4790]: E1003 09:44:34.337370 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqj4j_openshift-machine-config-operator(36eae06a-d8be-4128-8d68-acb32e1f011b)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" podUID="36eae06a-d8be-4128-8d68-acb32e1f011b" Oct 03 09:44:49 crc kubenswrapper[4790]: I1003 09:44:49.328921 4790 scope.go:117] "RemoveContainer" containerID="d843dc69f3124e5a638bab870b578cf0acf9e76306d860efb1b16bc10073326a" Oct 03 09:44:49 crc kubenswrapper[4790]: E1003 09:44:49.329530 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqj4j_openshift-machine-config-operator(36eae06a-d8be-4128-8d68-acb32e1f011b)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" podUID="36eae06a-d8be-4128-8d68-acb32e1f011b" Oct 03 09:45:00 crc kubenswrapper[4790]: I1003 09:45:00.156221 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29324745-lphg5"] Oct 03 09:45:00 crc kubenswrapper[4790]: E1003 09:45:00.157325 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="46d81759-761b-4aa1-aadb-3e1609205d2c" containerName="extract-utilities" Oct 03 09:45:00 crc kubenswrapper[4790]: I1003 09:45:00.157345 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="46d81759-761b-4aa1-aadb-3e1609205d2c" containerName="extract-utilities" Oct 03 09:45:00 crc kubenswrapper[4790]: E1003 09:45:00.157369 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="46d81759-761b-4aa1-aadb-3e1609205d2c" containerName="extract-content" Oct 03 09:45:00 crc kubenswrapper[4790]: I1003 09:45:00.157378 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="46d81759-761b-4aa1-aadb-3e1609205d2c" containerName="extract-content" Oct 03 09:45:00 crc kubenswrapper[4790]: E1003 09:45:00.157392 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="46d81759-761b-4aa1-aadb-3e1609205d2c" containerName="registry-server" Oct 03 09:45:00 crc kubenswrapper[4790]: I1003 09:45:00.157400 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="46d81759-761b-4aa1-aadb-3e1609205d2c" containerName="registry-server" Oct 03 09:45:00 crc kubenswrapper[4790]: I1003 09:45:00.157693 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="46d81759-761b-4aa1-aadb-3e1609205d2c" containerName="registry-server" Oct 03 09:45:00 crc kubenswrapper[4790]: I1003 09:45:00.158689 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29324745-lphg5" Oct 03 09:45:00 crc kubenswrapper[4790]: I1003 09:45:00.161596 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 03 09:45:00 crc kubenswrapper[4790]: I1003 09:45:00.173261 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29324745-lphg5"] Oct 03 09:45:00 crc kubenswrapper[4790]: I1003 09:45:00.174989 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 03 09:45:00 crc kubenswrapper[4790]: I1003 09:45:00.258629 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vxdfv\" (UniqueName: \"kubernetes.io/projected/5604f3b1-96b7-41fd-9fbf-8fadb45a1d90-kube-api-access-vxdfv\") pod \"collect-profiles-29324745-lphg5\" (UID: \"5604f3b1-96b7-41fd-9fbf-8fadb45a1d90\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324745-lphg5" Oct 03 09:45:00 crc kubenswrapper[4790]: I1003 09:45:00.258705 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5604f3b1-96b7-41fd-9fbf-8fadb45a1d90-secret-volume\") pod \"collect-profiles-29324745-lphg5\" (UID: \"5604f3b1-96b7-41fd-9fbf-8fadb45a1d90\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324745-lphg5" Oct 03 09:45:00 crc kubenswrapper[4790]: I1003 09:45:00.258783 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5604f3b1-96b7-41fd-9fbf-8fadb45a1d90-config-volume\") pod \"collect-profiles-29324745-lphg5\" (UID: \"5604f3b1-96b7-41fd-9fbf-8fadb45a1d90\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324745-lphg5" Oct 03 09:45:00 crc kubenswrapper[4790]: I1003 09:45:00.362339 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vxdfv\" (UniqueName: \"kubernetes.io/projected/5604f3b1-96b7-41fd-9fbf-8fadb45a1d90-kube-api-access-vxdfv\") pod \"collect-profiles-29324745-lphg5\" (UID: \"5604f3b1-96b7-41fd-9fbf-8fadb45a1d90\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324745-lphg5" Oct 03 09:45:00 crc kubenswrapper[4790]: I1003 09:45:00.362896 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5604f3b1-96b7-41fd-9fbf-8fadb45a1d90-secret-volume\") pod \"collect-profiles-29324745-lphg5\" (UID: \"5604f3b1-96b7-41fd-9fbf-8fadb45a1d90\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324745-lphg5" Oct 03 09:45:00 crc kubenswrapper[4790]: I1003 09:45:00.363066 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5604f3b1-96b7-41fd-9fbf-8fadb45a1d90-config-volume\") pod \"collect-profiles-29324745-lphg5\" (UID: \"5604f3b1-96b7-41fd-9fbf-8fadb45a1d90\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324745-lphg5" Oct 03 09:45:00 crc kubenswrapper[4790]: I1003 09:45:00.364064 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5604f3b1-96b7-41fd-9fbf-8fadb45a1d90-config-volume\") pod \"collect-profiles-29324745-lphg5\" (UID: \"5604f3b1-96b7-41fd-9fbf-8fadb45a1d90\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324745-lphg5" Oct 03 09:45:00 crc kubenswrapper[4790]: I1003 09:45:00.390745 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5604f3b1-96b7-41fd-9fbf-8fadb45a1d90-secret-volume\") pod \"collect-profiles-29324745-lphg5\" (UID: \"5604f3b1-96b7-41fd-9fbf-8fadb45a1d90\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324745-lphg5" Oct 03 09:45:00 crc kubenswrapper[4790]: I1003 09:45:00.395520 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vxdfv\" (UniqueName: \"kubernetes.io/projected/5604f3b1-96b7-41fd-9fbf-8fadb45a1d90-kube-api-access-vxdfv\") pod \"collect-profiles-29324745-lphg5\" (UID: \"5604f3b1-96b7-41fd-9fbf-8fadb45a1d90\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324745-lphg5" Oct 03 09:45:00 crc kubenswrapper[4790]: I1003 09:45:00.481007 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29324745-lphg5" Oct 03 09:45:00 crc kubenswrapper[4790]: I1003 09:45:00.926150 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29324745-lphg5"] Oct 03 09:45:00 crc kubenswrapper[4790]: I1003 09:45:00.972541 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29324745-lphg5" event={"ID":"5604f3b1-96b7-41fd-9fbf-8fadb45a1d90","Type":"ContainerStarted","Data":"8df02f14c27f272ffba8d402570ec5a9f324d3087486fa7481393a2994abf704"} Oct 03 09:45:01 crc kubenswrapper[4790]: I1003 09:45:01.328069 4790 scope.go:117] "RemoveContainer" containerID="d843dc69f3124e5a638bab870b578cf0acf9e76306d860efb1b16bc10073326a" Oct 03 09:45:01 crc kubenswrapper[4790]: E1003 09:45:01.328340 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqj4j_openshift-machine-config-operator(36eae06a-d8be-4128-8d68-acb32e1f011b)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" podUID="36eae06a-d8be-4128-8d68-acb32e1f011b" Oct 03 09:45:01 crc kubenswrapper[4790]: I1003 09:45:01.988711 4790 generic.go:334] "Generic (PLEG): container finished" podID="5604f3b1-96b7-41fd-9fbf-8fadb45a1d90" containerID="8cd0bd8f10beed80a3b65ce20a11b384b651d96ccff7467acae0e96ab9068e82" exitCode=0 Oct 03 09:45:01 crc kubenswrapper[4790]: I1003 09:45:01.988809 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29324745-lphg5" event={"ID":"5604f3b1-96b7-41fd-9fbf-8fadb45a1d90","Type":"ContainerDied","Data":"8cd0bd8f10beed80a3b65ce20a11b384b651d96ccff7467acae0e96ab9068e82"} Oct 03 09:45:03 crc kubenswrapper[4790]: I1003 09:45:03.442354 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29324745-lphg5" Oct 03 09:45:03 crc kubenswrapper[4790]: I1003 09:45:03.537600 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5604f3b1-96b7-41fd-9fbf-8fadb45a1d90-config-volume\") pod \"5604f3b1-96b7-41fd-9fbf-8fadb45a1d90\" (UID: \"5604f3b1-96b7-41fd-9fbf-8fadb45a1d90\") " Oct 03 09:45:03 crc kubenswrapper[4790]: I1003 09:45:03.537780 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vxdfv\" (UniqueName: \"kubernetes.io/projected/5604f3b1-96b7-41fd-9fbf-8fadb45a1d90-kube-api-access-vxdfv\") pod \"5604f3b1-96b7-41fd-9fbf-8fadb45a1d90\" (UID: \"5604f3b1-96b7-41fd-9fbf-8fadb45a1d90\") " Oct 03 09:45:03 crc kubenswrapper[4790]: I1003 09:45:03.537890 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5604f3b1-96b7-41fd-9fbf-8fadb45a1d90-secret-volume\") pod \"5604f3b1-96b7-41fd-9fbf-8fadb45a1d90\" (UID: \"5604f3b1-96b7-41fd-9fbf-8fadb45a1d90\") " Oct 03 09:45:03 crc kubenswrapper[4790]: I1003 09:45:03.538664 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5604f3b1-96b7-41fd-9fbf-8fadb45a1d90-config-volume" (OuterVolumeSpecName: "config-volume") pod "5604f3b1-96b7-41fd-9fbf-8fadb45a1d90" (UID: "5604f3b1-96b7-41fd-9fbf-8fadb45a1d90"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 09:45:03 crc kubenswrapper[4790]: I1003 09:45:03.547469 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5604f3b1-96b7-41fd-9fbf-8fadb45a1d90-kube-api-access-vxdfv" (OuterVolumeSpecName: "kube-api-access-vxdfv") pod "5604f3b1-96b7-41fd-9fbf-8fadb45a1d90" (UID: "5604f3b1-96b7-41fd-9fbf-8fadb45a1d90"). InnerVolumeSpecName "kube-api-access-vxdfv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:45:03 crc kubenswrapper[4790]: I1003 09:45:03.549775 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5604f3b1-96b7-41fd-9fbf-8fadb45a1d90-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "5604f3b1-96b7-41fd-9fbf-8fadb45a1d90" (UID: "5604f3b1-96b7-41fd-9fbf-8fadb45a1d90"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 09:45:03 crc kubenswrapper[4790]: I1003 09:45:03.641535 4790 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5604f3b1-96b7-41fd-9fbf-8fadb45a1d90-config-volume\") on node \"crc\" DevicePath \"\"" Oct 03 09:45:03 crc kubenswrapper[4790]: I1003 09:45:03.641891 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vxdfv\" (UniqueName: \"kubernetes.io/projected/5604f3b1-96b7-41fd-9fbf-8fadb45a1d90-kube-api-access-vxdfv\") on node \"crc\" DevicePath \"\"" Oct 03 09:45:03 crc kubenswrapper[4790]: I1003 09:45:03.641953 4790 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5604f3b1-96b7-41fd-9fbf-8fadb45a1d90-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 03 09:45:04 crc kubenswrapper[4790]: I1003 09:45:04.009722 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29324745-lphg5" event={"ID":"5604f3b1-96b7-41fd-9fbf-8fadb45a1d90","Type":"ContainerDied","Data":"8df02f14c27f272ffba8d402570ec5a9f324d3087486fa7481393a2994abf704"} Oct 03 09:45:04 crc kubenswrapper[4790]: I1003 09:45:04.010022 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8df02f14c27f272ffba8d402570ec5a9f324d3087486fa7481393a2994abf704" Oct 03 09:45:04 crc kubenswrapper[4790]: I1003 09:45:04.009915 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29324745-lphg5" Oct 03 09:45:04 crc kubenswrapper[4790]: I1003 09:45:04.518811 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29324700-zkgcz"] Oct 03 09:45:04 crc kubenswrapper[4790]: I1003 09:45:04.527065 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29324700-zkgcz"] Oct 03 09:45:06 crc kubenswrapper[4790]: I1003 09:45:06.343919 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fa19264-9605-4c16-a1d4-b412e599ebb1" path="/var/lib/kubelet/pods/5fa19264-9605-4c16-a1d4-b412e599ebb1/volumes" Oct 03 09:45:12 crc kubenswrapper[4790]: I1003 09:45:12.328367 4790 scope.go:117] "RemoveContainer" containerID="d843dc69f3124e5a638bab870b578cf0acf9e76306d860efb1b16bc10073326a" Oct 03 09:45:13 crc kubenswrapper[4790]: I1003 09:45:13.103884 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" event={"ID":"36eae06a-d8be-4128-8d68-acb32e1f011b","Type":"ContainerStarted","Data":"d8e63f555b14da1aeb2f2ca4b01950ae4847d21aba56ef772f222042cdc7ddc1"} Oct 03 09:45:29 crc kubenswrapper[4790]: I1003 09:45:29.826890 4790 scope.go:117] "RemoveContainer" containerID="2de2f05a97ca023e000fd486bdb5e97319530dd3d0544ccc7b0ef8575bc70a81" Oct 03 09:46:08 crc kubenswrapper[4790]: I1003 09:46:08.862377 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-h7bch"] Oct 03 09:46:08 crc kubenswrapper[4790]: E1003 09:46:08.863438 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5604f3b1-96b7-41fd-9fbf-8fadb45a1d90" containerName="collect-profiles" Oct 03 09:46:08 crc kubenswrapper[4790]: I1003 09:46:08.863453 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="5604f3b1-96b7-41fd-9fbf-8fadb45a1d90" containerName="collect-profiles" Oct 03 09:46:08 crc kubenswrapper[4790]: I1003 09:46:08.863680 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="5604f3b1-96b7-41fd-9fbf-8fadb45a1d90" containerName="collect-profiles" Oct 03 09:46:08 crc kubenswrapper[4790]: I1003 09:46:08.865696 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-h7bch" Oct 03 09:46:08 crc kubenswrapper[4790]: I1003 09:46:08.878422 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-h7bch"] Oct 03 09:46:08 crc kubenswrapper[4790]: I1003 09:46:08.960996 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/de744f23-2894-457e-b1a4-b5adbc2c047c-utilities\") pod \"certified-operators-h7bch\" (UID: \"de744f23-2894-457e-b1a4-b5adbc2c047c\") " pod="openshift-marketplace/certified-operators-h7bch" Oct 03 09:46:08 crc kubenswrapper[4790]: I1003 09:46:08.961515 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sqvm4\" (UniqueName: \"kubernetes.io/projected/de744f23-2894-457e-b1a4-b5adbc2c047c-kube-api-access-sqvm4\") pod \"certified-operators-h7bch\" (UID: \"de744f23-2894-457e-b1a4-b5adbc2c047c\") " pod="openshift-marketplace/certified-operators-h7bch" Oct 03 09:46:08 crc kubenswrapper[4790]: I1003 09:46:08.961695 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/de744f23-2894-457e-b1a4-b5adbc2c047c-catalog-content\") pod \"certified-operators-h7bch\" (UID: \"de744f23-2894-457e-b1a4-b5adbc2c047c\") " pod="openshift-marketplace/certified-operators-h7bch" Oct 03 09:46:09 crc kubenswrapper[4790]: I1003 09:46:09.063290 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/de744f23-2894-457e-b1a4-b5adbc2c047c-utilities\") pod \"certified-operators-h7bch\" (UID: \"de744f23-2894-457e-b1a4-b5adbc2c047c\") " pod="openshift-marketplace/certified-operators-h7bch" Oct 03 09:46:09 crc kubenswrapper[4790]: I1003 09:46:09.063455 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sqvm4\" (UniqueName: \"kubernetes.io/projected/de744f23-2894-457e-b1a4-b5adbc2c047c-kube-api-access-sqvm4\") pod \"certified-operators-h7bch\" (UID: \"de744f23-2894-457e-b1a4-b5adbc2c047c\") " pod="openshift-marketplace/certified-operators-h7bch" Oct 03 09:46:09 crc kubenswrapper[4790]: I1003 09:46:09.063498 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/de744f23-2894-457e-b1a4-b5adbc2c047c-catalog-content\") pod \"certified-operators-h7bch\" (UID: \"de744f23-2894-457e-b1a4-b5adbc2c047c\") " pod="openshift-marketplace/certified-operators-h7bch" Oct 03 09:46:09 crc kubenswrapper[4790]: I1003 09:46:09.063801 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/de744f23-2894-457e-b1a4-b5adbc2c047c-utilities\") pod \"certified-operators-h7bch\" (UID: \"de744f23-2894-457e-b1a4-b5adbc2c047c\") " pod="openshift-marketplace/certified-operators-h7bch" Oct 03 09:46:09 crc kubenswrapper[4790]: I1003 09:46:09.064122 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/de744f23-2894-457e-b1a4-b5adbc2c047c-catalog-content\") pod \"certified-operators-h7bch\" (UID: \"de744f23-2894-457e-b1a4-b5adbc2c047c\") " pod="openshift-marketplace/certified-operators-h7bch" Oct 03 09:46:09 crc kubenswrapper[4790]: I1003 09:46:09.083657 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sqvm4\" (UniqueName: \"kubernetes.io/projected/de744f23-2894-457e-b1a4-b5adbc2c047c-kube-api-access-sqvm4\") pod \"certified-operators-h7bch\" (UID: \"de744f23-2894-457e-b1a4-b5adbc2c047c\") " pod="openshift-marketplace/certified-operators-h7bch" Oct 03 09:46:09 crc kubenswrapper[4790]: I1003 09:46:09.183431 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-h7bch" Oct 03 09:46:09 crc kubenswrapper[4790]: I1003 09:46:09.695320 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-h7bch"] Oct 03 09:46:10 crc kubenswrapper[4790]: I1003 09:46:10.677323 4790 generic.go:334] "Generic (PLEG): container finished" podID="de744f23-2894-457e-b1a4-b5adbc2c047c" containerID="521d115c53d58201f975509f427c7bd7e1c6186a55f49f689728ad22c9fe4602" exitCode=0 Oct 03 09:46:10 crc kubenswrapper[4790]: I1003 09:46:10.677395 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-h7bch" event={"ID":"de744f23-2894-457e-b1a4-b5adbc2c047c","Type":"ContainerDied","Data":"521d115c53d58201f975509f427c7bd7e1c6186a55f49f689728ad22c9fe4602"} Oct 03 09:46:10 crc kubenswrapper[4790]: I1003 09:46:10.678238 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-h7bch" event={"ID":"de744f23-2894-457e-b1a4-b5adbc2c047c","Type":"ContainerStarted","Data":"b6acbde307800e087c445659b0a0018db6990b7bf781c7a76b7894a4af2eae30"} Oct 03 09:46:13 crc kubenswrapper[4790]: I1003 09:46:13.707119 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-h7bch" event={"ID":"de744f23-2894-457e-b1a4-b5adbc2c047c","Type":"ContainerStarted","Data":"f80038be6313031a2e19c66fe71ad5017d163270d38afa3a47c332cc367961be"} Oct 03 09:46:20 crc kubenswrapper[4790]: I1003 09:46:20.808462 4790 generic.go:334] "Generic (PLEG): container finished" podID="de744f23-2894-457e-b1a4-b5adbc2c047c" containerID="f80038be6313031a2e19c66fe71ad5017d163270d38afa3a47c332cc367961be" exitCode=0 Oct 03 09:46:20 crc kubenswrapper[4790]: I1003 09:46:20.809318 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-h7bch" event={"ID":"de744f23-2894-457e-b1a4-b5adbc2c047c","Type":"ContainerDied","Data":"f80038be6313031a2e19c66fe71ad5017d163270d38afa3a47c332cc367961be"} Oct 03 09:46:21 crc kubenswrapper[4790]: I1003 09:46:21.823299 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-h7bch" event={"ID":"de744f23-2894-457e-b1a4-b5adbc2c047c","Type":"ContainerStarted","Data":"224257618cf65a75be5dd8ca47f69e1109cb4b257a06ad827cc5f5d37a9dd20e"} Oct 03 09:46:21 crc kubenswrapper[4790]: I1003 09:46:21.849333 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-h7bch" podStartSLOduration=3.034572189 podStartE2EDuration="13.84930813s" podCreationTimestamp="2025-10-03 09:46:08 +0000 UTC" firstStartedPulling="2025-10-03 09:46:10.68114693 +0000 UTC m=+3957.034081178" lastFinishedPulling="2025-10-03 09:46:21.495882881 +0000 UTC m=+3967.848817119" observedRunningTime="2025-10-03 09:46:21.84202904 +0000 UTC m=+3968.194963288" watchObservedRunningTime="2025-10-03 09:46:21.84930813 +0000 UTC m=+3968.202242408" Oct 03 09:46:29 crc kubenswrapper[4790]: I1003 09:46:29.183927 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-h7bch" Oct 03 09:46:29 crc kubenswrapper[4790]: I1003 09:46:29.184566 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-h7bch" Oct 03 09:46:29 crc kubenswrapper[4790]: I1003 09:46:29.237647 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-h7bch" Oct 03 09:46:29 crc kubenswrapper[4790]: I1003 09:46:29.967324 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-h7bch" Oct 03 09:46:30 crc kubenswrapper[4790]: I1003 09:46:30.016021 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-h7bch"] Oct 03 09:46:31 crc kubenswrapper[4790]: I1003 09:46:31.934712 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-h7bch" podUID="de744f23-2894-457e-b1a4-b5adbc2c047c" containerName="registry-server" containerID="cri-o://224257618cf65a75be5dd8ca47f69e1109cb4b257a06ad827cc5f5d37a9dd20e" gracePeriod=2 Oct 03 09:46:32 crc kubenswrapper[4790]: I1003 09:46:32.416902 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-h7bch" Oct 03 09:46:32 crc kubenswrapper[4790]: I1003 09:46:32.518812 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/de744f23-2894-457e-b1a4-b5adbc2c047c-catalog-content\") pod \"de744f23-2894-457e-b1a4-b5adbc2c047c\" (UID: \"de744f23-2894-457e-b1a4-b5adbc2c047c\") " Oct 03 09:46:32 crc kubenswrapper[4790]: I1003 09:46:32.519250 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/de744f23-2894-457e-b1a4-b5adbc2c047c-utilities\") pod \"de744f23-2894-457e-b1a4-b5adbc2c047c\" (UID: \"de744f23-2894-457e-b1a4-b5adbc2c047c\") " Oct 03 09:46:32 crc kubenswrapper[4790]: I1003 09:46:32.519340 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sqvm4\" (UniqueName: \"kubernetes.io/projected/de744f23-2894-457e-b1a4-b5adbc2c047c-kube-api-access-sqvm4\") pod \"de744f23-2894-457e-b1a4-b5adbc2c047c\" (UID: \"de744f23-2894-457e-b1a4-b5adbc2c047c\") " Oct 03 09:46:32 crc kubenswrapper[4790]: I1003 09:46:32.520096 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/de744f23-2894-457e-b1a4-b5adbc2c047c-utilities" (OuterVolumeSpecName: "utilities") pod "de744f23-2894-457e-b1a4-b5adbc2c047c" (UID: "de744f23-2894-457e-b1a4-b5adbc2c047c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 09:46:32 crc kubenswrapper[4790]: I1003 09:46:32.526101 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/de744f23-2894-457e-b1a4-b5adbc2c047c-kube-api-access-sqvm4" (OuterVolumeSpecName: "kube-api-access-sqvm4") pod "de744f23-2894-457e-b1a4-b5adbc2c047c" (UID: "de744f23-2894-457e-b1a4-b5adbc2c047c"). InnerVolumeSpecName "kube-api-access-sqvm4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:46:32 crc kubenswrapper[4790]: I1003 09:46:32.576883 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/de744f23-2894-457e-b1a4-b5adbc2c047c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "de744f23-2894-457e-b1a4-b5adbc2c047c" (UID: "de744f23-2894-457e-b1a4-b5adbc2c047c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 09:46:32 crc kubenswrapper[4790]: I1003 09:46:32.621749 4790 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/de744f23-2894-457e-b1a4-b5adbc2c047c-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 09:46:32 crc kubenswrapper[4790]: I1003 09:46:32.621787 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sqvm4\" (UniqueName: \"kubernetes.io/projected/de744f23-2894-457e-b1a4-b5adbc2c047c-kube-api-access-sqvm4\") on node \"crc\" DevicePath \"\"" Oct 03 09:46:32 crc kubenswrapper[4790]: I1003 09:46:32.621800 4790 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/de744f23-2894-457e-b1a4-b5adbc2c047c-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 09:46:32 crc kubenswrapper[4790]: I1003 09:46:32.947882 4790 generic.go:334] "Generic (PLEG): container finished" podID="de744f23-2894-457e-b1a4-b5adbc2c047c" containerID="224257618cf65a75be5dd8ca47f69e1109cb4b257a06ad827cc5f5d37a9dd20e" exitCode=0 Oct 03 09:46:32 crc kubenswrapper[4790]: I1003 09:46:32.948149 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-h7bch" Oct 03 09:46:32 crc kubenswrapper[4790]: I1003 09:46:32.948168 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-h7bch" event={"ID":"de744f23-2894-457e-b1a4-b5adbc2c047c","Type":"ContainerDied","Data":"224257618cf65a75be5dd8ca47f69e1109cb4b257a06ad827cc5f5d37a9dd20e"} Oct 03 09:46:32 crc kubenswrapper[4790]: I1003 09:46:32.948340 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-h7bch" event={"ID":"de744f23-2894-457e-b1a4-b5adbc2c047c","Type":"ContainerDied","Data":"b6acbde307800e087c445659b0a0018db6990b7bf781c7a76b7894a4af2eae30"} Oct 03 09:46:32 crc kubenswrapper[4790]: I1003 09:46:32.948368 4790 scope.go:117] "RemoveContainer" containerID="224257618cf65a75be5dd8ca47f69e1109cb4b257a06ad827cc5f5d37a9dd20e" Oct 03 09:46:32 crc kubenswrapper[4790]: I1003 09:46:32.973775 4790 scope.go:117] "RemoveContainer" containerID="f80038be6313031a2e19c66fe71ad5017d163270d38afa3a47c332cc367961be" Oct 03 09:46:32 crc kubenswrapper[4790]: I1003 09:46:32.990402 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-h7bch"] Oct 03 09:46:33 crc kubenswrapper[4790]: I1003 09:46:33.000178 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-h7bch"] Oct 03 09:46:33 crc kubenswrapper[4790]: I1003 09:46:33.012364 4790 scope.go:117] "RemoveContainer" containerID="521d115c53d58201f975509f427c7bd7e1c6186a55f49f689728ad22c9fe4602" Oct 03 09:46:33 crc kubenswrapper[4790]: I1003 09:46:33.053116 4790 scope.go:117] "RemoveContainer" containerID="224257618cf65a75be5dd8ca47f69e1109cb4b257a06ad827cc5f5d37a9dd20e" Oct 03 09:46:33 crc kubenswrapper[4790]: E1003 09:46:33.053636 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"224257618cf65a75be5dd8ca47f69e1109cb4b257a06ad827cc5f5d37a9dd20e\": container with ID starting with 224257618cf65a75be5dd8ca47f69e1109cb4b257a06ad827cc5f5d37a9dd20e not found: ID does not exist" containerID="224257618cf65a75be5dd8ca47f69e1109cb4b257a06ad827cc5f5d37a9dd20e" Oct 03 09:46:33 crc kubenswrapper[4790]: I1003 09:46:33.053685 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"224257618cf65a75be5dd8ca47f69e1109cb4b257a06ad827cc5f5d37a9dd20e"} err="failed to get container status \"224257618cf65a75be5dd8ca47f69e1109cb4b257a06ad827cc5f5d37a9dd20e\": rpc error: code = NotFound desc = could not find container \"224257618cf65a75be5dd8ca47f69e1109cb4b257a06ad827cc5f5d37a9dd20e\": container with ID starting with 224257618cf65a75be5dd8ca47f69e1109cb4b257a06ad827cc5f5d37a9dd20e not found: ID does not exist" Oct 03 09:46:33 crc kubenswrapper[4790]: I1003 09:46:33.053713 4790 scope.go:117] "RemoveContainer" containerID="f80038be6313031a2e19c66fe71ad5017d163270d38afa3a47c332cc367961be" Oct 03 09:46:33 crc kubenswrapper[4790]: E1003 09:46:33.054034 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f80038be6313031a2e19c66fe71ad5017d163270d38afa3a47c332cc367961be\": container with ID starting with f80038be6313031a2e19c66fe71ad5017d163270d38afa3a47c332cc367961be not found: ID does not exist" containerID="f80038be6313031a2e19c66fe71ad5017d163270d38afa3a47c332cc367961be" Oct 03 09:46:33 crc kubenswrapper[4790]: I1003 09:46:33.054127 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f80038be6313031a2e19c66fe71ad5017d163270d38afa3a47c332cc367961be"} err="failed to get container status \"f80038be6313031a2e19c66fe71ad5017d163270d38afa3a47c332cc367961be\": rpc error: code = NotFound desc = could not find container \"f80038be6313031a2e19c66fe71ad5017d163270d38afa3a47c332cc367961be\": container with ID starting with f80038be6313031a2e19c66fe71ad5017d163270d38afa3a47c332cc367961be not found: ID does not exist" Oct 03 09:46:33 crc kubenswrapper[4790]: I1003 09:46:33.054209 4790 scope.go:117] "RemoveContainer" containerID="521d115c53d58201f975509f427c7bd7e1c6186a55f49f689728ad22c9fe4602" Oct 03 09:46:33 crc kubenswrapper[4790]: E1003 09:46:33.054662 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"521d115c53d58201f975509f427c7bd7e1c6186a55f49f689728ad22c9fe4602\": container with ID starting with 521d115c53d58201f975509f427c7bd7e1c6186a55f49f689728ad22c9fe4602 not found: ID does not exist" containerID="521d115c53d58201f975509f427c7bd7e1c6186a55f49f689728ad22c9fe4602" Oct 03 09:46:33 crc kubenswrapper[4790]: I1003 09:46:33.054735 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"521d115c53d58201f975509f427c7bd7e1c6186a55f49f689728ad22c9fe4602"} err="failed to get container status \"521d115c53d58201f975509f427c7bd7e1c6186a55f49f689728ad22c9fe4602\": rpc error: code = NotFound desc = could not find container \"521d115c53d58201f975509f427c7bd7e1c6186a55f49f689728ad22c9fe4602\": container with ID starting with 521d115c53d58201f975509f427c7bd7e1c6186a55f49f689728ad22c9fe4602 not found: ID does not exist" Oct 03 09:46:34 crc kubenswrapper[4790]: I1003 09:46:34.341988 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="de744f23-2894-457e-b1a4-b5adbc2c047c" path="/var/lib/kubelet/pods/de744f23-2894-457e-b1a4-b5adbc2c047c/volumes" Oct 03 09:47:40 crc kubenswrapper[4790]: I1003 09:47:40.185943 4790 patch_prober.go:28] interesting pod/machine-config-daemon-zqj4j container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 09:47:40 crc kubenswrapper[4790]: I1003 09:47:40.186566 4790 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" podUID="36eae06a-d8be-4128-8d68-acb32e1f011b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 09:48:10 crc kubenswrapper[4790]: I1003 09:48:10.186255 4790 patch_prober.go:28] interesting pod/machine-config-daemon-zqj4j container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 09:48:10 crc kubenswrapper[4790]: I1003 09:48:10.186738 4790 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" podUID="36eae06a-d8be-4128-8d68-acb32e1f011b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 09:48:40 crc kubenswrapper[4790]: I1003 09:48:40.186296 4790 patch_prober.go:28] interesting pod/machine-config-daemon-zqj4j container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 09:48:40 crc kubenswrapper[4790]: I1003 09:48:40.186878 4790 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" podUID="36eae06a-d8be-4128-8d68-acb32e1f011b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 09:48:40 crc kubenswrapper[4790]: I1003 09:48:40.186934 4790 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" Oct 03 09:48:40 crc kubenswrapper[4790]: I1003 09:48:40.187544 4790 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"d8e63f555b14da1aeb2f2ca4b01950ae4847d21aba56ef772f222042cdc7ddc1"} pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 03 09:48:40 crc kubenswrapper[4790]: I1003 09:48:40.187717 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" podUID="36eae06a-d8be-4128-8d68-acb32e1f011b" containerName="machine-config-daemon" containerID="cri-o://d8e63f555b14da1aeb2f2ca4b01950ae4847d21aba56ef772f222042cdc7ddc1" gracePeriod=600 Oct 03 09:48:41 crc kubenswrapper[4790]: I1003 09:48:41.232873 4790 generic.go:334] "Generic (PLEG): container finished" podID="36eae06a-d8be-4128-8d68-acb32e1f011b" containerID="d8e63f555b14da1aeb2f2ca4b01950ae4847d21aba56ef772f222042cdc7ddc1" exitCode=0 Oct 03 09:48:41 crc kubenswrapper[4790]: I1003 09:48:41.232942 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" event={"ID":"36eae06a-d8be-4128-8d68-acb32e1f011b","Type":"ContainerDied","Data":"d8e63f555b14da1aeb2f2ca4b01950ae4847d21aba56ef772f222042cdc7ddc1"} Oct 03 09:48:41 crc kubenswrapper[4790]: I1003 09:48:41.233490 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" event={"ID":"36eae06a-d8be-4128-8d68-acb32e1f011b","Type":"ContainerStarted","Data":"8e0196bb36398706962d585a007c8fc442ef4fad86d7a5243997f04886073380"} Oct 03 09:48:41 crc kubenswrapper[4790]: I1003 09:48:41.233519 4790 scope.go:117] "RemoveContainer" containerID="d843dc69f3124e5a638bab870b578cf0acf9e76306d860efb1b16bc10073326a" Oct 03 09:48:47 crc kubenswrapper[4790]: I1003 09:48:47.225681 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-fxpsz"] Oct 03 09:48:47 crc kubenswrapper[4790]: E1003 09:48:47.226754 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de744f23-2894-457e-b1a4-b5adbc2c047c" containerName="extract-content" Oct 03 09:48:47 crc kubenswrapper[4790]: I1003 09:48:47.226772 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="de744f23-2894-457e-b1a4-b5adbc2c047c" containerName="extract-content" Oct 03 09:48:47 crc kubenswrapper[4790]: E1003 09:48:47.226813 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de744f23-2894-457e-b1a4-b5adbc2c047c" containerName="extract-utilities" Oct 03 09:48:47 crc kubenswrapper[4790]: I1003 09:48:47.226821 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="de744f23-2894-457e-b1a4-b5adbc2c047c" containerName="extract-utilities" Oct 03 09:48:47 crc kubenswrapper[4790]: E1003 09:48:47.226837 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de744f23-2894-457e-b1a4-b5adbc2c047c" containerName="registry-server" Oct 03 09:48:47 crc kubenswrapper[4790]: I1003 09:48:47.226845 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="de744f23-2894-457e-b1a4-b5adbc2c047c" containerName="registry-server" Oct 03 09:48:47 crc kubenswrapper[4790]: I1003 09:48:47.227161 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="de744f23-2894-457e-b1a4-b5adbc2c047c" containerName="registry-server" Oct 03 09:48:47 crc kubenswrapper[4790]: I1003 09:48:47.228813 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fxpsz" Oct 03 09:48:47 crc kubenswrapper[4790]: I1003 09:48:47.235154 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-fxpsz"] Oct 03 09:48:47 crc kubenswrapper[4790]: I1003 09:48:47.238767 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ce4ec224-1d55-49ca-902c-96b152141360-utilities\") pod \"community-operators-fxpsz\" (UID: \"ce4ec224-1d55-49ca-902c-96b152141360\") " pod="openshift-marketplace/community-operators-fxpsz" Oct 03 09:48:47 crc kubenswrapper[4790]: I1003 09:48:47.238890 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4wtgm\" (UniqueName: \"kubernetes.io/projected/ce4ec224-1d55-49ca-902c-96b152141360-kube-api-access-4wtgm\") pod \"community-operators-fxpsz\" (UID: \"ce4ec224-1d55-49ca-902c-96b152141360\") " pod="openshift-marketplace/community-operators-fxpsz" Oct 03 09:48:47 crc kubenswrapper[4790]: I1003 09:48:47.238964 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ce4ec224-1d55-49ca-902c-96b152141360-catalog-content\") pod \"community-operators-fxpsz\" (UID: \"ce4ec224-1d55-49ca-902c-96b152141360\") " pod="openshift-marketplace/community-operators-fxpsz" Oct 03 09:48:47 crc kubenswrapper[4790]: I1003 09:48:47.340711 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ce4ec224-1d55-49ca-902c-96b152141360-utilities\") pod \"community-operators-fxpsz\" (UID: \"ce4ec224-1d55-49ca-902c-96b152141360\") " pod="openshift-marketplace/community-operators-fxpsz" Oct 03 09:48:47 crc kubenswrapper[4790]: I1003 09:48:47.340816 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4wtgm\" (UniqueName: \"kubernetes.io/projected/ce4ec224-1d55-49ca-902c-96b152141360-kube-api-access-4wtgm\") pod \"community-operators-fxpsz\" (UID: \"ce4ec224-1d55-49ca-902c-96b152141360\") " pod="openshift-marketplace/community-operators-fxpsz" Oct 03 09:48:47 crc kubenswrapper[4790]: I1003 09:48:47.340879 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ce4ec224-1d55-49ca-902c-96b152141360-catalog-content\") pod \"community-operators-fxpsz\" (UID: \"ce4ec224-1d55-49ca-902c-96b152141360\") " pod="openshift-marketplace/community-operators-fxpsz" Oct 03 09:48:47 crc kubenswrapper[4790]: I1003 09:48:47.341164 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ce4ec224-1d55-49ca-902c-96b152141360-utilities\") pod \"community-operators-fxpsz\" (UID: \"ce4ec224-1d55-49ca-902c-96b152141360\") " pod="openshift-marketplace/community-operators-fxpsz" Oct 03 09:48:47 crc kubenswrapper[4790]: I1003 09:48:47.341303 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ce4ec224-1d55-49ca-902c-96b152141360-catalog-content\") pod \"community-operators-fxpsz\" (UID: \"ce4ec224-1d55-49ca-902c-96b152141360\") " pod="openshift-marketplace/community-operators-fxpsz" Oct 03 09:48:47 crc kubenswrapper[4790]: I1003 09:48:47.370923 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4wtgm\" (UniqueName: \"kubernetes.io/projected/ce4ec224-1d55-49ca-902c-96b152141360-kube-api-access-4wtgm\") pod \"community-operators-fxpsz\" (UID: \"ce4ec224-1d55-49ca-902c-96b152141360\") " pod="openshift-marketplace/community-operators-fxpsz" Oct 03 09:48:47 crc kubenswrapper[4790]: I1003 09:48:47.554247 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fxpsz" Oct 03 09:48:48 crc kubenswrapper[4790]: I1003 09:48:48.081919 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-fxpsz"] Oct 03 09:48:48 crc kubenswrapper[4790]: I1003 09:48:48.304497 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fxpsz" event={"ID":"ce4ec224-1d55-49ca-902c-96b152141360","Type":"ContainerStarted","Data":"754aa12f4dceb82b41e2871630352bfdd235dcdf4b3bf68436ea4229f75d27bd"} Oct 03 09:48:49 crc kubenswrapper[4790]: I1003 09:48:49.315946 4790 generic.go:334] "Generic (PLEG): container finished" podID="ce4ec224-1d55-49ca-902c-96b152141360" containerID="71fcafa2426ebb83bb90773b9b85788c14ae2835f7f4094dbd0748ae7ba8501b" exitCode=0 Oct 03 09:48:49 crc kubenswrapper[4790]: I1003 09:48:49.316048 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fxpsz" event={"ID":"ce4ec224-1d55-49ca-902c-96b152141360","Type":"ContainerDied","Data":"71fcafa2426ebb83bb90773b9b85788c14ae2835f7f4094dbd0748ae7ba8501b"} Oct 03 09:48:49 crc kubenswrapper[4790]: I1003 09:48:49.318192 4790 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 03 09:48:50 crc kubenswrapper[4790]: I1003 09:48:50.326208 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fxpsz" event={"ID":"ce4ec224-1d55-49ca-902c-96b152141360","Type":"ContainerStarted","Data":"5c744c384d08b5487b25522a83530c652a07bf8ca016035e848762ba78e691e2"} Oct 03 09:48:52 crc kubenswrapper[4790]: I1003 09:48:52.344137 4790 generic.go:334] "Generic (PLEG): container finished" podID="ce4ec224-1d55-49ca-902c-96b152141360" containerID="5c744c384d08b5487b25522a83530c652a07bf8ca016035e848762ba78e691e2" exitCode=0 Oct 03 09:48:52 crc kubenswrapper[4790]: I1003 09:48:52.344244 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fxpsz" event={"ID":"ce4ec224-1d55-49ca-902c-96b152141360","Type":"ContainerDied","Data":"5c744c384d08b5487b25522a83530c652a07bf8ca016035e848762ba78e691e2"} Oct 03 09:48:53 crc kubenswrapper[4790]: I1003 09:48:53.361883 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fxpsz" event={"ID":"ce4ec224-1d55-49ca-902c-96b152141360","Type":"ContainerStarted","Data":"546e5d33ef3660960bf28f3fb9fa05ae8fa78ba41ffbad35bfdfeb4dc81d351a"} Oct 03 09:48:53 crc kubenswrapper[4790]: I1003 09:48:53.387685 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-fxpsz" podStartSLOduration=2.887082568 podStartE2EDuration="6.387667995s" podCreationTimestamp="2025-10-03 09:48:47 +0000 UTC" firstStartedPulling="2025-10-03 09:48:49.317861695 +0000 UTC m=+4115.670795933" lastFinishedPulling="2025-10-03 09:48:52.818447122 +0000 UTC m=+4119.171381360" observedRunningTime="2025-10-03 09:48:53.378550102 +0000 UTC m=+4119.731484350" watchObservedRunningTime="2025-10-03 09:48:53.387667995 +0000 UTC m=+4119.740602233" Oct 03 09:48:57 crc kubenswrapper[4790]: I1003 09:48:57.555173 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-fxpsz" Oct 03 09:48:57 crc kubenswrapper[4790]: I1003 09:48:57.555744 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-fxpsz" Oct 03 09:48:57 crc kubenswrapper[4790]: I1003 09:48:57.613226 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-fxpsz" Oct 03 09:48:58 crc kubenswrapper[4790]: I1003 09:48:58.459623 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-fxpsz" Oct 03 09:48:58 crc kubenswrapper[4790]: I1003 09:48:58.528760 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-fxpsz"] Oct 03 09:49:00 crc kubenswrapper[4790]: I1003 09:49:00.423133 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-fxpsz" podUID="ce4ec224-1d55-49ca-902c-96b152141360" containerName="registry-server" containerID="cri-o://546e5d33ef3660960bf28f3fb9fa05ae8fa78ba41ffbad35bfdfeb4dc81d351a" gracePeriod=2 Oct 03 09:49:00 crc kubenswrapper[4790]: I1003 09:49:00.936104 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fxpsz" Oct 03 09:49:01 crc kubenswrapper[4790]: I1003 09:49:01.044405 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ce4ec224-1d55-49ca-902c-96b152141360-utilities\") pod \"ce4ec224-1d55-49ca-902c-96b152141360\" (UID: \"ce4ec224-1d55-49ca-902c-96b152141360\") " Oct 03 09:49:01 crc kubenswrapper[4790]: I1003 09:49:01.044788 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4wtgm\" (UniqueName: \"kubernetes.io/projected/ce4ec224-1d55-49ca-902c-96b152141360-kube-api-access-4wtgm\") pod \"ce4ec224-1d55-49ca-902c-96b152141360\" (UID: \"ce4ec224-1d55-49ca-902c-96b152141360\") " Oct 03 09:49:01 crc kubenswrapper[4790]: I1003 09:49:01.044993 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ce4ec224-1d55-49ca-902c-96b152141360-catalog-content\") pod \"ce4ec224-1d55-49ca-902c-96b152141360\" (UID: \"ce4ec224-1d55-49ca-902c-96b152141360\") " Oct 03 09:49:01 crc kubenswrapper[4790]: I1003 09:49:01.048119 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ce4ec224-1d55-49ca-902c-96b152141360-utilities" (OuterVolumeSpecName: "utilities") pod "ce4ec224-1d55-49ca-902c-96b152141360" (UID: "ce4ec224-1d55-49ca-902c-96b152141360"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 09:49:01 crc kubenswrapper[4790]: I1003 09:49:01.051019 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ce4ec224-1d55-49ca-902c-96b152141360-kube-api-access-4wtgm" (OuterVolumeSpecName: "kube-api-access-4wtgm") pod "ce4ec224-1d55-49ca-902c-96b152141360" (UID: "ce4ec224-1d55-49ca-902c-96b152141360"). InnerVolumeSpecName "kube-api-access-4wtgm". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:49:01 crc kubenswrapper[4790]: I1003 09:49:01.101852 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ce4ec224-1d55-49ca-902c-96b152141360-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ce4ec224-1d55-49ca-902c-96b152141360" (UID: "ce4ec224-1d55-49ca-902c-96b152141360"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 09:49:01 crc kubenswrapper[4790]: I1003 09:49:01.147865 4790 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ce4ec224-1d55-49ca-902c-96b152141360-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 09:49:01 crc kubenswrapper[4790]: I1003 09:49:01.147914 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4wtgm\" (UniqueName: \"kubernetes.io/projected/ce4ec224-1d55-49ca-902c-96b152141360-kube-api-access-4wtgm\") on node \"crc\" DevicePath \"\"" Oct 03 09:49:01 crc kubenswrapper[4790]: I1003 09:49:01.147928 4790 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ce4ec224-1d55-49ca-902c-96b152141360-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 09:49:01 crc kubenswrapper[4790]: I1003 09:49:01.432694 4790 generic.go:334] "Generic (PLEG): container finished" podID="ce4ec224-1d55-49ca-902c-96b152141360" containerID="546e5d33ef3660960bf28f3fb9fa05ae8fa78ba41ffbad35bfdfeb4dc81d351a" exitCode=0 Oct 03 09:49:01 crc kubenswrapper[4790]: I1003 09:49:01.432735 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fxpsz" event={"ID":"ce4ec224-1d55-49ca-902c-96b152141360","Type":"ContainerDied","Data":"546e5d33ef3660960bf28f3fb9fa05ae8fa78ba41ffbad35bfdfeb4dc81d351a"} Oct 03 09:49:01 crc kubenswrapper[4790]: I1003 09:49:01.432756 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fxpsz" Oct 03 09:49:01 crc kubenswrapper[4790]: I1003 09:49:01.432773 4790 scope.go:117] "RemoveContainer" containerID="546e5d33ef3660960bf28f3fb9fa05ae8fa78ba41ffbad35bfdfeb4dc81d351a" Oct 03 09:49:01 crc kubenswrapper[4790]: I1003 09:49:01.432762 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fxpsz" event={"ID":"ce4ec224-1d55-49ca-902c-96b152141360","Type":"ContainerDied","Data":"754aa12f4dceb82b41e2871630352bfdd235dcdf4b3bf68436ea4229f75d27bd"} Oct 03 09:49:01 crc kubenswrapper[4790]: I1003 09:49:01.455431 4790 scope.go:117] "RemoveContainer" containerID="5c744c384d08b5487b25522a83530c652a07bf8ca016035e848762ba78e691e2" Oct 03 09:49:01 crc kubenswrapper[4790]: I1003 09:49:01.473252 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-fxpsz"] Oct 03 09:49:01 crc kubenswrapper[4790]: I1003 09:49:01.482069 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-fxpsz"] Oct 03 09:49:01 crc kubenswrapper[4790]: I1003 09:49:01.482394 4790 scope.go:117] "RemoveContainer" containerID="71fcafa2426ebb83bb90773b9b85788c14ae2835f7f4094dbd0748ae7ba8501b" Oct 03 09:49:01 crc kubenswrapper[4790]: I1003 09:49:01.527192 4790 scope.go:117] "RemoveContainer" containerID="546e5d33ef3660960bf28f3fb9fa05ae8fa78ba41ffbad35bfdfeb4dc81d351a" Oct 03 09:49:01 crc kubenswrapper[4790]: E1003 09:49:01.532004 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"546e5d33ef3660960bf28f3fb9fa05ae8fa78ba41ffbad35bfdfeb4dc81d351a\": container with ID starting with 546e5d33ef3660960bf28f3fb9fa05ae8fa78ba41ffbad35bfdfeb4dc81d351a not found: ID does not exist" containerID="546e5d33ef3660960bf28f3fb9fa05ae8fa78ba41ffbad35bfdfeb4dc81d351a" Oct 03 09:49:01 crc kubenswrapper[4790]: I1003 09:49:01.532066 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"546e5d33ef3660960bf28f3fb9fa05ae8fa78ba41ffbad35bfdfeb4dc81d351a"} err="failed to get container status \"546e5d33ef3660960bf28f3fb9fa05ae8fa78ba41ffbad35bfdfeb4dc81d351a\": rpc error: code = NotFound desc = could not find container \"546e5d33ef3660960bf28f3fb9fa05ae8fa78ba41ffbad35bfdfeb4dc81d351a\": container with ID starting with 546e5d33ef3660960bf28f3fb9fa05ae8fa78ba41ffbad35bfdfeb4dc81d351a not found: ID does not exist" Oct 03 09:49:01 crc kubenswrapper[4790]: I1003 09:49:01.532093 4790 scope.go:117] "RemoveContainer" containerID="5c744c384d08b5487b25522a83530c652a07bf8ca016035e848762ba78e691e2" Oct 03 09:49:01 crc kubenswrapper[4790]: E1003 09:49:01.532905 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5c744c384d08b5487b25522a83530c652a07bf8ca016035e848762ba78e691e2\": container with ID starting with 5c744c384d08b5487b25522a83530c652a07bf8ca016035e848762ba78e691e2 not found: ID does not exist" containerID="5c744c384d08b5487b25522a83530c652a07bf8ca016035e848762ba78e691e2" Oct 03 09:49:01 crc kubenswrapper[4790]: I1003 09:49:01.532972 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5c744c384d08b5487b25522a83530c652a07bf8ca016035e848762ba78e691e2"} err="failed to get container status \"5c744c384d08b5487b25522a83530c652a07bf8ca016035e848762ba78e691e2\": rpc error: code = NotFound desc = could not find container \"5c744c384d08b5487b25522a83530c652a07bf8ca016035e848762ba78e691e2\": container with ID starting with 5c744c384d08b5487b25522a83530c652a07bf8ca016035e848762ba78e691e2 not found: ID does not exist" Oct 03 09:49:01 crc kubenswrapper[4790]: I1003 09:49:01.533113 4790 scope.go:117] "RemoveContainer" containerID="71fcafa2426ebb83bb90773b9b85788c14ae2835f7f4094dbd0748ae7ba8501b" Oct 03 09:49:01 crc kubenswrapper[4790]: E1003 09:49:01.533814 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"71fcafa2426ebb83bb90773b9b85788c14ae2835f7f4094dbd0748ae7ba8501b\": container with ID starting with 71fcafa2426ebb83bb90773b9b85788c14ae2835f7f4094dbd0748ae7ba8501b not found: ID does not exist" containerID="71fcafa2426ebb83bb90773b9b85788c14ae2835f7f4094dbd0748ae7ba8501b" Oct 03 09:49:01 crc kubenswrapper[4790]: I1003 09:49:01.533854 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"71fcafa2426ebb83bb90773b9b85788c14ae2835f7f4094dbd0748ae7ba8501b"} err="failed to get container status \"71fcafa2426ebb83bb90773b9b85788c14ae2835f7f4094dbd0748ae7ba8501b\": rpc error: code = NotFound desc = could not find container \"71fcafa2426ebb83bb90773b9b85788c14ae2835f7f4094dbd0748ae7ba8501b\": container with ID starting with 71fcafa2426ebb83bb90773b9b85788c14ae2835f7f4094dbd0748ae7ba8501b not found: ID does not exist" Oct 03 09:49:02 crc kubenswrapper[4790]: I1003 09:49:02.372163 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ce4ec224-1d55-49ca-902c-96b152141360" path="/var/lib/kubelet/pods/ce4ec224-1d55-49ca-902c-96b152141360/volumes" Oct 03 09:49:15 crc kubenswrapper[4790]: I1003 09:49:15.458975 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-wvrkq"] Oct 03 09:49:15 crc kubenswrapper[4790]: E1003 09:49:15.464017 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ce4ec224-1d55-49ca-902c-96b152141360" containerName="registry-server" Oct 03 09:49:15 crc kubenswrapper[4790]: I1003 09:49:15.464039 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce4ec224-1d55-49ca-902c-96b152141360" containerName="registry-server" Oct 03 09:49:15 crc kubenswrapper[4790]: E1003 09:49:15.464066 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ce4ec224-1d55-49ca-902c-96b152141360" containerName="extract-content" Oct 03 09:49:15 crc kubenswrapper[4790]: I1003 09:49:15.464072 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce4ec224-1d55-49ca-902c-96b152141360" containerName="extract-content" Oct 03 09:49:15 crc kubenswrapper[4790]: E1003 09:49:15.464094 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ce4ec224-1d55-49ca-902c-96b152141360" containerName="extract-utilities" Oct 03 09:49:15 crc kubenswrapper[4790]: I1003 09:49:15.464100 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce4ec224-1d55-49ca-902c-96b152141360" containerName="extract-utilities" Oct 03 09:49:15 crc kubenswrapper[4790]: I1003 09:49:15.464329 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="ce4ec224-1d55-49ca-902c-96b152141360" containerName="registry-server" Oct 03 09:49:15 crc kubenswrapper[4790]: I1003 09:49:15.467045 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wvrkq" Oct 03 09:49:15 crc kubenswrapper[4790]: I1003 09:49:15.480283 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-wvrkq"] Oct 03 09:49:15 crc kubenswrapper[4790]: I1003 09:49:15.567937 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0b74f8b8-2bf7-43ee-9a9a-771782138b4a-catalog-content\") pod \"redhat-operators-wvrkq\" (UID: \"0b74f8b8-2bf7-43ee-9a9a-771782138b4a\") " pod="openshift-marketplace/redhat-operators-wvrkq" Oct 03 09:49:15 crc kubenswrapper[4790]: I1003 09:49:15.568075 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0b74f8b8-2bf7-43ee-9a9a-771782138b4a-utilities\") pod \"redhat-operators-wvrkq\" (UID: \"0b74f8b8-2bf7-43ee-9a9a-771782138b4a\") " pod="openshift-marketplace/redhat-operators-wvrkq" Oct 03 09:49:15 crc kubenswrapper[4790]: I1003 09:49:15.568401 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bl477\" (UniqueName: \"kubernetes.io/projected/0b74f8b8-2bf7-43ee-9a9a-771782138b4a-kube-api-access-bl477\") pod \"redhat-operators-wvrkq\" (UID: \"0b74f8b8-2bf7-43ee-9a9a-771782138b4a\") " pod="openshift-marketplace/redhat-operators-wvrkq" Oct 03 09:49:15 crc kubenswrapper[4790]: I1003 09:49:15.672538 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0b74f8b8-2bf7-43ee-9a9a-771782138b4a-utilities\") pod \"redhat-operators-wvrkq\" (UID: \"0b74f8b8-2bf7-43ee-9a9a-771782138b4a\") " pod="openshift-marketplace/redhat-operators-wvrkq" Oct 03 09:49:15 crc kubenswrapper[4790]: I1003 09:49:15.672765 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bl477\" (UniqueName: \"kubernetes.io/projected/0b74f8b8-2bf7-43ee-9a9a-771782138b4a-kube-api-access-bl477\") pod \"redhat-operators-wvrkq\" (UID: \"0b74f8b8-2bf7-43ee-9a9a-771782138b4a\") " pod="openshift-marketplace/redhat-operators-wvrkq" Oct 03 09:49:15 crc kubenswrapper[4790]: I1003 09:49:15.672801 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0b74f8b8-2bf7-43ee-9a9a-771782138b4a-catalog-content\") pod \"redhat-operators-wvrkq\" (UID: \"0b74f8b8-2bf7-43ee-9a9a-771782138b4a\") " pod="openshift-marketplace/redhat-operators-wvrkq" Oct 03 09:49:15 crc kubenswrapper[4790]: I1003 09:49:15.673038 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0b74f8b8-2bf7-43ee-9a9a-771782138b4a-utilities\") pod \"redhat-operators-wvrkq\" (UID: \"0b74f8b8-2bf7-43ee-9a9a-771782138b4a\") " pod="openshift-marketplace/redhat-operators-wvrkq" Oct 03 09:49:15 crc kubenswrapper[4790]: I1003 09:49:15.673227 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0b74f8b8-2bf7-43ee-9a9a-771782138b4a-catalog-content\") pod \"redhat-operators-wvrkq\" (UID: \"0b74f8b8-2bf7-43ee-9a9a-771782138b4a\") " pod="openshift-marketplace/redhat-operators-wvrkq" Oct 03 09:49:15 crc kubenswrapper[4790]: I1003 09:49:15.692952 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bl477\" (UniqueName: \"kubernetes.io/projected/0b74f8b8-2bf7-43ee-9a9a-771782138b4a-kube-api-access-bl477\") pod \"redhat-operators-wvrkq\" (UID: \"0b74f8b8-2bf7-43ee-9a9a-771782138b4a\") " pod="openshift-marketplace/redhat-operators-wvrkq" Oct 03 09:49:15 crc kubenswrapper[4790]: I1003 09:49:15.787210 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wvrkq" Oct 03 09:49:16 crc kubenswrapper[4790]: I1003 09:49:16.272795 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-wvrkq"] Oct 03 09:49:16 crc kubenswrapper[4790]: I1003 09:49:16.583616 4790 generic.go:334] "Generic (PLEG): container finished" podID="0b74f8b8-2bf7-43ee-9a9a-771782138b4a" containerID="4613e7a1d43c8167aa26cfa1dd5bf3fe6027c88fe9ab01ff13455e1d07520389" exitCode=0 Oct 03 09:49:16 crc kubenswrapper[4790]: I1003 09:49:16.583691 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wvrkq" event={"ID":"0b74f8b8-2bf7-43ee-9a9a-771782138b4a","Type":"ContainerDied","Data":"4613e7a1d43c8167aa26cfa1dd5bf3fe6027c88fe9ab01ff13455e1d07520389"} Oct 03 09:49:16 crc kubenswrapper[4790]: I1003 09:49:16.583901 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wvrkq" event={"ID":"0b74f8b8-2bf7-43ee-9a9a-771782138b4a","Type":"ContainerStarted","Data":"609c288d9be25d58ca91b2af6f2ad46548f8376dfa15785eaf8b3072d2ea3ae7"} Oct 03 09:49:18 crc kubenswrapper[4790]: I1003 09:49:18.604648 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wvrkq" event={"ID":"0b74f8b8-2bf7-43ee-9a9a-771782138b4a","Type":"ContainerStarted","Data":"f30a6452845add79c002a464326fb19878aace79c393e393d089aa7b9cfdbf0a"} Oct 03 09:49:21 crc kubenswrapper[4790]: I1003 09:49:21.635746 4790 generic.go:334] "Generic (PLEG): container finished" podID="0b74f8b8-2bf7-43ee-9a9a-771782138b4a" containerID="f30a6452845add79c002a464326fb19878aace79c393e393d089aa7b9cfdbf0a" exitCode=0 Oct 03 09:49:21 crc kubenswrapper[4790]: I1003 09:49:21.635889 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wvrkq" event={"ID":"0b74f8b8-2bf7-43ee-9a9a-771782138b4a","Type":"ContainerDied","Data":"f30a6452845add79c002a464326fb19878aace79c393e393d089aa7b9cfdbf0a"} Oct 03 09:49:22 crc kubenswrapper[4790]: I1003 09:49:22.648802 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wvrkq" event={"ID":"0b74f8b8-2bf7-43ee-9a9a-771782138b4a","Type":"ContainerStarted","Data":"5b21473811afb2956660c071ab287809dcc36f964039c533f3fae68a2b9835be"} Oct 03 09:49:22 crc kubenswrapper[4790]: I1003 09:49:22.667586 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-wvrkq" podStartSLOduration=2.027768993 podStartE2EDuration="7.667556649s" podCreationTimestamp="2025-10-03 09:49:15 +0000 UTC" firstStartedPulling="2025-10-03 09:49:16.585455014 +0000 UTC m=+4142.938389252" lastFinishedPulling="2025-10-03 09:49:22.22524265 +0000 UTC m=+4148.578176908" observedRunningTime="2025-10-03 09:49:22.666056848 +0000 UTC m=+4149.018991086" watchObservedRunningTime="2025-10-03 09:49:22.667556649 +0000 UTC m=+4149.020490887" Oct 03 09:49:25 crc kubenswrapper[4790]: I1003 09:49:25.788279 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-wvrkq" Oct 03 09:49:25 crc kubenswrapper[4790]: I1003 09:49:25.789793 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-wvrkq" Oct 03 09:49:26 crc kubenswrapper[4790]: I1003 09:49:26.835259 4790 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-wvrkq" podUID="0b74f8b8-2bf7-43ee-9a9a-771782138b4a" containerName="registry-server" probeResult="failure" output=< Oct 03 09:49:26 crc kubenswrapper[4790]: timeout: failed to connect service ":50051" within 1s Oct 03 09:49:26 crc kubenswrapper[4790]: > Oct 03 09:49:35 crc kubenswrapper[4790]: I1003 09:49:35.843718 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-wvrkq" Oct 03 09:49:35 crc kubenswrapper[4790]: I1003 09:49:35.890811 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-wvrkq" Oct 03 09:49:36 crc kubenswrapper[4790]: I1003 09:49:36.081140 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-wvrkq"] Oct 03 09:49:37 crc kubenswrapper[4790]: I1003 09:49:37.795491 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-wvrkq" podUID="0b74f8b8-2bf7-43ee-9a9a-771782138b4a" containerName="registry-server" containerID="cri-o://5b21473811afb2956660c071ab287809dcc36f964039c533f3fae68a2b9835be" gracePeriod=2 Oct 03 09:49:38 crc kubenswrapper[4790]: I1003 09:49:38.258651 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wvrkq" Oct 03 09:49:38 crc kubenswrapper[4790]: I1003 09:49:38.460314 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bl477\" (UniqueName: \"kubernetes.io/projected/0b74f8b8-2bf7-43ee-9a9a-771782138b4a-kube-api-access-bl477\") pod \"0b74f8b8-2bf7-43ee-9a9a-771782138b4a\" (UID: \"0b74f8b8-2bf7-43ee-9a9a-771782138b4a\") " Oct 03 09:49:38 crc kubenswrapper[4790]: I1003 09:49:38.460676 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0b74f8b8-2bf7-43ee-9a9a-771782138b4a-catalog-content\") pod \"0b74f8b8-2bf7-43ee-9a9a-771782138b4a\" (UID: \"0b74f8b8-2bf7-43ee-9a9a-771782138b4a\") " Oct 03 09:49:38 crc kubenswrapper[4790]: I1003 09:49:38.460746 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0b74f8b8-2bf7-43ee-9a9a-771782138b4a-utilities\") pod \"0b74f8b8-2bf7-43ee-9a9a-771782138b4a\" (UID: \"0b74f8b8-2bf7-43ee-9a9a-771782138b4a\") " Oct 03 09:49:38 crc kubenswrapper[4790]: I1003 09:49:38.461887 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0b74f8b8-2bf7-43ee-9a9a-771782138b4a-utilities" (OuterVolumeSpecName: "utilities") pod "0b74f8b8-2bf7-43ee-9a9a-771782138b4a" (UID: "0b74f8b8-2bf7-43ee-9a9a-771782138b4a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 09:49:38 crc kubenswrapper[4790]: I1003 09:49:38.481202 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b74f8b8-2bf7-43ee-9a9a-771782138b4a-kube-api-access-bl477" (OuterVolumeSpecName: "kube-api-access-bl477") pod "0b74f8b8-2bf7-43ee-9a9a-771782138b4a" (UID: "0b74f8b8-2bf7-43ee-9a9a-771782138b4a"). InnerVolumeSpecName "kube-api-access-bl477". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:49:38 crc kubenswrapper[4790]: I1003 09:49:38.549468 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0b74f8b8-2bf7-43ee-9a9a-771782138b4a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0b74f8b8-2bf7-43ee-9a9a-771782138b4a" (UID: "0b74f8b8-2bf7-43ee-9a9a-771782138b4a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 09:49:38 crc kubenswrapper[4790]: I1003 09:49:38.565505 4790 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0b74f8b8-2bf7-43ee-9a9a-771782138b4a-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 09:49:38 crc kubenswrapper[4790]: I1003 09:49:38.565546 4790 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0b74f8b8-2bf7-43ee-9a9a-771782138b4a-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 09:49:38 crc kubenswrapper[4790]: I1003 09:49:38.565558 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bl477\" (UniqueName: \"kubernetes.io/projected/0b74f8b8-2bf7-43ee-9a9a-771782138b4a-kube-api-access-bl477\") on node \"crc\" DevicePath \"\"" Oct 03 09:49:38 crc kubenswrapper[4790]: I1003 09:49:38.807633 4790 generic.go:334] "Generic (PLEG): container finished" podID="0b74f8b8-2bf7-43ee-9a9a-771782138b4a" containerID="5b21473811afb2956660c071ab287809dcc36f964039c533f3fae68a2b9835be" exitCode=0 Oct 03 09:49:38 crc kubenswrapper[4790]: I1003 09:49:38.807690 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wvrkq" event={"ID":"0b74f8b8-2bf7-43ee-9a9a-771782138b4a","Type":"ContainerDied","Data":"5b21473811afb2956660c071ab287809dcc36f964039c533f3fae68a2b9835be"} Oct 03 09:49:38 crc kubenswrapper[4790]: I1003 09:49:38.807729 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wvrkq" event={"ID":"0b74f8b8-2bf7-43ee-9a9a-771782138b4a","Type":"ContainerDied","Data":"609c288d9be25d58ca91b2af6f2ad46548f8376dfa15785eaf8b3072d2ea3ae7"} Oct 03 09:49:38 crc kubenswrapper[4790]: I1003 09:49:38.807751 4790 scope.go:117] "RemoveContainer" containerID="5b21473811afb2956660c071ab287809dcc36f964039c533f3fae68a2b9835be" Oct 03 09:49:38 crc kubenswrapper[4790]: I1003 09:49:38.807795 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wvrkq" Oct 03 09:49:38 crc kubenswrapper[4790]: I1003 09:49:38.839825 4790 scope.go:117] "RemoveContainer" containerID="f30a6452845add79c002a464326fb19878aace79c393e393d089aa7b9cfdbf0a" Oct 03 09:49:38 crc kubenswrapper[4790]: I1003 09:49:38.846345 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-wvrkq"] Oct 03 09:49:38 crc kubenswrapper[4790]: I1003 09:49:38.855456 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-wvrkq"] Oct 03 09:49:38 crc kubenswrapper[4790]: I1003 09:49:38.871278 4790 scope.go:117] "RemoveContainer" containerID="4613e7a1d43c8167aa26cfa1dd5bf3fe6027c88fe9ab01ff13455e1d07520389" Oct 03 09:49:38 crc kubenswrapper[4790]: I1003 09:49:38.919068 4790 scope.go:117] "RemoveContainer" containerID="5b21473811afb2956660c071ab287809dcc36f964039c533f3fae68a2b9835be" Oct 03 09:49:38 crc kubenswrapper[4790]: E1003 09:49:38.919775 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5b21473811afb2956660c071ab287809dcc36f964039c533f3fae68a2b9835be\": container with ID starting with 5b21473811afb2956660c071ab287809dcc36f964039c533f3fae68a2b9835be not found: ID does not exist" containerID="5b21473811afb2956660c071ab287809dcc36f964039c533f3fae68a2b9835be" Oct 03 09:49:38 crc kubenswrapper[4790]: I1003 09:49:38.919849 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5b21473811afb2956660c071ab287809dcc36f964039c533f3fae68a2b9835be"} err="failed to get container status \"5b21473811afb2956660c071ab287809dcc36f964039c533f3fae68a2b9835be\": rpc error: code = NotFound desc = could not find container \"5b21473811afb2956660c071ab287809dcc36f964039c533f3fae68a2b9835be\": container with ID starting with 5b21473811afb2956660c071ab287809dcc36f964039c533f3fae68a2b9835be not found: ID does not exist" Oct 03 09:49:38 crc kubenswrapper[4790]: I1003 09:49:38.919898 4790 scope.go:117] "RemoveContainer" containerID="f30a6452845add79c002a464326fb19878aace79c393e393d089aa7b9cfdbf0a" Oct 03 09:49:38 crc kubenswrapper[4790]: E1003 09:49:38.920148 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f30a6452845add79c002a464326fb19878aace79c393e393d089aa7b9cfdbf0a\": container with ID starting with f30a6452845add79c002a464326fb19878aace79c393e393d089aa7b9cfdbf0a not found: ID does not exist" containerID="f30a6452845add79c002a464326fb19878aace79c393e393d089aa7b9cfdbf0a" Oct 03 09:49:38 crc kubenswrapper[4790]: I1003 09:49:38.920201 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f30a6452845add79c002a464326fb19878aace79c393e393d089aa7b9cfdbf0a"} err="failed to get container status \"f30a6452845add79c002a464326fb19878aace79c393e393d089aa7b9cfdbf0a\": rpc error: code = NotFound desc = could not find container \"f30a6452845add79c002a464326fb19878aace79c393e393d089aa7b9cfdbf0a\": container with ID starting with f30a6452845add79c002a464326fb19878aace79c393e393d089aa7b9cfdbf0a not found: ID does not exist" Oct 03 09:49:38 crc kubenswrapper[4790]: I1003 09:49:38.920218 4790 scope.go:117] "RemoveContainer" containerID="4613e7a1d43c8167aa26cfa1dd5bf3fe6027c88fe9ab01ff13455e1d07520389" Oct 03 09:49:38 crc kubenswrapper[4790]: E1003 09:49:38.920509 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4613e7a1d43c8167aa26cfa1dd5bf3fe6027c88fe9ab01ff13455e1d07520389\": container with ID starting with 4613e7a1d43c8167aa26cfa1dd5bf3fe6027c88fe9ab01ff13455e1d07520389 not found: ID does not exist" containerID="4613e7a1d43c8167aa26cfa1dd5bf3fe6027c88fe9ab01ff13455e1d07520389" Oct 03 09:49:38 crc kubenswrapper[4790]: I1003 09:49:38.920563 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4613e7a1d43c8167aa26cfa1dd5bf3fe6027c88fe9ab01ff13455e1d07520389"} err="failed to get container status \"4613e7a1d43c8167aa26cfa1dd5bf3fe6027c88fe9ab01ff13455e1d07520389\": rpc error: code = NotFound desc = could not find container \"4613e7a1d43c8167aa26cfa1dd5bf3fe6027c88fe9ab01ff13455e1d07520389\": container with ID starting with 4613e7a1d43c8167aa26cfa1dd5bf3fe6027c88fe9ab01ff13455e1d07520389 not found: ID does not exist" Oct 03 09:49:40 crc kubenswrapper[4790]: I1003 09:49:40.344607 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b74f8b8-2bf7-43ee-9a9a-771782138b4a" path="/var/lib/kubelet/pods/0b74f8b8-2bf7-43ee-9a9a-771782138b4a/volumes" Oct 03 09:50:40 crc kubenswrapper[4790]: I1003 09:50:40.185908 4790 patch_prober.go:28] interesting pod/machine-config-daemon-zqj4j container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 09:50:40 crc kubenswrapper[4790]: I1003 09:50:40.186520 4790 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" podUID="36eae06a-d8be-4128-8d68-acb32e1f011b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 09:51:10 crc kubenswrapper[4790]: I1003 09:51:10.186681 4790 patch_prober.go:28] interesting pod/machine-config-daemon-zqj4j container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 09:51:10 crc kubenswrapper[4790]: I1003 09:51:10.188454 4790 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" podUID="36eae06a-d8be-4128-8d68-acb32e1f011b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 09:51:40 crc kubenswrapper[4790]: I1003 09:51:40.186396 4790 patch_prober.go:28] interesting pod/machine-config-daemon-zqj4j container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 09:51:40 crc kubenswrapper[4790]: I1003 09:51:40.186954 4790 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" podUID="36eae06a-d8be-4128-8d68-acb32e1f011b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 09:51:40 crc kubenswrapper[4790]: I1003 09:51:40.186999 4790 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" Oct 03 09:51:40 crc kubenswrapper[4790]: I1003 09:51:40.187850 4790 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"8e0196bb36398706962d585a007c8fc442ef4fad86d7a5243997f04886073380"} pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 03 09:51:40 crc kubenswrapper[4790]: I1003 09:51:40.187911 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" podUID="36eae06a-d8be-4128-8d68-acb32e1f011b" containerName="machine-config-daemon" containerID="cri-o://8e0196bb36398706962d585a007c8fc442ef4fad86d7a5243997f04886073380" gracePeriod=600 Oct 03 09:51:40 crc kubenswrapper[4790]: E1003 09:51:40.315489 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqj4j_openshift-machine-config-operator(36eae06a-d8be-4128-8d68-acb32e1f011b)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" podUID="36eae06a-d8be-4128-8d68-acb32e1f011b" Oct 03 09:51:40 crc kubenswrapper[4790]: I1003 09:51:40.947460 4790 generic.go:334] "Generic (PLEG): container finished" podID="36eae06a-d8be-4128-8d68-acb32e1f011b" containerID="8e0196bb36398706962d585a007c8fc442ef4fad86d7a5243997f04886073380" exitCode=0 Oct 03 09:51:40 crc kubenswrapper[4790]: I1003 09:51:40.947518 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" event={"ID":"36eae06a-d8be-4128-8d68-acb32e1f011b","Type":"ContainerDied","Data":"8e0196bb36398706962d585a007c8fc442ef4fad86d7a5243997f04886073380"} Oct 03 09:51:40 crc kubenswrapper[4790]: I1003 09:51:40.947568 4790 scope.go:117] "RemoveContainer" containerID="d8e63f555b14da1aeb2f2ca4b01950ae4847d21aba56ef772f222042cdc7ddc1" Oct 03 09:51:40 crc kubenswrapper[4790]: I1003 09:51:40.949295 4790 scope.go:117] "RemoveContainer" containerID="8e0196bb36398706962d585a007c8fc442ef4fad86d7a5243997f04886073380" Oct 03 09:51:40 crc kubenswrapper[4790]: E1003 09:51:40.949882 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqj4j_openshift-machine-config-operator(36eae06a-d8be-4128-8d68-acb32e1f011b)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" podUID="36eae06a-d8be-4128-8d68-acb32e1f011b" Oct 03 09:51:52 crc kubenswrapper[4790]: I1003 09:51:52.327992 4790 scope.go:117] "RemoveContainer" containerID="8e0196bb36398706962d585a007c8fc442ef4fad86d7a5243997f04886073380" Oct 03 09:51:52 crc kubenswrapper[4790]: E1003 09:51:52.328844 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqj4j_openshift-machine-config-operator(36eae06a-d8be-4128-8d68-acb32e1f011b)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" podUID="36eae06a-d8be-4128-8d68-acb32e1f011b" Oct 03 09:52:06 crc kubenswrapper[4790]: I1003 09:52:06.329077 4790 scope.go:117] "RemoveContainer" containerID="8e0196bb36398706962d585a007c8fc442ef4fad86d7a5243997f04886073380" Oct 03 09:52:06 crc kubenswrapper[4790]: E1003 09:52:06.329874 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqj4j_openshift-machine-config-operator(36eae06a-d8be-4128-8d68-acb32e1f011b)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" podUID="36eae06a-d8be-4128-8d68-acb32e1f011b" Oct 03 09:52:19 crc kubenswrapper[4790]: I1003 09:52:19.329030 4790 scope.go:117] "RemoveContainer" containerID="8e0196bb36398706962d585a007c8fc442ef4fad86d7a5243997f04886073380" Oct 03 09:52:19 crc kubenswrapper[4790]: E1003 09:52:19.329871 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqj4j_openshift-machine-config-operator(36eae06a-d8be-4128-8d68-acb32e1f011b)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" podUID="36eae06a-d8be-4128-8d68-acb32e1f011b" Oct 03 09:52:33 crc kubenswrapper[4790]: I1003 09:52:33.328450 4790 scope.go:117] "RemoveContainer" containerID="8e0196bb36398706962d585a007c8fc442ef4fad86d7a5243997f04886073380" Oct 03 09:52:33 crc kubenswrapper[4790]: E1003 09:52:33.329300 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqj4j_openshift-machine-config-operator(36eae06a-d8be-4128-8d68-acb32e1f011b)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" podUID="36eae06a-d8be-4128-8d68-acb32e1f011b" Oct 03 09:52:46 crc kubenswrapper[4790]: I1003 09:52:46.328872 4790 scope.go:117] "RemoveContainer" containerID="8e0196bb36398706962d585a007c8fc442ef4fad86d7a5243997f04886073380" Oct 03 09:52:46 crc kubenswrapper[4790]: E1003 09:52:46.329627 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqj4j_openshift-machine-config-operator(36eae06a-d8be-4128-8d68-acb32e1f011b)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" podUID="36eae06a-d8be-4128-8d68-acb32e1f011b" Oct 03 09:53:00 crc kubenswrapper[4790]: I1003 09:53:00.328079 4790 scope.go:117] "RemoveContainer" containerID="8e0196bb36398706962d585a007c8fc442ef4fad86d7a5243997f04886073380" Oct 03 09:53:00 crc kubenswrapper[4790]: E1003 09:53:00.328879 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqj4j_openshift-machine-config-operator(36eae06a-d8be-4128-8d68-acb32e1f011b)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" podUID="36eae06a-d8be-4128-8d68-acb32e1f011b" Oct 03 09:53:11 crc kubenswrapper[4790]: I1003 09:53:11.328622 4790 scope.go:117] "RemoveContainer" containerID="8e0196bb36398706962d585a007c8fc442ef4fad86d7a5243997f04886073380" Oct 03 09:53:11 crc kubenswrapper[4790]: E1003 09:53:11.329383 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqj4j_openshift-machine-config-operator(36eae06a-d8be-4128-8d68-acb32e1f011b)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" podUID="36eae06a-d8be-4128-8d68-acb32e1f011b" Oct 03 09:53:26 crc kubenswrapper[4790]: I1003 09:53:26.328735 4790 scope.go:117] "RemoveContainer" containerID="8e0196bb36398706962d585a007c8fc442ef4fad86d7a5243997f04886073380" Oct 03 09:53:26 crc kubenswrapper[4790]: E1003 09:53:26.329868 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqj4j_openshift-machine-config-operator(36eae06a-d8be-4128-8d68-acb32e1f011b)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" podUID="36eae06a-d8be-4128-8d68-acb32e1f011b" Oct 03 09:53:39 crc kubenswrapper[4790]: I1003 09:53:39.329476 4790 scope.go:117] "RemoveContainer" containerID="8e0196bb36398706962d585a007c8fc442ef4fad86d7a5243997f04886073380" Oct 03 09:53:39 crc kubenswrapper[4790]: E1003 09:53:39.330393 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqj4j_openshift-machine-config-operator(36eae06a-d8be-4128-8d68-acb32e1f011b)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" podUID="36eae06a-d8be-4128-8d68-acb32e1f011b" Oct 03 09:53:50 crc kubenswrapper[4790]: I1003 09:53:50.328249 4790 scope.go:117] "RemoveContainer" containerID="8e0196bb36398706962d585a007c8fc442ef4fad86d7a5243997f04886073380" Oct 03 09:53:50 crc kubenswrapper[4790]: E1003 09:53:50.329074 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqj4j_openshift-machine-config-operator(36eae06a-d8be-4128-8d68-acb32e1f011b)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" podUID="36eae06a-d8be-4128-8d68-acb32e1f011b" Oct 03 09:53:57 crc kubenswrapper[4790]: I1003 09:53:57.139392 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-pxffj"] Oct 03 09:53:57 crc kubenswrapper[4790]: E1003 09:53:57.140215 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0b74f8b8-2bf7-43ee-9a9a-771782138b4a" containerName="registry-server" Oct 03 09:53:57 crc kubenswrapper[4790]: I1003 09:53:57.140226 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b74f8b8-2bf7-43ee-9a9a-771782138b4a" containerName="registry-server" Oct 03 09:53:57 crc kubenswrapper[4790]: E1003 09:53:57.140266 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0b74f8b8-2bf7-43ee-9a9a-771782138b4a" containerName="extract-utilities" Oct 03 09:53:57 crc kubenswrapper[4790]: I1003 09:53:57.140273 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b74f8b8-2bf7-43ee-9a9a-771782138b4a" containerName="extract-utilities" Oct 03 09:53:57 crc kubenswrapper[4790]: E1003 09:53:57.140279 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0b74f8b8-2bf7-43ee-9a9a-771782138b4a" containerName="extract-content" Oct 03 09:53:57 crc kubenswrapper[4790]: I1003 09:53:57.140286 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b74f8b8-2bf7-43ee-9a9a-771782138b4a" containerName="extract-content" Oct 03 09:53:57 crc kubenswrapper[4790]: I1003 09:53:57.140474 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="0b74f8b8-2bf7-43ee-9a9a-771782138b4a" containerName="registry-server" Oct 03 09:53:57 crc kubenswrapper[4790]: I1003 09:53:57.142244 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-pxffj" Oct 03 09:53:57 crc kubenswrapper[4790]: I1003 09:53:57.149933 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-pxffj"] Oct 03 09:53:57 crc kubenswrapper[4790]: I1003 09:53:57.291030 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9f2bcb97-68f1-4428-8716-98dc41980be5-catalog-content\") pod \"redhat-marketplace-pxffj\" (UID: \"9f2bcb97-68f1-4428-8716-98dc41980be5\") " pod="openshift-marketplace/redhat-marketplace-pxffj" Oct 03 09:53:57 crc kubenswrapper[4790]: I1003 09:53:57.291460 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9f2bcb97-68f1-4428-8716-98dc41980be5-utilities\") pod \"redhat-marketplace-pxffj\" (UID: \"9f2bcb97-68f1-4428-8716-98dc41980be5\") " pod="openshift-marketplace/redhat-marketplace-pxffj" Oct 03 09:53:57 crc kubenswrapper[4790]: I1003 09:53:57.291612 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xkh5l\" (UniqueName: \"kubernetes.io/projected/9f2bcb97-68f1-4428-8716-98dc41980be5-kube-api-access-xkh5l\") pod \"redhat-marketplace-pxffj\" (UID: \"9f2bcb97-68f1-4428-8716-98dc41980be5\") " pod="openshift-marketplace/redhat-marketplace-pxffj" Oct 03 09:53:57 crc kubenswrapper[4790]: I1003 09:53:57.393258 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9f2bcb97-68f1-4428-8716-98dc41980be5-catalog-content\") pod \"redhat-marketplace-pxffj\" (UID: \"9f2bcb97-68f1-4428-8716-98dc41980be5\") " pod="openshift-marketplace/redhat-marketplace-pxffj" Oct 03 09:53:57 crc kubenswrapper[4790]: I1003 09:53:57.393566 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9f2bcb97-68f1-4428-8716-98dc41980be5-utilities\") pod \"redhat-marketplace-pxffj\" (UID: \"9f2bcb97-68f1-4428-8716-98dc41980be5\") " pod="openshift-marketplace/redhat-marketplace-pxffj" Oct 03 09:53:57 crc kubenswrapper[4790]: I1003 09:53:57.393680 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xkh5l\" (UniqueName: \"kubernetes.io/projected/9f2bcb97-68f1-4428-8716-98dc41980be5-kube-api-access-xkh5l\") pod \"redhat-marketplace-pxffj\" (UID: \"9f2bcb97-68f1-4428-8716-98dc41980be5\") " pod="openshift-marketplace/redhat-marketplace-pxffj" Oct 03 09:53:57 crc kubenswrapper[4790]: I1003 09:53:57.394376 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9f2bcb97-68f1-4428-8716-98dc41980be5-utilities\") pod \"redhat-marketplace-pxffj\" (UID: \"9f2bcb97-68f1-4428-8716-98dc41980be5\") " pod="openshift-marketplace/redhat-marketplace-pxffj" Oct 03 09:53:57 crc kubenswrapper[4790]: I1003 09:53:57.395095 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9f2bcb97-68f1-4428-8716-98dc41980be5-catalog-content\") pod \"redhat-marketplace-pxffj\" (UID: \"9f2bcb97-68f1-4428-8716-98dc41980be5\") " pod="openshift-marketplace/redhat-marketplace-pxffj" Oct 03 09:53:57 crc kubenswrapper[4790]: I1003 09:53:57.417276 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xkh5l\" (UniqueName: \"kubernetes.io/projected/9f2bcb97-68f1-4428-8716-98dc41980be5-kube-api-access-xkh5l\") pod \"redhat-marketplace-pxffj\" (UID: \"9f2bcb97-68f1-4428-8716-98dc41980be5\") " pod="openshift-marketplace/redhat-marketplace-pxffj" Oct 03 09:53:57 crc kubenswrapper[4790]: I1003 09:53:57.496370 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-pxffj" Oct 03 09:53:58 crc kubenswrapper[4790]: I1003 09:53:58.010058 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-pxffj"] Oct 03 09:53:58 crc kubenswrapper[4790]: I1003 09:53:58.268327 4790 generic.go:334] "Generic (PLEG): container finished" podID="9f2bcb97-68f1-4428-8716-98dc41980be5" containerID="4fb5900507f809b2410d418636e5371467a766a9735d86e39a851dea85b52620" exitCode=0 Oct 03 09:53:58 crc kubenswrapper[4790]: I1003 09:53:58.268414 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pxffj" event={"ID":"9f2bcb97-68f1-4428-8716-98dc41980be5","Type":"ContainerDied","Data":"4fb5900507f809b2410d418636e5371467a766a9735d86e39a851dea85b52620"} Oct 03 09:53:58 crc kubenswrapper[4790]: I1003 09:53:58.268774 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pxffj" event={"ID":"9f2bcb97-68f1-4428-8716-98dc41980be5","Type":"ContainerStarted","Data":"797f6b9f69c916bf3acbaa804b6debe59128d895921881bc9f331799180a74d1"} Oct 03 09:53:58 crc kubenswrapper[4790]: I1003 09:53:58.270710 4790 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 03 09:53:59 crc kubenswrapper[4790]: I1003 09:53:59.282439 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pxffj" event={"ID":"9f2bcb97-68f1-4428-8716-98dc41980be5","Type":"ContainerStarted","Data":"8a3121ca6aa9e11b5dad943f67b712e0989fae9d1fa2c001d1548de485125e11"} Oct 03 09:54:00 crc kubenswrapper[4790]: I1003 09:54:00.293094 4790 generic.go:334] "Generic (PLEG): container finished" podID="9f2bcb97-68f1-4428-8716-98dc41980be5" containerID="8a3121ca6aa9e11b5dad943f67b712e0989fae9d1fa2c001d1548de485125e11" exitCode=0 Oct 03 09:54:00 crc kubenswrapper[4790]: I1003 09:54:00.293209 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pxffj" event={"ID":"9f2bcb97-68f1-4428-8716-98dc41980be5","Type":"ContainerDied","Data":"8a3121ca6aa9e11b5dad943f67b712e0989fae9d1fa2c001d1548de485125e11"} Oct 03 09:54:01 crc kubenswrapper[4790]: I1003 09:54:01.306344 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pxffj" event={"ID":"9f2bcb97-68f1-4428-8716-98dc41980be5","Type":"ContainerStarted","Data":"a192ba19f859491de41bee269759939cacab6c07141a220e4fce99e939383897"} Oct 03 09:54:01 crc kubenswrapper[4790]: I1003 09:54:01.329929 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-pxffj" podStartSLOduration=1.6371342009999998 podStartE2EDuration="4.32990882s" podCreationTimestamp="2025-10-03 09:53:57 +0000 UTC" firstStartedPulling="2025-10-03 09:53:58.270347637 +0000 UTC m=+4424.623281875" lastFinishedPulling="2025-10-03 09:54:00.963122256 +0000 UTC m=+4427.316056494" observedRunningTime="2025-10-03 09:54:01.320676775 +0000 UTC m=+4427.673611023" watchObservedRunningTime="2025-10-03 09:54:01.32990882 +0000 UTC m=+4427.682843078" Oct 03 09:54:05 crc kubenswrapper[4790]: I1003 09:54:05.328642 4790 scope.go:117] "RemoveContainer" containerID="8e0196bb36398706962d585a007c8fc442ef4fad86d7a5243997f04886073380" Oct 03 09:54:05 crc kubenswrapper[4790]: E1003 09:54:05.329376 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqj4j_openshift-machine-config-operator(36eae06a-d8be-4128-8d68-acb32e1f011b)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" podUID="36eae06a-d8be-4128-8d68-acb32e1f011b" Oct 03 09:54:07 crc kubenswrapper[4790]: I1003 09:54:07.497365 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-pxffj" Oct 03 09:54:07 crc kubenswrapper[4790]: I1003 09:54:07.497824 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-pxffj" Oct 03 09:54:07 crc kubenswrapper[4790]: I1003 09:54:07.549254 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-pxffj" Oct 03 09:54:08 crc kubenswrapper[4790]: I1003 09:54:08.414244 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-pxffj" Oct 03 09:54:08 crc kubenswrapper[4790]: I1003 09:54:08.467669 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-pxffj"] Oct 03 09:54:10 crc kubenswrapper[4790]: I1003 09:54:10.385853 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-pxffj" podUID="9f2bcb97-68f1-4428-8716-98dc41980be5" containerName="registry-server" containerID="cri-o://a192ba19f859491de41bee269759939cacab6c07141a220e4fce99e939383897" gracePeriod=2 Oct 03 09:54:11 crc kubenswrapper[4790]: I1003 09:54:11.395715 4790 generic.go:334] "Generic (PLEG): container finished" podID="9f2bcb97-68f1-4428-8716-98dc41980be5" containerID="a192ba19f859491de41bee269759939cacab6c07141a220e4fce99e939383897" exitCode=0 Oct 03 09:54:11 crc kubenswrapper[4790]: I1003 09:54:11.395777 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pxffj" event={"ID":"9f2bcb97-68f1-4428-8716-98dc41980be5","Type":"ContainerDied","Data":"a192ba19f859491de41bee269759939cacab6c07141a220e4fce99e939383897"} Oct 03 09:54:11 crc kubenswrapper[4790]: I1003 09:54:11.396288 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pxffj" event={"ID":"9f2bcb97-68f1-4428-8716-98dc41980be5","Type":"ContainerDied","Data":"797f6b9f69c916bf3acbaa804b6debe59128d895921881bc9f331799180a74d1"} Oct 03 09:54:11 crc kubenswrapper[4790]: I1003 09:54:11.396300 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="797f6b9f69c916bf3acbaa804b6debe59128d895921881bc9f331799180a74d1" Oct 03 09:54:11 crc kubenswrapper[4790]: I1003 09:54:11.669719 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-pxffj" Oct 03 09:54:11 crc kubenswrapper[4790]: I1003 09:54:11.725835 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9f2bcb97-68f1-4428-8716-98dc41980be5-utilities\") pod \"9f2bcb97-68f1-4428-8716-98dc41980be5\" (UID: \"9f2bcb97-68f1-4428-8716-98dc41980be5\") " Oct 03 09:54:11 crc kubenswrapper[4790]: I1003 09:54:11.726094 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9f2bcb97-68f1-4428-8716-98dc41980be5-catalog-content\") pod \"9f2bcb97-68f1-4428-8716-98dc41980be5\" (UID: \"9f2bcb97-68f1-4428-8716-98dc41980be5\") " Oct 03 09:54:11 crc kubenswrapper[4790]: I1003 09:54:11.726221 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xkh5l\" (UniqueName: \"kubernetes.io/projected/9f2bcb97-68f1-4428-8716-98dc41980be5-kube-api-access-xkh5l\") pod \"9f2bcb97-68f1-4428-8716-98dc41980be5\" (UID: \"9f2bcb97-68f1-4428-8716-98dc41980be5\") " Oct 03 09:54:11 crc kubenswrapper[4790]: I1003 09:54:11.732672 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9f2bcb97-68f1-4428-8716-98dc41980be5-kube-api-access-xkh5l" (OuterVolumeSpecName: "kube-api-access-xkh5l") pod "9f2bcb97-68f1-4428-8716-98dc41980be5" (UID: "9f2bcb97-68f1-4428-8716-98dc41980be5"). InnerVolumeSpecName "kube-api-access-xkh5l". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:54:11 crc kubenswrapper[4790]: I1003 09:54:11.740166 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9f2bcb97-68f1-4428-8716-98dc41980be5-utilities" (OuterVolumeSpecName: "utilities") pod "9f2bcb97-68f1-4428-8716-98dc41980be5" (UID: "9f2bcb97-68f1-4428-8716-98dc41980be5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 09:54:11 crc kubenswrapper[4790]: I1003 09:54:11.747921 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9f2bcb97-68f1-4428-8716-98dc41980be5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9f2bcb97-68f1-4428-8716-98dc41980be5" (UID: "9f2bcb97-68f1-4428-8716-98dc41980be5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 09:54:11 crc kubenswrapper[4790]: I1003 09:54:11.829562 4790 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9f2bcb97-68f1-4428-8716-98dc41980be5-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 09:54:11 crc kubenswrapper[4790]: I1003 09:54:11.829625 4790 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9f2bcb97-68f1-4428-8716-98dc41980be5-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 09:54:11 crc kubenswrapper[4790]: I1003 09:54:11.829640 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xkh5l\" (UniqueName: \"kubernetes.io/projected/9f2bcb97-68f1-4428-8716-98dc41980be5-kube-api-access-xkh5l\") on node \"crc\" DevicePath \"\"" Oct 03 09:54:12 crc kubenswrapper[4790]: I1003 09:54:12.404935 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-pxffj" Oct 03 09:54:12 crc kubenswrapper[4790]: I1003 09:54:12.428719 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-pxffj"] Oct 03 09:54:12 crc kubenswrapper[4790]: I1003 09:54:12.436977 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-pxffj"] Oct 03 09:54:14 crc kubenswrapper[4790]: I1003 09:54:14.341844 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9f2bcb97-68f1-4428-8716-98dc41980be5" path="/var/lib/kubelet/pods/9f2bcb97-68f1-4428-8716-98dc41980be5/volumes" Oct 03 09:54:16 crc kubenswrapper[4790]: I1003 09:54:16.328283 4790 scope.go:117] "RemoveContainer" containerID="8e0196bb36398706962d585a007c8fc442ef4fad86d7a5243997f04886073380" Oct 03 09:54:16 crc kubenswrapper[4790]: E1003 09:54:16.330541 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqj4j_openshift-machine-config-operator(36eae06a-d8be-4128-8d68-acb32e1f011b)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" podUID="36eae06a-d8be-4128-8d68-acb32e1f011b" Oct 03 09:54:27 crc kubenswrapper[4790]: I1003 09:54:27.328488 4790 scope.go:117] "RemoveContainer" containerID="8e0196bb36398706962d585a007c8fc442ef4fad86d7a5243997f04886073380" Oct 03 09:54:27 crc kubenswrapper[4790]: E1003 09:54:27.330846 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqj4j_openshift-machine-config-operator(36eae06a-d8be-4128-8d68-acb32e1f011b)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" podUID="36eae06a-d8be-4128-8d68-acb32e1f011b" Oct 03 09:54:40 crc kubenswrapper[4790]: I1003 09:54:40.329641 4790 scope.go:117] "RemoveContainer" containerID="8e0196bb36398706962d585a007c8fc442ef4fad86d7a5243997f04886073380" Oct 03 09:54:40 crc kubenswrapper[4790]: E1003 09:54:40.330340 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqj4j_openshift-machine-config-operator(36eae06a-d8be-4128-8d68-acb32e1f011b)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" podUID="36eae06a-d8be-4128-8d68-acb32e1f011b" Oct 03 09:54:52 crc kubenswrapper[4790]: I1003 09:54:52.327786 4790 scope.go:117] "RemoveContainer" containerID="8e0196bb36398706962d585a007c8fc442ef4fad86d7a5243997f04886073380" Oct 03 09:54:52 crc kubenswrapper[4790]: E1003 09:54:52.328609 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqj4j_openshift-machine-config-operator(36eae06a-d8be-4128-8d68-acb32e1f011b)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" podUID="36eae06a-d8be-4128-8d68-acb32e1f011b" Oct 03 09:55:05 crc kubenswrapper[4790]: I1003 09:55:05.328781 4790 scope.go:117] "RemoveContainer" containerID="8e0196bb36398706962d585a007c8fc442ef4fad86d7a5243997f04886073380" Oct 03 09:55:05 crc kubenswrapper[4790]: E1003 09:55:05.329910 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqj4j_openshift-machine-config-operator(36eae06a-d8be-4128-8d68-acb32e1f011b)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" podUID="36eae06a-d8be-4128-8d68-acb32e1f011b" Oct 03 09:55:18 crc kubenswrapper[4790]: I1003 09:55:18.328734 4790 scope.go:117] "RemoveContainer" containerID="8e0196bb36398706962d585a007c8fc442ef4fad86d7a5243997f04886073380" Oct 03 09:55:18 crc kubenswrapper[4790]: E1003 09:55:18.329935 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqj4j_openshift-machine-config-operator(36eae06a-d8be-4128-8d68-acb32e1f011b)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" podUID="36eae06a-d8be-4128-8d68-acb32e1f011b" Oct 03 09:55:29 crc kubenswrapper[4790]: I1003 09:55:29.328484 4790 scope.go:117] "RemoveContainer" containerID="8e0196bb36398706962d585a007c8fc442ef4fad86d7a5243997f04886073380" Oct 03 09:55:29 crc kubenswrapper[4790]: E1003 09:55:29.329299 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqj4j_openshift-machine-config-operator(36eae06a-d8be-4128-8d68-acb32e1f011b)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" podUID="36eae06a-d8be-4128-8d68-acb32e1f011b" Oct 03 09:55:42 crc kubenswrapper[4790]: I1003 09:55:42.328348 4790 scope.go:117] "RemoveContainer" containerID="8e0196bb36398706962d585a007c8fc442ef4fad86d7a5243997f04886073380" Oct 03 09:55:42 crc kubenswrapper[4790]: E1003 09:55:42.329360 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqj4j_openshift-machine-config-operator(36eae06a-d8be-4128-8d68-acb32e1f011b)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" podUID="36eae06a-d8be-4128-8d68-acb32e1f011b" Oct 03 09:55:57 crc kubenswrapper[4790]: I1003 09:55:57.328836 4790 scope.go:117] "RemoveContainer" containerID="8e0196bb36398706962d585a007c8fc442ef4fad86d7a5243997f04886073380" Oct 03 09:55:57 crc kubenswrapper[4790]: E1003 09:55:57.329819 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqj4j_openshift-machine-config-operator(36eae06a-d8be-4128-8d68-acb32e1f011b)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" podUID="36eae06a-d8be-4128-8d68-acb32e1f011b" Oct 03 09:56:09 crc kubenswrapper[4790]: I1003 09:56:09.327685 4790 scope.go:117] "RemoveContainer" containerID="8e0196bb36398706962d585a007c8fc442ef4fad86d7a5243997f04886073380" Oct 03 09:56:09 crc kubenswrapper[4790]: E1003 09:56:09.329618 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqj4j_openshift-machine-config-operator(36eae06a-d8be-4128-8d68-acb32e1f011b)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" podUID="36eae06a-d8be-4128-8d68-acb32e1f011b" Oct 03 09:56:23 crc kubenswrapper[4790]: I1003 09:56:23.328819 4790 scope.go:117] "RemoveContainer" containerID="8e0196bb36398706962d585a007c8fc442ef4fad86d7a5243997f04886073380" Oct 03 09:56:23 crc kubenswrapper[4790]: E1003 09:56:23.329623 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqj4j_openshift-machine-config-operator(36eae06a-d8be-4128-8d68-acb32e1f011b)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" podUID="36eae06a-d8be-4128-8d68-acb32e1f011b" Oct 03 09:56:35 crc kubenswrapper[4790]: I1003 09:56:35.329064 4790 scope.go:117] "RemoveContainer" containerID="8e0196bb36398706962d585a007c8fc442ef4fad86d7a5243997f04886073380" Oct 03 09:56:35 crc kubenswrapper[4790]: E1003 09:56:35.330068 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqj4j_openshift-machine-config-operator(36eae06a-d8be-4128-8d68-acb32e1f011b)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" podUID="36eae06a-d8be-4128-8d68-acb32e1f011b" Oct 03 09:56:47 crc kubenswrapper[4790]: I1003 09:56:47.328833 4790 scope.go:117] "RemoveContainer" containerID="8e0196bb36398706962d585a007c8fc442ef4fad86d7a5243997f04886073380" Oct 03 09:56:47 crc kubenswrapper[4790]: I1003 09:56:47.934526 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" event={"ID":"36eae06a-d8be-4128-8d68-acb32e1f011b","Type":"ContainerStarted","Data":"b4fffdcc571671131c53b354b11ff7cfa23698400512e2f6c8453ede4bebf434"} Oct 03 09:57:24 crc kubenswrapper[4790]: I1003 09:57:24.034184 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-2gjvv"] Oct 03 09:57:24 crc kubenswrapper[4790]: E1003 09:57:24.035166 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f2bcb97-68f1-4428-8716-98dc41980be5" containerName="extract-utilities" Oct 03 09:57:24 crc kubenswrapper[4790]: I1003 09:57:24.035182 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f2bcb97-68f1-4428-8716-98dc41980be5" containerName="extract-utilities" Oct 03 09:57:24 crc kubenswrapper[4790]: E1003 09:57:24.035220 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f2bcb97-68f1-4428-8716-98dc41980be5" containerName="extract-content" Oct 03 09:57:24 crc kubenswrapper[4790]: I1003 09:57:24.035228 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f2bcb97-68f1-4428-8716-98dc41980be5" containerName="extract-content" Oct 03 09:57:24 crc kubenswrapper[4790]: E1003 09:57:24.035245 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f2bcb97-68f1-4428-8716-98dc41980be5" containerName="registry-server" Oct 03 09:57:24 crc kubenswrapper[4790]: I1003 09:57:24.035253 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f2bcb97-68f1-4428-8716-98dc41980be5" containerName="registry-server" Oct 03 09:57:24 crc kubenswrapper[4790]: I1003 09:57:24.035504 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="9f2bcb97-68f1-4428-8716-98dc41980be5" containerName="registry-server" Oct 03 09:57:24 crc kubenswrapper[4790]: I1003 09:57:24.037834 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2gjvv" Oct 03 09:57:24 crc kubenswrapper[4790]: I1003 09:57:24.059100 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-2gjvv"] Oct 03 09:57:24 crc kubenswrapper[4790]: I1003 09:57:24.103367 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wsn47\" (UniqueName: \"kubernetes.io/projected/ea2ecad8-9924-4c5f-bc8b-3a85040d6842-kube-api-access-wsn47\") pod \"certified-operators-2gjvv\" (UID: \"ea2ecad8-9924-4c5f-bc8b-3a85040d6842\") " pod="openshift-marketplace/certified-operators-2gjvv" Oct 03 09:57:24 crc kubenswrapper[4790]: I1003 09:57:24.103484 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ea2ecad8-9924-4c5f-bc8b-3a85040d6842-catalog-content\") pod \"certified-operators-2gjvv\" (UID: \"ea2ecad8-9924-4c5f-bc8b-3a85040d6842\") " pod="openshift-marketplace/certified-operators-2gjvv" Oct 03 09:57:24 crc kubenswrapper[4790]: I1003 09:57:24.103535 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ea2ecad8-9924-4c5f-bc8b-3a85040d6842-utilities\") pod \"certified-operators-2gjvv\" (UID: \"ea2ecad8-9924-4c5f-bc8b-3a85040d6842\") " pod="openshift-marketplace/certified-operators-2gjvv" Oct 03 09:57:24 crc kubenswrapper[4790]: I1003 09:57:24.205453 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wsn47\" (UniqueName: \"kubernetes.io/projected/ea2ecad8-9924-4c5f-bc8b-3a85040d6842-kube-api-access-wsn47\") pod \"certified-operators-2gjvv\" (UID: \"ea2ecad8-9924-4c5f-bc8b-3a85040d6842\") " pod="openshift-marketplace/certified-operators-2gjvv" Oct 03 09:57:24 crc kubenswrapper[4790]: I1003 09:57:24.205610 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ea2ecad8-9924-4c5f-bc8b-3a85040d6842-catalog-content\") pod \"certified-operators-2gjvv\" (UID: \"ea2ecad8-9924-4c5f-bc8b-3a85040d6842\") " pod="openshift-marketplace/certified-operators-2gjvv" Oct 03 09:57:24 crc kubenswrapper[4790]: I1003 09:57:24.205656 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ea2ecad8-9924-4c5f-bc8b-3a85040d6842-utilities\") pod \"certified-operators-2gjvv\" (UID: \"ea2ecad8-9924-4c5f-bc8b-3a85040d6842\") " pod="openshift-marketplace/certified-operators-2gjvv" Oct 03 09:57:24 crc kubenswrapper[4790]: I1003 09:57:24.206088 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ea2ecad8-9924-4c5f-bc8b-3a85040d6842-catalog-content\") pod \"certified-operators-2gjvv\" (UID: \"ea2ecad8-9924-4c5f-bc8b-3a85040d6842\") " pod="openshift-marketplace/certified-operators-2gjvv" Oct 03 09:57:24 crc kubenswrapper[4790]: I1003 09:57:24.206219 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ea2ecad8-9924-4c5f-bc8b-3a85040d6842-utilities\") pod \"certified-operators-2gjvv\" (UID: \"ea2ecad8-9924-4c5f-bc8b-3a85040d6842\") " pod="openshift-marketplace/certified-operators-2gjvv" Oct 03 09:57:24 crc kubenswrapper[4790]: I1003 09:57:24.235331 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wsn47\" (UniqueName: \"kubernetes.io/projected/ea2ecad8-9924-4c5f-bc8b-3a85040d6842-kube-api-access-wsn47\") pod \"certified-operators-2gjvv\" (UID: \"ea2ecad8-9924-4c5f-bc8b-3a85040d6842\") " pod="openshift-marketplace/certified-operators-2gjvv" Oct 03 09:57:24 crc kubenswrapper[4790]: I1003 09:57:24.360054 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2gjvv" Oct 03 09:57:25 crc kubenswrapper[4790]: I1003 09:57:25.023383 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-2gjvv"] Oct 03 09:57:25 crc kubenswrapper[4790]: W1003 09:57:25.027749 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podea2ecad8_9924_4c5f_bc8b_3a85040d6842.slice/crio-9514d5f11aa723a3da5063ebff42daf5c3ef2ae690147f9c050d0f36b7fd40d7 WatchSource:0}: Error finding container 9514d5f11aa723a3da5063ebff42daf5c3ef2ae690147f9c050d0f36b7fd40d7: Status 404 returned error can't find the container with id 9514d5f11aa723a3da5063ebff42daf5c3ef2ae690147f9c050d0f36b7fd40d7 Oct 03 09:57:25 crc kubenswrapper[4790]: I1003 09:57:25.286693 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2gjvv" event={"ID":"ea2ecad8-9924-4c5f-bc8b-3a85040d6842","Type":"ContainerStarted","Data":"9514d5f11aa723a3da5063ebff42daf5c3ef2ae690147f9c050d0f36b7fd40d7"} Oct 03 09:57:26 crc kubenswrapper[4790]: I1003 09:57:26.296725 4790 generic.go:334] "Generic (PLEG): container finished" podID="ea2ecad8-9924-4c5f-bc8b-3a85040d6842" containerID="3ad5ea8abbc72787ae5ebc593d9babc230a81b26b9b0db48d7dc657e0f6e71ee" exitCode=0 Oct 03 09:57:26 crc kubenswrapper[4790]: I1003 09:57:26.296809 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2gjvv" event={"ID":"ea2ecad8-9924-4c5f-bc8b-3a85040d6842","Type":"ContainerDied","Data":"3ad5ea8abbc72787ae5ebc593d9babc230a81b26b9b0db48d7dc657e0f6e71ee"} Oct 03 09:57:29 crc kubenswrapper[4790]: I1003 09:57:29.330993 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2gjvv" event={"ID":"ea2ecad8-9924-4c5f-bc8b-3a85040d6842","Type":"ContainerStarted","Data":"3c13f7078703d7ce2c3c23558f4d72fdfe551860f9287d56c6eed19259cedd82"} Oct 03 09:57:30 crc kubenswrapper[4790]: I1003 09:57:30.342474 4790 generic.go:334] "Generic (PLEG): container finished" podID="ea2ecad8-9924-4c5f-bc8b-3a85040d6842" containerID="3c13f7078703d7ce2c3c23558f4d72fdfe551860f9287d56c6eed19259cedd82" exitCode=0 Oct 03 09:57:30 crc kubenswrapper[4790]: I1003 09:57:30.342537 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2gjvv" event={"ID":"ea2ecad8-9924-4c5f-bc8b-3a85040d6842","Type":"ContainerDied","Data":"3c13f7078703d7ce2c3c23558f4d72fdfe551860f9287d56c6eed19259cedd82"} Oct 03 09:57:31 crc kubenswrapper[4790]: I1003 09:57:31.353919 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2gjvv" event={"ID":"ea2ecad8-9924-4c5f-bc8b-3a85040d6842","Type":"ContainerStarted","Data":"720fc8383294a2f4a1114edf0efd243f0573352cb78acaaab49e34d75d972056"} Oct 03 09:57:31 crc kubenswrapper[4790]: I1003 09:57:31.370319 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-2gjvv" podStartSLOduration=2.7705315390000003 podStartE2EDuration="7.370301847s" podCreationTimestamp="2025-10-03 09:57:24 +0000 UTC" firstStartedPulling="2025-10-03 09:57:26.2993522 +0000 UTC m=+4632.652286438" lastFinishedPulling="2025-10-03 09:57:30.899122508 +0000 UTC m=+4637.252056746" observedRunningTime="2025-10-03 09:57:31.368368914 +0000 UTC m=+4637.721303162" watchObservedRunningTime="2025-10-03 09:57:31.370301847 +0000 UTC m=+4637.723236085" Oct 03 09:57:34 crc kubenswrapper[4790]: I1003 09:57:34.361099 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-2gjvv" Oct 03 09:57:34 crc kubenswrapper[4790]: I1003 09:57:34.361757 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-2gjvv" Oct 03 09:57:34 crc kubenswrapper[4790]: I1003 09:57:34.413915 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-2gjvv" Oct 03 09:57:44 crc kubenswrapper[4790]: I1003 09:57:44.410332 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-2gjvv" Oct 03 09:57:44 crc kubenswrapper[4790]: I1003 09:57:44.461532 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-2gjvv"] Oct 03 09:57:44 crc kubenswrapper[4790]: I1003 09:57:44.530909 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-2gjvv" podUID="ea2ecad8-9924-4c5f-bc8b-3a85040d6842" containerName="registry-server" containerID="cri-o://720fc8383294a2f4a1114edf0efd243f0573352cb78acaaab49e34d75d972056" gracePeriod=2 Oct 03 09:57:44 crc kubenswrapper[4790]: I1003 09:57:44.999628 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2gjvv" Oct 03 09:57:45 crc kubenswrapper[4790]: I1003 09:57:45.183068 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ea2ecad8-9924-4c5f-bc8b-3a85040d6842-catalog-content\") pod \"ea2ecad8-9924-4c5f-bc8b-3a85040d6842\" (UID: \"ea2ecad8-9924-4c5f-bc8b-3a85040d6842\") " Oct 03 09:57:45 crc kubenswrapper[4790]: I1003 09:57:45.187852 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ea2ecad8-9924-4c5f-bc8b-3a85040d6842-utilities\") pod \"ea2ecad8-9924-4c5f-bc8b-3a85040d6842\" (UID: \"ea2ecad8-9924-4c5f-bc8b-3a85040d6842\") " Oct 03 09:57:45 crc kubenswrapper[4790]: I1003 09:57:45.188000 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wsn47\" (UniqueName: \"kubernetes.io/projected/ea2ecad8-9924-4c5f-bc8b-3a85040d6842-kube-api-access-wsn47\") pod \"ea2ecad8-9924-4c5f-bc8b-3a85040d6842\" (UID: \"ea2ecad8-9924-4c5f-bc8b-3a85040d6842\") " Oct 03 09:57:45 crc kubenswrapper[4790]: I1003 09:57:45.189735 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ea2ecad8-9924-4c5f-bc8b-3a85040d6842-utilities" (OuterVolumeSpecName: "utilities") pod "ea2ecad8-9924-4c5f-bc8b-3a85040d6842" (UID: "ea2ecad8-9924-4c5f-bc8b-3a85040d6842"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 09:57:45 crc kubenswrapper[4790]: I1003 09:57:45.196945 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ea2ecad8-9924-4c5f-bc8b-3a85040d6842-kube-api-access-wsn47" (OuterVolumeSpecName: "kube-api-access-wsn47") pod "ea2ecad8-9924-4c5f-bc8b-3a85040d6842" (UID: "ea2ecad8-9924-4c5f-bc8b-3a85040d6842"). InnerVolumeSpecName "kube-api-access-wsn47". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:57:45 crc kubenswrapper[4790]: I1003 09:57:45.231159 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ea2ecad8-9924-4c5f-bc8b-3a85040d6842-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ea2ecad8-9924-4c5f-bc8b-3a85040d6842" (UID: "ea2ecad8-9924-4c5f-bc8b-3a85040d6842"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 09:57:45 crc kubenswrapper[4790]: I1003 09:57:45.291401 4790 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ea2ecad8-9924-4c5f-bc8b-3a85040d6842-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 09:57:45 crc kubenswrapper[4790]: I1003 09:57:45.291448 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wsn47\" (UniqueName: \"kubernetes.io/projected/ea2ecad8-9924-4c5f-bc8b-3a85040d6842-kube-api-access-wsn47\") on node \"crc\" DevicePath \"\"" Oct 03 09:57:45 crc kubenswrapper[4790]: I1003 09:57:45.291459 4790 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ea2ecad8-9924-4c5f-bc8b-3a85040d6842-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 09:57:45 crc kubenswrapper[4790]: I1003 09:57:45.543005 4790 generic.go:334] "Generic (PLEG): container finished" podID="ea2ecad8-9924-4c5f-bc8b-3a85040d6842" containerID="720fc8383294a2f4a1114edf0efd243f0573352cb78acaaab49e34d75d972056" exitCode=0 Oct 03 09:57:45 crc kubenswrapper[4790]: I1003 09:57:45.543075 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2gjvv" Oct 03 09:57:45 crc kubenswrapper[4790]: I1003 09:57:45.543098 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2gjvv" event={"ID":"ea2ecad8-9924-4c5f-bc8b-3a85040d6842","Type":"ContainerDied","Data":"720fc8383294a2f4a1114edf0efd243f0573352cb78acaaab49e34d75d972056"} Oct 03 09:57:45 crc kubenswrapper[4790]: I1003 09:57:45.543646 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2gjvv" event={"ID":"ea2ecad8-9924-4c5f-bc8b-3a85040d6842","Type":"ContainerDied","Data":"9514d5f11aa723a3da5063ebff42daf5c3ef2ae690147f9c050d0f36b7fd40d7"} Oct 03 09:57:45 crc kubenswrapper[4790]: I1003 09:57:45.543701 4790 scope.go:117] "RemoveContainer" containerID="720fc8383294a2f4a1114edf0efd243f0573352cb78acaaab49e34d75d972056" Oct 03 09:57:45 crc kubenswrapper[4790]: I1003 09:57:45.566379 4790 scope.go:117] "RemoveContainer" containerID="3c13f7078703d7ce2c3c23558f4d72fdfe551860f9287d56c6eed19259cedd82" Oct 03 09:57:45 crc kubenswrapper[4790]: I1003 09:57:45.579415 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-2gjvv"] Oct 03 09:57:45 crc kubenswrapper[4790]: I1003 09:57:45.588797 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-2gjvv"] Oct 03 09:57:45 crc kubenswrapper[4790]: I1003 09:57:45.617723 4790 scope.go:117] "RemoveContainer" containerID="3ad5ea8abbc72787ae5ebc593d9babc230a81b26b9b0db48d7dc657e0f6e71ee" Oct 03 09:57:45 crc kubenswrapper[4790]: I1003 09:57:45.656639 4790 scope.go:117] "RemoveContainer" containerID="720fc8383294a2f4a1114edf0efd243f0573352cb78acaaab49e34d75d972056" Oct 03 09:57:45 crc kubenswrapper[4790]: E1003 09:57:45.657053 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"720fc8383294a2f4a1114edf0efd243f0573352cb78acaaab49e34d75d972056\": container with ID starting with 720fc8383294a2f4a1114edf0efd243f0573352cb78acaaab49e34d75d972056 not found: ID does not exist" containerID="720fc8383294a2f4a1114edf0efd243f0573352cb78acaaab49e34d75d972056" Oct 03 09:57:45 crc kubenswrapper[4790]: I1003 09:57:45.657081 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"720fc8383294a2f4a1114edf0efd243f0573352cb78acaaab49e34d75d972056"} err="failed to get container status \"720fc8383294a2f4a1114edf0efd243f0573352cb78acaaab49e34d75d972056\": rpc error: code = NotFound desc = could not find container \"720fc8383294a2f4a1114edf0efd243f0573352cb78acaaab49e34d75d972056\": container with ID starting with 720fc8383294a2f4a1114edf0efd243f0573352cb78acaaab49e34d75d972056 not found: ID does not exist" Oct 03 09:57:45 crc kubenswrapper[4790]: I1003 09:57:45.657099 4790 scope.go:117] "RemoveContainer" containerID="3c13f7078703d7ce2c3c23558f4d72fdfe551860f9287d56c6eed19259cedd82" Oct 03 09:57:45 crc kubenswrapper[4790]: E1003 09:57:45.657565 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3c13f7078703d7ce2c3c23558f4d72fdfe551860f9287d56c6eed19259cedd82\": container with ID starting with 3c13f7078703d7ce2c3c23558f4d72fdfe551860f9287d56c6eed19259cedd82 not found: ID does not exist" containerID="3c13f7078703d7ce2c3c23558f4d72fdfe551860f9287d56c6eed19259cedd82" Oct 03 09:57:45 crc kubenswrapper[4790]: I1003 09:57:45.657641 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3c13f7078703d7ce2c3c23558f4d72fdfe551860f9287d56c6eed19259cedd82"} err="failed to get container status \"3c13f7078703d7ce2c3c23558f4d72fdfe551860f9287d56c6eed19259cedd82\": rpc error: code = NotFound desc = could not find container \"3c13f7078703d7ce2c3c23558f4d72fdfe551860f9287d56c6eed19259cedd82\": container with ID starting with 3c13f7078703d7ce2c3c23558f4d72fdfe551860f9287d56c6eed19259cedd82 not found: ID does not exist" Oct 03 09:57:45 crc kubenswrapper[4790]: I1003 09:57:45.657678 4790 scope.go:117] "RemoveContainer" containerID="3ad5ea8abbc72787ae5ebc593d9babc230a81b26b9b0db48d7dc657e0f6e71ee" Oct 03 09:57:45 crc kubenswrapper[4790]: E1003 09:57:45.658070 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3ad5ea8abbc72787ae5ebc593d9babc230a81b26b9b0db48d7dc657e0f6e71ee\": container with ID starting with 3ad5ea8abbc72787ae5ebc593d9babc230a81b26b9b0db48d7dc657e0f6e71ee not found: ID does not exist" containerID="3ad5ea8abbc72787ae5ebc593d9babc230a81b26b9b0db48d7dc657e0f6e71ee" Oct 03 09:57:45 crc kubenswrapper[4790]: I1003 09:57:45.658192 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3ad5ea8abbc72787ae5ebc593d9babc230a81b26b9b0db48d7dc657e0f6e71ee"} err="failed to get container status \"3ad5ea8abbc72787ae5ebc593d9babc230a81b26b9b0db48d7dc657e0f6e71ee\": rpc error: code = NotFound desc = could not find container \"3ad5ea8abbc72787ae5ebc593d9babc230a81b26b9b0db48d7dc657e0f6e71ee\": container with ID starting with 3ad5ea8abbc72787ae5ebc593d9babc230a81b26b9b0db48d7dc657e0f6e71ee not found: ID does not exist" Oct 03 09:57:46 crc kubenswrapper[4790]: I1003 09:57:46.338727 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ea2ecad8-9924-4c5f-bc8b-3a85040d6842" path="/var/lib/kubelet/pods/ea2ecad8-9924-4c5f-bc8b-3a85040d6842/volumes" Oct 03 09:59:10 crc kubenswrapper[4790]: I1003 09:59:10.185705 4790 patch_prober.go:28] interesting pod/machine-config-daemon-zqj4j container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 09:59:10 crc kubenswrapper[4790]: I1003 09:59:10.186337 4790 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" podUID="36eae06a-d8be-4128-8d68-acb32e1f011b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 09:59:37 crc kubenswrapper[4790]: I1003 09:59:36.999475 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-cjzrr"] Oct 03 09:59:37 crc kubenswrapper[4790]: E1003 09:59:37.000677 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea2ecad8-9924-4c5f-bc8b-3a85040d6842" containerName="extract-content" Oct 03 09:59:37 crc kubenswrapper[4790]: I1003 09:59:37.000697 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea2ecad8-9924-4c5f-bc8b-3a85040d6842" containerName="extract-content" Oct 03 09:59:37 crc kubenswrapper[4790]: E1003 09:59:37.000733 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea2ecad8-9924-4c5f-bc8b-3a85040d6842" containerName="extract-utilities" Oct 03 09:59:37 crc kubenswrapper[4790]: I1003 09:59:37.000741 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea2ecad8-9924-4c5f-bc8b-3a85040d6842" containerName="extract-utilities" Oct 03 09:59:37 crc kubenswrapper[4790]: E1003 09:59:37.000771 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea2ecad8-9924-4c5f-bc8b-3a85040d6842" containerName="registry-server" Oct 03 09:59:37 crc kubenswrapper[4790]: I1003 09:59:37.000788 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea2ecad8-9924-4c5f-bc8b-3a85040d6842" containerName="registry-server" Oct 03 09:59:37 crc kubenswrapper[4790]: I1003 09:59:37.002678 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="ea2ecad8-9924-4c5f-bc8b-3a85040d6842" containerName="registry-server" Oct 03 09:59:37 crc kubenswrapper[4790]: I1003 09:59:37.029041 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-cjzrr" Oct 03 09:59:37 crc kubenswrapper[4790]: I1003 09:59:37.048301 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-cjzrr"] Oct 03 09:59:37 crc kubenswrapper[4790]: I1003 09:59:37.169672 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5deedefb-63ba-4e8f-ba6b-bdf060d61f19-utilities\") pod \"redhat-operators-cjzrr\" (UID: \"5deedefb-63ba-4e8f-ba6b-bdf060d61f19\") " pod="openshift-marketplace/redhat-operators-cjzrr" Oct 03 09:59:37 crc kubenswrapper[4790]: I1003 09:59:37.169815 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dzm8n\" (UniqueName: \"kubernetes.io/projected/5deedefb-63ba-4e8f-ba6b-bdf060d61f19-kube-api-access-dzm8n\") pod \"redhat-operators-cjzrr\" (UID: \"5deedefb-63ba-4e8f-ba6b-bdf060d61f19\") " pod="openshift-marketplace/redhat-operators-cjzrr" Oct 03 09:59:37 crc kubenswrapper[4790]: I1003 09:59:37.170093 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5deedefb-63ba-4e8f-ba6b-bdf060d61f19-catalog-content\") pod \"redhat-operators-cjzrr\" (UID: \"5deedefb-63ba-4e8f-ba6b-bdf060d61f19\") " pod="openshift-marketplace/redhat-operators-cjzrr" Oct 03 09:59:37 crc kubenswrapper[4790]: I1003 09:59:37.272858 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5deedefb-63ba-4e8f-ba6b-bdf060d61f19-catalog-content\") pod \"redhat-operators-cjzrr\" (UID: \"5deedefb-63ba-4e8f-ba6b-bdf060d61f19\") " pod="openshift-marketplace/redhat-operators-cjzrr" Oct 03 09:59:37 crc kubenswrapper[4790]: I1003 09:59:37.272991 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5deedefb-63ba-4e8f-ba6b-bdf060d61f19-utilities\") pod \"redhat-operators-cjzrr\" (UID: \"5deedefb-63ba-4e8f-ba6b-bdf060d61f19\") " pod="openshift-marketplace/redhat-operators-cjzrr" Oct 03 09:59:37 crc kubenswrapper[4790]: I1003 09:59:37.273103 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dzm8n\" (UniqueName: \"kubernetes.io/projected/5deedefb-63ba-4e8f-ba6b-bdf060d61f19-kube-api-access-dzm8n\") pod \"redhat-operators-cjzrr\" (UID: \"5deedefb-63ba-4e8f-ba6b-bdf060d61f19\") " pod="openshift-marketplace/redhat-operators-cjzrr" Oct 03 09:59:37 crc kubenswrapper[4790]: I1003 09:59:37.273380 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5deedefb-63ba-4e8f-ba6b-bdf060d61f19-catalog-content\") pod \"redhat-operators-cjzrr\" (UID: \"5deedefb-63ba-4e8f-ba6b-bdf060d61f19\") " pod="openshift-marketplace/redhat-operators-cjzrr" Oct 03 09:59:37 crc kubenswrapper[4790]: I1003 09:59:37.273708 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5deedefb-63ba-4e8f-ba6b-bdf060d61f19-utilities\") pod \"redhat-operators-cjzrr\" (UID: \"5deedefb-63ba-4e8f-ba6b-bdf060d61f19\") " pod="openshift-marketplace/redhat-operators-cjzrr" Oct 03 09:59:37 crc kubenswrapper[4790]: I1003 09:59:37.455041 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dzm8n\" (UniqueName: \"kubernetes.io/projected/5deedefb-63ba-4e8f-ba6b-bdf060d61f19-kube-api-access-dzm8n\") pod \"redhat-operators-cjzrr\" (UID: \"5deedefb-63ba-4e8f-ba6b-bdf060d61f19\") " pod="openshift-marketplace/redhat-operators-cjzrr" Oct 03 09:59:37 crc kubenswrapper[4790]: I1003 09:59:37.671654 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-cjzrr" Oct 03 09:59:38 crc kubenswrapper[4790]: I1003 09:59:38.388676 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-cjzrr"] Oct 03 09:59:38 crc kubenswrapper[4790]: I1003 09:59:38.645299 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cjzrr" event={"ID":"5deedefb-63ba-4e8f-ba6b-bdf060d61f19","Type":"ContainerStarted","Data":"7d4af7e5ba1ae679d39e76e39b80b7bc66842772a3bdc4593b63d934b5804de8"} Oct 03 09:59:38 crc kubenswrapper[4790]: I1003 09:59:38.645351 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cjzrr" event={"ID":"5deedefb-63ba-4e8f-ba6b-bdf060d61f19","Type":"ContainerStarted","Data":"3f0c0ec39b44d76dbc9feb59ab16adb11ef4ba740fb75d5427bbad8da1f9f9a1"} Oct 03 09:59:38 crc kubenswrapper[4790]: I1003 09:59:38.647434 4790 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 03 09:59:39 crc kubenswrapper[4790]: I1003 09:59:39.662879 4790 generic.go:334] "Generic (PLEG): container finished" podID="5deedefb-63ba-4e8f-ba6b-bdf060d61f19" containerID="7d4af7e5ba1ae679d39e76e39b80b7bc66842772a3bdc4593b63d934b5804de8" exitCode=0 Oct 03 09:59:39 crc kubenswrapper[4790]: I1003 09:59:39.662941 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cjzrr" event={"ID":"5deedefb-63ba-4e8f-ba6b-bdf060d61f19","Type":"ContainerDied","Data":"7d4af7e5ba1ae679d39e76e39b80b7bc66842772a3bdc4593b63d934b5804de8"} Oct 03 09:59:40 crc kubenswrapper[4790]: I1003 09:59:40.185871 4790 patch_prober.go:28] interesting pod/machine-config-daemon-zqj4j container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 09:59:40 crc kubenswrapper[4790]: I1003 09:59:40.186694 4790 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" podUID="36eae06a-d8be-4128-8d68-acb32e1f011b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 09:59:40 crc kubenswrapper[4790]: I1003 09:59:40.682187 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cjzrr" event={"ID":"5deedefb-63ba-4e8f-ba6b-bdf060d61f19","Type":"ContainerStarted","Data":"404e3c979a325f339a32e9d393460ad36166f23e556a00bfb3fc77c18a4fc8a6"} Oct 03 09:59:44 crc kubenswrapper[4790]: I1003 09:59:44.728866 4790 generic.go:334] "Generic (PLEG): container finished" podID="5deedefb-63ba-4e8f-ba6b-bdf060d61f19" containerID="404e3c979a325f339a32e9d393460ad36166f23e556a00bfb3fc77c18a4fc8a6" exitCode=0 Oct 03 09:59:44 crc kubenswrapper[4790]: I1003 09:59:44.728929 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cjzrr" event={"ID":"5deedefb-63ba-4e8f-ba6b-bdf060d61f19","Type":"ContainerDied","Data":"404e3c979a325f339a32e9d393460ad36166f23e556a00bfb3fc77c18a4fc8a6"} Oct 03 09:59:45 crc kubenswrapper[4790]: I1003 09:59:45.740709 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cjzrr" event={"ID":"5deedefb-63ba-4e8f-ba6b-bdf060d61f19","Type":"ContainerStarted","Data":"82fcae0622088e1c18da747d5f8d64f7a9b60965de389d54b54a841408ce046a"} Oct 03 09:59:45 crc kubenswrapper[4790]: I1003 09:59:45.760727 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-cjzrr" podStartSLOduration=3.217394715 podStartE2EDuration="9.760703653s" podCreationTimestamp="2025-10-03 09:59:36 +0000 UTC" firstStartedPulling="2025-10-03 09:59:38.647139903 +0000 UTC m=+4765.000074141" lastFinishedPulling="2025-10-03 09:59:45.190448831 +0000 UTC m=+4771.543383079" observedRunningTime="2025-10-03 09:59:45.759058638 +0000 UTC m=+4772.111992876" watchObservedRunningTime="2025-10-03 09:59:45.760703653 +0000 UTC m=+4772.113637891" Oct 03 09:59:47 crc kubenswrapper[4790]: I1003 09:59:47.672175 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-cjzrr" Oct 03 09:59:47 crc kubenswrapper[4790]: I1003 09:59:47.673733 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-cjzrr" Oct 03 09:59:48 crc kubenswrapper[4790]: I1003 09:59:48.995009 4790 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-cjzrr" podUID="5deedefb-63ba-4e8f-ba6b-bdf060d61f19" containerName="registry-server" probeResult="failure" output=< Oct 03 09:59:48 crc kubenswrapper[4790]: timeout: failed to connect service ":50051" within 1s Oct 03 09:59:48 crc kubenswrapper[4790]: > Oct 03 09:59:57 crc kubenswrapper[4790]: I1003 09:59:57.724982 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-cjzrr" Oct 03 09:59:57 crc kubenswrapper[4790]: I1003 09:59:57.772467 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-cjzrr" Oct 03 09:59:57 crc kubenswrapper[4790]: I1003 09:59:57.960332 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-cjzrr"] Oct 03 09:59:58 crc kubenswrapper[4790]: I1003 09:59:58.873104 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-cjzrr" podUID="5deedefb-63ba-4e8f-ba6b-bdf060d61f19" containerName="registry-server" containerID="cri-o://82fcae0622088e1c18da747d5f8d64f7a9b60965de389d54b54a841408ce046a" gracePeriod=2 Oct 03 09:59:59 crc kubenswrapper[4790]: I1003 09:59:59.409981 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-cjzrr" Oct 03 09:59:59 crc kubenswrapper[4790]: I1003 09:59:59.601903 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dzm8n\" (UniqueName: \"kubernetes.io/projected/5deedefb-63ba-4e8f-ba6b-bdf060d61f19-kube-api-access-dzm8n\") pod \"5deedefb-63ba-4e8f-ba6b-bdf060d61f19\" (UID: \"5deedefb-63ba-4e8f-ba6b-bdf060d61f19\") " Oct 03 09:59:59 crc kubenswrapper[4790]: I1003 09:59:59.602022 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5deedefb-63ba-4e8f-ba6b-bdf060d61f19-utilities\") pod \"5deedefb-63ba-4e8f-ba6b-bdf060d61f19\" (UID: \"5deedefb-63ba-4e8f-ba6b-bdf060d61f19\") " Oct 03 09:59:59 crc kubenswrapper[4790]: I1003 09:59:59.602294 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5deedefb-63ba-4e8f-ba6b-bdf060d61f19-catalog-content\") pod \"5deedefb-63ba-4e8f-ba6b-bdf060d61f19\" (UID: \"5deedefb-63ba-4e8f-ba6b-bdf060d61f19\") " Oct 03 09:59:59 crc kubenswrapper[4790]: I1003 09:59:59.602812 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5deedefb-63ba-4e8f-ba6b-bdf060d61f19-utilities" (OuterVolumeSpecName: "utilities") pod "5deedefb-63ba-4e8f-ba6b-bdf060d61f19" (UID: "5deedefb-63ba-4e8f-ba6b-bdf060d61f19"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 09:59:59 crc kubenswrapper[4790]: I1003 09:59:59.605402 4790 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5deedefb-63ba-4e8f-ba6b-bdf060d61f19-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 09:59:59 crc kubenswrapper[4790]: I1003 09:59:59.609114 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5deedefb-63ba-4e8f-ba6b-bdf060d61f19-kube-api-access-dzm8n" (OuterVolumeSpecName: "kube-api-access-dzm8n") pod "5deedefb-63ba-4e8f-ba6b-bdf060d61f19" (UID: "5deedefb-63ba-4e8f-ba6b-bdf060d61f19"). InnerVolumeSpecName "kube-api-access-dzm8n". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 09:59:59 crc kubenswrapper[4790]: I1003 09:59:59.698655 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5deedefb-63ba-4e8f-ba6b-bdf060d61f19-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5deedefb-63ba-4e8f-ba6b-bdf060d61f19" (UID: "5deedefb-63ba-4e8f-ba6b-bdf060d61f19"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 09:59:59 crc kubenswrapper[4790]: I1003 09:59:59.707131 4790 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5deedefb-63ba-4e8f-ba6b-bdf060d61f19-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 09:59:59 crc kubenswrapper[4790]: I1003 09:59:59.707172 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dzm8n\" (UniqueName: \"kubernetes.io/projected/5deedefb-63ba-4e8f-ba6b-bdf060d61f19-kube-api-access-dzm8n\") on node \"crc\" DevicePath \"\"" Oct 03 09:59:59 crc kubenswrapper[4790]: I1003 09:59:59.884769 4790 generic.go:334] "Generic (PLEG): container finished" podID="5deedefb-63ba-4e8f-ba6b-bdf060d61f19" containerID="82fcae0622088e1c18da747d5f8d64f7a9b60965de389d54b54a841408ce046a" exitCode=0 Oct 03 09:59:59 crc kubenswrapper[4790]: I1003 09:59:59.884819 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cjzrr" event={"ID":"5deedefb-63ba-4e8f-ba6b-bdf060d61f19","Type":"ContainerDied","Data":"82fcae0622088e1c18da747d5f8d64f7a9b60965de389d54b54a841408ce046a"} Oct 03 09:59:59 crc kubenswrapper[4790]: I1003 09:59:59.884842 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-cjzrr" Oct 03 09:59:59 crc kubenswrapper[4790]: I1003 09:59:59.884869 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cjzrr" event={"ID":"5deedefb-63ba-4e8f-ba6b-bdf060d61f19","Type":"ContainerDied","Data":"3f0c0ec39b44d76dbc9feb59ab16adb11ef4ba740fb75d5427bbad8da1f9f9a1"} Oct 03 09:59:59 crc kubenswrapper[4790]: I1003 09:59:59.884887 4790 scope.go:117] "RemoveContainer" containerID="82fcae0622088e1c18da747d5f8d64f7a9b60965de389d54b54a841408ce046a" Oct 03 09:59:59 crc kubenswrapper[4790]: I1003 09:59:59.916641 4790 scope.go:117] "RemoveContainer" containerID="404e3c979a325f339a32e9d393460ad36166f23e556a00bfb3fc77c18a4fc8a6" Oct 03 09:59:59 crc kubenswrapper[4790]: I1003 09:59:59.940322 4790 scope.go:117] "RemoveContainer" containerID="7d4af7e5ba1ae679d39e76e39b80b7bc66842772a3bdc4593b63d934b5804de8" Oct 03 09:59:59 crc kubenswrapper[4790]: I1003 09:59:59.946158 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-cjzrr"] Oct 03 09:59:59 crc kubenswrapper[4790]: I1003 09:59:59.953287 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-cjzrr"] Oct 03 09:59:59 crc kubenswrapper[4790]: I1003 09:59:59.996034 4790 scope.go:117] "RemoveContainer" containerID="82fcae0622088e1c18da747d5f8d64f7a9b60965de389d54b54a841408ce046a" Oct 03 09:59:59 crc kubenswrapper[4790]: E1003 09:59:59.996762 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"82fcae0622088e1c18da747d5f8d64f7a9b60965de389d54b54a841408ce046a\": container with ID starting with 82fcae0622088e1c18da747d5f8d64f7a9b60965de389d54b54a841408ce046a not found: ID does not exist" containerID="82fcae0622088e1c18da747d5f8d64f7a9b60965de389d54b54a841408ce046a" Oct 03 09:59:59 crc kubenswrapper[4790]: I1003 09:59:59.996811 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"82fcae0622088e1c18da747d5f8d64f7a9b60965de389d54b54a841408ce046a"} err="failed to get container status \"82fcae0622088e1c18da747d5f8d64f7a9b60965de389d54b54a841408ce046a\": rpc error: code = NotFound desc = could not find container \"82fcae0622088e1c18da747d5f8d64f7a9b60965de389d54b54a841408ce046a\": container with ID starting with 82fcae0622088e1c18da747d5f8d64f7a9b60965de389d54b54a841408ce046a not found: ID does not exist" Oct 03 09:59:59 crc kubenswrapper[4790]: I1003 09:59:59.996851 4790 scope.go:117] "RemoveContainer" containerID="404e3c979a325f339a32e9d393460ad36166f23e556a00bfb3fc77c18a4fc8a6" Oct 03 09:59:59 crc kubenswrapper[4790]: E1003 09:59:59.997460 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"404e3c979a325f339a32e9d393460ad36166f23e556a00bfb3fc77c18a4fc8a6\": container with ID starting with 404e3c979a325f339a32e9d393460ad36166f23e556a00bfb3fc77c18a4fc8a6 not found: ID does not exist" containerID="404e3c979a325f339a32e9d393460ad36166f23e556a00bfb3fc77c18a4fc8a6" Oct 03 09:59:59 crc kubenswrapper[4790]: I1003 09:59:59.997524 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"404e3c979a325f339a32e9d393460ad36166f23e556a00bfb3fc77c18a4fc8a6"} err="failed to get container status \"404e3c979a325f339a32e9d393460ad36166f23e556a00bfb3fc77c18a4fc8a6\": rpc error: code = NotFound desc = could not find container \"404e3c979a325f339a32e9d393460ad36166f23e556a00bfb3fc77c18a4fc8a6\": container with ID starting with 404e3c979a325f339a32e9d393460ad36166f23e556a00bfb3fc77c18a4fc8a6 not found: ID does not exist" Oct 03 09:59:59 crc kubenswrapper[4790]: I1003 09:59:59.997564 4790 scope.go:117] "RemoveContainer" containerID="7d4af7e5ba1ae679d39e76e39b80b7bc66842772a3bdc4593b63d934b5804de8" Oct 03 09:59:59 crc kubenswrapper[4790]: E1003 09:59:59.998056 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7d4af7e5ba1ae679d39e76e39b80b7bc66842772a3bdc4593b63d934b5804de8\": container with ID starting with 7d4af7e5ba1ae679d39e76e39b80b7bc66842772a3bdc4593b63d934b5804de8 not found: ID does not exist" containerID="7d4af7e5ba1ae679d39e76e39b80b7bc66842772a3bdc4593b63d934b5804de8" Oct 03 09:59:59 crc kubenswrapper[4790]: I1003 09:59:59.998096 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7d4af7e5ba1ae679d39e76e39b80b7bc66842772a3bdc4593b63d934b5804de8"} err="failed to get container status \"7d4af7e5ba1ae679d39e76e39b80b7bc66842772a3bdc4593b63d934b5804de8\": rpc error: code = NotFound desc = could not find container \"7d4af7e5ba1ae679d39e76e39b80b7bc66842772a3bdc4593b63d934b5804de8\": container with ID starting with 7d4af7e5ba1ae679d39e76e39b80b7bc66842772a3bdc4593b63d934b5804de8 not found: ID does not exist" Oct 03 10:00:00 crc kubenswrapper[4790]: I1003 10:00:00.150755 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29324760-k2jd9"] Oct 03 10:00:00 crc kubenswrapper[4790]: E1003 10:00:00.152331 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5deedefb-63ba-4e8f-ba6b-bdf060d61f19" containerName="registry-server" Oct 03 10:00:00 crc kubenswrapper[4790]: I1003 10:00:00.152452 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="5deedefb-63ba-4e8f-ba6b-bdf060d61f19" containerName="registry-server" Oct 03 10:00:00 crc kubenswrapper[4790]: E1003 10:00:00.152571 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5deedefb-63ba-4e8f-ba6b-bdf060d61f19" containerName="extract-content" Oct 03 10:00:00 crc kubenswrapper[4790]: I1003 10:00:00.152666 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="5deedefb-63ba-4e8f-ba6b-bdf060d61f19" containerName="extract-content" Oct 03 10:00:00 crc kubenswrapper[4790]: E1003 10:00:00.152747 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5deedefb-63ba-4e8f-ba6b-bdf060d61f19" containerName="extract-utilities" Oct 03 10:00:00 crc kubenswrapper[4790]: I1003 10:00:00.152837 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="5deedefb-63ba-4e8f-ba6b-bdf060d61f19" containerName="extract-utilities" Oct 03 10:00:00 crc kubenswrapper[4790]: I1003 10:00:00.153176 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="5deedefb-63ba-4e8f-ba6b-bdf060d61f19" containerName="registry-server" Oct 03 10:00:00 crc kubenswrapper[4790]: I1003 10:00:00.154252 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29324760-k2jd9" Oct 03 10:00:00 crc kubenswrapper[4790]: I1003 10:00:00.156952 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 03 10:00:00 crc kubenswrapper[4790]: I1003 10:00:00.157502 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 03 10:00:00 crc kubenswrapper[4790]: I1003 10:00:00.162558 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29324760-k2jd9"] Oct 03 10:00:00 crc kubenswrapper[4790]: I1003 10:00:00.215012 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2hw2c\" (UniqueName: \"kubernetes.io/projected/0a7568ac-11e3-4adf-8ec2-57dae1a9ba0d-kube-api-access-2hw2c\") pod \"collect-profiles-29324760-k2jd9\" (UID: \"0a7568ac-11e3-4adf-8ec2-57dae1a9ba0d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324760-k2jd9" Oct 03 10:00:00 crc kubenswrapper[4790]: I1003 10:00:00.215122 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0a7568ac-11e3-4adf-8ec2-57dae1a9ba0d-config-volume\") pod \"collect-profiles-29324760-k2jd9\" (UID: \"0a7568ac-11e3-4adf-8ec2-57dae1a9ba0d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324760-k2jd9" Oct 03 10:00:00 crc kubenswrapper[4790]: I1003 10:00:00.215155 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0a7568ac-11e3-4adf-8ec2-57dae1a9ba0d-secret-volume\") pod \"collect-profiles-29324760-k2jd9\" (UID: \"0a7568ac-11e3-4adf-8ec2-57dae1a9ba0d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324760-k2jd9" Oct 03 10:00:00 crc kubenswrapper[4790]: I1003 10:00:00.316662 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2hw2c\" (UniqueName: \"kubernetes.io/projected/0a7568ac-11e3-4adf-8ec2-57dae1a9ba0d-kube-api-access-2hw2c\") pod \"collect-profiles-29324760-k2jd9\" (UID: \"0a7568ac-11e3-4adf-8ec2-57dae1a9ba0d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324760-k2jd9" Oct 03 10:00:00 crc kubenswrapper[4790]: I1003 10:00:00.317031 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0a7568ac-11e3-4adf-8ec2-57dae1a9ba0d-config-volume\") pod \"collect-profiles-29324760-k2jd9\" (UID: \"0a7568ac-11e3-4adf-8ec2-57dae1a9ba0d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324760-k2jd9" Oct 03 10:00:00 crc kubenswrapper[4790]: I1003 10:00:00.317153 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0a7568ac-11e3-4adf-8ec2-57dae1a9ba0d-secret-volume\") pod \"collect-profiles-29324760-k2jd9\" (UID: \"0a7568ac-11e3-4adf-8ec2-57dae1a9ba0d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324760-k2jd9" Oct 03 10:00:00 crc kubenswrapper[4790]: I1003 10:00:00.318872 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0a7568ac-11e3-4adf-8ec2-57dae1a9ba0d-config-volume\") pod \"collect-profiles-29324760-k2jd9\" (UID: \"0a7568ac-11e3-4adf-8ec2-57dae1a9ba0d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324760-k2jd9" Oct 03 10:00:00 crc kubenswrapper[4790]: I1003 10:00:00.321817 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0a7568ac-11e3-4adf-8ec2-57dae1a9ba0d-secret-volume\") pod \"collect-profiles-29324760-k2jd9\" (UID: \"0a7568ac-11e3-4adf-8ec2-57dae1a9ba0d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324760-k2jd9" Oct 03 10:00:00 crc kubenswrapper[4790]: I1003 10:00:00.337929 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2hw2c\" (UniqueName: \"kubernetes.io/projected/0a7568ac-11e3-4adf-8ec2-57dae1a9ba0d-kube-api-access-2hw2c\") pod \"collect-profiles-29324760-k2jd9\" (UID: \"0a7568ac-11e3-4adf-8ec2-57dae1a9ba0d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324760-k2jd9" Oct 03 10:00:00 crc kubenswrapper[4790]: I1003 10:00:00.346342 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5deedefb-63ba-4e8f-ba6b-bdf060d61f19" path="/var/lib/kubelet/pods/5deedefb-63ba-4e8f-ba6b-bdf060d61f19/volumes" Oct 03 10:00:00 crc kubenswrapper[4790]: I1003 10:00:00.489657 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29324760-k2jd9" Oct 03 10:00:00 crc kubenswrapper[4790]: I1003 10:00:00.790617 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-zw9tw"] Oct 03 10:00:00 crc kubenswrapper[4790]: I1003 10:00:00.793169 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zw9tw" Oct 03 10:00:00 crc kubenswrapper[4790]: I1003 10:00:00.821826 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-zw9tw"] Oct 03 10:00:00 crc kubenswrapper[4790]: I1003 10:00:00.932006 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f882b505-f5eb-4d03-91f9-d84b6567c247-catalog-content\") pod \"community-operators-zw9tw\" (UID: \"f882b505-f5eb-4d03-91f9-d84b6567c247\") " pod="openshift-marketplace/community-operators-zw9tw" Oct 03 10:00:00 crc kubenswrapper[4790]: I1003 10:00:00.933559 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rjhdn\" (UniqueName: \"kubernetes.io/projected/f882b505-f5eb-4d03-91f9-d84b6567c247-kube-api-access-rjhdn\") pod \"community-operators-zw9tw\" (UID: \"f882b505-f5eb-4d03-91f9-d84b6567c247\") " pod="openshift-marketplace/community-operators-zw9tw" Oct 03 10:00:00 crc kubenswrapper[4790]: I1003 10:00:00.933718 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f882b505-f5eb-4d03-91f9-d84b6567c247-utilities\") pod \"community-operators-zw9tw\" (UID: \"f882b505-f5eb-4d03-91f9-d84b6567c247\") " pod="openshift-marketplace/community-operators-zw9tw" Oct 03 10:00:00 crc kubenswrapper[4790]: I1003 10:00:00.970906 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29324760-k2jd9"] Oct 03 10:00:00 crc kubenswrapper[4790]: W1003 10:00:00.973028 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0a7568ac_11e3_4adf_8ec2_57dae1a9ba0d.slice/crio-f522ec917141c0362df6ed69205b2ba585eeadfe4f7cfcf07213fba65443464f WatchSource:0}: Error finding container f522ec917141c0362df6ed69205b2ba585eeadfe4f7cfcf07213fba65443464f: Status 404 returned error can't find the container with id f522ec917141c0362df6ed69205b2ba585eeadfe4f7cfcf07213fba65443464f Oct 03 10:00:01 crc kubenswrapper[4790]: I1003 10:00:01.035366 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rjhdn\" (UniqueName: \"kubernetes.io/projected/f882b505-f5eb-4d03-91f9-d84b6567c247-kube-api-access-rjhdn\") pod \"community-operators-zw9tw\" (UID: \"f882b505-f5eb-4d03-91f9-d84b6567c247\") " pod="openshift-marketplace/community-operators-zw9tw" Oct 03 10:00:01 crc kubenswrapper[4790]: I1003 10:00:01.035585 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f882b505-f5eb-4d03-91f9-d84b6567c247-utilities\") pod \"community-operators-zw9tw\" (UID: \"f882b505-f5eb-4d03-91f9-d84b6567c247\") " pod="openshift-marketplace/community-operators-zw9tw" Oct 03 10:00:01 crc kubenswrapper[4790]: I1003 10:00:01.035667 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f882b505-f5eb-4d03-91f9-d84b6567c247-catalog-content\") pod \"community-operators-zw9tw\" (UID: \"f882b505-f5eb-4d03-91f9-d84b6567c247\") " pod="openshift-marketplace/community-operators-zw9tw" Oct 03 10:00:01 crc kubenswrapper[4790]: I1003 10:00:01.036142 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f882b505-f5eb-4d03-91f9-d84b6567c247-utilities\") pod \"community-operators-zw9tw\" (UID: \"f882b505-f5eb-4d03-91f9-d84b6567c247\") " pod="openshift-marketplace/community-operators-zw9tw" Oct 03 10:00:01 crc kubenswrapper[4790]: I1003 10:00:01.036176 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f882b505-f5eb-4d03-91f9-d84b6567c247-catalog-content\") pod \"community-operators-zw9tw\" (UID: \"f882b505-f5eb-4d03-91f9-d84b6567c247\") " pod="openshift-marketplace/community-operators-zw9tw" Oct 03 10:00:01 crc kubenswrapper[4790]: I1003 10:00:01.061692 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rjhdn\" (UniqueName: \"kubernetes.io/projected/f882b505-f5eb-4d03-91f9-d84b6567c247-kube-api-access-rjhdn\") pod \"community-operators-zw9tw\" (UID: \"f882b505-f5eb-4d03-91f9-d84b6567c247\") " pod="openshift-marketplace/community-operators-zw9tw" Oct 03 10:00:01 crc kubenswrapper[4790]: I1003 10:00:01.128183 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zw9tw" Oct 03 10:00:01 crc kubenswrapper[4790]: I1003 10:00:01.685970 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-zw9tw"] Oct 03 10:00:01 crc kubenswrapper[4790]: W1003 10:00:01.735674 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf882b505_f5eb_4d03_91f9_d84b6567c247.slice/crio-9f4dfa85897a1825ee9ec4e1a6b6628e2b7923e8d3d602ebedb13f027c405bbc WatchSource:0}: Error finding container 9f4dfa85897a1825ee9ec4e1a6b6628e2b7923e8d3d602ebedb13f027c405bbc: Status 404 returned error can't find the container with id 9f4dfa85897a1825ee9ec4e1a6b6628e2b7923e8d3d602ebedb13f027c405bbc Oct 03 10:00:01 crc kubenswrapper[4790]: I1003 10:00:01.907023 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zw9tw" event={"ID":"f882b505-f5eb-4d03-91f9-d84b6567c247","Type":"ContainerStarted","Data":"d40753449b3f31bea4d91efd96310c2899bd332b03b2fb4515bf52d72d0d2057"} Oct 03 10:00:01 crc kubenswrapper[4790]: I1003 10:00:01.907377 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zw9tw" event={"ID":"f882b505-f5eb-4d03-91f9-d84b6567c247","Type":"ContainerStarted","Data":"9f4dfa85897a1825ee9ec4e1a6b6628e2b7923e8d3d602ebedb13f027c405bbc"} Oct 03 10:00:01 crc kubenswrapper[4790]: I1003 10:00:01.910457 4790 generic.go:334] "Generic (PLEG): container finished" podID="0a7568ac-11e3-4adf-8ec2-57dae1a9ba0d" containerID="1504755ee15f0d6135f56a7b320a41621f44bbb01ec5373a2d954196289d8d5f" exitCode=0 Oct 03 10:00:01 crc kubenswrapper[4790]: I1003 10:00:01.910540 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29324760-k2jd9" event={"ID":"0a7568ac-11e3-4adf-8ec2-57dae1a9ba0d","Type":"ContainerDied","Data":"1504755ee15f0d6135f56a7b320a41621f44bbb01ec5373a2d954196289d8d5f"} Oct 03 10:00:01 crc kubenswrapper[4790]: I1003 10:00:01.910602 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29324760-k2jd9" event={"ID":"0a7568ac-11e3-4adf-8ec2-57dae1a9ba0d","Type":"ContainerStarted","Data":"f522ec917141c0362df6ed69205b2ba585eeadfe4f7cfcf07213fba65443464f"} Oct 03 10:00:02 crc kubenswrapper[4790]: I1003 10:00:02.935903 4790 generic.go:334] "Generic (PLEG): container finished" podID="f882b505-f5eb-4d03-91f9-d84b6567c247" containerID="d40753449b3f31bea4d91efd96310c2899bd332b03b2fb4515bf52d72d0d2057" exitCode=0 Oct 03 10:00:02 crc kubenswrapper[4790]: I1003 10:00:02.936016 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zw9tw" event={"ID":"f882b505-f5eb-4d03-91f9-d84b6567c247","Type":"ContainerDied","Data":"d40753449b3f31bea4d91efd96310c2899bd332b03b2fb4515bf52d72d0d2057"} Oct 03 10:00:03 crc kubenswrapper[4790]: I1003 10:00:03.358738 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29324760-k2jd9" Oct 03 10:00:03 crc kubenswrapper[4790]: I1003 10:00:03.497555 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2hw2c\" (UniqueName: \"kubernetes.io/projected/0a7568ac-11e3-4adf-8ec2-57dae1a9ba0d-kube-api-access-2hw2c\") pod \"0a7568ac-11e3-4adf-8ec2-57dae1a9ba0d\" (UID: \"0a7568ac-11e3-4adf-8ec2-57dae1a9ba0d\") " Oct 03 10:00:03 crc kubenswrapper[4790]: I1003 10:00:03.498301 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0a7568ac-11e3-4adf-8ec2-57dae1a9ba0d-config-volume\") pod \"0a7568ac-11e3-4adf-8ec2-57dae1a9ba0d\" (UID: \"0a7568ac-11e3-4adf-8ec2-57dae1a9ba0d\") " Oct 03 10:00:03 crc kubenswrapper[4790]: I1003 10:00:03.499354 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0a7568ac-11e3-4adf-8ec2-57dae1a9ba0d-config-volume" (OuterVolumeSpecName: "config-volume") pod "0a7568ac-11e3-4adf-8ec2-57dae1a9ba0d" (UID: "0a7568ac-11e3-4adf-8ec2-57dae1a9ba0d"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 10:00:03 crc kubenswrapper[4790]: I1003 10:00:03.499728 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0a7568ac-11e3-4adf-8ec2-57dae1a9ba0d-secret-volume\") pod \"0a7568ac-11e3-4adf-8ec2-57dae1a9ba0d\" (UID: \"0a7568ac-11e3-4adf-8ec2-57dae1a9ba0d\") " Oct 03 10:00:03 crc kubenswrapper[4790]: I1003 10:00:03.501273 4790 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0a7568ac-11e3-4adf-8ec2-57dae1a9ba0d-config-volume\") on node \"crc\" DevicePath \"\"" Oct 03 10:00:03 crc kubenswrapper[4790]: I1003 10:00:03.511794 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0a7568ac-11e3-4adf-8ec2-57dae1a9ba0d-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "0a7568ac-11e3-4adf-8ec2-57dae1a9ba0d" (UID: "0a7568ac-11e3-4adf-8ec2-57dae1a9ba0d"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 10:00:03 crc kubenswrapper[4790]: I1003 10:00:03.512895 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0a7568ac-11e3-4adf-8ec2-57dae1a9ba0d-kube-api-access-2hw2c" (OuterVolumeSpecName: "kube-api-access-2hw2c") pod "0a7568ac-11e3-4adf-8ec2-57dae1a9ba0d" (UID: "0a7568ac-11e3-4adf-8ec2-57dae1a9ba0d"). InnerVolumeSpecName "kube-api-access-2hw2c". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 10:00:03 crc kubenswrapper[4790]: I1003 10:00:03.604227 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2hw2c\" (UniqueName: \"kubernetes.io/projected/0a7568ac-11e3-4adf-8ec2-57dae1a9ba0d-kube-api-access-2hw2c\") on node \"crc\" DevicePath \"\"" Oct 03 10:00:03 crc kubenswrapper[4790]: I1003 10:00:03.604270 4790 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0a7568ac-11e3-4adf-8ec2-57dae1a9ba0d-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 03 10:00:03 crc kubenswrapper[4790]: I1003 10:00:03.963222 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29324760-k2jd9" event={"ID":"0a7568ac-11e3-4adf-8ec2-57dae1a9ba0d","Type":"ContainerDied","Data":"f522ec917141c0362df6ed69205b2ba585eeadfe4f7cfcf07213fba65443464f"} Oct 03 10:00:03 crc kubenswrapper[4790]: I1003 10:00:03.963276 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f522ec917141c0362df6ed69205b2ba585eeadfe4f7cfcf07213fba65443464f" Oct 03 10:00:03 crc kubenswrapper[4790]: I1003 10:00:03.963375 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29324760-k2jd9" Oct 03 10:00:04 crc kubenswrapper[4790]: I1003 10:00:04.451963 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29324715-f4g2f"] Oct 03 10:00:04 crc kubenswrapper[4790]: I1003 10:00:04.462567 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29324715-f4g2f"] Oct 03 10:00:04 crc kubenswrapper[4790]: I1003 10:00:04.984470 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zw9tw" event={"ID":"f882b505-f5eb-4d03-91f9-d84b6567c247","Type":"ContainerStarted","Data":"43c5b725c38ee4ec522237bffceaac51adaafa6d6676be759181b795abc6d78c"} Oct 03 10:00:05 crc kubenswrapper[4790]: I1003 10:00:05.995920 4790 generic.go:334] "Generic (PLEG): container finished" podID="f882b505-f5eb-4d03-91f9-d84b6567c247" containerID="43c5b725c38ee4ec522237bffceaac51adaafa6d6676be759181b795abc6d78c" exitCode=0 Oct 03 10:00:05 crc kubenswrapper[4790]: I1003 10:00:05.995988 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zw9tw" event={"ID":"f882b505-f5eb-4d03-91f9-d84b6567c247","Type":"ContainerDied","Data":"43c5b725c38ee4ec522237bffceaac51adaafa6d6676be759181b795abc6d78c"} Oct 03 10:00:06 crc kubenswrapper[4790]: I1003 10:00:06.341794 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="41d654b2-107a-4eae-b098-12182f3a239a" path="/var/lib/kubelet/pods/41d654b2-107a-4eae-b098-12182f3a239a/volumes" Oct 03 10:00:07 crc kubenswrapper[4790]: I1003 10:00:07.024724 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zw9tw" event={"ID":"f882b505-f5eb-4d03-91f9-d84b6567c247","Type":"ContainerStarted","Data":"e70aa3edc7505d5d3e6024ddb5b2f9f862c7edeac4b2627b2f913991b5a83c09"} Oct 03 10:00:07 crc kubenswrapper[4790]: I1003 10:00:07.048207 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-zw9tw" podStartSLOduration=3.516580694 podStartE2EDuration="7.048184524s" podCreationTimestamp="2025-10-03 10:00:00 +0000 UTC" firstStartedPulling="2025-10-03 10:00:02.939303658 +0000 UTC m=+4789.292237896" lastFinishedPulling="2025-10-03 10:00:06.470907488 +0000 UTC m=+4792.823841726" observedRunningTime="2025-10-03 10:00:07.044580994 +0000 UTC m=+4793.397515232" watchObservedRunningTime="2025-10-03 10:00:07.048184524 +0000 UTC m=+4793.401118762" Oct 03 10:00:10 crc kubenswrapper[4790]: I1003 10:00:10.186060 4790 patch_prober.go:28] interesting pod/machine-config-daemon-zqj4j container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 10:00:10 crc kubenswrapper[4790]: I1003 10:00:10.186770 4790 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" podUID="36eae06a-d8be-4128-8d68-acb32e1f011b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 10:00:10 crc kubenswrapper[4790]: I1003 10:00:10.186825 4790 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" Oct 03 10:00:10 crc kubenswrapper[4790]: I1003 10:00:10.187737 4790 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"b4fffdcc571671131c53b354b11ff7cfa23698400512e2f6c8453ede4bebf434"} pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 03 10:00:10 crc kubenswrapper[4790]: I1003 10:00:10.187798 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" podUID="36eae06a-d8be-4128-8d68-acb32e1f011b" containerName="machine-config-daemon" containerID="cri-o://b4fffdcc571671131c53b354b11ff7cfa23698400512e2f6c8453ede4bebf434" gracePeriod=600 Oct 03 10:00:11 crc kubenswrapper[4790]: I1003 10:00:11.068671 4790 generic.go:334] "Generic (PLEG): container finished" podID="36eae06a-d8be-4128-8d68-acb32e1f011b" containerID="b4fffdcc571671131c53b354b11ff7cfa23698400512e2f6c8453ede4bebf434" exitCode=0 Oct 03 10:00:11 crc kubenswrapper[4790]: I1003 10:00:11.068749 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" event={"ID":"36eae06a-d8be-4128-8d68-acb32e1f011b","Type":"ContainerDied","Data":"b4fffdcc571671131c53b354b11ff7cfa23698400512e2f6c8453ede4bebf434"} Oct 03 10:00:11 crc kubenswrapper[4790]: I1003 10:00:11.069389 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" event={"ID":"36eae06a-d8be-4128-8d68-acb32e1f011b","Type":"ContainerStarted","Data":"1a64f73fcffe7eaf400727d35dc4b02f6a92ebdf24e33ac845589e5623ad52ac"} Oct 03 10:00:11 crc kubenswrapper[4790]: I1003 10:00:11.069425 4790 scope.go:117] "RemoveContainer" containerID="8e0196bb36398706962d585a007c8fc442ef4fad86d7a5243997f04886073380" Oct 03 10:00:11 crc kubenswrapper[4790]: I1003 10:00:11.128986 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-zw9tw" Oct 03 10:00:11 crc kubenswrapper[4790]: I1003 10:00:11.129318 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-zw9tw" Oct 03 10:00:11 crc kubenswrapper[4790]: I1003 10:00:11.193526 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-zw9tw" Oct 03 10:00:12 crc kubenswrapper[4790]: I1003 10:00:12.142943 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-zw9tw" Oct 03 10:00:12 crc kubenswrapper[4790]: I1003 10:00:12.239094 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-zw9tw"] Oct 03 10:00:14 crc kubenswrapper[4790]: I1003 10:00:14.109448 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-zw9tw" podUID="f882b505-f5eb-4d03-91f9-d84b6567c247" containerName="registry-server" containerID="cri-o://e70aa3edc7505d5d3e6024ddb5b2f9f862c7edeac4b2627b2f913991b5a83c09" gracePeriod=2 Oct 03 10:00:14 crc kubenswrapper[4790]: I1003 10:00:14.683837 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zw9tw" Oct 03 10:00:14 crc kubenswrapper[4790]: I1003 10:00:14.848176 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f882b505-f5eb-4d03-91f9-d84b6567c247-utilities\") pod \"f882b505-f5eb-4d03-91f9-d84b6567c247\" (UID: \"f882b505-f5eb-4d03-91f9-d84b6567c247\") " Oct 03 10:00:14 crc kubenswrapper[4790]: I1003 10:00:14.848278 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f882b505-f5eb-4d03-91f9-d84b6567c247-catalog-content\") pod \"f882b505-f5eb-4d03-91f9-d84b6567c247\" (UID: \"f882b505-f5eb-4d03-91f9-d84b6567c247\") " Oct 03 10:00:14 crc kubenswrapper[4790]: I1003 10:00:14.848339 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rjhdn\" (UniqueName: \"kubernetes.io/projected/f882b505-f5eb-4d03-91f9-d84b6567c247-kube-api-access-rjhdn\") pod \"f882b505-f5eb-4d03-91f9-d84b6567c247\" (UID: \"f882b505-f5eb-4d03-91f9-d84b6567c247\") " Oct 03 10:00:14 crc kubenswrapper[4790]: I1003 10:00:14.849010 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f882b505-f5eb-4d03-91f9-d84b6567c247-utilities" (OuterVolumeSpecName: "utilities") pod "f882b505-f5eb-4d03-91f9-d84b6567c247" (UID: "f882b505-f5eb-4d03-91f9-d84b6567c247"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 10:00:14 crc kubenswrapper[4790]: I1003 10:00:14.861866 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f882b505-f5eb-4d03-91f9-d84b6567c247-kube-api-access-rjhdn" (OuterVolumeSpecName: "kube-api-access-rjhdn") pod "f882b505-f5eb-4d03-91f9-d84b6567c247" (UID: "f882b505-f5eb-4d03-91f9-d84b6567c247"). InnerVolumeSpecName "kube-api-access-rjhdn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 10:00:14 crc kubenswrapper[4790]: I1003 10:00:14.912963 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f882b505-f5eb-4d03-91f9-d84b6567c247-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f882b505-f5eb-4d03-91f9-d84b6567c247" (UID: "f882b505-f5eb-4d03-91f9-d84b6567c247"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 10:00:14 crc kubenswrapper[4790]: I1003 10:00:14.950799 4790 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f882b505-f5eb-4d03-91f9-d84b6567c247-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 10:00:14 crc kubenswrapper[4790]: I1003 10:00:14.950837 4790 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f882b505-f5eb-4d03-91f9-d84b6567c247-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 10:00:14 crc kubenswrapper[4790]: I1003 10:00:14.950855 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rjhdn\" (UniqueName: \"kubernetes.io/projected/f882b505-f5eb-4d03-91f9-d84b6567c247-kube-api-access-rjhdn\") on node \"crc\" DevicePath \"\"" Oct 03 10:00:15 crc kubenswrapper[4790]: I1003 10:00:15.122102 4790 generic.go:334] "Generic (PLEG): container finished" podID="f882b505-f5eb-4d03-91f9-d84b6567c247" containerID="e70aa3edc7505d5d3e6024ddb5b2f9f862c7edeac4b2627b2f913991b5a83c09" exitCode=0 Oct 03 10:00:15 crc kubenswrapper[4790]: I1003 10:00:15.122169 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zw9tw" event={"ID":"f882b505-f5eb-4d03-91f9-d84b6567c247","Type":"ContainerDied","Data":"e70aa3edc7505d5d3e6024ddb5b2f9f862c7edeac4b2627b2f913991b5a83c09"} Oct 03 10:00:15 crc kubenswrapper[4790]: I1003 10:00:15.122507 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zw9tw" event={"ID":"f882b505-f5eb-4d03-91f9-d84b6567c247","Type":"ContainerDied","Data":"9f4dfa85897a1825ee9ec4e1a6b6628e2b7923e8d3d602ebedb13f027c405bbc"} Oct 03 10:00:15 crc kubenswrapper[4790]: I1003 10:00:15.122203 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zw9tw" Oct 03 10:00:15 crc kubenswrapper[4790]: I1003 10:00:15.122530 4790 scope.go:117] "RemoveContainer" containerID="e70aa3edc7505d5d3e6024ddb5b2f9f862c7edeac4b2627b2f913991b5a83c09" Oct 03 10:00:15 crc kubenswrapper[4790]: I1003 10:00:15.156468 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-zw9tw"] Oct 03 10:00:15 crc kubenswrapper[4790]: I1003 10:00:15.161985 4790 scope.go:117] "RemoveContainer" containerID="43c5b725c38ee4ec522237bffceaac51adaafa6d6676be759181b795abc6d78c" Oct 03 10:00:15 crc kubenswrapper[4790]: I1003 10:00:15.165942 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-zw9tw"] Oct 03 10:00:15 crc kubenswrapper[4790]: I1003 10:00:15.187399 4790 scope.go:117] "RemoveContainer" containerID="d40753449b3f31bea4d91efd96310c2899bd332b03b2fb4515bf52d72d0d2057" Oct 03 10:00:15 crc kubenswrapper[4790]: I1003 10:00:15.233668 4790 scope.go:117] "RemoveContainer" containerID="e70aa3edc7505d5d3e6024ddb5b2f9f862c7edeac4b2627b2f913991b5a83c09" Oct 03 10:00:15 crc kubenswrapper[4790]: E1003 10:00:15.234123 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e70aa3edc7505d5d3e6024ddb5b2f9f862c7edeac4b2627b2f913991b5a83c09\": container with ID starting with e70aa3edc7505d5d3e6024ddb5b2f9f862c7edeac4b2627b2f913991b5a83c09 not found: ID does not exist" containerID="e70aa3edc7505d5d3e6024ddb5b2f9f862c7edeac4b2627b2f913991b5a83c09" Oct 03 10:00:15 crc kubenswrapper[4790]: I1003 10:00:15.234176 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e70aa3edc7505d5d3e6024ddb5b2f9f862c7edeac4b2627b2f913991b5a83c09"} err="failed to get container status \"e70aa3edc7505d5d3e6024ddb5b2f9f862c7edeac4b2627b2f913991b5a83c09\": rpc error: code = NotFound desc = could not find container \"e70aa3edc7505d5d3e6024ddb5b2f9f862c7edeac4b2627b2f913991b5a83c09\": container with ID starting with e70aa3edc7505d5d3e6024ddb5b2f9f862c7edeac4b2627b2f913991b5a83c09 not found: ID does not exist" Oct 03 10:00:15 crc kubenswrapper[4790]: I1003 10:00:15.234203 4790 scope.go:117] "RemoveContainer" containerID="43c5b725c38ee4ec522237bffceaac51adaafa6d6676be759181b795abc6d78c" Oct 03 10:00:15 crc kubenswrapper[4790]: E1003 10:00:15.234667 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"43c5b725c38ee4ec522237bffceaac51adaafa6d6676be759181b795abc6d78c\": container with ID starting with 43c5b725c38ee4ec522237bffceaac51adaafa6d6676be759181b795abc6d78c not found: ID does not exist" containerID="43c5b725c38ee4ec522237bffceaac51adaafa6d6676be759181b795abc6d78c" Oct 03 10:00:15 crc kubenswrapper[4790]: I1003 10:00:15.234702 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"43c5b725c38ee4ec522237bffceaac51adaafa6d6676be759181b795abc6d78c"} err="failed to get container status \"43c5b725c38ee4ec522237bffceaac51adaafa6d6676be759181b795abc6d78c\": rpc error: code = NotFound desc = could not find container \"43c5b725c38ee4ec522237bffceaac51adaafa6d6676be759181b795abc6d78c\": container with ID starting with 43c5b725c38ee4ec522237bffceaac51adaafa6d6676be759181b795abc6d78c not found: ID does not exist" Oct 03 10:00:15 crc kubenswrapper[4790]: I1003 10:00:15.234722 4790 scope.go:117] "RemoveContainer" containerID="d40753449b3f31bea4d91efd96310c2899bd332b03b2fb4515bf52d72d0d2057" Oct 03 10:00:15 crc kubenswrapper[4790]: E1003 10:00:15.235051 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d40753449b3f31bea4d91efd96310c2899bd332b03b2fb4515bf52d72d0d2057\": container with ID starting with d40753449b3f31bea4d91efd96310c2899bd332b03b2fb4515bf52d72d0d2057 not found: ID does not exist" containerID="d40753449b3f31bea4d91efd96310c2899bd332b03b2fb4515bf52d72d0d2057" Oct 03 10:00:15 crc kubenswrapper[4790]: I1003 10:00:15.235202 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d40753449b3f31bea4d91efd96310c2899bd332b03b2fb4515bf52d72d0d2057"} err="failed to get container status \"d40753449b3f31bea4d91efd96310c2899bd332b03b2fb4515bf52d72d0d2057\": rpc error: code = NotFound desc = could not find container \"d40753449b3f31bea4d91efd96310c2899bd332b03b2fb4515bf52d72d0d2057\": container with ID starting with d40753449b3f31bea4d91efd96310c2899bd332b03b2fb4515bf52d72d0d2057 not found: ID does not exist" Oct 03 10:00:16 crc kubenswrapper[4790]: I1003 10:00:16.340916 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f882b505-f5eb-4d03-91f9-d84b6567c247" path="/var/lib/kubelet/pods/f882b505-f5eb-4d03-91f9-d84b6567c247/volumes" Oct 03 10:00:30 crc kubenswrapper[4790]: I1003 10:00:30.247351 4790 scope.go:117] "RemoveContainer" containerID="4fb5900507f809b2410d418636e5371467a766a9735d86e39a851dea85b52620" Oct 03 10:00:30 crc kubenswrapper[4790]: I1003 10:00:30.286896 4790 scope.go:117] "RemoveContainer" containerID="a192ba19f859491de41bee269759939cacab6c07141a220e4fce99e939383897" Oct 03 10:00:30 crc kubenswrapper[4790]: I1003 10:00:30.329409 4790 scope.go:117] "RemoveContainer" containerID="8a3121ca6aa9e11b5dad943f67b712e0989fae9d1fa2c001d1548de485125e11" Oct 03 10:00:30 crc kubenswrapper[4790]: I1003 10:00:30.354128 4790 scope.go:117] "RemoveContainer" containerID="2ac122f766f58060ad178d2e3f4682fee79c2cae57bd85d339c08fc28532bd3a" Oct 03 10:01:00 crc kubenswrapper[4790]: I1003 10:01:00.179368 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cron-29324761-s6cn9"] Oct 03 10:01:00 crc kubenswrapper[4790]: E1003 10:01:00.180837 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f882b505-f5eb-4d03-91f9-d84b6567c247" containerName="extract-utilities" Oct 03 10:01:00 crc kubenswrapper[4790]: I1003 10:01:00.180856 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="f882b505-f5eb-4d03-91f9-d84b6567c247" containerName="extract-utilities" Oct 03 10:01:00 crc kubenswrapper[4790]: E1003 10:01:00.180900 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f882b505-f5eb-4d03-91f9-d84b6567c247" containerName="registry-server" Oct 03 10:01:00 crc kubenswrapper[4790]: I1003 10:01:00.180907 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="f882b505-f5eb-4d03-91f9-d84b6567c247" containerName="registry-server" Oct 03 10:01:00 crc kubenswrapper[4790]: E1003 10:01:00.180916 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0a7568ac-11e3-4adf-8ec2-57dae1a9ba0d" containerName="collect-profiles" Oct 03 10:01:00 crc kubenswrapper[4790]: I1003 10:01:00.180923 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a7568ac-11e3-4adf-8ec2-57dae1a9ba0d" containerName="collect-profiles" Oct 03 10:01:00 crc kubenswrapper[4790]: E1003 10:01:00.180943 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f882b505-f5eb-4d03-91f9-d84b6567c247" containerName="extract-content" Oct 03 10:01:00 crc kubenswrapper[4790]: I1003 10:01:00.180951 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="f882b505-f5eb-4d03-91f9-d84b6567c247" containerName="extract-content" Oct 03 10:01:00 crc kubenswrapper[4790]: I1003 10:01:00.181289 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="f882b505-f5eb-4d03-91f9-d84b6567c247" containerName="registry-server" Oct 03 10:01:00 crc kubenswrapper[4790]: I1003 10:01:00.181324 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="0a7568ac-11e3-4adf-8ec2-57dae1a9ba0d" containerName="collect-profiles" Oct 03 10:01:00 crc kubenswrapper[4790]: I1003 10:01:00.182238 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29324761-s6cn9" Oct 03 10:01:00 crc kubenswrapper[4790]: I1003 10:01:00.194539 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29324761-s6cn9"] Oct 03 10:01:00 crc kubenswrapper[4790]: I1003 10:01:00.282949 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/95907765-166e-4e38-a906-86b4020f3b5c-config-data\") pod \"keystone-cron-29324761-s6cn9\" (UID: \"95907765-166e-4e38-a906-86b4020f3b5c\") " pod="openstack/keystone-cron-29324761-s6cn9" Oct 03 10:01:00 crc kubenswrapper[4790]: I1003 10:01:00.283016 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95907765-166e-4e38-a906-86b4020f3b5c-combined-ca-bundle\") pod \"keystone-cron-29324761-s6cn9\" (UID: \"95907765-166e-4e38-a906-86b4020f3b5c\") " pod="openstack/keystone-cron-29324761-s6cn9" Oct 03 10:01:00 crc kubenswrapper[4790]: I1003 10:01:00.283420 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5tpg7\" (UniqueName: \"kubernetes.io/projected/95907765-166e-4e38-a906-86b4020f3b5c-kube-api-access-5tpg7\") pod \"keystone-cron-29324761-s6cn9\" (UID: \"95907765-166e-4e38-a906-86b4020f3b5c\") " pod="openstack/keystone-cron-29324761-s6cn9" Oct 03 10:01:00 crc kubenswrapper[4790]: I1003 10:01:00.283484 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/95907765-166e-4e38-a906-86b4020f3b5c-fernet-keys\") pod \"keystone-cron-29324761-s6cn9\" (UID: \"95907765-166e-4e38-a906-86b4020f3b5c\") " pod="openstack/keystone-cron-29324761-s6cn9" Oct 03 10:01:00 crc kubenswrapper[4790]: I1003 10:01:00.385765 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/95907765-166e-4e38-a906-86b4020f3b5c-fernet-keys\") pod \"keystone-cron-29324761-s6cn9\" (UID: \"95907765-166e-4e38-a906-86b4020f3b5c\") " pod="openstack/keystone-cron-29324761-s6cn9" Oct 03 10:01:00 crc kubenswrapper[4790]: I1003 10:01:00.385808 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5tpg7\" (UniqueName: \"kubernetes.io/projected/95907765-166e-4e38-a906-86b4020f3b5c-kube-api-access-5tpg7\") pod \"keystone-cron-29324761-s6cn9\" (UID: \"95907765-166e-4e38-a906-86b4020f3b5c\") " pod="openstack/keystone-cron-29324761-s6cn9" Oct 03 10:01:00 crc kubenswrapper[4790]: I1003 10:01:00.385929 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/95907765-166e-4e38-a906-86b4020f3b5c-config-data\") pod \"keystone-cron-29324761-s6cn9\" (UID: \"95907765-166e-4e38-a906-86b4020f3b5c\") " pod="openstack/keystone-cron-29324761-s6cn9" Oct 03 10:01:00 crc kubenswrapper[4790]: I1003 10:01:00.385988 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95907765-166e-4e38-a906-86b4020f3b5c-combined-ca-bundle\") pod \"keystone-cron-29324761-s6cn9\" (UID: \"95907765-166e-4e38-a906-86b4020f3b5c\") " pod="openstack/keystone-cron-29324761-s6cn9" Oct 03 10:01:00 crc kubenswrapper[4790]: I1003 10:01:00.400181 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/95907765-166e-4e38-a906-86b4020f3b5c-config-data\") pod \"keystone-cron-29324761-s6cn9\" (UID: \"95907765-166e-4e38-a906-86b4020f3b5c\") " pod="openstack/keystone-cron-29324761-s6cn9" Oct 03 10:01:00 crc kubenswrapper[4790]: I1003 10:01:00.400279 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95907765-166e-4e38-a906-86b4020f3b5c-combined-ca-bundle\") pod \"keystone-cron-29324761-s6cn9\" (UID: \"95907765-166e-4e38-a906-86b4020f3b5c\") " pod="openstack/keystone-cron-29324761-s6cn9" Oct 03 10:01:00 crc kubenswrapper[4790]: I1003 10:01:00.402526 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/95907765-166e-4e38-a906-86b4020f3b5c-fernet-keys\") pod \"keystone-cron-29324761-s6cn9\" (UID: \"95907765-166e-4e38-a906-86b4020f3b5c\") " pod="openstack/keystone-cron-29324761-s6cn9" Oct 03 10:01:00 crc kubenswrapper[4790]: I1003 10:01:00.402855 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5tpg7\" (UniqueName: \"kubernetes.io/projected/95907765-166e-4e38-a906-86b4020f3b5c-kube-api-access-5tpg7\") pod \"keystone-cron-29324761-s6cn9\" (UID: \"95907765-166e-4e38-a906-86b4020f3b5c\") " pod="openstack/keystone-cron-29324761-s6cn9" Oct 03 10:01:00 crc kubenswrapper[4790]: I1003 10:01:00.508903 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29324761-s6cn9" Oct 03 10:01:00 crc kubenswrapper[4790]: I1003 10:01:00.995500 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29324761-s6cn9"] Oct 03 10:01:01 crc kubenswrapper[4790]: I1003 10:01:01.572240 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29324761-s6cn9" event={"ID":"95907765-166e-4e38-a906-86b4020f3b5c","Type":"ContainerStarted","Data":"c254f6246e5383c2663283b32bd9450517f4e029eac578d960a5616038fba959"} Oct 03 10:01:01 crc kubenswrapper[4790]: I1003 10:01:01.572620 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29324761-s6cn9" event={"ID":"95907765-166e-4e38-a906-86b4020f3b5c","Type":"ContainerStarted","Data":"d2ae130a9065b15b29f0ca0c945bf04644b31637588eb07bb19f33009a472b77"} Oct 03 10:01:01 crc kubenswrapper[4790]: I1003 10:01:01.592428 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-cron-29324761-s6cn9" podStartSLOduration=1.592409414 podStartE2EDuration="1.592409414s" podCreationTimestamp="2025-10-03 10:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 10:01:01.588337802 +0000 UTC m=+4847.941272060" watchObservedRunningTime="2025-10-03 10:01:01.592409414 +0000 UTC m=+4847.945343652" Oct 03 10:01:05 crc kubenswrapper[4790]: I1003 10:01:05.622956 4790 generic.go:334] "Generic (PLEG): container finished" podID="95907765-166e-4e38-a906-86b4020f3b5c" containerID="c254f6246e5383c2663283b32bd9450517f4e029eac578d960a5616038fba959" exitCode=0 Oct 03 10:01:05 crc kubenswrapper[4790]: I1003 10:01:05.623049 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29324761-s6cn9" event={"ID":"95907765-166e-4e38-a906-86b4020f3b5c","Type":"ContainerDied","Data":"c254f6246e5383c2663283b32bd9450517f4e029eac578d960a5616038fba959"} Oct 03 10:01:07 crc kubenswrapper[4790]: I1003 10:01:07.037988 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29324761-s6cn9" Oct 03 10:01:07 crc kubenswrapper[4790]: I1003 10:01:07.135217 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/95907765-166e-4e38-a906-86b4020f3b5c-fernet-keys\") pod \"95907765-166e-4e38-a906-86b4020f3b5c\" (UID: \"95907765-166e-4e38-a906-86b4020f3b5c\") " Oct 03 10:01:07 crc kubenswrapper[4790]: I1003 10:01:07.135302 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5tpg7\" (UniqueName: \"kubernetes.io/projected/95907765-166e-4e38-a906-86b4020f3b5c-kube-api-access-5tpg7\") pod \"95907765-166e-4e38-a906-86b4020f3b5c\" (UID: \"95907765-166e-4e38-a906-86b4020f3b5c\") " Oct 03 10:01:07 crc kubenswrapper[4790]: I1003 10:01:07.135432 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/95907765-166e-4e38-a906-86b4020f3b5c-config-data\") pod \"95907765-166e-4e38-a906-86b4020f3b5c\" (UID: \"95907765-166e-4e38-a906-86b4020f3b5c\") " Oct 03 10:01:07 crc kubenswrapper[4790]: I1003 10:01:07.135507 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95907765-166e-4e38-a906-86b4020f3b5c-combined-ca-bundle\") pod \"95907765-166e-4e38-a906-86b4020f3b5c\" (UID: \"95907765-166e-4e38-a906-86b4020f3b5c\") " Oct 03 10:01:07 crc kubenswrapper[4790]: I1003 10:01:07.142931 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/95907765-166e-4e38-a906-86b4020f3b5c-kube-api-access-5tpg7" (OuterVolumeSpecName: "kube-api-access-5tpg7") pod "95907765-166e-4e38-a906-86b4020f3b5c" (UID: "95907765-166e-4e38-a906-86b4020f3b5c"). InnerVolumeSpecName "kube-api-access-5tpg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 10:01:07 crc kubenswrapper[4790]: I1003 10:01:07.149072 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/95907765-166e-4e38-a906-86b4020f3b5c-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "95907765-166e-4e38-a906-86b4020f3b5c" (UID: "95907765-166e-4e38-a906-86b4020f3b5c"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 10:01:07 crc kubenswrapper[4790]: I1003 10:01:07.168424 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/95907765-166e-4e38-a906-86b4020f3b5c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "95907765-166e-4e38-a906-86b4020f3b5c" (UID: "95907765-166e-4e38-a906-86b4020f3b5c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 10:01:07 crc kubenswrapper[4790]: I1003 10:01:07.191080 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/95907765-166e-4e38-a906-86b4020f3b5c-config-data" (OuterVolumeSpecName: "config-data") pod "95907765-166e-4e38-a906-86b4020f3b5c" (UID: "95907765-166e-4e38-a906-86b4020f3b5c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 10:01:07 crc kubenswrapper[4790]: I1003 10:01:07.237912 4790 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/95907765-166e-4e38-a906-86b4020f3b5c-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 10:01:07 crc kubenswrapper[4790]: I1003 10:01:07.237959 4790 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95907765-166e-4e38-a906-86b4020f3b5c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 03 10:01:07 crc kubenswrapper[4790]: I1003 10:01:07.237973 4790 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/95907765-166e-4e38-a906-86b4020f3b5c-fernet-keys\") on node \"crc\" DevicePath \"\"" Oct 03 10:01:07 crc kubenswrapper[4790]: I1003 10:01:07.237985 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5tpg7\" (UniqueName: \"kubernetes.io/projected/95907765-166e-4e38-a906-86b4020f3b5c-kube-api-access-5tpg7\") on node \"crc\" DevicePath \"\"" Oct 03 10:01:07 crc kubenswrapper[4790]: I1003 10:01:07.645427 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29324761-s6cn9" event={"ID":"95907765-166e-4e38-a906-86b4020f3b5c","Type":"ContainerDied","Data":"d2ae130a9065b15b29f0ca0c945bf04644b31637588eb07bb19f33009a472b77"} Oct 03 10:01:07 crc kubenswrapper[4790]: I1003 10:01:07.645479 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d2ae130a9065b15b29f0ca0c945bf04644b31637588eb07bb19f33009a472b77" Oct 03 10:01:07 crc kubenswrapper[4790]: I1003 10:01:07.645481 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29324761-s6cn9" Oct 03 10:02:10 crc kubenswrapper[4790]: I1003 10:02:10.186885 4790 patch_prober.go:28] interesting pod/machine-config-daemon-zqj4j container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 10:02:10 crc kubenswrapper[4790]: I1003 10:02:10.187631 4790 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" podUID="36eae06a-d8be-4128-8d68-acb32e1f011b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 10:02:40 crc kubenswrapper[4790]: I1003 10:02:40.185907 4790 patch_prober.go:28] interesting pod/machine-config-daemon-zqj4j container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 10:02:40 crc kubenswrapper[4790]: I1003 10:02:40.186562 4790 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" podUID="36eae06a-d8be-4128-8d68-acb32e1f011b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 10:03:10 crc kubenswrapper[4790]: I1003 10:03:10.186280 4790 patch_prober.go:28] interesting pod/machine-config-daemon-zqj4j container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 10:03:10 crc kubenswrapper[4790]: I1003 10:03:10.187036 4790 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" podUID="36eae06a-d8be-4128-8d68-acb32e1f011b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 10:03:10 crc kubenswrapper[4790]: I1003 10:03:10.187085 4790 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" Oct 03 10:03:10 crc kubenswrapper[4790]: I1003 10:03:10.187891 4790 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"1a64f73fcffe7eaf400727d35dc4b02f6a92ebdf24e33ac845589e5623ad52ac"} pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 03 10:03:10 crc kubenswrapper[4790]: I1003 10:03:10.187975 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" podUID="36eae06a-d8be-4128-8d68-acb32e1f011b" containerName="machine-config-daemon" containerID="cri-o://1a64f73fcffe7eaf400727d35dc4b02f6a92ebdf24e33ac845589e5623ad52ac" gracePeriod=600 Oct 03 10:03:10 crc kubenswrapper[4790]: E1003 10:03:10.307407 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqj4j_openshift-machine-config-operator(36eae06a-d8be-4128-8d68-acb32e1f011b)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" podUID="36eae06a-d8be-4128-8d68-acb32e1f011b" Oct 03 10:03:10 crc kubenswrapper[4790]: I1003 10:03:10.826434 4790 generic.go:334] "Generic (PLEG): container finished" podID="36eae06a-d8be-4128-8d68-acb32e1f011b" containerID="1a64f73fcffe7eaf400727d35dc4b02f6a92ebdf24e33ac845589e5623ad52ac" exitCode=0 Oct 03 10:03:10 crc kubenswrapper[4790]: I1003 10:03:10.826483 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" event={"ID":"36eae06a-d8be-4128-8d68-acb32e1f011b","Type":"ContainerDied","Data":"1a64f73fcffe7eaf400727d35dc4b02f6a92ebdf24e33ac845589e5623ad52ac"} Oct 03 10:03:10 crc kubenswrapper[4790]: I1003 10:03:10.826521 4790 scope.go:117] "RemoveContainer" containerID="b4fffdcc571671131c53b354b11ff7cfa23698400512e2f6c8453ede4bebf434" Oct 03 10:03:10 crc kubenswrapper[4790]: I1003 10:03:10.827232 4790 scope.go:117] "RemoveContainer" containerID="1a64f73fcffe7eaf400727d35dc4b02f6a92ebdf24e33ac845589e5623ad52ac" Oct 03 10:03:10 crc kubenswrapper[4790]: E1003 10:03:10.827540 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqj4j_openshift-machine-config-operator(36eae06a-d8be-4128-8d68-acb32e1f011b)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" podUID="36eae06a-d8be-4128-8d68-acb32e1f011b" Oct 03 10:03:26 crc kubenswrapper[4790]: I1003 10:03:26.328246 4790 scope.go:117] "RemoveContainer" containerID="1a64f73fcffe7eaf400727d35dc4b02f6a92ebdf24e33ac845589e5623ad52ac" Oct 03 10:03:26 crc kubenswrapper[4790]: E1003 10:03:26.329251 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqj4j_openshift-machine-config-operator(36eae06a-d8be-4128-8d68-acb32e1f011b)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" podUID="36eae06a-d8be-4128-8d68-acb32e1f011b" Oct 03 10:03:41 crc kubenswrapper[4790]: I1003 10:03:41.328747 4790 scope.go:117] "RemoveContainer" containerID="1a64f73fcffe7eaf400727d35dc4b02f6a92ebdf24e33ac845589e5623ad52ac" Oct 03 10:03:41 crc kubenswrapper[4790]: E1003 10:03:41.329489 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqj4j_openshift-machine-config-operator(36eae06a-d8be-4128-8d68-acb32e1f011b)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" podUID="36eae06a-d8be-4128-8d68-acb32e1f011b" Oct 03 10:03:54 crc kubenswrapper[4790]: I1003 10:03:54.336238 4790 scope.go:117] "RemoveContainer" containerID="1a64f73fcffe7eaf400727d35dc4b02f6a92ebdf24e33ac845589e5623ad52ac" Oct 03 10:03:54 crc kubenswrapper[4790]: E1003 10:03:54.337214 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqj4j_openshift-machine-config-operator(36eae06a-d8be-4128-8d68-acb32e1f011b)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" podUID="36eae06a-d8be-4128-8d68-acb32e1f011b" Oct 03 10:04:08 crc kubenswrapper[4790]: I1003 10:04:08.328882 4790 scope.go:117] "RemoveContainer" containerID="1a64f73fcffe7eaf400727d35dc4b02f6a92ebdf24e33ac845589e5623ad52ac" Oct 03 10:04:08 crc kubenswrapper[4790]: E1003 10:04:08.329934 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqj4j_openshift-machine-config-operator(36eae06a-d8be-4128-8d68-acb32e1f011b)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" podUID="36eae06a-d8be-4128-8d68-acb32e1f011b" Oct 03 10:04:20 crc kubenswrapper[4790]: I1003 10:04:20.329169 4790 scope.go:117] "RemoveContainer" containerID="1a64f73fcffe7eaf400727d35dc4b02f6a92ebdf24e33ac845589e5623ad52ac" Oct 03 10:04:20 crc kubenswrapper[4790]: E1003 10:04:20.330101 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqj4j_openshift-machine-config-operator(36eae06a-d8be-4128-8d68-acb32e1f011b)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" podUID="36eae06a-d8be-4128-8d68-acb32e1f011b" Oct 03 10:04:32 crc kubenswrapper[4790]: I1003 10:04:32.329069 4790 scope.go:117] "RemoveContainer" containerID="1a64f73fcffe7eaf400727d35dc4b02f6a92ebdf24e33ac845589e5623ad52ac" Oct 03 10:04:32 crc kubenswrapper[4790]: E1003 10:04:32.330098 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqj4j_openshift-machine-config-operator(36eae06a-d8be-4128-8d68-acb32e1f011b)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" podUID="36eae06a-d8be-4128-8d68-acb32e1f011b" Oct 03 10:04:46 crc kubenswrapper[4790]: I1003 10:04:46.329115 4790 scope.go:117] "RemoveContainer" containerID="1a64f73fcffe7eaf400727d35dc4b02f6a92ebdf24e33ac845589e5623ad52ac" Oct 03 10:04:46 crc kubenswrapper[4790]: E1003 10:04:46.330127 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqj4j_openshift-machine-config-operator(36eae06a-d8be-4128-8d68-acb32e1f011b)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" podUID="36eae06a-d8be-4128-8d68-acb32e1f011b" Oct 03 10:04:59 crc kubenswrapper[4790]: I1003 10:04:59.953774 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-bsrpb"] Oct 03 10:04:59 crc kubenswrapper[4790]: E1003 10:04:59.954977 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="95907765-166e-4e38-a906-86b4020f3b5c" containerName="keystone-cron" Oct 03 10:04:59 crc kubenswrapper[4790]: I1003 10:04:59.954997 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="95907765-166e-4e38-a906-86b4020f3b5c" containerName="keystone-cron" Oct 03 10:04:59 crc kubenswrapper[4790]: I1003 10:04:59.955259 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="95907765-166e-4e38-a906-86b4020f3b5c" containerName="keystone-cron" Oct 03 10:04:59 crc kubenswrapper[4790]: I1003 10:04:59.957103 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bsrpb" Oct 03 10:04:59 crc kubenswrapper[4790]: I1003 10:04:59.968320 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-bsrpb"] Oct 03 10:05:00 crc kubenswrapper[4790]: I1003 10:05:00.116640 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/39d14cba-e5c8-4839-9bad-f2e2068ca43e-utilities\") pod \"redhat-marketplace-bsrpb\" (UID: \"39d14cba-e5c8-4839-9bad-f2e2068ca43e\") " pod="openshift-marketplace/redhat-marketplace-bsrpb" Oct 03 10:05:00 crc kubenswrapper[4790]: I1003 10:05:00.116855 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-47jkq\" (UniqueName: \"kubernetes.io/projected/39d14cba-e5c8-4839-9bad-f2e2068ca43e-kube-api-access-47jkq\") pod \"redhat-marketplace-bsrpb\" (UID: \"39d14cba-e5c8-4839-9bad-f2e2068ca43e\") " pod="openshift-marketplace/redhat-marketplace-bsrpb" Oct 03 10:05:00 crc kubenswrapper[4790]: I1003 10:05:00.116909 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/39d14cba-e5c8-4839-9bad-f2e2068ca43e-catalog-content\") pod \"redhat-marketplace-bsrpb\" (UID: \"39d14cba-e5c8-4839-9bad-f2e2068ca43e\") " pod="openshift-marketplace/redhat-marketplace-bsrpb" Oct 03 10:05:00 crc kubenswrapper[4790]: I1003 10:05:00.218618 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-47jkq\" (UniqueName: \"kubernetes.io/projected/39d14cba-e5c8-4839-9bad-f2e2068ca43e-kube-api-access-47jkq\") pod \"redhat-marketplace-bsrpb\" (UID: \"39d14cba-e5c8-4839-9bad-f2e2068ca43e\") " pod="openshift-marketplace/redhat-marketplace-bsrpb" Oct 03 10:05:00 crc kubenswrapper[4790]: I1003 10:05:00.218718 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/39d14cba-e5c8-4839-9bad-f2e2068ca43e-catalog-content\") pod \"redhat-marketplace-bsrpb\" (UID: \"39d14cba-e5c8-4839-9bad-f2e2068ca43e\") " pod="openshift-marketplace/redhat-marketplace-bsrpb" Oct 03 10:05:00 crc kubenswrapper[4790]: I1003 10:05:00.218794 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/39d14cba-e5c8-4839-9bad-f2e2068ca43e-utilities\") pod \"redhat-marketplace-bsrpb\" (UID: \"39d14cba-e5c8-4839-9bad-f2e2068ca43e\") " pod="openshift-marketplace/redhat-marketplace-bsrpb" Oct 03 10:05:00 crc kubenswrapper[4790]: I1003 10:05:00.219153 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/39d14cba-e5c8-4839-9bad-f2e2068ca43e-catalog-content\") pod \"redhat-marketplace-bsrpb\" (UID: \"39d14cba-e5c8-4839-9bad-f2e2068ca43e\") " pod="openshift-marketplace/redhat-marketplace-bsrpb" Oct 03 10:05:00 crc kubenswrapper[4790]: I1003 10:05:00.219212 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/39d14cba-e5c8-4839-9bad-f2e2068ca43e-utilities\") pod \"redhat-marketplace-bsrpb\" (UID: \"39d14cba-e5c8-4839-9bad-f2e2068ca43e\") " pod="openshift-marketplace/redhat-marketplace-bsrpb" Oct 03 10:05:00 crc kubenswrapper[4790]: I1003 10:05:00.249831 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-47jkq\" (UniqueName: \"kubernetes.io/projected/39d14cba-e5c8-4839-9bad-f2e2068ca43e-kube-api-access-47jkq\") pod \"redhat-marketplace-bsrpb\" (UID: \"39d14cba-e5c8-4839-9bad-f2e2068ca43e\") " pod="openshift-marketplace/redhat-marketplace-bsrpb" Oct 03 10:05:00 crc kubenswrapper[4790]: I1003 10:05:00.280433 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bsrpb" Oct 03 10:05:00 crc kubenswrapper[4790]: I1003 10:05:00.783471 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-bsrpb"] Oct 03 10:05:00 crc kubenswrapper[4790]: I1003 10:05:00.888097 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bsrpb" event={"ID":"39d14cba-e5c8-4839-9bad-f2e2068ca43e","Type":"ContainerStarted","Data":"5ab381a9e3fe773b77dddec7c83b6ba445d2bde8e27076eb909c75467764b144"} Oct 03 10:05:01 crc kubenswrapper[4790]: I1003 10:05:01.328312 4790 scope.go:117] "RemoveContainer" containerID="1a64f73fcffe7eaf400727d35dc4b02f6a92ebdf24e33ac845589e5623ad52ac" Oct 03 10:05:01 crc kubenswrapper[4790]: E1003 10:05:01.329505 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqj4j_openshift-machine-config-operator(36eae06a-d8be-4128-8d68-acb32e1f011b)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" podUID="36eae06a-d8be-4128-8d68-acb32e1f011b" Oct 03 10:05:01 crc kubenswrapper[4790]: I1003 10:05:01.900967 4790 generic.go:334] "Generic (PLEG): container finished" podID="39d14cba-e5c8-4839-9bad-f2e2068ca43e" containerID="b80913a8c5759b12224525de539cf69367e79164219ce7b97a6d56fb26c6be6e" exitCode=0 Oct 03 10:05:01 crc kubenswrapper[4790]: I1003 10:05:01.901043 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bsrpb" event={"ID":"39d14cba-e5c8-4839-9bad-f2e2068ca43e","Type":"ContainerDied","Data":"b80913a8c5759b12224525de539cf69367e79164219ce7b97a6d56fb26c6be6e"} Oct 03 10:05:01 crc kubenswrapper[4790]: I1003 10:05:01.903554 4790 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 03 10:05:02 crc kubenswrapper[4790]: I1003 10:05:02.912778 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bsrpb" event={"ID":"39d14cba-e5c8-4839-9bad-f2e2068ca43e","Type":"ContainerStarted","Data":"6f07fed0dd7279f584ebd88515ea6b0f8ec8f0a97906c04c0f817573108590a2"} Oct 03 10:05:03 crc kubenswrapper[4790]: I1003 10:05:03.939130 4790 generic.go:334] "Generic (PLEG): container finished" podID="39d14cba-e5c8-4839-9bad-f2e2068ca43e" containerID="6f07fed0dd7279f584ebd88515ea6b0f8ec8f0a97906c04c0f817573108590a2" exitCode=0 Oct 03 10:05:03 crc kubenswrapper[4790]: I1003 10:05:03.939176 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bsrpb" event={"ID":"39d14cba-e5c8-4839-9bad-f2e2068ca43e","Type":"ContainerDied","Data":"6f07fed0dd7279f584ebd88515ea6b0f8ec8f0a97906c04c0f817573108590a2"} Oct 03 10:05:04 crc kubenswrapper[4790]: I1003 10:05:04.950380 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bsrpb" event={"ID":"39d14cba-e5c8-4839-9bad-f2e2068ca43e","Type":"ContainerStarted","Data":"ca3f46924a8e0f5c549136ae87bcb68b39e12d0273a6af0da393d15db6b316d4"} Oct 03 10:05:04 crc kubenswrapper[4790]: I1003 10:05:04.970017 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-bsrpb" podStartSLOduration=3.478429305 podStartE2EDuration="5.969998983s" podCreationTimestamp="2025-10-03 10:04:59 +0000 UTC" firstStartedPulling="2025-10-03 10:05:01.903204766 +0000 UTC m=+5088.256139004" lastFinishedPulling="2025-10-03 10:05:04.394774444 +0000 UTC m=+5090.747708682" observedRunningTime="2025-10-03 10:05:04.968669006 +0000 UTC m=+5091.321603254" watchObservedRunningTime="2025-10-03 10:05:04.969998983 +0000 UTC m=+5091.322933221" Oct 03 10:05:10 crc kubenswrapper[4790]: I1003 10:05:10.280825 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-bsrpb" Oct 03 10:05:10 crc kubenswrapper[4790]: I1003 10:05:10.281360 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-bsrpb" Oct 03 10:05:10 crc kubenswrapper[4790]: I1003 10:05:10.377438 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-bsrpb" Oct 03 10:05:11 crc kubenswrapper[4790]: I1003 10:05:11.063368 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-bsrpb" Oct 03 10:05:11 crc kubenswrapper[4790]: I1003 10:05:11.129860 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-bsrpb"] Oct 03 10:05:13 crc kubenswrapper[4790]: I1003 10:05:13.032858 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-bsrpb" podUID="39d14cba-e5c8-4839-9bad-f2e2068ca43e" containerName="registry-server" containerID="cri-o://ca3f46924a8e0f5c549136ae87bcb68b39e12d0273a6af0da393d15db6b316d4" gracePeriod=2 Oct 03 10:05:14 crc kubenswrapper[4790]: I1003 10:05:14.044913 4790 generic.go:334] "Generic (PLEG): container finished" podID="39d14cba-e5c8-4839-9bad-f2e2068ca43e" containerID="ca3f46924a8e0f5c549136ae87bcb68b39e12d0273a6af0da393d15db6b316d4" exitCode=0 Oct 03 10:05:14 crc kubenswrapper[4790]: I1003 10:05:14.045016 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bsrpb" event={"ID":"39d14cba-e5c8-4839-9bad-f2e2068ca43e","Type":"ContainerDied","Data":"ca3f46924a8e0f5c549136ae87bcb68b39e12d0273a6af0da393d15db6b316d4"} Oct 03 10:05:14 crc kubenswrapper[4790]: I1003 10:05:14.045301 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bsrpb" event={"ID":"39d14cba-e5c8-4839-9bad-f2e2068ca43e","Type":"ContainerDied","Data":"5ab381a9e3fe773b77dddec7c83b6ba445d2bde8e27076eb909c75467764b144"} Oct 03 10:05:14 crc kubenswrapper[4790]: I1003 10:05:14.045320 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5ab381a9e3fe773b77dddec7c83b6ba445d2bde8e27076eb909c75467764b144" Oct 03 10:05:14 crc kubenswrapper[4790]: I1003 10:05:14.097170 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bsrpb" Oct 03 10:05:14 crc kubenswrapper[4790]: I1003 10:05:14.211969 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/39d14cba-e5c8-4839-9bad-f2e2068ca43e-catalog-content\") pod \"39d14cba-e5c8-4839-9bad-f2e2068ca43e\" (UID: \"39d14cba-e5c8-4839-9bad-f2e2068ca43e\") " Oct 03 10:05:14 crc kubenswrapper[4790]: I1003 10:05:14.212048 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-47jkq\" (UniqueName: \"kubernetes.io/projected/39d14cba-e5c8-4839-9bad-f2e2068ca43e-kube-api-access-47jkq\") pod \"39d14cba-e5c8-4839-9bad-f2e2068ca43e\" (UID: \"39d14cba-e5c8-4839-9bad-f2e2068ca43e\") " Oct 03 10:05:14 crc kubenswrapper[4790]: I1003 10:05:14.212199 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/39d14cba-e5c8-4839-9bad-f2e2068ca43e-utilities\") pod \"39d14cba-e5c8-4839-9bad-f2e2068ca43e\" (UID: \"39d14cba-e5c8-4839-9bad-f2e2068ca43e\") " Oct 03 10:05:14 crc kubenswrapper[4790]: I1003 10:05:14.213085 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/39d14cba-e5c8-4839-9bad-f2e2068ca43e-utilities" (OuterVolumeSpecName: "utilities") pod "39d14cba-e5c8-4839-9bad-f2e2068ca43e" (UID: "39d14cba-e5c8-4839-9bad-f2e2068ca43e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 10:05:14 crc kubenswrapper[4790]: I1003 10:05:14.219903 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/39d14cba-e5c8-4839-9bad-f2e2068ca43e-kube-api-access-47jkq" (OuterVolumeSpecName: "kube-api-access-47jkq") pod "39d14cba-e5c8-4839-9bad-f2e2068ca43e" (UID: "39d14cba-e5c8-4839-9bad-f2e2068ca43e"). InnerVolumeSpecName "kube-api-access-47jkq". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 10:05:14 crc kubenswrapper[4790]: I1003 10:05:14.228963 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/39d14cba-e5c8-4839-9bad-f2e2068ca43e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "39d14cba-e5c8-4839-9bad-f2e2068ca43e" (UID: "39d14cba-e5c8-4839-9bad-f2e2068ca43e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 10:05:14 crc kubenswrapper[4790]: I1003 10:05:14.314873 4790 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/39d14cba-e5c8-4839-9bad-f2e2068ca43e-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 10:05:14 crc kubenswrapper[4790]: I1003 10:05:14.314940 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-47jkq\" (UniqueName: \"kubernetes.io/projected/39d14cba-e5c8-4839-9bad-f2e2068ca43e-kube-api-access-47jkq\") on node \"crc\" DevicePath \"\"" Oct 03 10:05:14 crc kubenswrapper[4790]: I1003 10:05:14.314950 4790 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/39d14cba-e5c8-4839-9bad-f2e2068ca43e-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 10:05:15 crc kubenswrapper[4790]: I1003 10:05:15.054557 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bsrpb" Oct 03 10:05:15 crc kubenswrapper[4790]: I1003 10:05:15.077842 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-bsrpb"] Oct 03 10:05:15 crc kubenswrapper[4790]: I1003 10:05:15.087196 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-bsrpb"] Oct 03 10:05:16 crc kubenswrapper[4790]: I1003 10:05:16.328102 4790 scope.go:117] "RemoveContainer" containerID="1a64f73fcffe7eaf400727d35dc4b02f6a92ebdf24e33ac845589e5623ad52ac" Oct 03 10:05:16 crc kubenswrapper[4790]: E1003 10:05:16.328767 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqj4j_openshift-machine-config-operator(36eae06a-d8be-4128-8d68-acb32e1f011b)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" podUID="36eae06a-d8be-4128-8d68-acb32e1f011b" Oct 03 10:05:16 crc kubenswrapper[4790]: I1003 10:05:16.339704 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="39d14cba-e5c8-4839-9bad-f2e2068ca43e" path="/var/lib/kubelet/pods/39d14cba-e5c8-4839-9bad-f2e2068ca43e/volumes" Oct 03 10:05:18 crc kubenswrapper[4790]: E1003 10:05:18.933691 4790 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod39d14cba_e5c8_4839_9bad_f2e2068ca43e.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod39d14cba_e5c8_4839_9bad_f2e2068ca43e.slice/crio-5ab381a9e3fe773b77dddec7c83b6ba445d2bde8e27076eb909c75467764b144\": RecentStats: unable to find data in memory cache]" Oct 03 10:05:29 crc kubenswrapper[4790]: E1003 10:05:29.249682 4790 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod39d14cba_e5c8_4839_9bad_f2e2068ca43e.slice/crio-5ab381a9e3fe773b77dddec7c83b6ba445d2bde8e27076eb909c75467764b144\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod39d14cba_e5c8_4839_9bad_f2e2068ca43e.slice\": RecentStats: unable to find data in memory cache]" Oct 03 10:05:29 crc kubenswrapper[4790]: I1003 10:05:29.328850 4790 scope.go:117] "RemoveContainer" containerID="1a64f73fcffe7eaf400727d35dc4b02f6a92ebdf24e33ac845589e5623ad52ac" Oct 03 10:05:29 crc kubenswrapper[4790]: E1003 10:05:29.329367 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqj4j_openshift-machine-config-operator(36eae06a-d8be-4128-8d68-acb32e1f011b)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" podUID="36eae06a-d8be-4128-8d68-acb32e1f011b" Oct 03 10:05:39 crc kubenswrapper[4790]: E1003 10:05:39.509761 4790 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod39d14cba_e5c8_4839_9bad_f2e2068ca43e.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod39d14cba_e5c8_4839_9bad_f2e2068ca43e.slice/crio-5ab381a9e3fe773b77dddec7c83b6ba445d2bde8e27076eb909c75467764b144\": RecentStats: unable to find data in memory cache]" Oct 03 10:05:41 crc kubenswrapper[4790]: I1003 10:05:41.328934 4790 scope.go:117] "RemoveContainer" containerID="1a64f73fcffe7eaf400727d35dc4b02f6a92ebdf24e33ac845589e5623ad52ac" Oct 03 10:05:41 crc kubenswrapper[4790]: E1003 10:05:41.329796 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqj4j_openshift-machine-config-operator(36eae06a-d8be-4128-8d68-acb32e1f011b)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" podUID="36eae06a-d8be-4128-8d68-acb32e1f011b" Oct 03 10:05:49 crc kubenswrapper[4790]: E1003 10:05:49.779055 4790 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod39d14cba_e5c8_4839_9bad_f2e2068ca43e.slice/crio-5ab381a9e3fe773b77dddec7c83b6ba445d2bde8e27076eb909c75467764b144\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod39d14cba_e5c8_4839_9bad_f2e2068ca43e.slice\": RecentStats: unable to find data in memory cache]" Oct 03 10:05:56 crc kubenswrapper[4790]: I1003 10:05:56.328707 4790 scope.go:117] "RemoveContainer" containerID="1a64f73fcffe7eaf400727d35dc4b02f6a92ebdf24e33ac845589e5623ad52ac" Oct 03 10:05:56 crc kubenswrapper[4790]: E1003 10:05:56.329544 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqj4j_openshift-machine-config-operator(36eae06a-d8be-4128-8d68-acb32e1f011b)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" podUID="36eae06a-d8be-4128-8d68-acb32e1f011b" Oct 03 10:06:00 crc kubenswrapper[4790]: E1003 10:06:00.036819 4790 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod39d14cba_e5c8_4839_9bad_f2e2068ca43e.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod39d14cba_e5c8_4839_9bad_f2e2068ca43e.slice/crio-5ab381a9e3fe773b77dddec7c83b6ba445d2bde8e27076eb909c75467764b144\": RecentStats: unable to find data in memory cache]" Oct 03 10:06:07 crc kubenswrapper[4790]: I1003 10:06:07.328324 4790 scope.go:117] "RemoveContainer" containerID="1a64f73fcffe7eaf400727d35dc4b02f6a92ebdf24e33ac845589e5623ad52ac" Oct 03 10:06:07 crc kubenswrapper[4790]: E1003 10:06:07.329138 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqj4j_openshift-machine-config-operator(36eae06a-d8be-4128-8d68-acb32e1f011b)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" podUID="36eae06a-d8be-4128-8d68-acb32e1f011b" Oct 03 10:06:10 crc kubenswrapper[4790]: E1003 10:06:10.280595 4790 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod39d14cba_e5c8_4839_9bad_f2e2068ca43e.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod39d14cba_e5c8_4839_9bad_f2e2068ca43e.slice/crio-5ab381a9e3fe773b77dddec7c83b6ba445d2bde8e27076eb909c75467764b144\": RecentStats: unable to find data in memory cache]" Oct 03 10:06:18 crc kubenswrapper[4790]: I1003 10:06:18.328936 4790 scope.go:117] "RemoveContainer" containerID="1a64f73fcffe7eaf400727d35dc4b02f6a92ebdf24e33ac845589e5623ad52ac" Oct 03 10:06:18 crc kubenswrapper[4790]: E1003 10:06:18.329770 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqj4j_openshift-machine-config-operator(36eae06a-d8be-4128-8d68-acb32e1f011b)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" podUID="36eae06a-d8be-4128-8d68-acb32e1f011b" Oct 03 10:06:30 crc kubenswrapper[4790]: I1003 10:06:30.328727 4790 scope.go:117] "RemoveContainer" containerID="1a64f73fcffe7eaf400727d35dc4b02f6a92ebdf24e33ac845589e5623ad52ac" Oct 03 10:06:30 crc kubenswrapper[4790]: E1003 10:06:30.329643 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqj4j_openshift-machine-config-operator(36eae06a-d8be-4128-8d68-acb32e1f011b)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" podUID="36eae06a-d8be-4128-8d68-acb32e1f011b" Oct 03 10:06:42 crc kubenswrapper[4790]: I1003 10:06:42.328566 4790 scope.go:117] "RemoveContainer" containerID="1a64f73fcffe7eaf400727d35dc4b02f6a92ebdf24e33ac845589e5623ad52ac" Oct 03 10:06:42 crc kubenswrapper[4790]: E1003 10:06:42.329408 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqj4j_openshift-machine-config-operator(36eae06a-d8be-4128-8d68-acb32e1f011b)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" podUID="36eae06a-d8be-4128-8d68-acb32e1f011b" Oct 03 10:06:57 crc kubenswrapper[4790]: I1003 10:06:57.328935 4790 scope.go:117] "RemoveContainer" containerID="1a64f73fcffe7eaf400727d35dc4b02f6a92ebdf24e33ac845589e5623ad52ac" Oct 03 10:06:57 crc kubenswrapper[4790]: E1003 10:06:57.329645 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqj4j_openshift-machine-config-operator(36eae06a-d8be-4128-8d68-acb32e1f011b)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" podUID="36eae06a-d8be-4128-8d68-acb32e1f011b" Oct 03 10:07:10 crc kubenswrapper[4790]: I1003 10:07:10.328124 4790 scope.go:117] "RemoveContainer" containerID="1a64f73fcffe7eaf400727d35dc4b02f6a92ebdf24e33ac845589e5623ad52ac" Oct 03 10:07:10 crc kubenswrapper[4790]: E1003 10:07:10.328918 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqj4j_openshift-machine-config-operator(36eae06a-d8be-4128-8d68-acb32e1f011b)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" podUID="36eae06a-d8be-4128-8d68-acb32e1f011b" Oct 03 10:07:25 crc kubenswrapper[4790]: I1003 10:07:25.328426 4790 scope.go:117] "RemoveContainer" containerID="1a64f73fcffe7eaf400727d35dc4b02f6a92ebdf24e33ac845589e5623ad52ac" Oct 03 10:07:25 crc kubenswrapper[4790]: E1003 10:07:25.329348 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqj4j_openshift-machine-config-operator(36eae06a-d8be-4128-8d68-acb32e1f011b)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" podUID="36eae06a-d8be-4128-8d68-acb32e1f011b" Oct 03 10:07:25 crc kubenswrapper[4790]: I1003 10:07:25.946353 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-96rlt"] Oct 03 10:07:25 crc kubenswrapper[4790]: E1003 10:07:25.946930 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="39d14cba-e5c8-4839-9bad-f2e2068ca43e" containerName="extract-content" Oct 03 10:07:25 crc kubenswrapper[4790]: I1003 10:07:25.946955 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="39d14cba-e5c8-4839-9bad-f2e2068ca43e" containerName="extract-content" Oct 03 10:07:25 crc kubenswrapper[4790]: E1003 10:07:25.946991 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="39d14cba-e5c8-4839-9bad-f2e2068ca43e" containerName="extract-utilities" Oct 03 10:07:25 crc kubenswrapper[4790]: I1003 10:07:25.947003 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="39d14cba-e5c8-4839-9bad-f2e2068ca43e" containerName="extract-utilities" Oct 03 10:07:25 crc kubenswrapper[4790]: E1003 10:07:25.947032 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="39d14cba-e5c8-4839-9bad-f2e2068ca43e" containerName="registry-server" Oct 03 10:07:25 crc kubenswrapper[4790]: I1003 10:07:25.947041 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="39d14cba-e5c8-4839-9bad-f2e2068ca43e" containerName="registry-server" Oct 03 10:07:25 crc kubenswrapper[4790]: I1003 10:07:25.947310 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="39d14cba-e5c8-4839-9bad-f2e2068ca43e" containerName="registry-server" Oct 03 10:07:25 crc kubenswrapper[4790]: I1003 10:07:25.949389 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-96rlt" Oct 03 10:07:25 crc kubenswrapper[4790]: I1003 10:07:25.974057 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-96rlt"] Oct 03 10:07:26 crc kubenswrapper[4790]: I1003 10:07:26.050488 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/80572eaf-e996-4768-9a86-9812d3aaba6a-utilities\") pod \"certified-operators-96rlt\" (UID: \"80572eaf-e996-4768-9a86-9812d3aaba6a\") " pod="openshift-marketplace/certified-operators-96rlt" Oct 03 10:07:26 crc kubenswrapper[4790]: I1003 10:07:26.050547 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7vvlf\" (UniqueName: \"kubernetes.io/projected/80572eaf-e996-4768-9a86-9812d3aaba6a-kube-api-access-7vvlf\") pod \"certified-operators-96rlt\" (UID: \"80572eaf-e996-4768-9a86-9812d3aaba6a\") " pod="openshift-marketplace/certified-operators-96rlt" Oct 03 10:07:26 crc kubenswrapper[4790]: I1003 10:07:26.050879 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/80572eaf-e996-4768-9a86-9812d3aaba6a-catalog-content\") pod \"certified-operators-96rlt\" (UID: \"80572eaf-e996-4768-9a86-9812d3aaba6a\") " pod="openshift-marketplace/certified-operators-96rlt" Oct 03 10:07:26 crc kubenswrapper[4790]: I1003 10:07:26.153830 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/80572eaf-e996-4768-9a86-9812d3aaba6a-catalog-content\") pod \"certified-operators-96rlt\" (UID: \"80572eaf-e996-4768-9a86-9812d3aaba6a\") " pod="openshift-marketplace/certified-operators-96rlt" Oct 03 10:07:26 crc kubenswrapper[4790]: I1003 10:07:26.153983 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/80572eaf-e996-4768-9a86-9812d3aaba6a-utilities\") pod \"certified-operators-96rlt\" (UID: \"80572eaf-e996-4768-9a86-9812d3aaba6a\") " pod="openshift-marketplace/certified-operators-96rlt" Oct 03 10:07:26 crc kubenswrapper[4790]: I1003 10:07:26.154012 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7vvlf\" (UniqueName: \"kubernetes.io/projected/80572eaf-e996-4768-9a86-9812d3aaba6a-kube-api-access-7vvlf\") pod \"certified-operators-96rlt\" (UID: \"80572eaf-e996-4768-9a86-9812d3aaba6a\") " pod="openshift-marketplace/certified-operators-96rlt" Oct 03 10:07:26 crc kubenswrapper[4790]: I1003 10:07:26.154292 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/80572eaf-e996-4768-9a86-9812d3aaba6a-catalog-content\") pod \"certified-operators-96rlt\" (UID: \"80572eaf-e996-4768-9a86-9812d3aaba6a\") " pod="openshift-marketplace/certified-operators-96rlt" Oct 03 10:07:26 crc kubenswrapper[4790]: I1003 10:07:26.154332 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/80572eaf-e996-4768-9a86-9812d3aaba6a-utilities\") pod \"certified-operators-96rlt\" (UID: \"80572eaf-e996-4768-9a86-9812d3aaba6a\") " pod="openshift-marketplace/certified-operators-96rlt" Oct 03 10:07:26 crc kubenswrapper[4790]: I1003 10:07:26.179497 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7vvlf\" (UniqueName: \"kubernetes.io/projected/80572eaf-e996-4768-9a86-9812d3aaba6a-kube-api-access-7vvlf\") pod \"certified-operators-96rlt\" (UID: \"80572eaf-e996-4768-9a86-9812d3aaba6a\") " pod="openshift-marketplace/certified-operators-96rlt" Oct 03 10:07:26 crc kubenswrapper[4790]: I1003 10:07:26.276381 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-96rlt" Oct 03 10:07:26 crc kubenswrapper[4790]: I1003 10:07:26.858131 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-96rlt"] Oct 03 10:07:27 crc kubenswrapper[4790]: I1003 10:07:27.385681 4790 generic.go:334] "Generic (PLEG): container finished" podID="80572eaf-e996-4768-9a86-9812d3aaba6a" containerID="5267e287d13033e8b4b10816b221ca52742c872bbfe06ada3ffed27112eca435" exitCode=0 Oct 03 10:07:27 crc kubenswrapper[4790]: I1003 10:07:27.385784 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-96rlt" event={"ID":"80572eaf-e996-4768-9a86-9812d3aaba6a","Type":"ContainerDied","Data":"5267e287d13033e8b4b10816b221ca52742c872bbfe06ada3ffed27112eca435"} Oct 03 10:07:27 crc kubenswrapper[4790]: I1003 10:07:27.386016 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-96rlt" event={"ID":"80572eaf-e996-4768-9a86-9812d3aaba6a","Type":"ContainerStarted","Data":"e00018367a781a51020de5ea1f718111b99435a2ea611aac868ebf5b59f98777"} Oct 03 10:07:28 crc kubenswrapper[4790]: I1003 10:07:28.396156 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-96rlt" event={"ID":"80572eaf-e996-4768-9a86-9812d3aaba6a","Type":"ContainerStarted","Data":"9bc40027066661ade9602a64dfacbaa131ad9ec4a47da3059cc43f1f73cd1527"} Oct 03 10:07:29 crc kubenswrapper[4790]: I1003 10:07:29.408526 4790 generic.go:334] "Generic (PLEG): container finished" podID="80572eaf-e996-4768-9a86-9812d3aaba6a" containerID="9bc40027066661ade9602a64dfacbaa131ad9ec4a47da3059cc43f1f73cd1527" exitCode=0 Oct 03 10:07:29 crc kubenswrapper[4790]: I1003 10:07:29.408587 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-96rlt" event={"ID":"80572eaf-e996-4768-9a86-9812d3aaba6a","Type":"ContainerDied","Data":"9bc40027066661ade9602a64dfacbaa131ad9ec4a47da3059cc43f1f73cd1527"} Oct 03 10:07:30 crc kubenswrapper[4790]: I1003 10:07:30.426871 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-96rlt" event={"ID":"80572eaf-e996-4768-9a86-9812d3aaba6a","Type":"ContainerStarted","Data":"179fe4a4211672624a00dda3324b318887eb124b2175915e37043fa9f504b9fa"} Oct 03 10:07:30 crc kubenswrapper[4790]: I1003 10:07:30.477083 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-96rlt" podStartSLOduration=2.8210479299999998 podStartE2EDuration="5.477044164s" podCreationTimestamp="2025-10-03 10:07:25 +0000 UTC" firstStartedPulling="2025-10-03 10:07:27.390157254 +0000 UTC m=+5233.743091492" lastFinishedPulling="2025-10-03 10:07:30.046153448 +0000 UTC m=+5236.399087726" observedRunningTime="2025-10-03 10:07:30.470114512 +0000 UTC m=+5236.823048750" watchObservedRunningTime="2025-10-03 10:07:30.477044164 +0000 UTC m=+5236.829978412" Oct 03 10:07:36 crc kubenswrapper[4790]: I1003 10:07:36.277393 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-96rlt" Oct 03 10:07:36 crc kubenswrapper[4790]: I1003 10:07:36.280425 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-96rlt" Oct 03 10:07:36 crc kubenswrapper[4790]: I1003 10:07:36.348469 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-96rlt" Oct 03 10:07:36 crc kubenswrapper[4790]: I1003 10:07:36.530973 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-96rlt" Oct 03 10:07:36 crc kubenswrapper[4790]: I1003 10:07:36.600914 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-96rlt"] Oct 03 10:07:37 crc kubenswrapper[4790]: I1003 10:07:37.328485 4790 scope.go:117] "RemoveContainer" containerID="1a64f73fcffe7eaf400727d35dc4b02f6a92ebdf24e33ac845589e5623ad52ac" Oct 03 10:07:37 crc kubenswrapper[4790]: E1003 10:07:37.328811 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqj4j_openshift-machine-config-operator(36eae06a-d8be-4128-8d68-acb32e1f011b)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" podUID="36eae06a-d8be-4128-8d68-acb32e1f011b" Oct 03 10:07:38 crc kubenswrapper[4790]: I1003 10:07:38.497002 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-96rlt" podUID="80572eaf-e996-4768-9a86-9812d3aaba6a" containerName="registry-server" containerID="cri-o://179fe4a4211672624a00dda3324b318887eb124b2175915e37043fa9f504b9fa" gracePeriod=2 Oct 03 10:07:38 crc kubenswrapper[4790]: I1003 10:07:38.949549 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-96rlt" Oct 03 10:07:39 crc kubenswrapper[4790]: I1003 10:07:39.042493 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/80572eaf-e996-4768-9a86-9812d3aaba6a-utilities\") pod \"80572eaf-e996-4768-9a86-9812d3aaba6a\" (UID: \"80572eaf-e996-4768-9a86-9812d3aaba6a\") " Oct 03 10:07:39 crc kubenswrapper[4790]: I1003 10:07:39.042695 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/80572eaf-e996-4768-9a86-9812d3aaba6a-catalog-content\") pod \"80572eaf-e996-4768-9a86-9812d3aaba6a\" (UID: \"80572eaf-e996-4768-9a86-9812d3aaba6a\") " Oct 03 10:07:39 crc kubenswrapper[4790]: I1003 10:07:39.042806 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7vvlf\" (UniqueName: \"kubernetes.io/projected/80572eaf-e996-4768-9a86-9812d3aaba6a-kube-api-access-7vvlf\") pod \"80572eaf-e996-4768-9a86-9812d3aaba6a\" (UID: \"80572eaf-e996-4768-9a86-9812d3aaba6a\") " Oct 03 10:07:39 crc kubenswrapper[4790]: I1003 10:07:39.043454 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/80572eaf-e996-4768-9a86-9812d3aaba6a-utilities" (OuterVolumeSpecName: "utilities") pod "80572eaf-e996-4768-9a86-9812d3aaba6a" (UID: "80572eaf-e996-4768-9a86-9812d3aaba6a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 10:07:39 crc kubenswrapper[4790]: I1003 10:07:39.050308 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/80572eaf-e996-4768-9a86-9812d3aaba6a-kube-api-access-7vvlf" (OuterVolumeSpecName: "kube-api-access-7vvlf") pod "80572eaf-e996-4768-9a86-9812d3aaba6a" (UID: "80572eaf-e996-4768-9a86-9812d3aaba6a"). InnerVolumeSpecName "kube-api-access-7vvlf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 10:07:39 crc kubenswrapper[4790]: I1003 10:07:39.089499 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/80572eaf-e996-4768-9a86-9812d3aaba6a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "80572eaf-e996-4768-9a86-9812d3aaba6a" (UID: "80572eaf-e996-4768-9a86-9812d3aaba6a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 10:07:39 crc kubenswrapper[4790]: I1003 10:07:39.145895 4790 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/80572eaf-e996-4768-9a86-9812d3aaba6a-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 10:07:39 crc kubenswrapper[4790]: I1003 10:07:39.146160 4790 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/80572eaf-e996-4768-9a86-9812d3aaba6a-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 10:07:39 crc kubenswrapper[4790]: I1003 10:07:39.146244 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7vvlf\" (UniqueName: \"kubernetes.io/projected/80572eaf-e996-4768-9a86-9812d3aaba6a-kube-api-access-7vvlf\") on node \"crc\" DevicePath \"\"" Oct 03 10:07:39 crc kubenswrapper[4790]: I1003 10:07:39.511807 4790 generic.go:334] "Generic (PLEG): container finished" podID="80572eaf-e996-4768-9a86-9812d3aaba6a" containerID="179fe4a4211672624a00dda3324b318887eb124b2175915e37043fa9f504b9fa" exitCode=0 Oct 03 10:07:39 crc kubenswrapper[4790]: I1003 10:07:39.511850 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-96rlt" event={"ID":"80572eaf-e996-4768-9a86-9812d3aaba6a","Type":"ContainerDied","Data":"179fe4a4211672624a00dda3324b318887eb124b2175915e37043fa9f504b9fa"} Oct 03 10:07:39 crc kubenswrapper[4790]: I1003 10:07:39.511881 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-96rlt" event={"ID":"80572eaf-e996-4768-9a86-9812d3aaba6a","Type":"ContainerDied","Data":"e00018367a781a51020de5ea1f718111b99435a2ea611aac868ebf5b59f98777"} Oct 03 10:07:39 crc kubenswrapper[4790]: I1003 10:07:39.511896 4790 scope.go:117] "RemoveContainer" containerID="179fe4a4211672624a00dda3324b318887eb124b2175915e37043fa9f504b9fa" Oct 03 10:07:39 crc kubenswrapper[4790]: I1003 10:07:39.511893 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-96rlt" Oct 03 10:07:39 crc kubenswrapper[4790]: I1003 10:07:39.543323 4790 scope.go:117] "RemoveContainer" containerID="9bc40027066661ade9602a64dfacbaa131ad9ec4a47da3059cc43f1f73cd1527" Oct 03 10:07:39 crc kubenswrapper[4790]: I1003 10:07:39.548070 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-96rlt"] Oct 03 10:07:39 crc kubenswrapper[4790]: I1003 10:07:39.581756 4790 scope.go:117] "RemoveContainer" containerID="5267e287d13033e8b4b10816b221ca52742c872bbfe06ada3ffed27112eca435" Oct 03 10:07:39 crc kubenswrapper[4790]: I1003 10:07:39.582678 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-96rlt"] Oct 03 10:07:39 crc kubenswrapper[4790]: I1003 10:07:39.638266 4790 scope.go:117] "RemoveContainer" containerID="179fe4a4211672624a00dda3324b318887eb124b2175915e37043fa9f504b9fa" Oct 03 10:07:39 crc kubenswrapper[4790]: E1003 10:07:39.638879 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"179fe4a4211672624a00dda3324b318887eb124b2175915e37043fa9f504b9fa\": container with ID starting with 179fe4a4211672624a00dda3324b318887eb124b2175915e37043fa9f504b9fa not found: ID does not exist" containerID="179fe4a4211672624a00dda3324b318887eb124b2175915e37043fa9f504b9fa" Oct 03 10:07:39 crc kubenswrapper[4790]: I1003 10:07:39.638931 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"179fe4a4211672624a00dda3324b318887eb124b2175915e37043fa9f504b9fa"} err="failed to get container status \"179fe4a4211672624a00dda3324b318887eb124b2175915e37043fa9f504b9fa\": rpc error: code = NotFound desc = could not find container \"179fe4a4211672624a00dda3324b318887eb124b2175915e37043fa9f504b9fa\": container with ID starting with 179fe4a4211672624a00dda3324b318887eb124b2175915e37043fa9f504b9fa not found: ID does not exist" Oct 03 10:07:39 crc kubenswrapper[4790]: I1003 10:07:39.638966 4790 scope.go:117] "RemoveContainer" containerID="9bc40027066661ade9602a64dfacbaa131ad9ec4a47da3059cc43f1f73cd1527" Oct 03 10:07:39 crc kubenswrapper[4790]: E1003 10:07:39.639539 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9bc40027066661ade9602a64dfacbaa131ad9ec4a47da3059cc43f1f73cd1527\": container with ID starting with 9bc40027066661ade9602a64dfacbaa131ad9ec4a47da3059cc43f1f73cd1527 not found: ID does not exist" containerID="9bc40027066661ade9602a64dfacbaa131ad9ec4a47da3059cc43f1f73cd1527" Oct 03 10:07:39 crc kubenswrapper[4790]: I1003 10:07:39.639594 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9bc40027066661ade9602a64dfacbaa131ad9ec4a47da3059cc43f1f73cd1527"} err="failed to get container status \"9bc40027066661ade9602a64dfacbaa131ad9ec4a47da3059cc43f1f73cd1527\": rpc error: code = NotFound desc = could not find container \"9bc40027066661ade9602a64dfacbaa131ad9ec4a47da3059cc43f1f73cd1527\": container with ID starting with 9bc40027066661ade9602a64dfacbaa131ad9ec4a47da3059cc43f1f73cd1527 not found: ID does not exist" Oct 03 10:07:39 crc kubenswrapper[4790]: I1003 10:07:39.639614 4790 scope.go:117] "RemoveContainer" containerID="5267e287d13033e8b4b10816b221ca52742c872bbfe06ada3ffed27112eca435" Oct 03 10:07:39 crc kubenswrapper[4790]: E1003 10:07:39.640006 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5267e287d13033e8b4b10816b221ca52742c872bbfe06ada3ffed27112eca435\": container with ID starting with 5267e287d13033e8b4b10816b221ca52742c872bbfe06ada3ffed27112eca435 not found: ID does not exist" containerID="5267e287d13033e8b4b10816b221ca52742c872bbfe06ada3ffed27112eca435" Oct 03 10:07:39 crc kubenswrapper[4790]: I1003 10:07:39.640046 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5267e287d13033e8b4b10816b221ca52742c872bbfe06ada3ffed27112eca435"} err="failed to get container status \"5267e287d13033e8b4b10816b221ca52742c872bbfe06ada3ffed27112eca435\": rpc error: code = NotFound desc = could not find container \"5267e287d13033e8b4b10816b221ca52742c872bbfe06ada3ffed27112eca435\": container with ID starting with 5267e287d13033e8b4b10816b221ca52742c872bbfe06ada3ffed27112eca435 not found: ID does not exist" Oct 03 10:07:40 crc kubenswrapper[4790]: I1003 10:07:40.345717 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="80572eaf-e996-4768-9a86-9812d3aaba6a" path="/var/lib/kubelet/pods/80572eaf-e996-4768-9a86-9812d3aaba6a/volumes" Oct 03 10:07:49 crc kubenswrapper[4790]: I1003 10:07:49.332135 4790 scope.go:117] "RemoveContainer" containerID="1a64f73fcffe7eaf400727d35dc4b02f6a92ebdf24e33ac845589e5623ad52ac" Oct 03 10:07:49 crc kubenswrapper[4790]: E1003 10:07:49.333268 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqj4j_openshift-machine-config-operator(36eae06a-d8be-4128-8d68-acb32e1f011b)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" podUID="36eae06a-d8be-4128-8d68-acb32e1f011b" Oct 03 10:08:01 crc kubenswrapper[4790]: I1003 10:08:01.328547 4790 scope.go:117] "RemoveContainer" containerID="1a64f73fcffe7eaf400727d35dc4b02f6a92ebdf24e33ac845589e5623ad52ac" Oct 03 10:08:01 crc kubenswrapper[4790]: E1003 10:08:01.329415 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqj4j_openshift-machine-config-operator(36eae06a-d8be-4128-8d68-acb32e1f011b)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" podUID="36eae06a-d8be-4128-8d68-acb32e1f011b" Oct 03 10:08:13 crc kubenswrapper[4790]: I1003 10:08:13.328198 4790 scope.go:117] "RemoveContainer" containerID="1a64f73fcffe7eaf400727d35dc4b02f6a92ebdf24e33ac845589e5623ad52ac" Oct 03 10:08:14 crc kubenswrapper[4790]: I1003 10:08:14.847167 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" event={"ID":"36eae06a-d8be-4128-8d68-acb32e1f011b","Type":"ContainerStarted","Data":"bc04f8a0c2ea3bee219e93e8f6cbd077be44f16f505b12226dc4f542aabae8e0"} Oct 03 10:10:36 crc kubenswrapper[4790]: I1003 10:10:36.910527 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-gnt8r"] Oct 03 10:10:36 crc kubenswrapper[4790]: E1003 10:10:36.911949 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="80572eaf-e996-4768-9a86-9812d3aaba6a" containerName="extract-utilities" Oct 03 10:10:36 crc kubenswrapper[4790]: I1003 10:10:36.911965 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="80572eaf-e996-4768-9a86-9812d3aaba6a" containerName="extract-utilities" Oct 03 10:10:36 crc kubenswrapper[4790]: E1003 10:10:36.911974 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="80572eaf-e996-4768-9a86-9812d3aaba6a" containerName="registry-server" Oct 03 10:10:36 crc kubenswrapper[4790]: I1003 10:10:36.911980 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="80572eaf-e996-4768-9a86-9812d3aaba6a" containerName="registry-server" Oct 03 10:10:36 crc kubenswrapper[4790]: E1003 10:10:36.912017 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="80572eaf-e996-4768-9a86-9812d3aaba6a" containerName="extract-content" Oct 03 10:10:36 crc kubenswrapper[4790]: I1003 10:10:36.912024 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="80572eaf-e996-4768-9a86-9812d3aaba6a" containerName="extract-content" Oct 03 10:10:36 crc kubenswrapper[4790]: I1003 10:10:36.912212 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="80572eaf-e996-4768-9a86-9812d3aaba6a" containerName="registry-server" Oct 03 10:10:36 crc kubenswrapper[4790]: I1003 10:10:36.914195 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-gnt8r" Oct 03 10:10:36 crc kubenswrapper[4790]: I1003 10:10:36.924766 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-gnt8r"] Oct 03 10:10:37 crc kubenswrapper[4790]: I1003 10:10:37.068495 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ff7d8b29-32ec-424e-abde-563eb6387b3e-utilities\") pod \"community-operators-gnt8r\" (UID: \"ff7d8b29-32ec-424e-abde-563eb6387b3e\") " pod="openshift-marketplace/community-operators-gnt8r" Oct 03 10:10:37 crc kubenswrapper[4790]: I1003 10:10:37.068627 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gp8hb\" (UniqueName: \"kubernetes.io/projected/ff7d8b29-32ec-424e-abde-563eb6387b3e-kube-api-access-gp8hb\") pod \"community-operators-gnt8r\" (UID: \"ff7d8b29-32ec-424e-abde-563eb6387b3e\") " pod="openshift-marketplace/community-operators-gnt8r" Oct 03 10:10:37 crc kubenswrapper[4790]: I1003 10:10:37.068693 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ff7d8b29-32ec-424e-abde-563eb6387b3e-catalog-content\") pod \"community-operators-gnt8r\" (UID: \"ff7d8b29-32ec-424e-abde-563eb6387b3e\") " pod="openshift-marketplace/community-operators-gnt8r" Oct 03 10:10:37 crc kubenswrapper[4790]: I1003 10:10:37.170853 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ff7d8b29-32ec-424e-abde-563eb6387b3e-utilities\") pod \"community-operators-gnt8r\" (UID: \"ff7d8b29-32ec-424e-abde-563eb6387b3e\") " pod="openshift-marketplace/community-operators-gnt8r" Oct 03 10:10:37 crc kubenswrapper[4790]: I1003 10:10:37.170946 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gp8hb\" (UniqueName: \"kubernetes.io/projected/ff7d8b29-32ec-424e-abde-563eb6387b3e-kube-api-access-gp8hb\") pod \"community-operators-gnt8r\" (UID: \"ff7d8b29-32ec-424e-abde-563eb6387b3e\") " pod="openshift-marketplace/community-operators-gnt8r" Oct 03 10:10:37 crc kubenswrapper[4790]: I1003 10:10:37.171026 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ff7d8b29-32ec-424e-abde-563eb6387b3e-catalog-content\") pod \"community-operators-gnt8r\" (UID: \"ff7d8b29-32ec-424e-abde-563eb6387b3e\") " pod="openshift-marketplace/community-operators-gnt8r" Oct 03 10:10:37 crc kubenswrapper[4790]: I1003 10:10:37.171495 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ff7d8b29-32ec-424e-abde-563eb6387b3e-utilities\") pod \"community-operators-gnt8r\" (UID: \"ff7d8b29-32ec-424e-abde-563eb6387b3e\") " pod="openshift-marketplace/community-operators-gnt8r" Oct 03 10:10:37 crc kubenswrapper[4790]: I1003 10:10:37.171723 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ff7d8b29-32ec-424e-abde-563eb6387b3e-catalog-content\") pod \"community-operators-gnt8r\" (UID: \"ff7d8b29-32ec-424e-abde-563eb6387b3e\") " pod="openshift-marketplace/community-operators-gnt8r" Oct 03 10:10:37 crc kubenswrapper[4790]: I1003 10:10:37.207686 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gp8hb\" (UniqueName: \"kubernetes.io/projected/ff7d8b29-32ec-424e-abde-563eb6387b3e-kube-api-access-gp8hb\") pod \"community-operators-gnt8r\" (UID: \"ff7d8b29-32ec-424e-abde-563eb6387b3e\") " pod="openshift-marketplace/community-operators-gnt8r" Oct 03 10:10:37 crc kubenswrapper[4790]: I1003 10:10:37.280984 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-gnt8r" Oct 03 10:10:37 crc kubenswrapper[4790]: I1003 10:10:37.849602 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-gnt8r"] Oct 03 10:10:38 crc kubenswrapper[4790]: I1003 10:10:38.269968 4790 generic.go:334] "Generic (PLEG): container finished" podID="ff7d8b29-32ec-424e-abde-563eb6387b3e" containerID="609338ef43dc96cb41a35910c32120547e0d7aa268b20f7a1c77d57a14f18598" exitCode=0 Oct 03 10:10:38 crc kubenswrapper[4790]: I1003 10:10:38.270018 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gnt8r" event={"ID":"ff7d8b29-32ec-424e-abde-563eb6387b3e","Type":"ContainerDied","Data":"609338ef43dc96cb41a35910c32120547e0d7aa268b20f7a1c77d57a14f18598"} Oct 03 10:10:38 crc kubenswrapper[4790]: I1003 10:10:38.270332 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gnt8r" event={"ID":"ff7d8b29-32ec-424e-abde-563eb6387b3e","Type":"ContainerStarted","Data":"2c3f46a94a6b93ee65e84333749b84acf3faf8f93583769652d762b4aa03bcdf"} Oct 03 10:10:38 crc kubenswrapper[4790]: I1003 10:10:38.272476 4790 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 03 10:10:39 crc kubenswrapper[4790]: I1003 10:10:39.290803 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gnt8r" event={"ID":"ff7d8b29-32ec-424e-abde-563eb6387b3e","Type":"ContainerStarted","Data":"45d8484d80f316903499e5fdfd06711348110a9e8035a36cb6cf64534777d07c"} Oct 03 10:10:40 crc kubenswrapper[4790]: I1003 10:10:40.186128 4790 patch_prober.go:28] interesting pod/machine-config-daemon-zqj4j container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 10:10:40 crc kubenswrapper[4790]: I1003 10:10:40.186463 4790 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" podUID="36eae06a-d8be-4128-8d68-acb32e1f011b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 10:10:40 crc kubenswrapper[4790]: I1003 10:10:40.301382 4790 generic.go:334] "Generic (PLEG): container finished" podID="ff7d8b29-32ec-424e-abde-563eb6387b3e" containerID="45d8484d80f316903499e5fdfd06711348110a9e8035a36cb6cf64534777d07c" exitCode=0 Oct 03 10:10:40 crc kubenswrapper[4790]: I1003 10:10:40.301433 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gnt8r" event={"ID":"ff7d8b29-32ec-424e-abde-563eb6387b3e","Type":"ContainerDied","Data":"45d8484d80f316903499e5fdfd06711348110a9e8035a36cb6cf64534777d07c"} Oct 03 10:10:41 crc kubenswrapper[4790]: I1003 10:10:41.313555 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gnt8r" event={"ID":"ff7d8b29-32ec-424e-abde-563eb6387b3e","Type":"ContainerStarted","Data":"77d3725fe9dff771eee7783b3aa535bfc4ab29963ea1a0a5dfd93e75390b58c6"} Oct 03 10:10:41 crc kubenswrapper[4790]: I1003 10:10:41.333370 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-gnt8r" podStartSLOduration=2.919127779 podStartE2EDuration="5.333349909s" podCreationTimestamp="2025-10-03 10:10:36 +0000 UTC" firstStartedPulling="2025-10-03 10:10:38.272066695 +0000 UTC m=+5424.625000943" lastFinishedPulling="2025-10-03 10:10:40.686288815 +0000 UTC m=+5427.039223073" observedRunningTime="2025-10-03 10:10:41.330738607 +0000 UTC m=+5427.683672855" watchObservedRunningTime="2025-10-03 10:10:41.333349909 +0000 UTC m=+5427.686284147" Oct 03 10:10:44 crc kubenswrapper[4790]: I1003 10:10:44.292475 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-22plk"] Oct 03 10:10:44 crc kubenswrapper[4790]: I1003 10:10:44.295685 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-22plk" Oct 03 10:10:44 crc kubenswrapper[4790]: I1003 10:10:44.320234 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-22plk"] Oct 03 10:10:44 crc kubenswrapper[4790]: I1003 10:10:44.421714 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ctp5z\" (UniqueName: \"kubernetes.io/projected/1ebc880d-93ac-4e27-aa9b-e553df6bcbd8-kube-api-access-ctp5z\") pod \"redhat-operators-22plk\" (UID: \"1ebc880d-93ac-4e27-aa9b-e553df6bcbd8\") " pod="openshift-marketplace/redhat-operators-22plk" Oct 03 10:10:44 crc kubenswrapper[4790]: I1003 10:10:44.422371 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1ebc880d-93ac-4e27-aa9b-e553df6bcbd8-catalog-content\") pod \"redhat-operators-22plk\" (UID: \"1ebc880d-93ac-4e27-aa9b-e553df6bcbd8\") " pod="openshift-marketplace/redhat-operators-22plk" Oct 03 10:10:44 crc kubenswrapper[4790]: I1003 10:10:44.422441 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1ebc880d-93ac-4e27-aa9b-e553df6bcbd8-utilities\") pod \"redhat-operators-22plk\" (UID: \"1ebc880d-93ac-4e27-aa9b-e553df6bcbd8\") " pod="openshift-marketplace/redhat-operators-22plk" Oct 03 10:10:44 crc kubenswrapper[4790]: I1003 10:10:44.525040 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ctp5z\" (UniqueName: \"kubernetes.io/projected/1ebc880d-93ac-4e27-aa9b-e553df6bcbd8-kube-api-access-ctp5z\") pod \"redhat-operators-22plk\" (UID: \"1ebc880d-93ac-4e27-aa9b-e553df6bcbd8\") " pod="openshift-marketplace/redhat-operators-22plk" Oct 03 10:10:44 crc kubenswrapper[4790]: I1003 10:10:44.525243 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1ebc880d-93ac-4e27-aa9b-e553df6bcbd8-catalog-content\") pod \"redhat-operators-22plk\" (UID: \"1ebc880d-93ac-4e27-aa9b-e553df6bcbd8\") " pod="openshift-marketplace/redhat-operators-22plk" Oct 03 10:10:44 crc kubenswrapper[4790]: I1003 10:10:44.525271 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1ebc880d-93ac-4e27-aa9b-e553df6bcbd8-utilities\") pod \"redhat-operators-22plk\" (UID: \"1ebc880d-93ac-4e27-aa9b-e553df6bcbd8\") " pod="openshift-marketplace/redhat-operators-22plk" Oct 03 10:10:44 crc kubenswrapper[4790]: I1003 10:10:44.525906 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1ebc880d-93ac-4e27-aa9b-e553df6bcbd8-catalog-content\") pod \"redhat-operators-22plk\" (UID: \"1ebc880d-93ac-4e27-aa9b-e553df6bcbd8\") " pod="openshift-marketplace/redhat-operators-22plk" Oct 03 10:10:44 crc kubenswrapper[4790]: I1003 10:10:44.526002 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1ebc880d-93ac-4e27-aa9b-e553df6bcbd8-utilities\") pod \"redhat-operators-22plk\" (UID: \"1ebc880d-93ac-4e27-aa9b-e553df6bcbd8\") " pod="openshift-marketplace/redhat-operators-22plk" Oct 03 10:10:44 crc kubenswrapper[4790]: I1003 10:10:44.550811 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ctp5z\" (UniqueName: \"kubernetes.io/projected/1ebc880d-93ac-4e27-aa9b-e553df6bcbd8-kube-api-access-ctp5z\") pod \"redhat-operators-22plk\" (UID: \"1ebc880d-93ac-4e27-aa9b-e553df6bcbd8\") " pod="openshift-marketplace/redhat-operators-22plk" Oct 03 10:10:44 crc kubenswrapper[4790]: I1003 10:10:44.626005 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-22plk" Oct 03 10:10:45 crc kubenswrapper[4790]: I1003 10:10:45.140013 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-22plk"] Oct 03 10:10:45 crc kubenswrapper[4790]: W1003 10:10:45.143715 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1ebc880d_93ac_4e27_aa9b_e553df6bcbd8.slice/crio-a9023ccf03c82f127b117dfd2971697f476119e30217636b4da3b991df5f3842 WatchSource:0}: Error finding container a9023ccf03c82f127b117dfd2971697f476119e30217636b4da3b991df5f3842: Status 404 returned error can't find the container with id a9023ccf03c82f127b117dfd2971697f476119e30217636b4da3b991df5f3842 Oct 03 10:10:45 crc kubenswrapper[4790]: I1003 10:10:45.355114 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-22plk" event={"ID":"1ebc880d-93ac-4e27-aa9b-e553df6bcbd8","Type":"ContainerStarted","Data":"4e78fac0a1bb892bb9bef15c159892ac374ed0476bf115d2227bfb081caa8080"} Oct 03 10:10:45 crc kubenswrapper[4790]: I1003 10:10:45.355294 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-22plk" event={"ID":"1ebc880d-93ac-4e27-aa9b-e553df6bcbd8","Type":"ContainerStarted","Data":"a9023ccf03c82f127b117dfd2971697f476119e30217636b4da3b991df5f3842"} Oct 03 10:10:46 crc kubenswrapper[4790]: I1003 10:10:46.381426 4790 generic.go:334] "Generic (PLEG): container finished" podID="1ebc880d-93ac-4e27-aa9b-e553df6bcbd8" containerID="4e78fac0a1bb892bb9bef15c159892ac374ed0476bf115d2227bfb081caa8080" exitCode=0 Oct 03 10:10:46 crc kubenswrapper[4790]: I1003 10:10:46.381927 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-22plk" event={"ID":"1ebc880d-93ac-4e27-aa9b-e553df6bcbd8","Type":"ContainerDied","Data":"4e78fac0a1bb892bb9bef15c159892ac374ed0476bf115d2227bfb081caa8080"} Oct 03 10:10:47 crc kubenswrapper[4790]: I1003 10:10:47.281044 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-gnt8r" Oct 03 10:10:47 crc kubenswrapper[4790]: I1003 10:10:47.281101 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-gnt8r" Oct 03 10:10:47 crc kubenswrapper[4790]: I1003 10:10:47.330960 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-gnt8r" Oct 03 10:10:47 crc kubenswrapper[4790]: I1003 10:10:47.448920 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-gnt8r" Oct 03 10:10:48 crc kubenswrapper[4790]: I1003 10:10:48.404155 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-22plk" event={"ID":"1ebc880d-93ac-4e27-aa9b-e553df6bcbd8","Type":"ContainerStarted","Data":"69a3bb1d79f7c0208575214bdaab2f1cc453ea3297243dd2c14f68ad3bcfe9b1"} Oct 03 10:10:49 crc kubenswrapper[4790]: I1003 10:10:49.685106 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-gnt8r"] Oct 03 10:10:49 crc kubenswrapper[4790]: I1003 10:10:49.687087 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-gnt8r" podUID="ff7d8b29-32ec-424e-abde-563eb6387b3e" containerName="registry-server" containerID="cri-o://77d3725fe9dff771eee7783b3aa535bfc4ab29963ea1a0a5dfd93e75390b58c6" gracePeriod=2 Oct 03 10:10:50 crc kubenswrapper[4790]: I1003 10:10:50.426543 4790 generic.go:334] "Generic (PLEG): container finished" podID="ff7d8b29-32ec-424e-abde-563eb6387b3e" containerID="77d3725fe9dff771eee7783b3aa535bfc4ab29963ea1a0a5dfd93e75390b58c6" exitCode=0 Oct 03 10:10:50 crc kubenswrapper[4790]: I1003 10:10:50.426728 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gnt8r" event={"ID":"ff7d8b29-32ec-424e-abde-563eb6387b3e","Type":"ContainerDied","Data":"77d3725fe9dff771eee7783b3aa535bfc4ab29963ea1a0a5dfd93e75390b58c6"} Oct 03 10:10:50 crc kubenswrapper[4790]: I1003 10:10:50.639111 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-gnt8r" Oct 03 10:10:50 crc kubenswrapper[4790]: I1003 10:10:50.777077 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ff7d8b29-32ec-424e-abde-563eb6387b3e-catalog-content\") pod \"ff7d8b29-32ec-424e-abde-563eb6387b3e\" (UID: \"ff7d8b29-32ec-424e-abde-563eb6387b3e\") " Oct 03 10:10:50 crc kubenswrapper[4790]: I1003 10:10:50.777153 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ff7d8b29-32ec-424e-abde-563eb6387b3e-utilities\") pod \"ff7d8b29-32ec-424e-abde-563eb6387b3e\" (UID: \"ff7d8b29-32ec-424e-abde-563eb6387b3e\") " Oct 03 10:10:50 crc kubenswrapper[4790]: I1003 10:10:50.777257 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gp8hb\" (UniqueName: \"kubernetes.io/projected/ff7d8b29-32ec-424e-abde-563eb6387b3e-kube-api-access-gp8hb\") pod \"ff7d8b29-32ec-424e-abde-563eb6387b3e\" (UID: \"ff7d8b29-32ec-424e-abde-563eb6387b3e\") " Oct 03 10:10:50 crc kubenswrapper[4790]: I1003 10:10:50.778874 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ff7d8b29-32ec-424e-abde-563eb6387b3e-utilities" (OuterVolumeSpecName: "utilities") pod "ff7d8b29-32ec-424e-abde-563eb6387b3e" (UID: "ff7d8b29-32ec-424e-abde-563eb6387b3e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 10:10:50 crc kubenswrapper[4790]: I1003 10:10:50.791783 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ff7d8b29-32ec-424e-abde-563eb6387b3e-kube-api-access-gp8hb" (OuterVolumeSpecName: "kube-api-access-gp8hb") pod "ff7d8b29-32ec-424e-abde-563eb6387b3e" (UID: "ff7d8b29-32ec-424e-abde-563eb6387b3e"). InnerVolumeSpecName "kube-api-access-gp8hb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 10:10:50 crc kubenswrapper[4790]: I1003 10:10:50.835386 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ff7d8b29-32ec-424e-abde-563eb6387b3e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ff7d8b29-32ec-424e-abde-563eb6387b3e" (UID: "ff7d8b29-32ec-424e-abde-563eb6387b3e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 10:10:50 crc kubenswrapper[4790]: I1003 10:10:50.880291 4790 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ff7d8b29-32ec-424e-abde-563eb6387b3e-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 10:10:50 crc kubenswrapper[4790]: I1003 10:10:50.880337 4790 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ff7d8b29-32ec-424e-abde-563eb6387b3e-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 10:10:50 crc kubenswrapper[4790]: I1003 10:10:50.880351 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gp8hb\" (UniqueName: \"kubernetes.io/projected/ff7d8b29-32ec-424e-abde-563eb6387b3e-kube-api-access-gp8hb\") on node \"crc\" DevicePath \"\"" Oct 03 10:10:51 crc kubenswrapper[4790]: I1003 10:10:51.441325 4790 generic.go:334] "Generic (PLEG): container finished" podID="1ebc880d-93ac-4e27-aa9b-e553df6bcbd8" containerID="69a3bb1d79f7c0208575214bdaab2f1cc453ea3297243dd2c14f68ad3bcfe9b1" exitCode=0 Oct 03 10:10:51 crc kubenswrapper[4790]: I1003 10:10:51.441419 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-22plk" event={"ID":"1ebc880d-93ac-4e27-aa9b-e553df6bcbd8","Type":"ContainerDied","Data":"69a3bb1d79f7c0208575214bdaab2f1cc453ea3297243dd2c14f68ad3bcfe9b1"} Oct 03 10:10:51 crc kubenswrapper[4790]: I1003 10:10:51.444875 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gnt8r" event={"ID":"ff7d8b29-32ec-424e-abde-563eb6387b3e","Type":"ContainerDied","Data":"2c3f46a94a6b93ee65e84333749b84acf3faf8f93583769652d762b4aa03bcdf"} Oct 03 10:10:51 crc kubenswrapper[4790]: I1003 10:10:51.444926 4790 scope.go:117] "RemoveContainer" containerID="77d3725fe9dff771eee7783b3aa535bfc4ab29963ea1a0a5dfd93e75390b58c6" Oct 03 10:10:51 crc kubenswrapper[4790]: I1003 10:10:51.444935 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-gnt8r" Oct 03 10:10:51 crc kubenswrapper[4790]: I1003 10:10:51.465562 4790 scope.go:117] "RemoveContainer" containerID="45d8484d80f316903499e5fdfd06711348110a9e8035a36cb6cf64534777d07c" Oct 03 10:10:51 crc kubenswrapper[4790]: I1003 10:10:51.494519 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-gnt8r"] Oct 03 10:10:51 crc kubenswrapper[4790]: I1003 10:10:51.503053 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-gnt8r"] Oct 03 10:10:51 crc kubenswrapper[4790]: I1003 10:10:51.509980 4790 scope.go:117] "RemoveContainer" containerID="609338ef43dc96cb41a35910c32120547e0d7aa268b20f7a1c77d57a14f18598" Oct 03 10:10:52 crc kubenswrapper[4790]: I1003 10:10:52.343437 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ff7d8b29-32ec-424e-abde-563eb6387b3e" path="/var/lib/kubelet/pods/ff7d8b29-32ec-424e-abde-563eb6387b3e/volumes" Oct 03 10:10:52 crc kubenswrapper[4790]: I1003 10:10:52.457944 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-22plk" event={"ID":"1ebc880d-93ac-4e27-aa9b-e553df6bcbd8","Type":"ContainerStarted","Data":"20c99bfe3cee250739b34cb2db95394ed86ea8ad996503dadd8956891a591f7e"} Oct 03 10:10:52 crc kubenswrapper[4790]: I1003 10:10:52.484224 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-22plk" podStartSLOduration=2.802858048 podStartE2EDuration="8.484201085s" podCreationTimestamp="2025-10-03 10:10:44 +0000 UTC" firstStartedPulling="2025-10-03 10:10:46.385655749 +0000 UTC m=+5432.738589987" lastFinishedPulling="2025-10-03 10:10:52.066998786 +0000 UTC m=+5438.419933024" observedRunningTime="2025-10-03 10:10:52.478394215 +0000 UTC m=+5438.831328473" watchObservedRunningTime="2025-10-03 10:10:52.484201085 +0000 UTC m=+5438.837135323" Oct 03 10:10:54 crc kubenswrapper[4790]: I1003 10:10:54.629155 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-22plk" Oct 03 10:10:54 crc kubenswrapper[4790]: I1003 10:10:54.629732 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-22plk" Oct 03 10:10:55 crc kubenswrapper[4790]: I1003 10:10:55.677170 4790 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-22plk" podUID="1ebc880d-93ac-4e27-aa9b-e553df6bcbd8" containerName="registry-server" probeResult="failure" output=< Oct 03 10:10:55 crc kubenswrapper[4790]: timeout: failed to connect service ":50051" within 1s Oct 03 10:10:55 crc kubenswrapper[4790]: > Oct 03 10:11:04 crc kubenswrapper[4790]: I1003 10:11:04.699789 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-22plk" Oct 03 10:11:04 crc kubenswrapper[4790]: I1003 10:11:04.750766 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-22plk" Oct 03 10:11:04 crc kubenswrapper[4790]: I1003 10:11:04.942484 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-22plk"] Oct 03 10:11:06 crc kubenswrapper[4790]: I1003 10:11:06.603726 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-22plk" podUID="1ebc880d-93ac-4e27-aa9b-e553df6bcbd8" containerName="registry-server" containerID="cri-o://20c99bfe3cee250739b34cb2db95394ed86ea8ad996503dadd8956891a591f7e" gracePeriod=2 Oct 03 10:11:07 crc kubenswrapper[4790]: I1003 10:11:07.079212 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-22plk" Oct 03 10:11:07 crc kubenswrapper[4790]: I1003 10:11:07.224452 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1ebc880d-93ac-4e27-aa9b-e553df6bcbd8-utilities\") pod \"1ebc880d-93ac-4e27-aa9b-e553df6bcbd8\" (UID: \"1ebc880d-93ac-4e27-aa9b-e553df6bcbd8\") " Oct 03 10:11:07 crc kubenswrapper[4790]: I1003 10:11:07.224719 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ctp5z\" (UniqueName: \"kubernetes.io/projected/1ebc880d-93ac-4e27-aa9b-e553df6bcbd8-kube-api-access-ctp5z\") pod \"1ebc880d-93ac-4e27-aa9b-e553df6bcbd8\" (UID: \"1ebc880d-93ac-4e27-aa9b-e553df6bcbd8\") " Oct 03 10:11:07 crc kubenswrapper[4790]: I1003 10:11:07.224804 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1ebc880d-93ac-4e27-aa9b-e553df6bcbd8-catalog-content\") pod \"1ebc880d-93ac-4e27-aa9b-e553df6bcbd8\" (UID: \"1ebc880d-93ac-4e27-aa9b-e553df6bcbd8\") " Oct 03 10:11:07 crc kubenswrapper[4790]: I1003 10:11:07.225510 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1ebc880d-93ac-4e27-aa9b-e553df6bcbd8-utilities" (OuterVolumeSpecName: "utilities") pod "1ebc880d-93ac-4e27-aa9b-e553df6bcbd8" (UID: "1ebc880d-93ac-4e27-aa9b-e553df6bcbd8"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 10:11:07 crc kubenswrapper[4790]: I1003 10:11:07.237242 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1ebc880d-93ac-4e27-aa9b-e553df6bcbd8-kube-api-access-ctp5z" (OuterVolumeSpecName: "kube-api-access-ctp5z") pod "1ebc880d-93ac-4e27-aa9b-e553df6bcbd8" (UID: "1ebc880d-93ac-4e27-aa9b-e553df6bcbd8"). InnerVolumeSpecName "kube-api-access-ctp5z". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 10:11:07 crc kubenswrapper[4790]: I1003 10:11:07.327502 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ctp5z\" (UniqueName: \"kubernetes.io/projected/1ebc880d-93ac-4e27-aa9b-e553df6bcbd8-kube-api-access-ctp5z\") on node \"crc\" DevicePath \"\"" Oct 03 10:11:07 crc kubenswrapper[4790]: I1003 10:11:07.327751 4790 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1ebc880d-93ac-4e27-aa9b-e553df6bcbd8-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 10:11:07 crc kubenswrapper[4790]: I1003 10:11:07.348185 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1ebc880d-93ac-4e27-aa9b-e553df6bcbd8-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1ebc880d-93ac-4e27-aa9b-e553df6bcbd8" (UID: "1ebc880d-93ac-4e27-aa9b-e553df6bcbd8"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 10:11:07 crc kubenswrapper[4790]: I1003 10:11:07.431540 4790 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1ebc880d-93ac-4e27-aa9b-e553df6bcbd8-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 10:11:07 crc kubenswrapper[4790]: I1003 10:11:07.619017 4790 generic.go:334] "Generic (PLEG): container finished" podID="1ebc880d-93ac-4e27-aa9b-e553df6bcbd8" containerID="20c99bfe3cee250739b34cb2db95394ed86ea8ad996503dadd8956891a591f7e" exitCode=0 Oct 03 10:11:07 crc kubenswrapper[4790]: I1003 10:11:07.619067 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-22plk" event={"ID":"1ebc880d-93ac-4e27-aa9b-e553df6bcbd8","Type":"ContainerDied","Data":"20c99bfe3cee250739b34cb2db95394ed86ea8ad996503dadd8956891a591f7e"} Oct 03 10:11:07 crc kubenswrapper[4790]: I1003 10:11:07.619098 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-22plk" event={"ID":"1ebc880d-93ac-4e27-aa9b-e553df6bcbd8","Type":"ContainerDied","Data":"a9023ccf03c82f127b117dfd2971697f476119e30217636b4da3b991df5f3842"} Oct 03 10:11:07 crc kubenswrapper[4790]: I1003 10:11:07.619105 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-22plk" Oct 03 10:11:07 crc kubenswrapper[4790]: I1003 10:11:07.619116 4790 scope.go:117] "RemoveContainer" containerID="20c99bfe3cee250739b34cb2db95394ed86ea8ad996503dadd8956891a591f7e" Oct 03 10:11:07 crc kubenswrapper[4790]: I1003 10:11:07.655759 4790 scope.go:117] "RemoveContainer" containerID="69a3bb1d79f7c0208575214bdaab2f1cc453ea3297243dd2c14f68ad3bcfe9b1" Oct 03 10:11:07 crc kubenswrapper[4790]: I1003 10:11:07.672740 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-22plk"] Oct 03 10:11:07 crc kubenswrapper[4790]: I1003 10:11:07.725551 4790 scope.go:117] "RemoveContainer" containerID="4e78fac0a1bb892bb9bef15c159892ac374ed0476bf115d2227bfb081caa8080" Oct 03 10:11:07 crc kubenswrapper[4790]: I1003 10:11:07.728144 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-22plk"] Oct 03 10:11:07 crc kubenswrapper[4790]: I1003 10:11:07.760385 4790 scope.go:117] "RemoveContainer" containerID="20c99bfe3cee250739b34cb2db95394ed86ea8ad996503dadd8956891a591f7e" Oct 03 10:11:07 crc kubenswrapper[4790]: E1003 10:11:07.761085 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"20c99bfe3cee250739b34cb2db95394ed86ea8ad996503dadd8956891a591f7e\": container with ID starting with 20c99bfe3cee250739b34cb2db95394ed86ea8ad996503dadd8956891a591f7e not found: ID does not exist" containerID="20c99bfe3cee250739b34cb2db95394ed86ea8ad996503dadd8956891a591f7e" Oct 03 10:11:07 crc kubenswrapper[4790]: I1003 10:11:07.761153 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"20c99bfe3cee250739b34cb2db95394ed86ea8ad996503dadd8956891a591f7e"} err="failed to get container status \"20c99bfe3cee250739b34cb2db95394ed86ea8ad996503dadd8956891a591f7e\": rpc error: code = NotFound desc = could not find container \"20c99bfe3cee250739b34cb2db95394ed86ea8ad996503dadd8956891a591f7e\": container with ID starting with 20c99bfe3cee250739b34cb2db95394ed86ea8ad996503dadd8956891a591f7e not found: ID does not exist" Oct 03 10:11:07 crc kubenswrapper[4790]: I1003 10:11:07.761186 4790 scope.go:117] "RemoveContainer" containerID="69a3bb1d79f7c0208575214bdaab2f1cc453ea3297243dd2c14f68ad3bcfe9b1" Oct 03 10:11:07 crc kubenswrapper[4790]: E1003 10:11:07.761700 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"69a3bb1d79f7c0208575214bdaab2f1cc453ea3297243dd2c14f68ad3bcfe9b1\": container with ID starting with 69a3bb1d79f7c0208575214bdaab2f1cc453ea3297243dd2c14f68ad3bcfe9b1 not found: ID does not exist" containerID="69a3bb1d79f7c0208575214bdaab2f1cc453ea3297243dd2c14f68ad3bcfe9b1" Oct 03 10:11:07 crc kubenswrapper[4790]: I1003 10:11:07.761750 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"69a3bb1d79f7c0208575214bdaab2f1cc453ea3297243dd2c14f68ad3bcfe9b1"} err="failed to get container status \"69a3bb1d79f7c0208575214bdaab2f1cc453ea3297243dd2c14f68ad3bcfe9b1\": rpc error: code = NotFound desc = could not find container \"69a3bb1d79f7c0208575214bdaab2f1cc453ea3297243dd2c14f68ad3bcfe9b1\": container with ID starting with 69a3bb1d79f7c0208575214bdaab2f1cc453ea3297243dd2c14f68ad3bcfe9b1 not found: ID does not exist" Oct 03 10:11:07 crc kubenswrapper[4790]: I1003 10:11:07.761776 4790 scope.go:117] "RemoveContainer" containerID="4e78fac0a1bb892bb9bef15c159892ac374ed0476bf115d2227bfb081caa8080" Oct 03 10:11:07 crc kubenswrapper[4790]: E1003 10:11:07.762111 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4e78fac0a1bb892bb9bef15c159892ac374ed0476bf115d2227bfb081caa8080\": container with ID starting with 4e78fac0a1bb892bb9bef15c159892ac374ed0476bf115d2227bfb081caa8080 not found: ID does not exist" containerID="4e78fac0a1bb892bb9bef15c159892ac374ed0476bf115d2227bfb081caa8080" Oct 03 10:11:07 crc kubenswrapper[4790]: I1003 10:11:07.762136 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4e78fac0a1bb892bb9bef15c159892ac374ed0476bf115d2227bfb081caa8080"} err="failed to get container status \"4e78fac0a1bb892bb9bef15c159892ac374ed0476bf115d2227bfb081caa8080\": rpc error: code = NotFound desc = could not find container \"4e78fac0a1bb892bb9bef15c159892ac374ed0476bf115d2227bfb081caa8080\": container with ID starting with 4e78fac0a1bb892bb9bef15c159892ac374ed0476bf115d2227bfb081caa8080 not found: ID does not exist" Oct 03 10:11:08 crc kubenswrapper[4790]: I1003 10:11:08.339493 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1ebc880d-93ac-4e27-aa9b-e553df6bcbd8" path="/var/lib/kubelet/pods/1ebc880d-93ac-4e27-aa9b-e553df6bcbd8/volumes" Oct 03 10:11:10 crc kubenswrapper[4790]: I1003 10:11:10.186163 4790 patch_prober.go:28] interesting pod/machine-config-daemon-zqj4j container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 10:11:10 crc kubenswrapper[4790]: I1003 10:11:10.186525 4790 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" podUID="36eae06a-d8be-4128-8d68-acb32e1f011b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 10:11:30 crc kubenswrapper[4790]: I1003 10:11:30.743823 4790 scope.go:117] "RemoveContainer" containerID="b80913a8c5759b12224525de539cf69367e79164219ce7b97a6d56fb26c6be6e" Oct 03 10:11:30 crc kubenswrapper[4790]: I1003 10:11:30.773974 4790 scope.go:117] "RemoveContainer" containerID="6f07fed0dd7279f584ebd88515ea6b0f8ec8f0a97906c04c0f817573108590a2" Oct 03 10:11:30 crc kubenswrapper[4790]: I1003 10:11:30.830919 4790 scope.go:117] "RemoveContainer" containerID="ca3f46924a8e0f5c549136ae87bcb68b39e12d0273a6af0da393d15db6b316d4" Oct 03 10:11:40 crc kubenswrapper[4790]: I1003 10:11:40.186564 4790 patch_prober.go:28] interesting pod/machine-config-daemon-zqj4j container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 10:11:40 crc kubenswrapper[4790]: I1003 10:11:40.187246 4790 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" podUID="36eae06a-d8be-4128-8d68-acb32e1f011b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 10:11:40 crc kubenswrapper[4790]: I1003 10:11:40.187304 4790 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" Oct 03 10:11:40 crc kubenswrapper[4790]: I1003 10:11:40.188222 4790 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"bc04f8a0c2ea3bee219e93e8f6cbd077be44f16f505b12226dc4f542aabae8e0"} pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 03 10:11:40 crc kubenswrapper[4790]: I1003 10:11:40.188283 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" podUID="36eae06a-d8be-4128-8d68-acb32e1f011b" containerName="machine-config-daemon" containerID="cri-o://bc04f8a0c2ea3bee219e93e8f6cbd077be44f16f505b12226dc4f542aabae8e0" gracePeriod=600 Oct 03 10:11:40 crc kubenswrapper[4790]: I1003 10:11:40.937782 4790 generic.go:334] "Generic (PLEG): container finished" podID="36eae06a-d8be-4128-8d68-acb32e1f011b" containerID="bc04f8a0c2ea3bee219e93e8f6cbd077be44f16f505b12226dc4f542aabae8e0" exitCode=0 Oct 03 10:11:40 crc kubenswrapper[4790]: I1003 10:11:40.937899 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" event={"ID":"36eae06a-d8be-4128-8d68-acb32e1f011b","Type":"ContainerDied","Data":"bc04f8a0c2ea3bee219e93e8f6cbd077be44f16f505b12226dc4f542aabae8e0"} Oct 03 10:11:40 crc kubenswrapper[4790]: I1003 10:11:40.938422 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" event={"ID":"36eae06a-d8be-4128-8d68-acb32e1f011b","Type":"ContainerStarted","Data":"54762a3a6941b00ff04aded3b5d01b002e1f5c184e678c224d6c8f4fab6a3037"} Oct 03 10:11:40 crc kubenswrapper[4790]: I1003 10:11:40.938448 4790 scope.go:117] "RemoveContainer" containerID="1a64f73fcffe7eaf400727d35dc4b02f6a92ebdf24e33ac845589e5623ad52ac" Oct 03 10:13:40 crc kubenswrapper[4790]: I1003 10:13:40.185922 4790 patch_prober.go:28] interesting pod/machine-config-daemon-zqj4j container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 10:13:40 crc kubenswrapper[4790]: I1003 10:13:40.186555 4790 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" podUID="36eae06a-d8be-4128-8d68-acb32e1f011b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 10:14:10 crc kubenswrapper[4790]: I1003 10:14:10.186304 4790 patch_prober.go:28] interesting pod/machine-config-daemon-zqj4j container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 10:14:10 crc kubenswrapper[4790]: I1003 10:14:10.187008 4790 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" podUID="36eae06a-d8be-4128-8d68-acb32e1f011b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 10:14:40 crc kubenswrapper[4790]: I1003 10:14:40.186741 4790 patch_prober.go:28] interesting pod/machine-config-daemon-zqj4j container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 10:14:40 crc kubenswrapper[4790]: I1003 10:14:40.187418 4790 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" podUID="36eae06a-d8be-4128-8d68-acb32e1f011b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 10:14:40 crc kubenswrapper[4790]: I1003 10:14:40.187473 4790 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" Oct 03 10:14:40 crc kubenswrapper[4790]: I1003 10:14:40.188311 4790 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"54762a3a6941b00ff04aded3b5d01b002e1f5c184e678c224d6c8f4fab6a3037"} pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 03 10:14:40 crc kubenswrapper[4790]: I1003 10:14:40.188382 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" podUID="36eae06a-d8be-4128-8d68-acb32e1f011b" containerName="machine-config-daemon" containerID="cri-o://54762a3a6941b00ff04aded3b5d01b002e1f5c184e678c224d6c8f4fab6a3037" gracePeriod=600 Oct 03 10:14:40 crc kubenswrapper[4790]: E1003 10:14:40.323349 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqj4j_openshift-machine-config-operator(36eae06a-d8be-4128-8d68-acb32e1f011b)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" podUID="36eae06a-d8be-4128-8d68-acb32e1f011b" Oct 03 10:14:40 crc kubenswrapper[4790]: I1003 10:14:40.747442 4790 generic.go:334] "Generic (PLEG): container finished" podID="36eae06a-d8be-4128-8d68-acb32e1f011b" containerID="54762a3a6941b00ff04aded3b5d01b002e1f5c184e678c224d6c8f4fab6a3037" exitCode=0 Oct 03 10:14:40 crc kubenswrapper[4790]: I1003 10:14:40.747475 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" event={"ID":"36eae06a-d8be-4128-8d68-acb32e1f011b","Type":"ContainerDied","Data":"54762a3a6941b00ff04aded3b5d01b002e1f5c184e678c224d6c8f4fab6a3037"} Oct 03 10:14:40 crc kubenswrapper[4790]: I1003 10:14:40.747918 4790 scope.go:117] "RemoveContainer" containerID="bc04f8a0c2ea3bee219e93e8f6cbd077be44f16f505b12226dc4f542aabae8e0" Oct 03 10:14:40 crc kubenswrapper[4790]: I1003 10:14:40.748622 4790 scope.go:117] "RemoveContainer" containerID="54762a3a6941b00ff04aded3b5d01b002e1f5c184e678c224d6c8f4fab6a3037" Oct 03 10:14:40 crc kubenswrapper[4790]: E1003 10:14:40.749024 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqj4j_openshift-machine-config-operator(36eae06a-d8be-4128-8d68-acb32e1f011b)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" podUID="36eae06a-d8be-4128-8d68-acb32e1f011b" Oct 03 10:14:51 crc kubenswrapper[4790]: I1003 10:14:51.329438 4790 scope.go:117] "RemoveContainer" containerID="54762a3a6941b00ff04aded3b5d01b002e1f5c184e678c224d6c8f4fab6a3037" Oct 03 10:14:51 crc kubenswrapper[4790]: E1003 10:14:51.330333 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqj4j_openshift-machine-config-operator(36eae06a-d8be-4128-8d68-acb32e1f011b)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" podUID="36eae06a-d8be-4128-8d68-acb32e1f011b" Oct 03 10:15:00 crc kubenswrapper[4790]: I1003 10:15:00.151628 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29324775-vj96p"] Oct 03 10:15:00 crc kubenswrapper[4790]: E1003 10:15:00.152732 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1ebc880d-93ac-4e27-aa9b-e553df6bcbd8" containerName="extract-content" Oct 03 10:15:00 crc kubenswrapper[4790]: I1003 10:15:00.152749 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ebc880d-93ac-4e27-aa9b-e553df6bcbd8" containerName="extract-content" Oct 03 10:15:00 crc kubenswrapper[4790]: E1003 10:15:00.152788 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff7d8b29-32ec-424e-abde-563eb6387b3e" containerName="extract-utilities" Oct 03 10:15:00 crc kubenswrapper[4790]: I1003 10:15:00.152794 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff7d8b29-32ec-424e-abde-563eb6387b3e" containerName="extract-utilities" Oct 03 10:15:00 crc kubenswrapper[4790]: E1003 10:15:00.152805 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff7d8b29-32ec-424e-abde-563eb6387b3e" containerName="registry-server" Oct 03 10:15:00 crc kubenswrapper[4790]: I1003 10:15:00.152813 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff7d8b29-32ec-424e-abde-563eb6387b3e" containerName="registry-server" Oct 03 10:15:00 crc kubenswrapper[4790]: E1003 10:15:00.152824 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1ebc880d-93ac-4e27-aa9b-e553df6bcbd8" containerName="extract-utilities" Oct 03 10:15:00 crc kubenswrapper[4790]: I1003 10:15:00.152830 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ebc880d-93ac-4e27-aa9b-e553df6bcbd8" containerName="extract-utilities" Oct 03 10:15:00 crc kubenswrapper[4790]: E1003 10:15:00.152844 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff7d8b29-32ec-424e-abde-563eb6387b3e" containerName="extract-content" Oct 03 10:15:00 crc kubenswrapper[4790]: I1003 10:15:00.152850 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff7d8b29-32ec-424e-abde-563eb6387b3e" containerName="extract-content" Oct 03 10:15:00 crc kubenswrapper[4790]: E1003 10:15:00.152860 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1ebc880d-93ac-4e27-aa9b-e553df6bcbd8" containerName="registry-server" Oct 03 10:15:00 crc kubenswrapper[4790]: I1003 10:15:00.152865 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ebc880d-93ac-4e27-aa9b-e553df6bcbd8" containerName="registry-server" Oct 03 10:15:00 crc kubenswrapper[4790]: I1003 10:15:00.153061 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="1ebc880d-93ac-4e27-aa9b-e553df6bcbd8" containerName="registry-server" Oct 03 10:15:00 crc kubenswrapper[4790]: I1003 10:15:00.153070 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="ff7d8b29-32ec-424e-abde-563eb6387b3e" containerName="registry-server" Oct 03 10:15:00 crc kubenswrapper[4790]: I1003 10:15:00.153981 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29324775-vj96p" Oct 03 10:15:00 crc kubenswrapper[4790]: I1003 10:15:00.157036 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 03 10:15:00 crc kubenswrapper[4790]: I1003 10:15:00.157752 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 03 10:15:00 crc kubenswrapper[4790]: I1003 10:15:00.162056 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29324775-vj96p"] Oct 03 10:15:00 crc kubenswrapper[4790]: I1003 10:15:00.336461 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/41e34a0c-976d-45e6-9f15-c0c9cb6878df-secret-volume\") pod \"collect-profiles-29324775-vj96p\" (UID: \"41e34a0c-976d-45e6-9f15-c0c9cb6878df\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324775-vj96p" Oct 03 10:15:00 crc kubenswrapper[4790]: I1003 10:15:00.336590 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/41e34a0c-976d-45e6-9f15-c0c9cb6878df-config-volume\") pod \"collect-profiles-29324775-vj96p\" (UID: \"41e34a0c-976d-45e6-9f15-c0c9cb6878df\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324775-vj96p" Oct 03 10:15:00 crc kubenswrapper[4790]: I1003 10:15:00.336792 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7zmkl\" (UniqueName: \"kubernetes.io/projected/41e34a0c-976d-45e6-9f15-c0c9cb6878df-kube-api-access-7zmkl\") pod \"collect-profiles-29324775-vj96p\" (UID: \"41e34a0c-976d-45e6-9f15-c0c9cb6878df\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324775-vj96p" Oct 03 10:15:00 crc kubenswrapper[4790]: I1003 10:15:00.438544 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/41e34a0c-976d-45e6-9f15-c0c9cb6878df-config-volume\") pod \"collect-profiles-29324775-vj96p\" (UID: \"41e34a0c-976d-45e6-9f15-c0c9cb6878df\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324775-vj96p" Oct 03 10:15:00 crc kubenswrapper[4790]: I1003 10:15:00.439119 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7zmkl\" (UniqueName: \"kubernetes.io/projected/41e34a0c-976d-45e6-9f15-c0c9cb6878df-kube-api-access-7zmkl\") pod \"collect-profiles-29324775-vj96p\" (UID: \"41e34a0c-976d-45e6-9f15-c0c9cb6878df\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324775-vj96p" Oct 03 10:15:00 crc kubenswrapper[4790]: I1003 10:15:00.439190 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/41e34a0c-976d-45e6-9f15-c0c9cb6878df-secret-volume\") pod \"collect-profiles-29324775-vj96p\" (UID: \"41e34a0c-976d-45e6-9f15-c0c9cb6878df\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324775-vj96p" Oct 03 10:15:00 crc kubenswrapper[4790]: I1003 10:15:00.440800 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/41e34a0c-976d-45e6-9f15-c0c9cb6878df-config-volume\") pod \"collect-profiles-29324775-vj96p\" (UID: \"41e34a0c-976d-45e6-9f15-c0c9cb6878df\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324775-vj96p" Oct 03 10:15:00 crc kubenswrapper[4790]: I1003 10:15:00.454923 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/41e34a0c-976d-45e6-9f15-c0c9cb6878df-secret-volume\") pod \"collect-profiles-29324775-vj96p\" (UID: \"41e34a0c-976d-45e6-9f15-c0c9cb6878df\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324775-vj96p" Oct 03 10:15:00 crc kubenswrapper[4790]: I1003 10:15:00.459625 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7zmkl\" (UniqueName: \"kubernetes.io/projected/41e34a0c-976d-45e6-9f15-c0c9cb6878df-kube-api-access-7zmkl\") pod \"collect-profiles-29324775-vj96p\" (UID: \"41e34a0c-976d-45e6-9f15-c0c9cb6878df\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324775-vj96p" Oct 03 10:15:00 crc kubenswrapper[4790]: I1003 10:15:00.559950 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29324775-vj96p" Oct 03 10:15:01 crc kubenswrapper[4790]: I1003 10:15:01.094102 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29324775-vj96p"] Oct 03 10:15:01 crc kubenswrapper[4790]: I1003 10:15:01.976538 4790 generic.go:334] "Generic (PLEG): container finished" podID="41e34a0c-976d-45e6-9f15-c0c9cb6878df" containerID="eb1be1fa2bab10d998200f9de508849361b5aa1db6fbf3e31115f4b9b0119b2d" exitCode=0 Oct 03 10:15:01 crc kubenswrapper[4790]: I1003 10:15:01.976871 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29324775-vj96p" event={"ID":"41e34a0c-976d-45e6-9f15-c0c9cb6878df","Type":"ContainerDied","Data":"eb1be1fa2bab10d998200f9de508849361b5aa1db6fbf3e31115f4b9b0119b2d"} Oct 03 10:15:01 crc kubenswrapper[4790]: I1003 10:15:01.976905 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29324775-vj96p" event={"ID":"41e34a0c-976d-45e6-9f15-c0c9cb6878df","Type":"ContainerStarted","Data":"0dd4d04a451dd19415603691a92cf35c6e11878719e79b51af79a9bab1d654e9"} Oct 03 10:15:03 crc kubenswrapper[4790]: I1003 10:15:03.359044 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29324775-vj96p" Oct 03 10:15:03 crc kubenswrapper[4790]: I1003 10:15:03.511403 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/41e34a0c-976d-45e6-9f15-c0c9cb6878df-secret-volume\") pod \"41e34a0c-976d-45e6-9f15-c0c9cb6878df\" (UID: \"41e34a0c-976d-45e6-9f15-c0c9cb6878df\") " Oct 03 10:15:03 crc kubenswrapper[4790]: I1003 10:15:03.511799 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7zmkl\" (UniqueName: \"kubernetes.io/projected/41e34a0c-976d-45e6-9f15-c0c9cb6878df-kube-api-access-7zmkl\") pod \"41e34a0c-976d-45e6-9f15-c0c9cb6878df\" (UID: \"41e34a0c-976d-45e6-9f15-c0c9cb6878df\") " Oct 03 10:15:03 crc kubenswrapper[4790]: I1003 10:15:03.511841 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/41e34a0c-976d-45e6-9f15-c0c9cb6878df-config-volume\") pod \"41e34a0c-976d-45e6-9f15-c0c9cb6878df\" (UID: \"41e34a0c-976d-45e6-9f15-c0c9cb6878df\") " Oct 03 10:15:03 crc kubenswrapper[4790]: I1003 10:15:03.512896 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/41e34a0c-976d-45e6-9f15-c0c9cb6878df-config-volume" (OuterVolumeSpecName: "config-volume") pod "41e34a0c-976d-45e6-9f15-c0c9cb6878df" (UID: "41e34a0c-976d-45e6-9f15-c0c9cb6878df"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 10:15:03 crc kubenswrapper[4790]: I1003 10:15:03.521753 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/41e34a0c-976d-45e6-9f15-c0c9cb6878df-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "41e34a0c-976d-45e6-9f15-c0c9cb6878df" (UID: "41e34a0c-976d-45e6-9f15-c0c9cb6878df"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 10:15:03 crc kubenswrapper[4790]: I1003 10:15:03.534071 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/41e34a0c-976d-45e6-9f15-c0c9cb6878df-kube-api-access-7zmkl" (OuterVolumeSpecName: "kube-api-access-7zmkl") pod "41e34a0c-976d-45e6-9f15-c0c9cb6878df" (UID: "41e34a0c-976d-45e6-9f15-c0c9cb6878df"). InnerVolumeSpecName "kube-api-access-7zmkl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 10:15:03 crc kubenswrapper[4790]: I1003 10:15:03.625096 4790 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/41e34a0c-976d-45e6-9f15-c0c9cb6878df-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 03 10:15:03 crc kubenswrapper[4790]: I1003 10:15:03.625150 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7zmkl\" (UniqueName: \"kubernetes.io/projected/41e34a0c-976d-45e6-9f15-c0c9cb6878df-kube-api-access-7zmkl\") on node \"crc\" DevicePath \"\"" Oct 03 10:15:03 crc kubenswrapper[4790]: I1003 10:15:03.625166 4790 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/41e34a0c-976d-45e6-9f15-c0c9cb6878df-config-volume\") on node \"crc\" DevicePath \"\"" Oct 03 10:15:03 crc kubenswrapper[4790]: I1003 10:15:03.999188 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29324775-vj96p" event={"ID":"41e34a0c-976d-45e6-9f15-c0c9cb6878df","Type":"ContainerDied","Data":"0dd4d04a451dd19415603691a92cf35c6e11878719e79b51af79a9bab1d654e9"} Oct 03 10:15:03 crc kubenswrapper[4790]: I1003 10:15:03.999247 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0dd4d04a451dd19415603691a92cf35c6e11878719e79b51af79a9bab1d654e9" Oct 03 10:15:03 crc kubenswrapper[4790]: I1003 10:15:03.999251 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29324775-vj96p" Oct 03 10:15:04 crc kubenswrapper[4790]: I1003 10:15:04.328113 4790 scope.go:117] "RemoveContainer" containerID="54762a3a6941b00ff04aded3b5d01b002e1f5c184e678c224d6c8f4fab6a3037" Oct 03 10:15:04 crc kubenswrapper[4790]: E1003 10:15:04.328806 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqj4j_openshift-machine-config-operator(36eae06a-d8be-4128-8d68-acb32e1f011b)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" podUID="36eae06a-d8be-4128-8d68-acb32e1f011b" Oct 03 10:15:04 crc kubenswrapper[4790]: I1003 10:15:04.438297 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29324730-q4thh"] Oct 03 10:15:04 crc kubenswrapper[4790]: I1003 10:15:04.447117 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29324730-q4thh"] Oct 03 10:15:06 crc kubenswrapper[4790]: I1003 10:15:06.338965 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="72133265-7f2f-4394-b2a6-79ea86c4d8d2" path="/var/lib/kubelet/pods/72133265-7f2f-4394-b2a6-79ea86c4d8d2/volumes" Oct 03 10:15:15 crc kubenswrapper[4790]: I1003 10:15:15.328869 4790 scope.go:117] "RemoveContainer" containerID="54762a3a6941b00ff04aded3b5d01b002e1f5c184e678c224d6c8f4fab6a3037" Oct 03 10:15:15 crc kubenswrapper[4790]: E1003 10:15:15.329823 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqj4j_openshift-machine-config-operator(36eae06a-d8be-4128-8d68-acb32e1f011b)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" podUID="36eae06a-d8be-4128-8d68-acb32e1f011b" Oct 03 10:15:29 crc kubenswrapper[4790]: I1003 10:15:29.328619 4790 scope.go:117] "RemoveContainer" containerID="54762a3a6941b00ff04aded3b5d01b002e1f5c184e678c224d6c8f4fab6a3037" Oct 03 10:15:29 crc kubenswrapper[4790]: E1003 10:15:29.329475 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqj4j_openshift-machine-config-operator(36eae06a-d8be-4128-8d68-acb32e1f011b)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" podUID="36eae06a-d8be-4128-8d68-acb32e1f011b" Oct 03 10:15:31 crc kubenswrapper[4790]: I1003 10:15:31.012639 4790 scope.go:117] "RemoveContainer" containerID="bb1ed60ac67850026255fc988a3f9d1023630b7dfd4e875c77b9a3235bb0fbd5" Oct 03 10:15:41 crc kubenswrapper[4790]: I1003 10:15:41.327973 4790 scope.go:117] "RemoveContainer" containerID="54762a3a6941b00ff04aded3b5d01b002e1f5c184e678c224d6c8f4fab6a3037" Oct 03 10:15:41 crc kubenswrapper[4790]: E1003 10:15:41.328896 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqj4j_openshift-machine-config-operator(36eae06a-d8be-4128-8d68-acb32e1f011b)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" podUID="36eae06a-d8be-4128-8d68-acb32e1f011b" Oct 03 10:15:42 crc kubenswrapper[4790]: I1003 10:15:42.295437 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-75585"] Oct 03 10:15:42 crc kubenswrapper[4790]: E1003 10:15:42.296300 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="41e34a0c-976d-45e6-9f15-c0c9cb6878df" containerName="collect-profiles" Oct 03 10:15:42 crc kubenswrapper[4790]: I1003 10:15:42.296319 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="41e34a0c-976d-45e6-9f15-c0c9cb6878df" containerName="collect-profiles" Oct 03 10:15:42 crc kubenswrapper[4790]: I1003 10:15:42.296539 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="41e34a0c-976d-45e6-9f15-c0c9cb6878df" containerName="collect-profiles" Oct 03 10:15:42 crc kubenswrapper[4790]: I1003 10:15:42.298049 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-75585" Oct 03 10:15:42 crc kubenswrapper[4790]: I1003 10:15:42.317418 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-75585"] Oct 03 10:15:42 crc kubenswrapper[4790]: I1003 10:15:42.414893 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-74xx6\" (UniqueName: \"kubernetes.io/projected/92eb12f6-b080-40b7-a31e-73d4779f5ea9-kube-api-access-74xx6\") pod \"redhat-marketplace-75585\" (UID: \"92eb12f6-b080-40b7-a31e-73d4779f5ea9\") " pod="openshift-marketplace/redhat-marketplace-75585" Oct 03 10:15:42 crc kubenswrapper[4790]: I1003 10:15:42.414956 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/92eb12f6-b080-40b7-a31e-73d4779f5ea9-catalog-content\") pod \"redhat-marketplace-75585\" (UID: \"92eb12f6-b080-40b7-a31e-73d4779f5ea9\") " pod="openshift-marketplace/redhat-marketplace-75585" Oct 03 10:15:42 crc kubenswrapper[4790]: I1003 10:15:42.415591 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/92eb12f6-b080-40b7-a31e-73d4779f5ea9-utilities\") pod \"redhat-marketplace-75585\" (UID: \"92eb12f6-b080-40b7-a31e-73d4779f5ea9\") " pod="openshift-marketplace/redhat-marketplace-75585" Oct 03 10:15:42 crc kubenswrapper[4790]: I1003 10:15:42.518147 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-74xx6\" (UniqueName: \"kubernetes.io/projected/92eb12f6-b080-40b7-a31e-73d4779f5ea9-kube-api-access-74xx6\") pod \"redhat-marketplace-75585\" (UID: \"92eb12f6-b080-40b7-a31e-73d4779f5ea9\") " pod="openshift-marketplace/redhat-marketplace-75585" Oct 03 10:15:42 crc kubenswrapper[4790]: I1003 10:15:42.518223 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/92eb12f6-b080-40b7-a31e-73d4779f5ea9-catalog-content\") pod \"redhat-marketplace-75585\" (UID: \"92eb12f6-b080-40b7-a31e-73d4779f5ea9\") " pod="openshift-marketplace/redhat-marketplace-75585" Oct 03 10:15:42 crc kubenswrapper[4790]: I1003 10:15:42.518395 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/92eb12f6-b080-40b7-a31e-73d4779f5ea9-utilities\") pod \"redhat-marketplace-75585\" (UID: \"92eb12f6-b080-40b7-a31e-73d4779f5ea9\") " pod="openshift-marketplace/redhat-marketplace-75585" Oct 03 10:15:42 crc kubenswrapper[4790]: I1003 10:15:42.518862 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/92eb12f6-b080-40b7-a31e-73d4779f5ea9-catalog-content\") pod \"redhat-marketplace-75585\" (UID: \"92eb12f6-b080-40b7-a31e-73d4779f5ea9\") " pod="openshift-marketplace/redhat-marketplace-75585" Oct 03 10:15:42 crc kubenswrapper[4790]: I1003 10:15:42.518917 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/92eb12f6-b080-40b7-a31e-73d4779f5ea9-utilities\") pod \"redhat-marketplace-75585\" (UID: \"92eb12f6-b080-40b7-a31e-73d4779f5ea9\") " pod="openshift-marketplace/redhat-marketplace-75585" Oct 03 10:15:42 crc kubenswrapper[4790]: I1003 10:15:42.548907 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-74xx6\" (UniqueName: \"kubernetes.io/projected/92eb12f6-b080-40b7-a31e-73d4779f5ea9-kube-api-access-74xx6\") pod \"redhat-marketplace-75585\" (UID: \"92eb12f6-b080-40b7-a31e-73d4779f5ea9\") " pod="openshift-marketplace/redhat-marketplace-75585" Oct 03 10:15:42 crc kubenswrapper[4790]: I1003 10:15:42.632295 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-75585" Oct 03 10:15:43 crc kubenswrapper[4790]: I1003 10:15:43.119662 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-75585"] Oct 03 10:15:43 crc kubenswrapper[4790]: I1003 10:15:43.364449 4790 generic.go:334] "Generic (PLEG): container finished" podID="92eb12f6-b080-40b7-a31e-73d4779f5ea9" containerID="c7a57e7fc4717e7758ec27e54d6a42d0588494431ceeccf165cc24e06ae581f1" exitCode=0 Oct 03 10:15:43 crc kubenswrapper[4790]: I1003 10:15:43.364526 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-75585" event={"ID":"92eb12f6-b080-40b7-a31e-73d4779f5ea9","Type":"ContainerDied","Data":"c7a57e7fc4717e7758ec27e54d6a42d0588494431ceeccf165cc24e06ae581f1"} Oct 03 10:15:43 crc kubenswrapper[4790]: I1003 10:15:43.364571 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-75585" event={"ID":"92eb12f6-b080-40b7-a31e-73d4779f5ea9","Type":"ContainerStarted","Data":"c21647e2b5c3167ad0f581662094702555ce8885bd9ba750c89658cdec728bd4"} Oct 03 10:15:43 crc kubenswrapper[4790]: I1003 10:15:43.367141 4790 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 03 10:15:44 crc kubenswrapper[4790]: I1003 10:15:44.375621 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-75585" event={"ID":"92eb12f6-b080-40b7-a31e-73d4779f5ea9","Type":"ContainerStarted","Data":"30d9dae419a4eb5e812f95ac183b152eddbbff86077d2af1b35c7bd6473aa1a5"} Oct 03 10:15:45 crc kubenswrapper[4790]: I1003 10:15:45.395984 4790 generic.go:334] "Generic (PLEG): container finished" podID="92eb12f6-b080-40b7-a31e-73d4779f5ea9" containerID="30d9dae419a4eb5e812f95ac183b152eddbbff86077d2af1b35c7bd6473aa1a5" exitCode=0 Oct 03 10:15:45 crc kubenswrapper[4790]: I1003 10:15:45.396090 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-75585" event={"ID":"92eb12f6-b080-40b7-a31e-73d4779f5ea9","Type":"ContainerDied","Data":"30d9dae419a4eb5e812f95ac183b152eddbbff86077d2af1b35c7bd6473aa1a5"} Oct 03 10:15:46 crc kubenswrapper[4790]: I1003 10:15:46.408411 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-75585" event={"ID":"92eb12f6-b080-40b7-a31e-73d4779f5ea9","Type":"ContainerStarted","Data":"5413a93d359c2dc111f40915e9f0c292433d1ee9685594deb4849deceb8e0021"} Oct 03 10:15:46 crc kubenswrapper[4790]: I1003 10:15:46.428446 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-75585" podStartSLOduration=1.9821128460000002 podStartE2EDuration="4.428429522s" podCreationTimestamp="2025-10-03 10:15:42 +0000 UTC" firstStartedPulling="2025-10-03 10:15:43.366805348 +0000 UTC m=+5729.719739586" lastFinishedPulling="2025-10-03 10:15:45.813122024 +0000 UTC m=+5732.166056262" observedRunningTime="2025-10-03 10:15:46.424384711 +0000 UTC m=+5732.777318949" watchObservedRunningTime="2025-10-03 10:15:46.428429522 +0000 UTC m=+5732.781363760" Oct 03 10:15:52 crc kubenswrapper[4790]: I1003 10:15:52.632644 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-75585" Oct 03 10:15:52 crc kubenswrapper[4790]: I1003 10:15:52.633383 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-75585" Oct 03 10:15:52 crc kubenswrapper[4790]: I1003 10:15:52.686238 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-75585" Oct 03 10:15:53 crc kubenswrapper[4790]: I1003 10:15:53.526099 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-75585" Oct 03 10:15:53 crc kubenswrapper[4790]: I1003 10:15:53.571936 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-75585"] Oct 03 10:15:54 crc kubenswrapper[4790]: I1003 10:15:54.334354 4790 scope.go:117] "RemoveContainer" containerID="54762a3a6941b00ff04aded3b5d01b002e1f5c184e678c224d6c8f4fab6a3037" Oct 03 10:15:54 crc kubenswrapper[4790]: E1003 10:15:54.334692 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqj4j_openshift-machine-config-operator(36eae06a-d8be-4128-8d68-acb32e1f011b)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" podUID="36eae06a-d8be-4128-8d68-acb32e1f011b" Oct 03 10:15:55 crc kubenswrapper[4790]: I1003 10:15:55.489028 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-75585" podUID="92eb12f6-b080-40b7-a31e-73d4779f5ea9" containerName="registry-server" containerID="cri-o://5413a93d359c2dc111f40915e9f0c292433d1ee9685594deb4849deceb8e0021" gracePeriod=2 Oct 03 10:15:56 crc kubenswrapper[4790]: I1003 10:15:56.014513 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-75585" Oct 03 10:15:56 crc kubenswrapper[4790]: I1003 10:15:56.115600 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/92eb12f6-b080-40b7-a31e-73d4779f5ea9-utilities\") pod \"92eb12f6-b080-40b7-a31e-73d4779f5ea9\" (UID: \"92eb12f6-b080-40b7-a31e-73d4779f5ea9\") " Oct 03 10:15:56 crc kubenswrapper[4790]: I1003 10:15:56.115879 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-74xx6\" (UniqueName: \"kubernetes.io/projected/92eb12f6-b080-40b7-a31e-73d4779f5ea9-kube-api-access-74xx6\") pod \"92eb12f6-b080-40b7-a31e-73d4779f5ea9\" (UID: \"92eb12f6-b080-40b7-a31e-73d4779f5ea9\") " Oct 03 10:15:56 crc kubenswrapper[4790]: I1003 10:15:56.115998 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/92eb12f6-b080-40b7-a31e-73d4779f5ea9-catalog-content\") pod \"92eb12f6-b080-40b7-a31e-73d4779f5ea9\" (UID: \"92eb12f6-b080-40b7-a31e-73d4779f5ea9\") " Oct 03 10:15:56 crc kubenswrapper[4790]: I1003 10:15:56.116634 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/92eb12f6-b080-40b7-a31e-73d4779f5ea9-utilities" (OuterVolumeSpecName: "utilities") pod "92eb12f6-b080-40b7-a31e-73d4779f5ea9" (UID: "92eb12f6-b080-40b7-a31e-73d4779f5ea9"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 10:15:56 crc kubenswrapper[4790]: I1003 10:15:56.124167 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/92eb12f6-b080-40b7-a31e-73d4779f5ea9-kube-api-access-74xx6" (OuterVolumeSpecName: "kube-api-access-74xx6") pod "92eb12f6-b080-40b7-a31e-73d4779f5ea9" (UID: "92eb12f6-b080-40b7-a31e-73d4779f5ea9"). InnerVolumeSpecName "kube-api-access-74xx6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 10:15:56 crc kubenswrapper[4790]: I1003 10:15:56.133741 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-74xx6\" (UniqueName: \"kubernetes.io/projected/92eb12f6-b080-40b7-a31e-73d4779f5ea9-kube-api-access-74xx6\") on node \"crc\" DevicePath \"\"" Oct 03 10:15:56 crc kubenswrapper[4790]: I1003 10:15:56.133801 4790 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/92eb12f6-b080-40b7-a31e-73d4779f5ea9-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 10:15:56 crc kubenswrapper[4790]: I1003 10:15:56.136756 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/92eb12f6-b080-40b7-a31e-73d4779f5ea9-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "92eb12f6-b080-40b7-a31e-73d4779f5ea9" (UID: "92eb12f6-b080-40b7-a31e-73d4779f5ea9"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 10:15:56 crc kubenswrapper[4790]: I1003 10:15:56.236306 4790 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/92eb12f6-b080-40b7-a31e-73d4779f5ea9-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 10:15:56 crc kubenswrapper[4790]: I1003 10:15:56.500818 4790 generic.go:334] "Generic (PLEG): container finished" podID="92eb12f6-b080-40b7-a31e-73d4779f5ea9" containerID="5413a93d359c2dc111f40915e9f0c292433d1ee9685594deb4849deceb8e0021" exitCode=0 Oct 03 10:15:56 crc kubenswrapper[4790]: I1003 10:15:56.500883 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-75585" Oct 03 10:15:56 crc kubenswrapper[4790]: I1003 10:15:56.500881 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-75585" event={"ID":"92eb12f6-b080-40b7-a31e-73d4779f5ea9","Type":"ContainerDied","Data":"5413a93d359c2dc111f40915e9f0c292433d1ee9685594deb4849deceb8e0021"} Oct 03 10:15:56 crc kubenswrapper[4790]: I1003 10:15:56.501248 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-75585" event={"ID":"92eb12f6-b080-40b7-a31e-73d4779f5ea9","Type":"ContainerDied","Data":"c21647e2b5c3167ad0f581662094702555ce8885bd9ba750c89658cdec728bd4"} Oct 03 10:15:56 crc kubenswrapper[4790]: I1003 10:15:56.501272 4790 scope.go:117] "RemoveContainer" containerID="5413a93d359c2dc111f40915e9f0c292433d1ee9685594deb4849deceb8e0021" Oct 03 10:15:56 crc kubenswrapper[4790]: I1003 10:15:56.529822 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-75585"] Oct 03 10:15:56 crc kubenswrapper[4790]: I1003 10:15:56.537996 4790 scope.go:117] "RemoveContainer" containerID="30d9dae419a4eb5e812f95ac183b152eddbbff86077d2af1b35c7bd6473aa1a5" Oct 03 10:15:56 crc kubenswrapper[4790]: I1003 10:15:56.539491 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-75585"] Oct 03 10:15:56 crc kubenswrapper[4790]: I1003 10:15:56.562726 4790 scope.go:117] "RemoveContainer" containerID="c7a57e7fc4717e7758ec27e54d6a42d0588494431ceeccf165cc24e06ae581f1" Oct 03 10:15:56 crc kubenswrapper[4790]: I1003 10:15:56.613206 4790 scope.go:117] "RemoveContainer" containerID="5413a93d359c2dc111f40915e9f0c292433d1ee9685594deb4849deceb8e0021" Oct 03 10:15:56 crc kubenswrapper[4790]: E1003 10:15:56.613902 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5413a93d359c2dc111f40915e9f0c292433d1ee9685594deb4849deceb8e0021\": container with ID starting with 5413a93d359c2dc111f40915e9f0c292433d1ee9685594deb4849deceb8e0021 not found: ID does not exist" containerID="5413a93d359c2dc111f40915e9f0c292433d1ee9685594deb4849deceb8e0021" Oct 03 10:15:56 crc kubenswrapper[4790]: I1003 10:15:56.614066 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5413a93d359c2dc111f40915e9f0c292433d1ee9685594deb4849deceb8e0021"} err="failed to get container status \"5413a93d359c2dc111f40915e9f0c292433d1ee9685594deb4849deceb8e0021\": rpc error: code = NotFound desc = could not find container \"5413a93d359c2dc111f40915e9f0c292433d1ee9685594deb4849deceb8e0021\": container with ID starting with 5413a93d359c2dc111f40915e9f0c292433d1ee9685594deb4849deceb8e0021 not found: ID does not exist" Oct 03 10:15:56 crc kubenswrapper[4790]: I1003 10:15:56.614184 4790 scope.go:117] "RemoveContainer" containerID="30d9dae419a4eb5e812f95ac183b152eddbbff86077d2af1b35c7bd6473aa1a5" Oct 03 10:15:56 crc kubenswrapper[4790]: E1003 10:15:56.614517 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"30d9dae419a4eb5e812f95ac183b152eddbbff86077d2af1b35c7bd6473aa1a5\": container with ID starting with 30d9dae419a4eb5e812f95ac183b152eddbbff86077d2af1b35c7bd6473aa1a5 not found: ID does not exist" containerID="30d9dae419a4eb5e812f95ac183b152eddbbff86077d2af1b35c7bd6473aa1a5" Oct 03 10:15:56 crc kubenswrapper[4790]: I1003 10:15:56.614546 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"30d9dae419a4eb5e812f95ac183b152eddbbff86077d2af1b35c7bd6473aa1a5"} err="failed to get container status \"30d9dae419a4eb5e812f95ac183b152eddbbff86077d2af1b35c7bd6473aa1a5\": rpc error: code = NotFound desc = could not find container \"30d9dae419a4eb5e812f95ac183b152eddbbff86077d2af1b35c7bd6473aa1a5\": container with ID starting with 30d9dae419a4eb5e812f95ac183b152eddbbff86077d2af1b35c7bd6473aa1a5 not found: ID does not exist" Oct 03 10:15:56 crc kubenswrapper[4790]: I1003 10:15:56.614569 4790 scope.go:117] "RemoveContainer" containerID="c7a57e7fc4717e7758ec27e54d6a42d0588494431ceeccf165cc24e06ae581f1" Oct 03 10:15:56 crc kubenswrapper[4790]: E1003 10:15:56.614920 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c7a57e7fc4717e7758ec27e54d6a42d0588494431ceeccf165cc24e06ae581f1\": container with ID starting with c7a57e7fc4717e7758ec27e54d6a42d0588494431ceeccf165cc24e06ae581f1 not found: ID does not exist" containerID="c7a57e7fc4717e7758ec27e54d6a42d0588494431ceeccf165cc24e06ae581f1" Oct 03 10:15:56 crc kubenswrapper[4790]: I1003 10:15:56.615020 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c7a57e7fc4717e7758ec27e54d6a42d0588494431ceeccf165cc24e06ae581f1"} err="failed to get container status \"c7a57e7fc4717e7758ec27e54d6a42d0588494431ceeccf165cc24e06ae581f1\": rpc error: code = NotFound desc = could not find container \"c7a57e7fc4717e7758ec27e54d6a42d0588494431ceeccf165cc24e06ae581f1\": container with ID starting with c7a57e7fc4717e7758ec27e54d6a42d0588494431ceeccf165cc24e06ae581f1 not found: ID does not exist" Oct 03 10:15:58 crc kubenswrapper[4790]: I1003 10:15:58.339767 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="92eb12f6-b080-40b7-a31e-73d4779f5ea9" path="/var/lib/kubelet/pods/92eb12f6-b080-40b7-a31e-73d4779f5ea9/volumes" Oct 03 10:16:06 crc kubenswrapper[4790]: I1003 10:16:06.328857 4790 scope.go:117] "RemoveContainer" containerID="54762a3a6941b00ff04aded3b5d01b002e1f5c184e678c224d6c8f4fab6a3037" Oct 03 10:16:06 crc kubenswrapper[4790]: E1003 10:16:06.329611 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqj4j_openshift-machine-config-operator(36eae06a-d8be-4128-8d68-acb32e1f011b)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" podUID="36eae06a-d8be-4128-8d68-acb32e1f011b" Oct 03 10:16:21 crc kubenswrapper[4790]: I1003 10:16:21.328464 4790 scope.go:117] "RemoveContainer" containerID="54762a3a6941b00ff04aded3b5d01b002e1f5c184e678c224d6c8f4fab6a3037" Oct 03 10:16:21 crc kubenswrapper[4790]: E1003 10:16:21.329258 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqj4j_openshift-machine-config-operator(36eae06a-d8be-4128-8d68-acb32e1f011b)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" podUID="36eae06a-d8be-4128-8d68-acb32e1f011b" Oct 03 10:16:34 crc kubenswrapper[4790]: I1003 10:16:34.335906 4790 scope.go:117] "RemoveContainer" containerID="54762a3a6941b00ff04aded3b5d01b002e1f5c184e678c224d6c8f4fab6a3037" Oct 03 10:16:34 crc kubenswrapper[4790]: E1003 10:16:34.336635 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqj4j_openshift-machine-config-operator(36eae06a-d8be-4128-8d68-acb32e1f011b)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" podUID="36eae06a-d8be-4128-8d68-acb32e1f011b" Oct 03 10:16:47 crc kubenswrapper[4790]: I1003 10:16:47.328541 4790 scope.go:117] "RemoveContainer" containerID="54762a3a6941b00ff04aded3b5d01b002e1f5c184e678c224d6c8f4fab6a3037" Oct 03 10:16:47 crc kubenswrapper[4790]: E1003 10:16:47.329285 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqj4j_openshift-machine-config-operator(36eae06a-d8be-4128-8d68-acb32e1f011b)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" podUID="36eae06a-d8be-4128-8d68-acb32e1f011b" Oct 03 10:16:58 crc kubenswrapper[4790]: I1003 10:16:58.328243 4790 scope.go:117] "RemoveContainer" containerID="54762a3a6941b00ff04aded3b5d01b002e1f5c184e678c224d6c8f4fab6a3037" Oct 03 10:16:58 crc kubenswrapper[4790]: E1003 10:16:58.328942 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqj4j_openshift-machine-config-operator(36eae06a-d8be-4128-8d68-acb32e1f011b)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" podUID="36eae06a-d8be-4128-8d68-acb32e1f011b" Oct 03 10:17:12 crc kubenswrapper[4790]: I1003 10:17:12.329363 4790 scope.go:117] "RemoveContainer" containerID="54762a3a6941b00ff04aded3b5d01b002e1f5c184e678c224d6c8f4fab6a3037" Oct 03 10:17:12 crc kubenswrapper[4790]: E1003 10:17:12.330438 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqj4j_openshift-machine-config-operator(36eae06a-d8be-4128-8d68-acb32e1f011b)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" podUID="36eae06a-d8be-4128-8d68-acb32e1f011b" Oct 03 10:17:24 crc kubenswrapper[4790]: I1003 10:17:24.338229 4790 scope.go:117] "RemoveContainer" containerID="54762a3a6941b00ff04aded3b5d01b002e1f5c184e678c224d6c8f4fab6a3037" Oct 03 10:17:24 crc kubenswrapper[4790]: E1003 10:17:24.339171 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqj4j_openshift-machine-config-operator(36eae06a-d8be-4128-8d68-acb32e1f011b)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" podUID="36eae06a-d8be-4128-8d68-acb32e1f011b" Oct 03 10:17:35 crc kubenswrapper[4790]: I1003 10:17:35.328559 4790 scope.go:117] "RemoveContainer" containerID="54762a3a6941b00ff04aded3b5d01b002e1f5c184e678c224d6c8f4fab6a3037" Oct 03 10:17:35 crc kubenswrapper[4790]: E1003 10:17:35.329274 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqj4j_openshift-machine-config-operator(36eae06a-d8be-4128-8d68-acb32e1f011b)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" podUID="36eae06a-d8be-4128-8d68-acb32e1f011b" Oct 03 10:17:37 crc kubenswrapper[4790]: I1003 10:17:37.465329 4790 generic.go:334] "Generic (PLEG): container finished" podID="69ff309d-d784-487d-a190-4529008cb497" containerID="2e5feb2eaec978d38327aa47155c6f3b66adf79d0d1f1bc741f20249ab2c1c2b" exitCode=0 Oct 03 10:17:37 crc kubenswrapper[4790]: I1003 10:17:37.465411 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"69ff309d-d784-487d-a190-4529008cb497","Type":"ContainerDied","Data":"2e5feb2eaec978d38327aa47155c6f3b66adf79d0d1f1bc741f20249ab2c1c2b"} Oct 03 10:17:38 crc kubenswrapper[4790]: I1003 10:17:38.833425 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Oct 03 10:17:38 crc kubenswrapper[4790]: I1003 10:17:38.977693 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/69ff309d-d784-487d-a190-4529008cb497-test-operator-ephemeral-temporary\") pod \"69ff309d-d784-487d-a190-4529008cb497\" (UID: \"69ff309d-d784-487d-a190-4529008cb497\") " Oct 03 10:17:38 crc kubenswrapper[4790]: I1003 10:17:38.977838 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/69ff309d-d784-487d-a190-4529008cb497-config-data\") pod \"69ff309d-d784-487d-a190-4529008cb497\" (UID: \"69ff309d-d784-487d-a190-4529008cb497\") " Oct 03 10:17:38 crc kubenswrapper[4790]: I1003 10:17:38.977889 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/69ff309d-d784-487d-a190-4529008cb497-openstack-config-secret\") pod \"69ff309d-d784-487d-a190-4529008cb497\" (UID: \"69ff309d-d784-487d-a190-4529008cb497\") " Oct 03 10:17:38 crc kubenswrapper[4790]: I1003 10:17:38.977939 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/69ff309d-d784-487d-a190-4529008cb497-test-operator-ephemeral-workdir\") pod \"69ff309d-d784-487d-a190-4529008cb497\" (UID: \"69ff309d-d784-487d-a190-4529008cb497\") " Oct 03 10:17:38 crc kubenswrapper[4790]: I1003 10:17:38.977997 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/69ff309d-d784-487d-a190-4529008cb497-ca-certs\") pod \"69ff309d-d784-487d-a190-4529008cb497\" (UID: \"69ff309d-d784-487d-a190-4529008cb497\") " Oct 03 10:17:38 crc kubenswrapper[4790]: I1003 10:17:38.978024 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sm8pg\" (UniqueName: \"kubernetes.io/projected/69ff309d-d784-487d-a190-4529008cb497-kube-api-access-sm8pg\") pod \"69ff309d-d784-487d-a190-4529008cb497\" (UID: \"69ff309d-d784-487d-a190-4529008cb497\") " Oct 03 10:17:38 crc kubenswrapper[4790]: I1003 10:17:38.978079 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/69ff309d-d784-487d-a190-4529008cb497-ssh-key\") pod \"69ff309d-d784-487d-a190-4529008cb497\" (UID: \"69ff309d-d784-487d-a190-4529008cb497\") " Oct 03 10:17:38 crc kubenswrapper[4790]: I1003 10:17:38.978113 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-logs\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"69ff309d-d784-487d-a190-4529008cb497\" (UID: \"69ff309d-d784-487d-a190-4529008cb497\") " Oct 03 10:17:38 crc kubenswrapper[4790]: I1003 10:17:38.978239 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/69ff309d-d784-487d-a190-4529008cb497-openstack-config\") pod \"69ff309d-d784-487d-a190-4529008cb497\" (UID: \"69ff309d-d784-487d-a190-4529008cb497\") " Oct 03 10:17:38 crc kubenswrapper[4790]: I1003 10:17:38.978449 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/69ff309d-d784-487d-a190-4529008cb497-test-operator-ephemeral-temporary" (OuterVolumeSpecName: "test-operator-ephemeral-temporary") pod "69ff309d-d784-487d-a190-4529008cb497" (UID: "69ff309d-d784-487d-a190-4529008cb497"). InnerVolumeSpecName "test-operator-ephemeral-temporary". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 10:17:38 crc kubenswrapper[4790]: I1003 10:17:38.978876 4790 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/69ff309d-d784-487d-a190-4529008cb497-test-operator-ephemeral-temporary\") on node \"crc\" DevicePath \"\"" Oct 03 10:17:38 crc kubenswrapper[4790]: I1003 10:17:38.979516 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/69ff309d-d784-487d-a190-4529008cb497-config-data" (OuterVolumeSpecName: "config-data") pod "69ff309d-d784-487d-a190-4529008cb497" (UID: "69ff309d-d784-487d-a190-4529008cb497"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 10:17:38 crc kubenswrapper[4790]: I1003 10:17:38.983535 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/69ff309d-d784-487d-a190-4529008cb497-test-operator-ephemeral-workdir" (OuterVolumeSpecName: "test-operator-ephemeral-workdir") pod "69ff309d-d784-487d-a190-4529008cb497" (UID: "69ff309d-d784-487d-a190-4529008cb497"). InnerVolumeSpecName "test-operator-ephemeral-workdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 10:17:38 crc kubenswrapper[4790]: I1003 10:17:38.988710 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage07-crc" (OuterVolumeSpecName: "test-operator-logs") pod "69ff309d-d784-487d-a190-4529008cb497" (UID: "69ff309d-d784-487d-a190-4529008cb497"). InnerVolumeSpecName "local-storage07-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 03 10:17:38 crc kubenswrapper[4790]: I1003 10:17:38.989424 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/69ff309d-d784-487d-a190-4529008cb497-kube-api-access-sm8pg" (OuterVolumeSpecName: "kube-api-access-sm8pg") pod "69ff309d-d784-487d-a190-4529008cb497" (UID: "69ff309d-d784-487d-a190-4529008cb497"). InnerVolumeSpecName "kube-api-access-sm8pg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 10:17:39 crc kubenswrapper[4790]: I1003 10:17:39.010211 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/69ff309d-d784-487d-a190-4529008cb497-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "69ff309d-d784-487d-a190-4529008cb497" (UID: "69ff309d-d784-487d-a190-4529008cb497"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 10:17:39 crc kubenswrapper[4790]: I1003 10:17:39.018445 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/69ff309d-d784-487d-a190-4529008cb497-ca-certs" (OuterVolumeSpecName: "ca-certs") pod "69ff309d-d784-487d-a190-4529008cb497" (UID: "69ff309d-d784-487d-a190-4529008cb497"). InnerVolumeSpecName "ca-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 10:17:39 crc kubenswrapper[4790]: I1003 10:17:39.020497 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/69ff309d-d784-487d-a190-4529008cb497-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "69ff309d-d784-487d-a190-4529008cb497" (UID: "69ff309d-d784-487d-a190-4529008cb497"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 10:17:39 crc kubenswrapper[4790]: I1003 10:17:39.045166 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/69ff309d-d784-487d-a190-4529008cb497-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "69ff309d-d784-487d-a190-4529008cb497" (UID: "69ff309d-d784-487d-a190-4529008cb497"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 10:17:39 crc kubenswrapper[4790]: I1003 10:17:39.081568 4790 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/69ff309d-d784-487d-a190-4529008cb497-config-data\") on node \"crc\" DevicePath \"\"" Oct 03 10:17:39 crc kubenswrapper[4790]: I1003 10:17:39.081712 4790 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/69ff309d-d784-487d-a190-4529008cb497-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Oct 03 10:17:39 crc kubenswrapper[4790]: I1003 10:17:39.081733 4790 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/69ff309d-d784-487d-a190-4529008cb497-test-operator-ephemeral-workdir\") on node \"crc\" DevicePath \"\"" Oct 03 10:17:39 crc kubenswrapper[4790]: I1003 10:17:39.081746 4790 reconciler_common.go:293] "Volume detached for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/69ff309d-d784-487d-a190-4529008cb497-ca-certs\") on node \"crc\" DevicePath \"\"" Oct 03 10:17:39 crc kubenswrapper[4790]: I1003 10:17:39.081758 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sm8pg\" (UniqueName: \"kubernetes.io/projected/69ff309d-d784-487d-a190-4529008cb497-kube-api-access-sm8pg\") on node \"crc\" DevicePath \"\"" Oct 03 10:17:39 crc kubenswrapper[4790]: I1003 10:17:39.081797 4790 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/69ff309d-d784-487d-a190-4529008cb497-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 03 10:17:39 crc kubenswrapper[4790]: I1003 10:17:39.081868 4790 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" " Oct 03 10:17:39 crc kubenswrapper[4790]: I1003 10:17:39.081885 4790 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/69ff309d-d784-487d-a190-4529008cb497-openstack-config\") on node \"crc\" DevicePath \"\"" Oct 03 10:17:39 crc kubenswrapper[4790]: I1003 10:17:39.109422 4790 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage07-crc" (UniqueName: "kubernetes.io/local-volume/local-storage07-crc") on node "crc" Oct 03 10:17:39 crc kubenswrapper[4790]: I1003 10:17:39.184417 4790 reconciler_common.go:293] "Volume detached for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" DevicePath \"\"" Oct 03 10:17:39 crc kubenswrapper[4790]: I1003 10:17:39.488214 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"69ff309d-d784-487d-a190-4529008cb497","Type":"ContainerDied","Data":"9565be091cb3cbf943f5ebb99d76d02e775f0f8a5d2bad58d9a6ab36356159e1"} Oct 03 10:17:39 crc kubenswrapper[4790]: I1003 10:17:39.488295 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9565be091cb3cbf943f5ebb99d76d02e775f0f8a5d2bad58d9a6ab36356159e1" Oct 03 10:17:39 crc kubenswrapper[4790]: I1003 10:17:39.488323 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Oct 03 10:17:46 crc kubenswrapper[4790]: I1003 10:17:46.329048 4790 scope.go:117] "RemoveContainer" containerID="54762a3a6941b00ff04aded3b5d01b002e1f5c184e678c224d6c8f4fab6a3037" Oct 03 10:17:46 crc kubenswrapper[4790]: E1003 10:17:46.330334 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqj4j_openshift-machine-config-operator(36eae06a-d8be-4128-8d68-acb32e1f011b)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" podUID="36eae06a-d8be-4128-8d68-acb32e1f011b" Oct 03 10:17:50 crc kubenswrapper[4790]: I1003 10:17:50.446653 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Oct 03 10:17:50 crc kubenswrapper[4790]: E1003 10:17:50.447823 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="92eb12f6-b080-40b7-a31e-73d4779f5ea9" containerName="extract-content" Oct 03 10:17:50 crc kubenswrapper[4790]: I1003 10:17:50.447840 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="92eb12f6-b080-40b7-a31e-73d4779f5ea9" containerName="extract-content" Oct 03 10:17:50 crc kubenswrapper[4790]: E1003 10:17:50.447855 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="92eb12f6-b080-40b7-a31e-73d4779f5ea9" containerName="extract-utilities" Oct 03 10:17:50 crc kubenswrapper[4790]: I1003 10:17:50.447864 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="92eb12f6-b080-40b7-a31e-73d4779f5ea9" containerName="extract-utilities" Oct 03 10:17:50 crc kubenswrapper[4790]: E1003 10:17:50.447911 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="92eb12f6-b080-40b7-a31e-73d4779f5ea9" containerName="registry-server" Oct 03 10:17:50 crc kubenswrapper[4790]: I1003 10:17:50.447918 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="92eb12f6-b080-40b7-a31e-73d4779f5ea9" containerName="registry-server" Oct 03 10:17:50 crc kubenswrapper[4790]: E1003 10:17:50.447935 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="69ff309d-d784-487d-a190-4529008cb497" containerName="tempest-tests-tempest-tests-runner" Oct 03 10:17:50 crc kubenswrapper[4790]: I1003 10:17:50.447943 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="69ff309d-d784-487d-a190-4529008cb497" containerName="tempest-tests-tempest-tests-runner" Oct 03 10:17:50 crc kubenswrapper[4790]: I1003 10:17:50.448203 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="92eb12f6-b080-40b7-a31e-73d4779f5ea9" containerName="registry-server" Oct 03 10:17:50 crc kubenswrapper[4790]: I1003 10:17:50.448244 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="69ff309d-d784-487d-a190-4529008cb497" containerName="tempest-tests-tempest-tests-runner" Oct 03 10:17:50 crc kubenswrapper[4790]: I1003 10:17:50.449146 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Oct 03 10:17:50 crc kubenswrapper[4790]: I1003 10:17:50.452391 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-xmfm8" Oct 03 10:17:50 crc kubenswrapper[4790]: I1003 10:17:50.474053 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Oct 03 10:17:50 crc kubenswrapper[4790]: I1003 10:17:50.632274 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5zgb8\" (UniqueName: \"kubernetes.io/projected/516af441-b673-4c5b-840f-1f9bda8cdc5a-kube-api-access-5zgb8\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"516af441-b673-4c5b-840f-1f9bda8cdc5a\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Oct 03 10:17:50 crc kubenswrapper[4790]: I1003 10:17:50.633033 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"516af441-b673-4c5b-840f-1f9bda8cdc5a\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Oct 03 10:17:50 crc kubenswrapper[4790]: I1003 10:17:50.735130 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5zgb8\" (UniqueName: \"kubernetes.io/projected/516af441-b673-4c5b-840f-1f9bda8cdc5a-kube-api-access-5zgb8\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"516af441-b673-4c5b-840f-1f9bda8cdc5a\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Oct 03 10:17:50 crc kubenswrapper[4790]: I1003 10:17:50.735222 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"516af441-b673-4c5b-840f-1f9bda8cdc5a\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Oct 03 10:17:50 crc kubenswrapper[4790]: I1003 10:17:50.735690 4790 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"516af441-b673-4c5b-840f-1f9bda8cdc5a\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Oct 03 10:17:50 crc kubenswrapper[4790]: I1003 10:17:50.754244 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5zgb8\" (UniqueName: \"kubernetes.io/projected/516af441-b673-4c5b-840f-1f9bda8cdc5a-kube-api-access-5zgb8\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"516af441-b673-4c5b-840f-1f9bda8cdc5a\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Oct 03 10:17:50 crc kubenswrapper[4790]: I1003 10:17:50.761529 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"516af441-b673-4c5b-840f-1f9bda8cdc5a\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Oct 03 10:17:50 crc kubenswrapper[4790]: I1003 10:17:50.773059 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Oct 03 10:17:51 crc kubenswrapper[4790]: I1003 10:17:51.283613 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Oct 03 10:17:51 crc kubenswrapper[4790]: I1003 10:17:51.615054 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"516af441-b673-4c5b-840f-1f9bda8cdc5a","Type":"ContainerStarted","Data":"1b0375d0b41352807d5fc6b6ff776abf7f297d491512a9ab6ad19390f7385e8e"} Oct 03 10:17:52 crc kubenswrapper[4790]: I1003 10:17:52.624661 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"516af441-b673-4c5b-840f-1f9bda8cdc5a","Type":"ContainerStarted","Data":"6e07308f27e7ac82bfc2a15130a3b5a40c61a523f4f91d9c6a2302307aead59b"} Oct 03 10:17:52 crc kubenswrapper[4790]: I1003 10:17:52.643276 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" podStartSLOduration=1.8696345189999999 podStartE2EDuration="2.643255602s" podCreationTimestamp="2025-10-03 10:17:50 +0000 UTC" firstStartedPulling="2025-10-03 10:17:51.288011089 +0000 UTC m=+5857.640945327" lastFinishedPulling="2025-10-03 10:17:52.061632172 +0000 UTC m=+5858.414566410" observedRunningTime="2025-10-03 10:17:52.634483591 +0000 UTC m=+5858.987417829" watchObservedRunningTime="2025-10-03 10:17:52.643255602 +0000 UTC m=+5858.996189840" Oct 03 10:17:58 crc kubenswrapper[4790]: I1003 10:17:58.329154 4790 scope.go:117] "RemoveContainer" containerID="54762a3a6941b00ff04aded3b5d01b002e1f5c184e678c224d6c8f4fab6a3037" Oct 03 10:17:58 crc kubenswrapper[4790]: E1003 10:17:58.330210 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqj4j_openshift-machine-config-operator(36eae06a-d8be-4128-8d68-acb32e1f011b)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" podUID="36eae06a-d8be-4128-8d68-acb32e1f011b" Oct 03 10:18:13 crc kubenswrapper[4790]: I1003 10:18:13.328523 4790 scope.go:117] "RemoveContainer" containerID="54762a3a6941b00ff04aded3b5d01b002e1f5c184e678c224d6c8f4fab6a3037" Oct 03 10:18:13 crc kubenswrapper[4790]: E1003 10:18:13.332984 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqj4j_openshift-machine-config-operator(36eae06a-d8be-4128-8d68-acb32e1f011b)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" podUID="36eae06a-d8be-4128-8d68-acb32e1f011b" Oct 03 10:18:14 crc kubenswrapper[4790]: I1003 10:18:14.026817 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-hlhcg/must-gather-bgnkz"] Oct 03 10:18:14 crc kubenswrapper[4790]: I1003 10:18:14.028803 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-hlhcg/must-gather-bgnkz" Oct 03 10:18:14 crc kubenswrapper[4790]: I1003 10:18:14.032567 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-hlhcg"/"openshift-service-ca.crt" Oct 03 10:18:14 crc kubenswrapper[4790]: I1003 10:18:14.033824 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-hlhcg"/"kube-root-ca.crt" Oct 03 10:18:14 crc kubenswrapper[4790]: I1003 10:18:14.046929 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-hlhcg/must-gather-bgnkz"] Oct 03 10:18:14 crc kubenswrapper[4790]: I1003 10:18:14.171424 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/c22cc451-27ee-4530-8185-d5285f391019-must-gather-output\") pod \"must-gather-bgnkz\" (UID: \"c22cc451-27ee-4530-8185-d5285f391019\") " pod="openshift-must-gather-hlhcg/must-gather-bgnkz" Oct 03 10:18:14 crc kubenswrapper[4790]: I1003 10:18:14.171535 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dkk4x\" (UniqueName: \"kubernetes.io/projected/c22cc451-27ee-4530-8185-d5285f391019-kube-api-access-dkk4x\") pod \"must-gather-bgnkz\" (UID: \"c22cc451-27ee-4530-8185-d5285f391019\") " pod="openshift-must-gather-hlhcg/must-gather-bgnkz" Oct 03 10:18:14 crc kubenswrapper[4790]: I1003 10:18:14.273214 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dkk4x\" (UniqueName: \"kubernetes.io/projected/c22cc451-27ee-4530-8185-d5285f391019-kube-api-access-dkk4x\") pod \"must-gather-bgnkz\" (UID: \"c22cc451-27ee-4530-8185-d5285f391019\") " pod="openshift-must-gather-hlhcg/must-gather-bgnkz" Oct 03 10:18:14 crc kubenswrapper[4790]: I1003 10:18:14.273391 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/c22cc451-27ee-4530-8185-d5285f391019-must-gather-output\") pod \"must-gather-bgnkz\" (UID: \"c22cc451-27ee-4530-8185-d5285f391019\") " pod="openshift-must-gather-hlhcg/must-gather-bgnkz" Oct 03 10:18:14 crc kubenswrapper[4790]: I1003 10:18:14.273866 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/c22cc451-27ee-4530-8185-d5285f391019-must-gather-output\") pod \"must-gather-bgnkz\" (UID: \"c22cc451-27ee-4530-8185-d5285f391019\") " pod="openshift-must-gather-hlhcg/must-gather-bgnkz" Oct 03 10:18:14 crc kubenswrapper[4790]: I1003 10:18:14.294280 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dkk4x\" (UniqueName: \"kubernetes.io/projected/c22cc451-27ee-4530-8185-d5285f391019-kube-api-access-dkk4x\") pod \"must-gather-bgnkz\" (UID: \"c22cc451-27ee-4530-8185-d5285f391019\") " pod="openshift-must-gather-hlhcg/must-gather-bgnkz" Oct 03 10:18:14 crc kubenswrapper[4790]: I1003 10:18:14.349282 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-hlhcg/must-gather-bgnkz" Oct 03 10:18:14 crc kubenswrapper[4790]: I1003 10:18:14.833329 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-hlhcg/must-gather-bgnkz"] Oct 03 10:18:15 crc kubenswrapper[4790]: I1003 10:18:15.855346 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-hlhcg/must-gather-bgnkz" event={"ID":"c22cc451-27ee-4530-8185-d5285f391019","Type":"ContainerStarted","Data":"1f56e8b746b17d30e99ed2757469b1d60b37976a224e4b269603ba973529f290"} Oct 03 10:18:21 crc kubenswrapper[4790]: I1003 10:18:21.950700 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-hlhcg/must-gather-bgnkz" event={"ID":"c22cc451-27ee-4530-8185-d5285f391019","Type":"ContainerStarted","Data":"b0adbf2effa4fe25c92816baac362267e1ad66ddf919d296dfe31b7f5742fc1b"} Oct 03 10:18:21 crc kubenswrapper[4790]: I1003 10:18:21.951229 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-hlhcg/must-gather-bgnkz" event={"ID":"c22cc451-27ee-4530-8185-d5285f391019","Type":"ContainerStarted","Data":"36ca8af4f7ba1773b65e1bd2c7e1967fb03dd64f04b383791b879ba331e9911b"} Oct 03 10:18:21 crc kubenswrapper[4790]: I1003 10:18:21.975894 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-hlhcg/must-gather-bgnkz" podStartSLOduration=2.056394208 podStartE2EDuration="7.975877589s" podCreationTimestamp="2025-10-03 10:18:14 +0000 UTC" firstStartedPulling="2025-10-03 10:18:14.841848053 +0000 UTC m=+5881.194782291" lastFinishedPulling="2025-10-03 10:18:20.761331434 +0000 UTC m=+5887.114265672" observedRunningTime="2025-10-03 10:18:21.965731049 +0000 UTC m=+5888.318665307" watchObservedRunningTime="2025-10-03 10:18:21.975877589 +0000 UTC m=+5888.328811817" Oct 03 10:18:25 crc kubenswrapper[4790]: I1003 10:18:25.328484 4790 scope.go:117] "RemoveContainer" containerID="54762a3a6941b00ff04aded3b5d01b002e1f5c184e678c224d6c8f4fab6a3037" Oct 03 10:18:25 crc kubenswrapper[4790]: E1003 10:18:25.329320 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqj4j_openshift-machine-config-operator(36eae06a-d8be-4128-8d68-acb32e1f011b)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" podUID="36eae06a-d8be-4128-8d68-acb32e1f011b" Oct 03 10:18:25 crc kubenswrapper[4790]: I1003 10:18:25.520968 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-hlhcg/crc-debug-qhcjl"] Oct 03 10:18:25 crc kubenswrapper[4790]: I1003 10:18:25.523205 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-hlhcg/crc-debug-qhcjl" Oct 03 10:18:25 crc kubenswrapper[4790]: I1003 10:18:25.529205 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-hlhcg"/"default-dockercfg-h5ccg" Oct 03 10:18:25 crc kubenswrapper[4790]: I1003 10:18:25.644842 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bv69k\" (UniqueName: \"kubernetes.io/projected/2a34afb2-c892-4447-81a6-db7880c35820-kube-api-access-bv69k\") pod \"crc-debug-qhcjl\" (UID: \"2a34afb2-c892-4447-81a6-db7880c35820\") " pod="openshift-must-gather-hlhcg/crc-debug-qhcjl" Oct 03 10:18:25 crc kubenswrapper[4790]: I1003 10:18:25.644934 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/2a34afb2-c892-4447-81a6-db7880c35820-host\") pod \"crc-debug-qhcjl\" (UID: \"2a34afb2-c892-4447-81a6-db7880c35820\") " pod="openshift-must-gather-hlhcg/crc-debug-qhcjl" Oct 03 10:18:25 crc kubenswrapper[4790]: I1003 10:18:25.746881 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bv69k\" (UniqueName: \"kubernetes.io/projected/2a34afb2-c892-4447-81a6-db7880c35820-kube-api-access-bv69k\") pod \"crc-debug-qhcjl\" (UID: \"2a34afb2-c892-4447-81a6-db7880c35820\") " pod="openshift-must-gather-hlhcg/crc-debug-qhcjl" Oct 03 10:18:25 crc kubenswrapper[4790]: I1003 10:18:25.746968 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/2a34afb2-c892-4447-81a6-db7880c35820-host\") pod \"crc-debug-qhcjl\" (UID: \"2a34afb2-c892-4447-81a6-db7880c35820\") " pod="openshift-must-gather-hlhcg/crc-debug-qhcjl" Oct 03 10:18:25 crc kubenswrapper[4790]: I1003 10:18:25.747191 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/2a34afb2-c892-4447-81a6-db7880c35820-host\") pod \"crc-debug-qhcjl\" (UID: \"2a34afb2-c892-4447-81a6-db7880c35820\") " pod="openshift-must-gather-hlhcg/crc-debug-qhcjl" Oct 03 10:18:25 crc kubenswrapper[4790]: I1003 10:18:25.768398 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bv69k\" (UniqueName: \"kubernetes.io/projected/2a34afb2-c892-4447-81a6-db7880c35820-kube-api-access-bv69k\") pod \"crc-debug-qhcjl\" (UID: \"2a34afb2-c892-4447-81a6-db7880c35820\") " pod="openshift-must-gather-hlhcg/crc-debug-qhcjl" Oct 03 10:18:25 crc kubenswrapper[4790]: I1003 10:18:25.844278 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-hlhcg/crc-debug-qhcjl" Oct 03 10:18:26 crc kubenswrapper[4790]: I1003 10:18:26.020822 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-hlhcg/crc-debug-qhcjl" event={"ID":"2a34afb2-c892-4447-81a6-db7880c35820","Type":"ContainerStarted","Data":"0bcd5b739272c172c1b12ad9323c47900f42f420f572966b73428b2ca8ecb0d7"} Oct 03 10:18:36 crc kubenswrapper[4790]: I1003 10:18:36.328150 4790 scope.go:117] "RemoveContainer" containerID="54762a3a6941b00ff04aded3b5d01b002e1f5c184e678c224d6c8f4fab6a3037" Oct 03 10:18:36 crc kubenswrapper[4790]: E1003 10:18:36.328956 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqj4j_openshift-machine-config-operator(36eae06a-d8be-4128-8d68-acb32e1f011b)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" podUID="36eae06a-d8be-4128-8d68-acb32e1f011b" Oct 03 10:18:37 crc kubenswrapper[4790]: I1003 10:18:37.171899 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-hlhcg/crc-debug-qhcjl" event={"ID":"2a34afb2-c892-4447-81a6-db7880c35820","Type":"ContainerStarted","Data":"3d5e61aff8681fa2b49f167ac095a0e9d8a36057541842e94d3f2a2911d20e84"} Oct 03 10:18:37 crc kubenswrapper[4790]: I1003 10:18:37.194172 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-hlhcg/crc-debug-qhcjl" podStartSLOduration=1.210832861 podStartE2EDuration="12.1941496s" podCreationTimestamp="2025-10-03 10:18:25 +0000 UTC" firstStartedPulling="2025-10-03 10:18:25.881911346 +0000 UTC m=+5892.234845584" lastFinishedPulling="2025-10-03 10:18:36.865228085 +0000 UTC m=+5903.218162323" observedRunningTime="2025-10-03 10:18:37.188380111 +0000 UTC m=+5903.541314349" watchObservedRunningTime="2025-10-03 10:18:37.1941496 +0000 UTC m=+5903.547083828" Oct 03 10:18:41 crc kubenswrapper[4790]: I1003 10:18:41.221341 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-s64mv"] Oct 03 10:18:41 crc kubenswrapper[4790]: I1003 10:18:41.224537 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-s64mv" Oct 03 10:18:41 crc kubenswrapper[4790]: I1003 10:18:41.235147 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-s64mv"] Oct 03 10:18:41 crc kubenswrapper[4790]: I1003 10:18:41.265739 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/222f02b3-fb45-4b55-b0a4-bcb6032786c0-utilities\") pod \"certified-operators-s64mv\" (UID: \"222f02b3-fb45-4b55-b0a4-bcb6032786c0\") " pod="openshift-marketplace/certified-operators-s64mv" Oct 03 10:18:41 crc kubenswrapper[4790]: I1003 10:18:41.265797 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/222f02b3-fb45-4b55-b0a4-bcb6032786c0-catalog-content\") pod \"certified-operators-s64mv\" (UID: \"222f02b3-fb45-4b55-b0a4-bcb6032786c0\") " pod="openshift-marketplace/certified-operators-s64mv" Oct 03 10:18:41 crc kubenswrapper[4790]: I1003 10:18:41.265861 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xqbqm\" (UniqueName: \"kubernetes.io/projected/222f02b3-fb45-4b55-b0a4-bcb6032786c0-kube-api-access-xqbqm\") pod \"certified-operators-s64mv\" (UID: \"222f02b3-fb45-4b55-b0a4-bcb6032786c0\") " pod="openshift-marketplace/certified-operators-s64mv" Oct 03 10:18:41 crc kubenswrapper[4790]: I1003 10:18:41.368449 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/222f02b3-fb45-4b55-b0a4-bcb6032786c0-utilities\") pod \"certified-operators-s64mv\" (UID: \"222f02b3-fb45-4b55-b0a4-bcb6032786c0\") " pod="openshift-marketplace/certified-operators-s64mv" Oct 03 10:18:41 crc kubenswrapper[4790]: I1003 10:18:41.368518 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/222f02b3-fb45-4b55-b0a4-bcb6032786c0-catalog-content\") pod \"certified-operators-s64mv\" (UID: \"222f02b3-fb45-4b55-b0a4-bcb6032786c0\") " pod="openshift-marketplace/certified-operators-s64mv" Oct 03 10:18:41 crc kubenswrapper[4790]: I1003 10:18:41.368598 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xqbqm\" (UniqueName: \"kubernetes.io/projected/222f02b3-fb45-4b55-b0a4-bcb6032786c0-kube-api-access-xqbqm\") pod \"certified-operators-s64mv\" (UID: \"222f02b3-fb45-4b55-b0a4-bcb6032786c0\") " pod="openshift-marketplace/certified-operators-s64mv" Oct 03 10:18:41 crc kubenswrapper[4790]: I1003 10:18:41.369549 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/222f02b3-fb45-4b55-b0a4-bcb6032786c0-utilities\") pod \"certified-operators-s64mv\" (UID: \"222f02b3-fb45-4b55-b0a4-bcb6032786c0\") " pod="openshift-marketplace/certified-operators-s64mv" Oct 03 10:18:41 crc kubenswrapper[4790]: I1003 10:18:41.369858 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/222f02b3-fb45-4b55-b0a4-bcb6032786c0-catalog-content\") pod \"certified-operators-s64mv\" (UID: \"222f02b3-fb45-4b55-b0a4-bcb6032786c0\") " pod="openshift-marketplace/certified-operators-s64mv" Oct 03 10:18:41 crc kubenswrapper[4790]: I1003 10:18:41.398253 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xqbqm\" (UniqueName: \"kubernetes.io/projected/222f02b3-fb45-4b55-b0a4-bcb6032786c0-kube-api-access-xqbqm\") pod \"certified-operators-s64mv\" (UID: \"222f02b3-fb45-4b55-b0a4-bcb6032786c0\") " pod="openshift-marketplace/certified-operators-s64mv" Oct 03 10:18:41 crc kubenswrapper[4790]: I1003 10:18:41.559092 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-s64mv" Oct 03 10:18:44 crc kubenswrapper[4790]: I1003 10:18:44.085940 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-s64mv"] Oct 03 10:18:44 crc kubenswrapper[4790]: W1003 10:18:44.090921 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod222f02b3_fb45_4b55_b0a4_bcb6032786c0.slice/crio-53abed004c58b9929fc1e2698149ca42171aacec3f052470272146d7d8e45271 WatchSource:0}: Error finding container 53abed004c58b9929fc1e2698149ca42171aacec3f052470272146d7d8e45271: Status 404 returned error can't find the container with id 53abed004c58b9929fc1e2698149ca42171aacec3f052470272146d7d8e45271 Oct 03 10:18:44 crc kubenswrapper[4790]: I1003 10:18:44.246233 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-s64mv" event={"ID":"222f02b3-fb45-4b55-b0a4-bcb6032786c0","Type":"ContainerStarted","Data":"53abed004c58b9929fc1e2698149ca42171aacec3f052470272146d7d8e45271"} Oct 03 10:18:45 crc kubenswrapper[4790]: I1003 10:18:45.257934 4790 generic.go:334] "Generic (PLEG): container finished" podID="222f02b3-fb45-4b55-b0a4-bcb6032786c0" containerID="5b03f7f362c3b2b4d0d63e6c1dadaf40009ee048100bb84eb876c8a84542642e" exitCode=0 Oct 03 10:18:45 crc kubenswrapper[4790]: I1003 10:18:45.257999 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-s64mv" event={"ID":"222f02b3-fb45-4b55-b0a4-bcb6032786c0","Type":"ContainerDied","Data":"5b03f7f362c3b2b4d0d63e6c1dadaf40009ee048100bb84eb876c8a84542642e"} Oct 03 10:18:46 crc kubenswrapper[4790]: I1003 10:18:46.268642 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-s64mv" event={"ID":"222f02b3-fb45-4b55-b0a4-bcb6032786c0","Type":"ContainerStarted","Data":"883f18fafa8693731d4c44a5fdc608c5efae4a6bd15c0ed5302e70f273726f84"} Oct 03 10:18:48 crc kubenswrapper[4790]: I1003 10:18:48.289995 4790 generic.go:334] "Generic (PLEG): container finished" podID="222f02b3-fb45-4b55-b0a4-bcb6032786c0" containerID="883f18fafa8693731d4c44a5fdc608c5efae4a6bd15c0ed5302e70f273726f84" exitCode=0 Oct 03 10:18:48 crc kubenswrapper[4790]: I1003 10:18:48.290256 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-s64mv" event={"ID":"222f02b3-fb45-4b55-b0a4-bcb6032786c0","Type":"ContainerDied","Data":"883f18fafa8693731d4c44a5fdc608c5efae4a6bd15c0ed5302e70f273726f84"} Oct 03 10:18:49 crc kubenswrapper[4790]: I1003 10:18:49.327655 4790 scope.go:117] "RemoveContainer" containerID="54762a3a6941b00ff04aded3b5d01b002e1f5c184e678c224d6c8f4fab6a3037" Oct 03 10:18:49 crc kubenswrapper[4790]: E1003 10:18:49.328523 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqj4j_openshift-machine-config-operator(36eae06a-d8be-4128-8d68-acb32e1f011b)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" podUID="36eae06a-d8be-4128-8d68-acb32e1f011b" Oct 03 10:18:51 crc kubenswrapper[4790]: I1003 10:18:51.328181 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-s64mv" event={"ID":"222f02b3-fb45-4b55-b0a4-bcb6032786c0","Type":"ContainerStarted","Data":"e14bf68123de8caa266f7b955bef46ded327a7470e37eb340c6e586f4d899702"} Oct 03 10:18:51 crc kubenswrapper[4790]: I1003 10:18:51.380387 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-s64mv" podStartSLOduration=6.853392107 podStartE2EDuration="10.380361176s" podCreationTimestamp="2025-10-03 10:18:41 +0000 UTC" firstStartedPulling="2025-10-03 10:18:45.260263125 +0000 UTC m=+5911.613197363" lastFinishedPulling="2025-10-03 10:18:48.787232194 +0000 UTC m=+5915.140166432" observedRunningTime="2025-10-03 10:18:51.34751283 +0000 UTC m=+5917.700447068" watchObservedRunningTime="2025-10-03 10:18:51.380361176 +0000 UTC m=+5917.733295414" Oct 03 10:18:51 crc kubenswrapper[4790]: I1003 10:18:51.560694 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-s64mv" Oct 03 10:18:51 crc kubenswrapper[4790]: I1003 10:18:51.560757 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-s64mv" Oct 03 10:18:52 crc kubenswrapper[4790]: I1003 10:18:52.616395 4790 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-s64mv" podUID="222f02b3-fb45-4b55-b0a4-bcb6032786c0" containerName="registry-server" probeResult="failure" output=< Oct 03 10:18:52 crc kubenswrapper[4790]: timeout: failed to connect service ":50051" within 1s Oct 03 10:18:52 crc kubenswrapper[4790]: > Oct 03 10:19:01 crc kubenswrapper[4790]: I1003 10:19:01.328901 4790 scope.go:117] "RemoveContainer" containerID="54762a3a6941b00ff04aded3b5d01b002e1f5c184e678c224d6c8f4fab6a3037" Oct 03 10:19:01 crc kubenswrapper[4790]: E1003 10:19:01.329848 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqj4j_openshift-machine-config-operator(36eae06a-d8be-4128-8d68-acb32e1f011b)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" podUID="36eae06a-d8be-4128-8d68-acb32e1f011b" Oct 03 10:19:01 crc kubenswrapper[4790]: I1003 10:19:01.618288 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-s64mv" Oct 03 10:19:01 crc kubenswrapper[4790]: I1003 10:19:01.674089 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-s64mv" Oct 03 10:19:01 crc kubenswrapper[4790]: I1003 10:19:01.863105 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-s64mv"] Oct 03 10:19:03 crc kubenswrapper[4790]: I1003 10:19:03.453211 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-s64mv" podUID="222f02b3-fb45-4b55-b0a4-bcb6032786c0" containerName="registry-server" containerID="cri-o://e14bf68123de8caa266f7b955bef46ded327a7470e37eb340c6e586f4d899702" gracePeriod=2 Oct 03 10:19:04 crc kubenswrapper[4790]: I1003 10:19:04.036420 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-s64mv" Oct 03 10:19:04 crc kubenswrapper[4790]: I1003 10:19:04.094444 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xqbqm\" (UniqueName: \"kubernetes.io/projected/222f02b3-fb45-4b55-b0a4-bcb6032786c0-kube-api-access-xqbqm\") pod \"222f02b3-fb45-4b55-b0a4-bcb6032786c0\" (UID: \"222f02b3-fb45-4b55-b0a4-bcb6032786c0\") " Oct 03 10:19:04 crc kubenswrapper[4790]: I1003 10:19:04.094740 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/222f02b3-fb45-4b55-b0a4-bcb6032786c0-catalog-content\") pod \"222f02b3-fb45-4b55-b0a4-bcb6032786c0\" (UID: \"222f02b3-fb45-4b55-b0a4-bcb6032786c0\") " Oct 03 10:19:04 crc kubenswrapper[4790]: I1003 10:19:04.094891 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/222f02b3-fb45-4b55-b0a4-bcb6032786c0-utilities\") pod \"222f02b3-fb45-4b55-b0a4-bcb6032786c0\" (UID: \"222f02b3-fb45-4b55-b0a4-bcb6032786c0\") " Oct 03 10:19:04 crc kubenswrapper[4790]: I1003 10:19:04.096338 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/222f02b3-fb45-4b55-b0a4-bcb6032786c0-utilities" (OuterVolumeSpecName: "utilities") pod "222f02b3-fb45-4b55-b0a4-bcb6032786c0" (UID: "222f02b3-fb45-4b55-b0a4-bcb6032786c0"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 10:19:04 crc kubenswrapper[4790]: I1003 10:19:04.115834 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/222f02b3-fb45-4b55-b0a4-bcb6032786c0-kube-api-access-xqbqm" (OuterVolumeSpecName: "kube-api-access-xqbqm") pod "222f02b3-fb45-4b55-b0a4-bcb6032786c0" (UID: "222f02b3-fb45-4b55-b0a4-bcb6032786c0"). InnerVolumeSpecName "kube-api-access-xqbqm". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 10:19:04 crc kubenswrapper[4790]: I1003 10:19:04.149980 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/222f02b3-fb45-4b55-b0a4-bcb6032786c0-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "222f02b3-fb45-4b55-b0a4-bcb6032786c0" (UID: "222f02b3-fb45-4b55-b0a4-bcb6032786c0"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 10:19:04 crc kubenswrapper[4790]: I1003 10:19:04.197277 4790 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/222f02b3-fb45-4b55-b0a4-bcb6032786c0-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 10:19:04 crc kubenswrapper[4790]: I1003 10:19:04.197321 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xqbqm\" (UniqueName: \"kubernetes.io/projected/222f02b3-fb45-4b55-b0a4-bcb6032786c0-kube-api-access-xqbqm\") on node \"crc\" DevicePath \"\"" Oct 03 10:19:04 crc kubenswrapper[4790]: I1003 10:19:04.197336 4790 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/222f02b3-fb45-4b55-b0a4-bcb6032786c0-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 10:19:04 crc kubenswrapper[4790]: I1003 10:19:04.463588 4790 generic.go:334] "Generic (PLEG): container finished" podID="222f02b3-fb45-4b55-b0a4-bcb6032786c0" containerID="e14bf68123de8caa266f7b955bef46ded327a7470e37eb340c6e586f4d899702" exitCode=0 Oct 03 10:19:04 crc kubenswrapper[4790]: I1003 10:19:04.463631 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-s64mv" event={"ID":"222f02b3-fb45-4b55-b0a4-bcb6032786c0","Type":"ContainerDied","Data":"e14bf68123de8caa266f7b955bef46ded327a7470e37eb340c6e586f4d899702"} Oct 03 10:19:04 crc kubenswrapper[4790]: I1003 10:19:04.463660 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-s64mv" event={"ID":"222f02b3-fb45-4b55-b0a4-bcb6032786c0","Type":"ContainerDied","Data":"53abed004c58b9929fc1e2698149ca42171aacec3f052470272146d7d8e45271"} Oct 03 10:19:04 crc kubenswrapper[4790]: I1003 10:19:04.463680 4790 scope.go:117] "RemoveContainer" containerID="e14bf68123de8caa266f7b955bef46ded327a7470e37eb340c6e586f4d899702" Oct 03 10:19:04 crc kubenswrapper[4790]: I1003 10:19:04.463813 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-s64mv" Oct 03 10:19:04 crc kubenswrapper[4790]: I1003 10:19:04.492785 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-s64mv"] Oct 03 10:19:04 crc kubenswrapper[4790]: I1003 10:19:04.502548 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-s64mv"] Oct 03 10:19:04 crc kubenswrapper[4790]: I1003 10:19:04.618676 4790 scope.go:117] "RemoveContainer" containerID="883f18fafa8693731d4c44a5fdc608c5efae4a6bd15c0ed5302e70f273726f84" Oct 03 10:19:04 crc kubenswrapper[4790]: I1003 10:19:04.654032 4790 scope.go:117] "RemoveContainer" containerID="5b03f7f362c3b2b4d0d63e6c1dadaf40009ee048100bb84eb876c8a84542642e" Oct 03 10:19:04 crc kubenswrapper[4790]: I1003 10:19:04.696720 4790 scope.go:117] "RemoveContainer" containerID="e14bf68123de8caa266f7b955bef46ded327a7470e37eb340c6e586f4d899702" Oct 03 10:19:04 crc kubenswrapper[4790]: E1003 10:19:04.697222 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e14bf68123de8caa266f7b955bef46ded327a7470e37eb340c6e586f4d899702\": container with ID starting with e14bf68123de8caa266f7b955bef46ded327a7470e37eb340c6e586f4d899702 not found: ID does not exist" containerID="e14bf68123de8caa266f7b955bef46ded327a7470e37eb340c6e586f4d899702" Oct 03 10:19:04 crc kubenswrapper[4790]: I1003 10:19:04.697281 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e14bf68123de8caa266f7b955bef46ded327a7470e37eb340c6e586f4d899702"} err="failed to get container status \"e14bf68123de8caa266f7b955bef46ded327a7470e37eb340c6e586f4d899702\": rpc error: code = NotFound desc = could not find container \"e14bf68123de8caa266f7b955bef46ded327a7470e37eb340c6e586f4d899702\": container with ID starting with e14bf68123de8caa266f7b955bef46ded327a7470e37eb340c6e586f4d899702 not found: ID does not exist" Oct 03 10:19:04 crc kubenswrapper[4790]: I1003 10:19:04.697308 4790 scope.go:117] "RemoveContainer" containerID="883f18fafa8693731d4c44a5fdc608c5efae4a6bd15c0ed5302e70f273726f84" Oct 03 10:19:04 crc kubenswrapper[4790]: E1003 10:19:04.697777 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"883f18fafa8693731d4c44a5fdc608c5efae4a6bd15c0ed5302e70f273726f84\": container with ID starting with 883f18fafa8693731d4c44a5fdc608c5efae4a6bd15c0ed5302e70f273726f84 not found: ID does not exist" containerID="883f18fafa8693731d4c44a5fdc608c5efae4a6bd15c0ed5302e70f273726f84" Oct 03 10:19:04 crc kubenswrapper[4790]: I1003 10:19:04.697806 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"883f18fafa8693731d4c44a5fdc608c5efae4a6bd15c0ed5302e70f273726f84"} err="failed to get container status \"883f18fafa8693731d4c44a5fdc608c5efae4a6bd15c0ed5302e70f273726f84\": rpc error: code = NotFound desc = could not find container \"883f18fafa8693731d4c44a5fdc608c5efae4a6bd15c0ed5302e70f273726f84\": container with ID starting with 883f18fafa8693731d4c44a5fdc608c5efae4a6bd15c0ed5302e70f273726f84 not found: ID does not exist" Oct 03 10:19:04 crc kubenswrapper[4790]: I1003 10:19:04.697830 4790 scope.go:117] "RemoveContainer" containerID="5b03f7f362c3b2b4d0d63e6c1dadaf40009ee048100bb84eb876c8a84542642e" Oct 03 10:19:04 crc kubenswrapper[4790]: E1003 10:19:04.698074 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5b03f7f362c3b2b4d0d63e6c1dadaf40009ee048100bb84eb876c8a84542642e\": container with ID starting with 5b03f7f362c3b2b4d0d63e6c1dadaf40009ee048100bb84eb876c8a84542642e not found: ID does not exist" containerID="5b03f7f362c3b2b4d0d63e6c1dadaf40009ee048100bb84eb876c8a84542642e" Oct 03 10:19:04 crc kubenswrapper[4790]: I1003 10:19:04.698105 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5b03f7f362c3b2b4d0d63e6c1dadaf40009ee048100bb84eb876c8a84542642e"} err="failed to get container status \"5b03f7f362c3b2b4d0d63e6c1dadaf40009ee048100bb84eb876c8a84542642e\": rpc error: code = NotFound desc = could not find container \"5b03f7f362c3b2b4d0d63e6c1dadaf40009ee048100bb84eb876c8a84542642e\": container with ID starting with 5b03f7f362c3b2b4d0d63e6c1dadaf40009ee048100bb84eb876c8a84542642e not found: ID does not exist" Oct 03 10:19:06 crc kubenswrapper[4790]: I1003 10:19:06.356181 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="222f02b3-fb45-4b55-b0a4-bcb6032786c0" path="/var/lib/kubelet/pods/222f02b3-fb45-4b55-b0a4-bcb6032786c0/volumes" Oct 03 10:19:14 crc kubenswrapper[4790]: I1003 10:19:14.339844 4790 scope.go:117] "RemoveContainer" containerID="54762a3a6941b00ff04aded3b5d01b002e1f5c184e678c224d6c8f4fab6a3037" Oct 03 10:19:14 crc kubenswrapper[4790]: E1003 10:19:14.344393 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqj4j_openshift-machine-config-operator(36eae06a-d8be-4128-8d68-acb32e1f011b)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" podUID="36eae06a-d8be-4128-8d68-acb32e1f011b" Oct 03 10:19:29 crc kubenswrapper[4790]: I1003 10:19:29.328591 4790 scope.go:117] "RemoveContainer" containerID="54762a3a6941b00ff04aded3b5d01b002e1f5c184e678c224d6c8f4fab6a3037" Oct 03 10:19:29 crc kubenswrapper[4790]: E1003 10:19:29.329462 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqj4j_openshift-machine-config-operator(36eae06a-d8be-4128-8d68-acb32e1f011b)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" podUID="36eae06a-d8be-4128-8d68-acb32e1f011b" Oct 03 10:19:42 crc kubenswrapper[4790]: I1003 10:19:42.328459 4790 scope.go:117] "RemoveContainer" containerID="54762a3a6941b00ff04aded3b5d01b002e1f5c184e678c224d6c8f4fab6a3037" Oct 03 10:19:42 crc kubenswrapper[4790]: I1003 10:19:42.868808 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" event={"ID":"36eae06a-d8be-4128-8d68-acb32e1f011b","Type":"ContainerStarted","Data":"72e7e65c5c41feebe96198a1537bafbf79d8997c016f7bcda12de7ad1bebd3e6"} Oct 03 10:19:51 crc kubenswrapper[4790]: I1003 10:19:51.016774 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-55549c44b-l2c25_f1f79552-9846-40d8-b095-ac773b5af079/barbican-api/0.log" Oct 03 10:19:51 crc kubenswrapper[4790]: I1003 10:19:51.064446 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-55549c44b-l2c25_f1f79552-9846-40d8-b095-ac773b5af079/barbican-api-log/0.log" Oct 03 10:19:51 crc kubenswrapper[4790]: I1003 10:19:51.348815 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-56674f8d84-pn6qq_0db45010-78ab-486c-b5be-bc2431f3ab9d/barbican-keystone-listener/0.log" Oct 03 10:19:51 crc kubenswrapper[4790]: I1003 10:19:51.386283 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-56674f8d84-pn6qq_0db45010-78ab-486c-b5be-bc2431f3ab9d/barbican-keystone-listener-log/0.log" Oct 03 10:19:51 crc kubenswrapper[4790]: I1003 10:19:51.546401 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-765d8bdc5-vfk8w_7a3c6af9-6ba2-4e18-8779-1e71dbc175a1/barbican-worker/0.log" Oct 03 10:19:51 crc kubenswrapper[4790]: I1003 10:19:51.609651 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-765d8bdc5-vfk8w_7a3c6af9-6ba2-4e18-8779-1e71dbc175a1/barbican-worker-log/0.log" Oct 03 10:19:51 crc kubenswrapper[4790]: I1003 10:19:51.767540 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-l8fbg_97eb47a0-1a88-49ee-9d24-b9f0624ed543/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Oct 03 10:19:52 crc kubenswrapper[4790]: I1003 10:19:52.078483 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_ab369c6c-af88-4f07-bc5c-fb229a9b2cc8/ceilometer-notification-agent/0.log" Oct 03 10:19:52 crc kubenswrapper[4790]: I1003 10:19:52.126207 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_ab369c6c-af88-4f07-bc5c-fb229a9b2cc8/ceilometer-central-agent/0.log" Oct 03 10:19:52 crc kubenswrapper[4790]: I1003 10:19:52.144356 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_ab369c6c-af88-4f07-bc5c-fb229a9b2cc8/proxy-httpd/0.log" Oct 03 10:19:52 crc kubenswrapper[4790]: I1003 10:19:52.302942 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_ab369c6c-af88-4f07-bc5c-fb229a9b2cc8/sg-core/0.log" Oct 03 10:19:52 crc kubenswrapper[4790]: I1003 10:19:52.642957 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_a7f77060-c6da-48ba-9c55-e454fd1da9fd/cinder-api-log/0.log" Oct 03 10:19:53 crc kubenswrapper[4790]: I1003 10:19:53.126485 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-backup-0_8927cfc4-2b2e-4a79-bd9e-3ff95160b86a/probe/0.log" Oct 03 10:19:53 crc kubenswrapper[4790]: I1003 10:19:53.525400 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_a7f77060-c6da-48ba-9c55-e454fd1da9fd/cinder-api/0.log" Oct 03 10:19:53 crc kubenswrapper[4790]: I1003 10:19:53.652480 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-backup-0_8927cfc4-2b2e-4a79-bd9e-3ff95160b86a/cinder-backup/0.log" Oct 03 10:19:53 crc kubenswrapper[4790]: I1003 10:19:53.661406 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_1289ded7-2702-43dc-9d28-00567ba7b668/cinder-scheduler/0.log" Oct 03 10:19:53 crc kubenswrapper[4790]: I1003 10:19:53.812243 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_1289ded7-2702-43dc-9d28-00567ba7b668/probe/0.log" Oct 03 10:19:54 crc kubenswrapper[4790]: I1003 10:19:54.155129 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-volume-nfs-0_63400791-2785-485b-86b4-ffd9943163a5/probe/0.log" Oct 03 10:19:54 crc kubenswrapper[4790]: I1003 10:19:54.244815 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-volume-nfs-0_63400791-2785-485b-86b4-ffd9943163a5/cinder-volume/0.log" Oct 03 10:19:54 crc kubenswrapper[4790]: I1003 10:19:54.474400 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-volume-nfs-2-0_313b0b89-2532-482b-994d-317dfc54b6a7/probe/0.log" Oct 03 10:19:54 crc kubenswrapper[4790]: I1003 10:19:54.649682 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-jr7ff_94a4a6e2-8a66-4144-b0ea-893b7519c242/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Oct 03 10:19:54 crc kubenswrapper[4790]: I1003 10:19:54.749015 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-volume-nfs-2-0_313b0b89-2532-482b-994d-317dfc54b6a7/cinder-volume/0.log" Oct 03 10:19:54 crc kubenswrapper[4790]: I1003 10:19:54.798096 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-6wbf8_e3ec6c6b-f744-4d2d-a924-32f4682b1dfa/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 03 10:19:54 crc kubenswrapper[4790]: I1003 10:19:54.962896 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-m8tf7_46adefe5-7949-4e5f-a07c-81569f5f6ad8/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 03 10:19:54 crc kubenswrapper[4790]: I1003 10:19:54.975539 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-79ff9d4d8f-zfwpx_33a1dfa6-9323-4d22-b0c1-0bd210b9638f/init/0.log" Oct 03 10:19:55 crc kubenswrapper[4790]: I1003 10:19:55.231058 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-79ff9d4d8f-zfwpx_33a1dfa6-9323-4d22-b0c1-0bd210b9638f/init/0.log" Oct 03 10:19:55 crc kubenswrapper[4790]: I1003 10:19:55.273726 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-edpm-deployment-openstack-edpm-ipam-mdlnz_2542827b-a700-41bf-80ff-22928d83630a/download-cache-edpm-deployment-openstack-edpm-ipam/0.log" Oct 03 10:19:55 crc kubenswrapper[4790]: I1003 10:19:55.371234 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-79ff9d4d8f-zfwpx_33a1dfa6-9323-4d22-b0c1-0bd210b9638f/dnsmasq-dns/0.log" Oct 03 10:19:55 crc kubenswrapper[4790]: I1003 10:19:55.519700 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_42c4562a-910e-47f6-befc-effb8f851374/glance-httpd/0.log" Oct 03 10:19:55 crc kubenswrapper[4790]: I1003 10:19:55.743205 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_42c4562a-910e-47f6-befc-effb8f851374/glance-log/0.log" Oct 03 10:19:55 crc kubenswrapper[4790]: I1003 10:19:55.903599 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_d1558347-4d48-4e51-a89c-a3d8664016a4/glance-log/0.log" Oct 03 10:19:55 crc kubenswrapper[4790]: I1003 10:19:55.933149 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_d1558347-4d48-4e51-a89c-a3d8664016a4/glance-httpd/0.log" Oct 03 10:19:56 crc kubenswrapper[4790]: I1003 10:19:56.162158 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-86f486799b-gwc4l_6b9b4763-1706-4c7e-b39f-0d8311a9637d/horizon/0.log" Oct 03 10:19:56 crc kubenswrapper[4790]: I1003 10:19:56.244550 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-x28c6_037bc62e-a9b6-45c4-9bce-9395a372bc90/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Oct 03 10:19:56 crc kubenswrapper[4790]: I1003 10:19:56.515876 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-nb4kf_eae0b967-81c1-45eb-8240-26198c43a621/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 03 10:19:56 crc kubenswrapper[4790]: I1003 10:19:56.821511 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29324701-xvbcm_0eab3760-3551-42a8-9eaa-f53d4b94a8ca/keystone-cron/0.log" Oct 03 10:19:56 crc kubenswrapper[4790]: I1003 10:19:56.918221 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-86f486799b-gwc4l_6b9b4763-1706-4c7e-b39f-0d8311a9637d/horizon-log/0.log" Oct 03 10:19:57 crc kubenswrapper[4790]: I1003 10:19:57.061945 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29324761-s6cn9_95907765-166e-4e38-a906-86b4020f3b5c/keystone-cron/0.log" Oct 03 10:19:57 crc kubenswrapper[4790]: I1003 10:19:57.274991 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_272a9be6-ce74-4ee6-a1d2-17a8935d8c2e/kube-state-metrics/0.log" Oct 03 10:19:57 crc kubenswrapper[4790]: I1003 10:19:57.313068 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-7c4ff96776-8dljp_750b0c2b-c871-46f7-b318-78ace47356a9/keystone-api/0.log" Oct 03 10:19:57 crc kubenswrapper[4790]: I1003 10:19:57.436717 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-pgz87_60e50039-1bd7-4bd0-98ae-f1285c701e2e/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Oct 03 10:19:58 crc kubenswrapper[4790]: I1003 10:19:58.147252 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-95d849fd9-jww5w_11739e3d-ef66-4152-bd55-6c3de0dd19e2/neutron-httpd/0.log" Oct 03 10:19:58 crc kubenswrapper[4790]: I1003 10:19:58.154546 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-95d849fd9-jww5w_11739e3d-ef66-4152-bd55-6c3de0dd19e2/neutron-api/0.log" Oct 03 10:19:58 crc kubenswrapper[4790]: I1003 10:19:58.205534 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-64ldb_1e83d303-3e2f-4dab-b67f-de344d5fe175/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Oct 03 10:19:59 crc kubenswrapper[4790]: I1003 10:19:59.346029 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_46dabafe-d4c6-4b3e-84f6-3eb83eee9647/nova-cell0-conductor-conductor/0.log" Oct 03 10:20:00 crc kubenswrapper[4790]: I1003 10:20:00.272005 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_a529d30e-fdca-4eb5-8233-34e63534a164/nova-cell1-conductor-conductor/0.log" Oct 03 10:20:00 crc kubenswrapper[4790]: I1003 10:20:00.508695 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_f3e4ca6a-6d02-4c7e-b549-64e5900371ab/nova-api-log/0.log" Oct 03 10:20:00 crc kubenswrapper[4790]: I1003 10:20:00.695443 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_f3e4ca6a-6d02-4c7e-b549-64e5900371ab/nova-api-api/0.log" Oct 03 10:20:00 crc kubenswrapper[4790]: I1003 10:20:00.885014 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_bbe5d109-dfdc-4f03-afa3-a71800a7d74b/nova-cell1-novncproxy-novncproxy/0.log" Oct 03 10:20:01 crc kubenswrapper[4790]: I1003 10:20:01.078376 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-edpm-deployment-openstack-edpm-ipam-z7tfh_7e86fc9f-d34e-41ca-b945-b8b45136512f/nova-edpm-deployment-openstack-edpm-ipam/0.log" Oct 03 10:20:01 crc kubenswrapper[4790]: I1003 10:20:01.230140 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_b829e543-5d7d-4e9b-9674-9c24d060a0c5/nova-metadata-log/0.log" Oct 03 10:20:01 crc kubenswrapper[4790]: I1003 10:20:01.808054 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_87e3a414-fd4b-49ac-8e64-3514f9725df1/nova-scheduler-scheduler/0.log" Oct 03 10:20:01 crc kubenswrapper[4790]: I1003 10:20:01.958608 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_89aeb08e-2699-4c1b-99ef-4ae2cfce9de3/mysql-bootstrap/0.log" Oct 03 10:20:02 crc kubenswrapper[4790]: I1003 10:20:02.200591 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_89aeb08e-2699-4c1b-99ef-4ae2cfce9de3/mysql-bootstrap/0.log" Oct 03 10:20:02 crc kubenswrapper[4790]: I1003 10:20:02.228090 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_89aeb08e-2699-4c1b-99ef-4ae2cfce9de3/galera/0.log" Oct 03 10:20:02 crc kubenswrapper[4790]: I1003 10:20:02.501405 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_010b668f-22e4-4ad8-aae5-decf063cd49b/mysql-bootstrap/0.log" Oct 03 10:20:02 crc kubenswrapper[4790]: I1003 10:20:02.717033 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_010b668f-22e4-4ad8-aae5-decf063cd49b/mysql-bootstrap/0.log" Oct 03 10:20:02 crc kubenswrapper[4790]: I1003 10:20:02.809035 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_010b668f-22e4-4ad8-aae5-decf063cd49b/galera/0.log" Oct 03 10:20:03 crc kubenswrapper[4790]: I1003 10:20:03.126781 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_7a835cde-9c23-4cf9-9473-35d01cf2c528/openstackclient/0.log" Oct 03 10:20:03 crc kubenswrapper[4790]: I1003 10:20:03.460700 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-8ljvm_845dcf0a-43fb-4760-aac6-d12c02a8b785/ovn-controller/0.log" Oct 03 10:20:03 crc kubenswrapper[4790]: I1003 10:20:03.667368 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-6vcdh_f892381b-8fc9-4352-915e-62d0b92cb7b6/openstack-network-exporter/0.log" Oct 03 10:20:03 crc kubenswrapper[4790]: I1003 10:20:03.933744 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-vrsc7_5d8a4a4d-fc2d-49d8-80fd-8b6faf7e3aec/ovsdb-server-init/0.log" Oct 03 10:20:04 crc kubenswrapper[4790]: I1003 10:20:04.149082 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-vrsc7_5d8a4a4d-fc2d-49d8-80fd-8b6faf7e3aec/ovsdb-server-init/0.log" Oct 03 10:20:04 crc kubenswrapper[4790]: I1003 10:20:04.341439 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-vrsc7_5d8a4a4d-fc2d-49d8-80fd-8b6faf7e3aec/ovsdb-server/0.log" Oct 03 10:20:04 crc kubenswrapper[4790]: I1003 10:20:04.380522 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_b829e543-5d7d-4e9b-9674-9c24d060a0c5/nova-metadata-metadata/0.log" Oct 03 10:20:04 crc kubenswrapper[4790]: I1003 10:20:04.595280 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-vrsc7_5d8a4a4d-fc2d-49d8-80fd-8b6faf7e3aec/ovs-vswitchd/0.log" Oct 03 10:20:04 crc kubenswrapper[4790]: I1003 10:20:04.691508 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-c77vh_f3195619-a867-4741-a9c5-fc995f682859/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Oct 03 10:20:05 crc kubenswrapper[4790]: I1003 10:20:05.121506 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_b0f1b68c-e2ab-4d14-b207-a8a336b1bc1b/ovn-northd/0.log" Oct 03 10:20:05 crc kubenswrapper[4790]: I1003 10:20:05.152114 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_b0f1b68c-e2ab-4d14-b207-a8a336b1bc1b/openstack-network-exporter/0.log" Oct 03 10:20:05 crc kubenswrapper[4790]: I1003 10:20:05.324036 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_6330b59a-ef23-4482-9fa5-58ccb2f6bc5a/openstack-network-exporter/0.log" Oct 03 10:20:05 crc kubenswrapper[4790]: I1003 10:20:05.468550 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_6330b59a-ef23-4482-9fa5-58ccb2f6bc5a/ovsdbserver-nb/0.log" Oct 03 10:20:05 crc kubenswrapper[4790]: I1003 10:20:05.585249 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_c53865bf-810d-440a-bfa4-171e3183066c/openstack-network-exporter/0.log" Oct 03 10:20:05 crc kubenswrapper[4790]: I1003 10:20:05.703401 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_c53865bf-810d-440a-bfa4-171e3183066c/ovsdbserver-sb/0.log" Oct 03 10:20:06 crc kubenswrapper[4790]: I1003 10:20:06.087910 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-6d5469d486-dtt4h_38463967-5519-4ea0-8081-b22ac7858270/placement-api/0.log" Oct 03 10:20:06 crc kubenswrapper[4790]: I1003 10:20:06.283390 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-6d5469d486-dtt4h_38463967-5519-4ea0-8081-b22ac7858270/placement-log/0.log" Oct 03 10:20:06 crc kubenswrapper[4790]: I1003 10:20:06.296110 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_ba88a153-0441-4fd3-a3b3-37338c589169/init-config-reloader/0.log" Oct 03 10:20:06 crc kubenswrapper[4790]: I1003 10:20:06.543081 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_ba88a153-0441-4fd3-a3b3-37338c589169/config-reloader/0.log" Oct 03 10:20:06 crc kubenswrapper[4790]: I1003 10:20:06.556777 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_ba88a153-0441-4fd3-a3b3-37338c589169/prometheus/0.log" Oct 03 10:20:06 crc kubenswrapper[4790]: I1003 10:20:06.599926 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_ba88a153-0441-4fd3-a3b3-37338c589169/init-config-reloader/0.log" Oct 03 10:20:06 crc kubenswrapper[4790]: I1003 10:20:06.794638 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_ba88a153-0441-4fd3-a3b3-37338c589169/thanos-sidecar/0.log" Oct 03 10:20:06 crc kubenswrapper[4790]: I1003 10:20:06.848063 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_77c8e0d3-b23b-44de-a8d3-0eb2e368e9e9/setup-container/0.log" Oct 03 10:20:07 crc kubenswrapper[4790]: I1003 10:20:07.107233 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_77c8e0d3-b23b-44de-a8d3-0eb2e368e9e9/rabbitmq/0.log" Oct 03 10:20:07 crc kubenswrapper[4790]: I1003 10:20:07.109059 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_77c8e0d3-b23b-44de-a8d3-0eb2e368e9e9/setup-container/0.log" Oct 03 10:20:07 crc kubenswrapper[4790]: I1003 10:20:07.287708 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-notifications-server-0_04926fe2-c5e2-4663-8945-448a17ec2a8b/setup-container/0.log" Oct 03 10:20:07 crc kubenswrapper[4790]: I1003 10:20:07.532110 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-notifications-server-0_04926fe2-c5e2-4663-8945-448a17ec2a8b/setup-container/0.log" Oct 03 10:20:07 crc kubenswrapper[4790]: I1003 10:20:07.641090 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-notifications-server-0_04926fe2-c5e2-4663-8945-448a17ec2a8b/rabbitmq/0.log" Oct 03 10:20:07 crc kubenswrapper[4790]: I1003 10:20:07.758388 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_babbce79-b0e5-48e6-a5b7-b08e51f1dd69/setup-container/0.log" Oct 03 10:20:08 crc kubenswrapper[4790]: I1003 10:20:08.051251 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_babbce79-b0e5-48e6-a5b7-b08e51f1dd69/rabbitmq/0.log" Oct 03 10:20:08 crc kubenswrapper[4790]: I1003 10:20:08.082562 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_babbce79-b0e5-48e6-a5b7-b08e51f1dd69/setup-container/0.log" Oct 03 10:20:08 crc kubenswrapper[4790]: I1003 10:20:08.389894 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_redhat-edpm-deployment-openstack-edpm-ipam-c52nn_06011acd-ae43-466d-b25d-70eddbc5c34f/redhat-edpm-deployment-openstack-edpm-ipam/0.log" Oct 03 10:20:08 crc kubenswrapper[4790]: I1003 10:20:08.396763 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-nnnn2_8cc4fd03-2e9a-4988-8e58-06d9df765283/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 03 10:20:08 crc kubenswrapper[4790]: I1003 10:20:08.601791 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-cfdk9_551d5e7d-f7da-45c1-acbe-efa124afdd28/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Oct 03 10:20:08 crc kubenswrapper[4790]: I1003 10:20:08.913095 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-zm2qd_8dd56d83-b17f-4633-822c-e9ce0e3236d3/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 03 10:20:08 crc kubenswrapper[4790]: I1003 10:20:08.927466 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-brzw8_44cc8b3a-08a0-490c-b1ba-e23b7da761a4/ssh-known-hosts-edpm-deployment/0.log" Oct 03 10:20:09 crc kubenswrapper[4790]: I1003 10:20:09.223267 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-7657d6d659-5lb95_9ee67672-dab9-4d89-b6b4-02eb06e8fe8d/proxy-server/0.log" Oct 03 10:20:09 crc kubenswrapper[4790]: I1003 10:20:09.440032 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-7657d6d659-5lb95_9ee67672-dab9-4d89-b6b4-02eb06e8fe8d/proxy-httpd/0.log" Oct 03 10:20:09 crc kubenswrapper[4790]: I1003 10:20:09.483938 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-5tpvp_4f9b13ce-6ec8-4e05-b395-c7ec580823ed/swift-ring-rebalance/0.log" Oct 03 10:20:09 crc kubenswrapper[4790]: I1003 10:20:09.708124 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_d6c105dc-48df-4d07-8a82-ca93e78ad98c/account-auditor/0.log" Oct 03 10:20:09 crc kubenswrapper[4790]: I1003 10:20:09.968322 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_d6c105dc-48df-4d07-8a82-ca93e78ad98c/account-reaper/0.log" Oct 03 10:20:10 crc kubenswrapper[4790]: I1003 10:20:10.133489 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_d6c105dc-48df-4d07-8a82-ca93e78ad98c/account-server/0.log" Oct 03 10:20:10 crc kubenswrapper[4790]: I1003 10:20:10.192969 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_d6c105dc-48df-4d07-8a82-ca93e78ad98c/container-auditor/0.log" Oct 03 10:20:10 crc kubenswrapper[4790]: I1003 10:20:10.241186 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_d6c105dc-48df-4d07-8a82-ca93e78ad98c/account-replicator/0.log" Oct 03 10:20:10 crc kubenswrapper[4790]: I1003 10:20:10.397135 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_d6c105dc-48df-4d07-8a82-ca93e78ad98c/container-server/0.log" Oct 03 10:20:10 crc kubenswrapper[4790]: I1003 10:20:10.410992 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_d6c105dc-48df-4d07-8a82-ca93e78ad98c/container-replicator/0.log" Oct 03 10:20:10 crc kubenswrapper[4790]: I1003 10:20:10.454866 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_d6c105dc-48df-4d07-8a82-ca93e78ad98c/container-updater/0.log" Oct 03 10:20:10 crc kubenswrapper[4790]: I1003 10:20:10.649691 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_d6c105dc-48df-4d07-8a82-ca93e78ad98c/object-expirer/0.log" Oct 03 10:20:10 crc kubenswrapper[4790]: I1003 10:20:10.678549 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_d6c105dc-48df-4d07-8a82-ca93e78ad98c/object-auditor/0.log" Oct 03 10:20:10 crc kubenswrapper[4790]: I1003 10:20:10.774419 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_d6c105dc-48df-4d07-8a82-ca93e78ad98c/object-replicator/0.log" Oct 03 10:20:10 crc kubenswrapper[4790]: I1003 10:20:10.887541 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_d6c105dc-48df-4d07-8a82-ca93e78ad98c/object-server/0.log" Oct 03 10:20:10 crc kubenswrapper[4790]: I1003 10:20:10.931211 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_d6c105dc-48df-4d07-8a82-ca93e78ad98c/object-updater/0.log" Oct 03 10:20:11 crc kubenswrapper[4790]: I1003 10:20:11.038915 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_d6c105dc-48df-4d07-8a82-ca93e78ad98c/rsync/0.log" Oct 03 10:20:11 crc kubenswrapper[4790]: I1003 10:20:11.130916 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_d6c105dc-48df-4d07-8a82-ca93e78ad98c/swift-recon-cron/0.log" Oct 03 10:20:11 crc kubenswrapper[4790]: I1003 10:20:11.372312 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-edpm-deployment-openstack-edpm-ipam-48d4w_780db1d7-b699-4f38-9f86-c6bcf5c7c9a6/telemetry-edpm-deployment-openstack-edpm-ipam/0.log" Oct 03 10:20:11 crc kubenswrapper[4790]: I1003 10:20:11.466646 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest_69ff309d-d784-487d-a190-4529008cb497/tempest-tests-tempest-tests-runner/0.log" Oct 03 10:20:11 crc kubenswrapper[4790]: I1003 10:20:11.598703 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_516af441-b673-4c5b-840f-1f9bda8cdc5a/test-operator-logs-container/0.log" Oct 03 10:20:11 crc kubenswrapper[4790]: I1003 10:20:11.895383 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-b75fr_a18ef35c-666c-4fc3-b466-7d63bb44e942/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Oct 03 10:20:13 crc kubenswrapper[4790]: I1003 10:20:13.017456 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_watcher-applier-0_67485d96-5ca0-4195-bb7a-d09d03d3dc49/watcher-applier/0.log" Oct 03 10:20:13 crc kubenswrapper[4790]: I1003 10:20:13.322528 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_watcher-api-0_e0b0e105-a5dc-4a34-b966-14fba0fd89ec/watcher-api-log/0.log" Oct 03 10:20:17 crc kubenswrapper[4790]: I1003 10:20:17.823480 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_watcher-decision-engine-0_5d8afb4c-874f-4d5d-ad9b-e11a24bd9799/watcher-decision-engine/0.log" Oct 03 10:20:18 crc kubenswrapper[4790]: I1003 10:20:18.881483 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_watcher-api-0_e0b0e105-a5dc-4a34-b966-14fba0fd89ec/watcher-api/0.log" Oct 03 10:20:23 crc kubenswrapper[4790]: I1003 10:20:23.235154 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_3b885fd9-8ed3-45b9-8e6a-63069940ae79/memcached/0.log" Oct 03 10:20:57 crc kubenswrapper[4790]: I1003 10:20:57.722079 4790 generic.go:334] "Generic (PLEG): container finished" podID="2a34afb2-c892-4447-81a6-db7880c35820" containerID="3d5e61aff8681fa2b49f167ac095a0e9d8a36057541842e94d3f2a2911d20e84" exitCode=0 Oct 03 10:20:57 crc kubenswrapper[4790]: I1003 10:20:57.722162 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-hlhcg/crc-debug-qhcjl" event={"ID":"2a34afb2-c892-4447-81a6-db7880c35820","Type":"ContainerDied","Data":"3d5e61aff8681fa2b49f167ac095a0e9d8a36057541842e94d3f2a2911d20e84"} Oct 03 10:20:58 crc kubenswrapper[4790]: I1003 10:20:58.875398 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-hlhcg/crc-debug-qhcjl" Oct 03 10:20:58 crc kubenswrapper[4790]: I1003 10:20:58.914776 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-hlhcg/crc-debug-qhcjl"] Oct 03 10:20:58 crc kubenswrapper[4790]: I1003 10:20:58.925399 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-hlhcg/crc-debug-qhcjl"] Oct 03 10:20:58 crc kubenswrapper[4790]: I1003 10:20:58.927339 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/2a34afb2-c892-4447-81a6-db7880c35820-host\") pod \"2a34afb2-c892-4447-81a6-db7880c35820\" (UID: \"2a34afb2-c892-4447-81a6-db7880c35820\") " Oct 03 10:20:58 crc kubenswrapper[4790]: I1003 10:20:58.927500 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2a34afb2-c892-4447-81a6-db7880c35820-host" (OuterVolumeSpecName: "host") pod "2a34afb2-c892-4447-81a6-db7880c35820" (UID: "2a34afb2-c892-4447-81a6-db7880c35820"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 10:20:58 crc kubenswrapper[4790]: I1003 10:20:58.927643 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bv69k\" (UniqueName: \"kubernetes.io/projected/2a34afb2-c892-4447-81a6-db7880c35820-kube-api-access-bv69k\") pod \"2a34afb2-c892-4447-81a6-db7880c35820\" (UID: \"2a34afb2-c892-4447-81a6-db7880c35820\") " Oct 03 10:20:58 crc kubenswrapper[4790]: I1003 10:20:58.928203 4790 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/2a34afb2-c892-4447-81a6-db7880c35820-host\") on node \"crc\" DevicePath \"\"" Oct 03 10:20:58 crc kubenswrapper[4790]: I1003 10:20:58.934156 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2a34afb2-c892-4447-81a6-db7880c35820-kube-api-access-bv69k" (OuterVolumeSpecName: "kube-api-access-bv69k") pod "2a34afb2-c892-4447-81a6-db7880c35820" (UID: "2a34afb2-c892-4447-81a6-db7880c35820"). InnerVolumeSpecName "kube-api-access-bv69k". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 10:20:59 crc kubenswrapper[4790]: I1003 10:20:59.029875 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bv69k\" (UniqueName: \"kubernetes.io/projected/2a34afb2-c892-4447-81a6-db7880c35820-kube-api-access-bv69k\") on node \"crc\" DevicePath \"\"" Oct 03 10:20:59 crc kubenswrapper[4790]: I1003 10:20:59.744694 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0bcd5b739272c172c1b12ad9323c47900f42f420f572966b73428b2ca8ecb0d7" Oct 03 10:20:59 crc kubenswrapper[4790]: I1003 10:20:59.744770 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-hlhcg/crc-debug-qhcjl" Oct 03 10:21:00 crc kubenswrapper[4790]: I1003 10:21:00.080830 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-hlhcg/crc-debug-wgqt9"] Oct 03 10:21:00 crc kubenswrapper[4790]: E1003 10:21:00.081272 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="222f02b3-fb45-4b55-b0a4-bcb6032786c0" containerName="extract-utilities" Oct 03 10:21:00 crc kubenswrapper[4790]: I1003 10:21:00.081285 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="222f02b3-fb45-4b55-b0a4-bcb6032786c0" containerName="extract-utilities" Oct 03 10:21:00 crc kubenswrapper[4790]: E1003 10:21:00.081322 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="222f02b3-fb45-4b55-b0a4-bcb6032786c0" containerName="registry-server" Oct 03 10:21:00 crc kubenswrapper[4790]: I1003 10:21:00.081330 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="222f02b3-fb45-4b55-b0a4-bcb6032786c0" containerName="registry-server" Oct 03 10:21:00 crc kubenswrapper[4790]: E1003 10:21:00.081345 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2a34afb2-c892-4447-81a6-db7880c35820" containerName="container-00" Oct 03 10:21:00 crc kubenswrapper[4790]: I1003 10:21:00.081351 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a34afb2-c892-4447-81a6-db7880c35820" containerName="container-00" Oct 03 10:21:00 crc kubenswrapper[4790]: E1003 10:21:00.081366 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="222f02b3-fb45-4b55-b0a4-bcb6032786c0" containerName="extract-content" Oct 03 10:21:00 crc kubenswrapper[4790]: I1003 10:21:00.081372 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="222f02b3-fb45-4b55-b0a4-bcb6032786c0" containerName="extract-content" Oct 03 10:21:00 crc kubenswrapper[4790]: I1003 10:21:00.081627 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="2a34afb2-c892-4447-81a6-db7880c35820" containerName="container-00" Oct 03 10:21:00 crc kubenswrapper[4790]: I1003 10:21:00.081644 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="222f02b3-fb45-4b55-b0a4-bcb6032786c0" containerName="registry-server" Oct 03 10:21:00 crc kubenswrapper[4790]: I1003 10:21:00.082416 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-hlhcg/crc-debug-wgqt9" Oct 03 10:21:00 crc kubenswrapper[4790]: I1003 10:21:00.084777 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-hlhcg"/"default-dockercfg-h5ccg" Oct 03 10:21:00 crc kubenswrapper[4790]: I1003 10:21:00.165244 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c76067b0-48f1-4481-a6ce-47f9cb784726-host\") pod \"crc-debug-wgqt9\" (UID: \"c76067b0-48f1-4481-a6ce-47f9cb784726\") " pod="openshift-must-gather-hlhcg/crc-debug-wgqt9" Oct 03 10:21:00 crc kubenswrapper[4790]: I1003 10:21:00.165309 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wcxj8\" (UniqueName: \"kubernetes.io/projected/c76067b0-48f1-4481-a6ce-47f9cb784726-kube-api-access-wcxj8\") pod \"crc-debug-wgqt9\" (UID: \"c76067b0-48f1-4481-a6ce-47f9cb784726\") " pod="openshift-must-gather-hlhcg/crc-debug-wgqt9" Oct 03 10:21:00 crc kubenswrapper[4790]: I1003 10:21:00.267682 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c76067b0-48f1-4481-a6ce-47f9cb784726-host\") pod \"crc-debug-wgqt9\" (UID: \"c76067b0-48f1-4481-a6ce-47f9cb784726\") " pod="openshift-must-gather-hlhcg/crc-debug-wgqt9" Oct 03 10:21:00 crc kubenswrapper[4790]: I1003 10:21:00.267742 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wcxj8\" (UniqueName: \"kubernetes.io/projected/c76067b0-48f1-4481-a6ce-47f9cb784726-kube-api-access-wcxj8\") pod \"crc-debug-wgqt9\" (UID: \"c76067b0-48f1-4481-a6ce-47f9cb784726\") " pod="openshift-must-gather-hlhcg/crc-debug-wgqt9" Oct 03 10:21:00 crc kubenswrapper[4790]: I1003 10:21:00.268337 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c76067b0-48f1-4481-a6ce-47f9cb784726-host\") pod \"crc-debug-wgqt9\" (UID: \"c76067b0-48f1-4481-a6ce-47f9cb784726\") " pod="openshift-must-gather-hlhcg/crc-debug-wgqt9" Oct 03 10:21:00 crc kubenswrapper[4790]: I1003 10:21:00.284543 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wcxj8\" (UniqueName: \"kubernetes.io/projected/c76067b0-48f1-4481-a6ce-47f9cb784726-kube-api-access-wcxj8\") pod \"crc-debug-wgqt9\" (UID: \"c76067b0-48f1-4481-a6ce-47f9cb784726\") " pod="openshift-must-gather-hlhcg/crc-debug-wgqt9" Oct 03 10:21:00 crc kubenswrapper[4790]: I1003 10:21:00.341251 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2a34afb2-c892-4447-81a6-db7880c35820" path="/var/lib/kubelet/pods/2a34afb2-c892-4447-81a6-db7880c35820/volumes" Oct 03 10:21:00 crc kubenswrapper[4790]: I1003 10:21:00.402266 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-hlhcg/crc-debug-wgqt9" Oct 03 10:21:00 crc kubenswrapper[4790]: I1003 10:21:00.755593 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-hlhcg/crc-debug-wgqt9" event={"ID":"c76067b0-48f1-4481-a6ce-47f9cb784726","Type":"ContainerStarted","Data":"5b6306d88588c51309cece09d73532919ee27329505bcc43150533907c51556f"} Oct 03 10:21:00 crc kubenswrapper[4790]: I1003 10:21:00.755947 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-hlhcg/crc-debug-wgqt9" event={"ID":"c76067b0-48f1-4481-a6ce-47f9cb784726","Type":"ContainerStarted","Data":"ab978845077d5222371761474e66cc325a3ea6053b22e167631d92faf42b3677"} Oct 03 10:21:00 crc kubenswrapper[4790]: I1003 10:21:00.782293 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-hlhcg/crc-debug-wgqt9" podStartSLOduration=0.782267795 podStartE2EDuration="782.267795ms" podCreationTimestamp="2025-10-03 10:21:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 10:21:00.769867554 +0000 UTC m=+6047.122801792" watchObservedRunningTime="2025-10-03 10:21:00.782267795 +0000 UTC m=+6047.135202043" Oct 03 10:21:01 crc kubenswrapper[4790]: I1003 10:21:01.771208 4790 generic.go:334] "Generic (PLEG): container finished" podID="c76067b0-48f1-4481-a6ce-47f9cb784726" containerID="5b6306d88588c51309cece09d73532919ee27329505bcc43150533907c51556f" exitCode=0 Oct 03 10:21:01 crc kubenswrapper[4790]: I1003 10:21:01.771268 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-hlhcg/crc-debug-wgqt9" event={"ID":"c76067b0-48f1-4481-a6ce-47f9cb784726","Type":"ContainerDied","Data":"5b6306d88588c51309cece09d73532919ee27329505bcc43150533907c51556f"} Oct 03 10:21:02 crc kubenswrapper[4790]: I1003 10:21:02.905132 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-hlhcg/crc-debug-wgqt9" Oct 03 10:21:03 crc kubenswrapper[4790]: I1003 10:21:03.018853 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c76067b0-48f1-4481-a6ce-47f9cb784726-host\") pod \"c76067b0-48f1-4481-a6ce-47f9cb784726\" (UID: \"c76067b0-48f1-4481-a6ce-47f9cb784726\") " Oct 03 10:21:03 crc kubenswrapper[4790]: I1003 10:21:03.018951 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c76067b0-48f1-4481-a6ce-47f9cb784726-host" (OuterVolumeSpecName: "host") pod "c76067b0-48f1-4481-a6ce-47f9cb784726" (UID: "c76067b0-48f1-4481-a6ce-47f9cb784726"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 10:21:03 crc kubenswrapper[4790]: I1003 10:21:03.019439 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wcxj8\" (UniqueName: \"kubernetes.io/projected/c76067b0-48f1-4481-a6ce-47f9cb784726-kube-api-access-wcxj8\") pod \"c76067b0-48f1-4481-a6ce-47f9cb784726\" (UID: \"c76067b0-48f1-4481-a6ce-47f9cb784726\") " Oct 03 10:21:03 crc kubenswrapper[4790]: I1003 10:21:03.019918 4790 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c76067b0-48f1-4481-a6ce-47f9cb784726-host\") on node \"crc\" DevicePath \"\"" Oct 03 10:21:03 crc kubenswrapper[4790]: I1003 10:21:03.024722 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c76067b0-48f1-4481-a6ce-47f9cb784726-kube-api-access-wcxj8" (OuterVolumeSpecName: "kube-api-access-wcxj8") pod "c76067b0-48f1-4481-a6ce-47f9cb784726" (UID: "c76067b0-48f1-4481-a6ce-47f9cb784726"). InnerVolumeSpecName "kube-api-access-wcxj8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 10:21:03 crc kubenswrapper[4790]: I1003 10:21:03.122257 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wcxj8\" (UniqueName: \"kubernetes.io/projected/c76067b0-48f1-4481-a6ce-47f9cb784726-kube-api-access-wcxj8\") on node \"crc\" DevicePath \"\"" Oct 03 10:21:03 crc kubenswrapper[4790]: I1003 10:21:03.796997 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-hlhcg/crc-debug-wgqt9" event={"ID":"c76067b0-48f1-4481-a6ce-47f9cb784726","Type":"ContainerDied","Data":"ab978845077d5222371761474e66cc325a3ea6053b22e167631d92faf42b3677"} Oct 03 10:21:03 crc kubenswrapper[4790]: I1003 10:21:03.797364 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ab978845077d5222371761474e66cc325a3ea6053b22e167631d92faf42b3677" Oct 03 10:21:03 crc kubenswrapper[4790]: I1003 10:21:03.797091 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-hlhcg/crc-debug-wgqt9" Oct 03 10:21:10 crc kubenswrapper[4790]: I1003 10:21:10.301803 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-hlhcg/crc-debug-wgqt9"] Oct 03 10:21:10 crc kubenswrapper[4790]: I1003 10:21:10.309958 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-hlhcg/crc-debug-wgqt9"] Oct 03 10:21:10 crc kubenswrapper[4790]: I1003 10:21:10.343011 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c76067b0-48f1-4481-a6ce-47f9cb784726" path="/var/lib/kubelet/pods/c76067b0-48f1-4481-a6ce-47f9cb784726/volumes" Oct 03 10:21:11 crc kubenswrapper[4790]: I1003 10:21:11.464989 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-hlhcg/crc-debug-nr6f7"] Oct 03 10:21:11 crc kubenswrapper[4790]: E1003 10:21:11.465768 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c76067b0-48f1-4481-a6ce-47f9cb784726" containerName="container-00" Oct 03 10:21:11 crc kubenswrapper[4790]: I1003 10:21:11.465781 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="c76067b0-48f1-4481-a6ce-47f9cb784726" containerName="container-00" Oct 03 10:21:11 crc kubenswrapper[4790]: I1003 10:21:11.466041 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="c76067b0-48f1-4481-a6ce-47f9cb784726" containerName="container-00" Oct 03 10:21:11 crc kubenswrapper[4790]: I1003 10:21:11.466907 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-hlhcg/crc-debug-nr6f7" Oct 03 10:21:11 crc kubenswrapper[4790]: I1003 10:21:11.469044 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-hlhcg"/"default-dockercfg-h5ccg" Oct 03 10:21:11 crc kubenswrapper[4790]: I1003 10:21:11.571067 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pmbzd\" (UniqueName: \"kubernetes.io/projected/c364ee9d-d51a-4bf7-b15e-ee32546ce34c-kube-api-access-pmbzd\") pod \"crc-debug-nr6f7\" (UID: \"c364ee9d-d51a-4bf7-b15e-ee32546ce34c\") " pod="openshift-must-gather-hlhcg/crc-debug-nr6f7" Oct 03 10:21:11 crc kubenswrapper[4790]: I1003 10:21:11.571235 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c364ee9d-d51a-4bf7-b15e-ee32546ce34c-host\") pod \"crc-debug-nr6f7\" (UID: \"c364ee9d-d51a-4bf7-b15e-ee32546ce34c\") " pod="openshift-must-gather-hlhcg/crc-debug-nr6f7" Oct 03 10:21:11 crc kubenswrapper[4790]: I1003 10:21:11.672987 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pmbzd\" (UniqueName: \"kubernetes.io/projected/c364ee9d-d51a-4bf7-b15e-ee32546ce34c-kube-api-access-pmbzd\") pod \"crc-debug-nr6f7\" (UID: \"c364ee9d-d51a-4bf7-b15e-ee32546ce34c\") " pod="openshift-must-gather-hlhcg/crc-debug-nr6f7" Oct 03 10:21:11 crc kubenswrapper[4790]: I1003 10:21:11.673129 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c364ee9d-d51a-4bf7-b15e-ee32546ce34c-host\") pod \"crc-debug-nr6f7\" (UID: \"c364ee9d-d51a-4bf7-b15e-ee32546ce34c\") " pod="openshift-must-gather-hlhcg/crc-debug-nr6f7" Oct 03 10:21:11 crc kubenswrapper[4790]: I1003 10:21:11.673273 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c364ee9d-d51a-4bf7-b15e-ee32546ce34c-host\") pod \"crc-debug-nr6f7\" (UID: \"c364ee9d-d51a-4bf7-b15e-ee32546ce34c\") " pod="openshift-must-gather-hlhcg/crc-debug-nr6f7" Oct 03 10:21:11 crc kubenswrapper[4790]: I1003 10:21:11.704125 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pmbzd\" (UniqueName: \"kubernetes.io/projected/c364ee9d-d51a-4bf7-b15e-ee32546ce34c-kube-api-access-pmbzd\") pod \"crc-debug-nr6f7\" (UID: \"c364ee9d-d51a-4bf7-b15e-ee32546ce34c\") " pod="openshift-must-gather-hlhcg/crc-debug-nr6f7" Oct 03 10:21:11 crc kubenswrapper[4790]: I1003 10:21:11.786431 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-hlhcg/crc-debug-nr6f7" Oct 03 10:21:11 crc kubenswrapper[4790]: I1003 10:21:11.871730 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-hlhcg/crc-debug-nr6f7" event={"ID":"c364ee9d-d51a-4bf7-b15e-ee32546ce34c","Type":"ContainerStarted","Data":"aaac422ace97198ce2a727d91687e6960f045521e33bd8457b0ac49f69291400"} Oct 03 10:21:12 crc kubenswrapper[4790]: I1003 10:21:12.883922 4790 generic.go:334] "Generic (PLEG): container finished" podID="c364ee9d-d51a-4bf7-b15e-ee32546ce34c" containerID="c60ea5d13567a4b8e07d7cc4a6159fcb805752ce1baf3d6b26109e0c8e4920af" exitCode=0 Oct 03 10:21:12 crc kubenswrapper[4790]: I1003 10:21:12.883999 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-hlhcg/crc-debug-nr6f7" event={"ID":"c364ee9d-d51a-4bf7-b15e-ee32546ce34c","Type":"ContainerDied","Data":"c60ea5d13567a4b8e07d7cc4a6159fcb805752ce1baf3d6b26109e0c8e4920af"} Oct 03 10:21:12 crc kubenswrapper[4790]: I1003 10:21:12.922827 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-hlhcg/crc-debug-nr6f7"] Oct 03 10:21:12 crc kubenswrapper[4790]: I1003 10:21:12.934233 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-hlhcg/crc-debug-nr6f7"] Oct 03 10:21:13 crc kubenswrapper[4790]: I1003 10:21:13.993673 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-hlhcg/crc-debug-nr6f7" Oct 03 10:21:14 crc kubenswrapper[4790]: I1003 10:21:14.019028 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c364ee9d-d51a-4bf7-b15e-ee32546ce34c-host\") pod \"c364ee9d-d51a-4bf7-b15e-ee32546ce34c\" (UID: \"c364ee9d-d51a-4bf7-b15e-ee32546ce34c\") " Oct 03 10:21:14 crc kubenswrapper[4790]: I1003 10:21:14.019119 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pmbzd\" (UniqueName: \"kubernetes.io/projected/c364ee9d-d51a-4bf7-b15e-ee32546ce34c-kube-api-access-pmbzd\") pod \"c364ee9d-d51a-4bf7-b15e-ee32546ce34c\" (UID: \"c364ee9d-d51a-4bf7-b15e-ee32546ce34c\") " Oct 03 10:21:14 crc kubenswrapper[4790]: I1003 10:21:14.019288 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c364ee9d-d51a-4bf7-b15e-ee32546ce34c-host" (OuterVolumeSpecName: "host") pod "c364ee9d-d51a-4bf7-b15e-ee32546ce34c" (UID: "c364ee9d-d51a-4bf7-b15e-ee32546ce34c"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 10:21:14 crc kubenswrapper[4790]: I1003 10:21:14.019732 4790 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c364ee9d-d51a-4bf7-b15e-ee32546ce34c-host\") on node \"crc\" DevicePath \"\"" Oct 03 10:21:14 crc kubenswrapper[4790]: I1003 10:21:14.028820 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c364ee9d-d51a-4bf7-b15e-ee32546ce34c-kube-api-access-pmbzd" (OuterVolumeSpecName: "kube-api-access-pmbzd") pod "c364ee9d-d51a-4bf7-b15e-ee32546ce34c" (UID: "c364ee9d-d51a-4bf7-b15e-ee32546ce34c"). InnerVolumeSpecName "kube-api-access-pmbzd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 10:21:14 crc kubenswrapper[4790]: I1003 10:21:14.121558 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pmbzd\" (UniqueName: \"kubernetes.io/projected/c364ee9d-d51a-4bf7-b15e-ee32546ce34c-kube-api-access-pmbzd\") on node \"crc\" DevicePath \"\"" Oct 03 10:21:14 crc kubenswrapper[4790]: I1003 10:21:14.340675 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c364ee9d-d51a-4bf7-b15e-ee32546ce34c" path="/var/lib/kubelet/pods/c364ee9d-d51a-4bf7-b15e-ee32546ce34c/volumes" Oct 03 10:21:14 crc kubenswrapper[4790]: I1003 10:21:14.469208 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_05e449ad10d2925ed0c6277e570b403daaa897f83ce6b356969c83b302dmhsz_a3112067-ef2a-471a-94af-601f199e667d/util/0.log" Oct 03 10:21:14 crc kubenswrapper[4790]: I1003 10:21:14.669310 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_05e449ad10d2925ed0c6277e570b403daaa897f83ce6b356969c83b302dmhsz_a3112067-ef2a-471a-94af-601f199e667d/util/0.log" Oct 03 10:21:14 crc kubenswrapper[4790]: I1003 10:21:14.725958 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_05e449ad10d2925ed0c6277e570b403daaa897f83ce6b356969c83b302dmhsz_a3112067-ef2a-471a-94af-601f199e667d/pull/0.log" Oct 03 10:21:14 crc kubenswrapper[4790]: I1003 10:21:14.726082 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_05e449ad10d2925ed0c6277e570b403daaa897f83ce6b356969c83b302dmhsz_a3112067-ef2a-471a-94af-601f199e667d/pull/0.log" Oct 03 10:21:14 crc kubenswrapper[4790]: E1003 10:21:14.859341 4790 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc364ee9d_d51a_4bf7_b15e_ee32546ce34c.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc364ee9d_d51a_4bf7_b15e_ee32546ce34c.slice/crio-aaac422ace97198ce2a727d91687e6960f045521e33bd8457b0ac49f69291400\": RecentStats: unable to find data in memory cache]" Oct 03 10:21:14 crc kubenswrapper[4790]: I1003 10:21:14.866933 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_05e449ad10d2925ed0c6277e570b403daaa897f83ce6b356969c83b302dmhsz_a3112067-ef2a-471a-94af-601f199e667d/util/0.log" Oct 03 10:21:14 crc kubenswrapper[4790]: I1003 10:21:14.874205 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_05e449ad10d2925ed0c6277e570b403daaa897f83ce6b356969c83b302dmhsz_a3112067-ef2a-471a-94af-601f199e667d/pull/0.log" Oct 03 10:21:14 crc kubenswrapper[4790]: I1003 10:21:14.904380 4790 scope.go:117] "RemoveContainer" containerID="c60ea5d13567a4b8e07d7cc4a6159fcb805752ce1baf3d6b26109e0c8e4920af" Oct 03 10:21:14 crc kubenswrapper[4790]: I1003 10:21:14.904416 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-hlhcg/crc-debug-nr6f7" Oct 03 10:21:14 crc kubenswrapper[4790]: I1003 10:21:14.940524 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_05e449ad10d2925ed0c6277e570b403daaa897f83ce6b356969c83b302dmhsz_a3112067-ef2a-471a-94af-601f199e667d/extract/0.log" Oct 03 10:21:15 crc kubenswrapper[4790]: I1003 10:21:15.083759 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-6c675fb79f-rtftw_f598237f-5311-44e2-8ed2-3de81bbebc91/kube-rbac-proxy/0.log" Oct 03 10:21:15 crc kubenswrapper[4790]: I1003 10:21:15.170737 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-6c675fb79f-rtftw_f598237f-5311-44e2-8ed2-3de81bbebc91/manager/0.log" Oct 03 10:21:15 crc kubenswrapper[4790]: I1003 10:21:15.192948 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-79d68d6c85-q7g8m_9b9ab2ca-8c9e-46d1-af78-b811e32b05f0/kube-rbac-proxy/0.log" Oct 03 10:21:15 crc kubenswrapper[4790]: I1003 10:21:15.305006 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-79d68d6c85-q7g8m_9b9ab2ca-8c9e-46d1-af78-b811e32b05f0/manager/0.log" Oct 03 10:21:15 crc kubenswrapper[4790]: I1003 10:21:15.392599 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-75dfd9b554-s25k5_b1f2aef2-7fd0-4d10-ab5d-f0a8b2e7ffaf/kube-rbac-proxy/0.log" Oct 03 10:21:15 crc kubenswrapper[4790]: I1003 10:21:15.395020 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-75dfd9b554-s25k5_b1f2aef2-7fd0-4d10-ab5d-f0a8b2e7ffaf/manager/0.log" Oct 03 10:21:15 crc kubenswrapper[4790]: I1003 10:21:15.529324 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-846dff85b5-pqh6z_a168c4e9-522f-4f9b-97d3-5c50008ec59c/kube-rbac-proxy/0.log" Oct 03 10:21:15 crc kubenswrapper[4790]: I1003 10:21:15.642931 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-846dff85b5-pqh6z_a168c4e9-522f-4f9b-97d3-5c50008ec59c/manager/0.log" Oct 03 10:21:15 crc kubenswrapper[4790]: I1003 10:21:15.688378 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-599898f689-h2km6_d6653934-c835-4642-a1da-9159b7f85dcb/kube-rbac-proxy/0.log" Oct 03 10:21:15 crc kubenswrapper[4790]: I1003 10:21:15.742877 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-599898f689-h2km6_d6653934-c835-4642-a1da-9159b7f85dcb/manager/0.log" Oct 03 10:21:15 crc kubenswrapper[4790]: I1003 10:21:15.808931 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-6769b867d9-d6t4r_5b3d2a0d-d36d-4a67-9c5f-6a85735127a1/kube-rbac-proxy/0.log" Oct 03 10:21:15 crc kubenswrapper[4790]: I1003 10:21:15.896539 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-6769b867d9-d6t4r_5b3d2a0d-d36d-4a67-9c5f-6a85735127a1/manager/0.log" Oct 03 10:21:16 crc kubenswrapper[4790]: I1003 10:21:16.008386 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-5fbf469cd7-894gw_2bc6d6a2-91a5-464c-bba7-3a00ee0b1938/kube-rbac-proxy/0.log" Oct 03 10:21:16 crc kubenswrapper[4790]: I1003 10:21:16.154417 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-84bc9db6cc-2vq5g_1eec0ecd-6bcb-4df5-b6bc-66780ca3e4d4/kube-rbac-proxy/0.log" Oct 03 10:21:16 crc kubenswrapper[4790]: I1003 10:21:16.207397 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-5fbf469cd7-894gw_2bc6d6a2-91a5-464c-bba7-3a00ee0b1938/manager/0.log" Oct 03 10:21:16 crc kubenswrapper[4790]: I1003 10:21:16.244149 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-84bc9db6cc-2vq5g_1eec0ecd-6bcb-4df5-b6bc-66780ca3e4d4/manager/0.log" Oct 03 10:21:16 crc kubenswrapper[4790]: I1003 10:21:16.353294 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-7f55849f88-twvmq_12237b3a-fae2-4cac-b0d7-ee396be3da08/kube-rbac-proxy/0.log" Oct 03 10:21:16 crc kubenswrapper[4790]: I1003 10:21:16.460329 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-7f55849f88-twvmq_12237b3a-fae2-4cac-b0d7-ee396be3da08/manager/0.log" Oct 03 10:21:16 crc kubenswrapper[4790]: I1003 10:21:16.570875 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-6fd6854b49-k8l6q_9cfa9536-48e0-4639-a0ed-acab9ab92ed0/kube-rbac-proxy/0.log" Oct 03 10:21:16 crc kubenswrapper[4790]: I1003 10:21:16.572241 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-6fd6854b49-k8l6q_9cfa9536-48e0-4639-a0ed-acab9ab92ed0/manager/0.log" Oct 03 10:21:16 crc kubenswrapper[4790]: I1003 10:21:16.676713 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-5c468bf4d4-sdqf6_c97f26b2-5313-49db-8ee8-bcde2e793bfa/kube-rbac-proxy/0.log" Oct 03 10:21:16 crc kubenswrapper[4790]: I1003 10:21:16.773111 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-5c468bf4d4-sdqf6_c97f26b2-5313-49db-8ee8-bcde2e793bfa/manager/0.log" Oct 03 10:21:16 crc kubenswrapper[4790]: I1003 10:21:16.817363 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-6574bf987d-668k4_e6f6790b-7aca-49c0-881f-8a7d15ac4071/kube-rbac-proxy/0.log" Oct 03 10:21:16 crc kubenswrapper[4790]: I1003 10:21:16.885461 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-6574bf987d-668k4_e6f6790b-7aca-49c0-881f-8a7d15ac4071/manager/0.log" Oct 03 10:21:16 crc kubenswrapper[4790]: I1003 10:21:16.955864 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-555c7456bd-tlmwh_b52c73ba-c737-4eaf-a913-155d05c96f6e/kube-rbac-proxy/0.log" Oct 03 10:21:17 crc kubenswrapper[4790]: I1003 10:21:17.116252 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-555c7456bd-tlmwh_b52c73ba-c737-4eaf-a913-155d05c96f6e/manager/0.log" Oct 03 10:21:17 crc kubenswrapper[4790]: I1003 10:21:17.165955 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-59d6cfdf45-62zvh_98c1e024-12c4-47f7-ac46-ae9f6bd45ab9/kube-rbac-proxy/0.log" Oct 03 10:21:17 crc kubenswrapper[4790]: I1003 10:21:17.206015 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-59d6cfdf45-62zvh_98c1e024-12c4-47f7-ac46-ae9f6bd45ab9/manager/0.log" Oct 03 10:21:17 crc kubenswrapper[4790]: I1003 10:21:17.326677 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-6f64c4d678lpbm5_759c9730-8d00-4516-9ed2-79f56be527bf/kube-rbac-proxy/0.log" Oct 03 10:21:17 crc kubenswrapper[4790]: I1003 10:21:17.368013 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-6f64c4d678lpbm5_759c9730-8d00-4516-9ed2-79f56be527bf/manager/0.log" Oct 03 10:21:17 crc kubenswrapper[4790]: I1003 10:21:17.483393 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-6fb94b767d-nhfws_c64c5a59-ac27-4428-8cdc-a1f8c96368bb/kube-rbac-proxy/0.log" Oct 03 10:21:17 crc kubenswrapper[4790]: I1003 10:21:17.718148 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-7f8d586cd6-ptxlm_f1035867-3856-4436-8175-e21df1fbb18c/kube-rbac-proxy/0.log" Oct 03 10:21:17 crc kubenswrapper[4790]: I1003 10:21:17.880070 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-7f8d586cd6-ptxlm_f1035867-3856-4436-8175-e21df1fbb18c/operator/0.log" Oct 03 10:21:17 crc kubenswrapper[4790]: I1003 10:21:17.941112 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-w2g5j_9a285450-ca0c-4d27-b51b-81474aaa7d8f/registry-server/0.log" Oct 03 10:21:18 crc kubenswrapper[4790]: I1003 10:21:18.167803 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-688db7b6c7-kv6xz_94dc0b91-323f-4390-8af1-71019409bdf2/kube-rbac-proxy/0.log" Oct 03 10:21:18 crc kubenswrapper[4790]: I1003 10:21:18.266860 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-688db7b6c7-kv6xz_94dc0b91-323f-4390-8af1-71019409bdf2/manager/0.log" Oct 03 10:21:18 crc kubenswrapper[4790]: I1003 10:21:18.376771 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-7d8bb7f44c-8gls6_c7787291-34a2-4682-88f4-7b05f28f36f2/kube-rbac-proxy/0.log" Oct 03 10:21:18 crc kubenswrapper[4790]: I1003 10:21:18.509236 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-7d8bb7f44c-8gls6_c7787291-34a2-4682-88f4-7b05f28f36f2/manager/0.log" Oct 03 10:21:18 crc kubenswrapper[4790]: I1003 10:21:18.663113 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-5f97d8c699-q8lx4_681ac688-7419-4d78-bf96-3a1d54c30d8b/operator/0.log" Oct 03 10:21:18 crc kubenswrapper[4790]: I1003 10:21:18.814408 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-6fb94b767d-nhfws_c64c5a59-ac27-4428-8cdc-a1f8c96368bb/manager/0.log" Oct 03 10:21:18 crc kubenswrapper[4790]: I1003 10:21:18.829313 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-6859f9b676-ntlnz_7c58f90d-6708-4bb2-ba68-fb4e08b99c61/manager/0.log" Oct 03 10:21:18 crc kubenswrapper[4790]: I1003 10:21:18.852392 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-6859f9b676-ntlnz_7c58f90d-6708-4bb2-ba68-fb4e08b99c61/kube-rbac-proxy/0.log" Oct 03 10:21:18 crc kubenswrapper[4790]: I1003 10:21:18.891601 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-5db5cf686f-wrcp9_b90a3b76-58bf-4c8a-a361-c59a183f6828/kube-rbac-proxy/0.log" Oct 03 10:21:19 crc kubenswrapper[4790]: I1003 10:21:19.085658 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5cd5cb47d7-mshnf_15dfe278-fcef-4c6f-8c06-cce51c7d5f7c/kube-rbac-proxy/0.log" Oct 03 10:21:19 crc kubenswrapper[4790]: I1003 10:21:19.135347 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5cd5cb47d7-mshnf_15dfe278-fcef-4c6f-8c06-cce51c7d5f7c/manager/0.log" Oct 03 10:21:19 crc kubenswrapper[4790]: I1003 10:21:19.260519 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-5db5cf686f-wrcp9_b90a3b76-58bf-4c8a-a361-c59a183f6828/manager/0.log" Oct 03 10:21:19 crc kubenswrapper[4790]: I1003 10:21:19.288208 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-5c899c45b6-gb6mg_a583ed8e-8a45-4dc9-80e0-bf559d110818/kube-rbac-proxy/0.log" Oct 03 10:21:19 crc kubenswrapper[4790]: I1003 10:21:19.405260 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-5c899c45b6-gb6mg_a583ed8e-8a45-4dc9-80e0-bf559d110818/manager/0.log" Oct 03 10:21:25 crc kubenswrapper[4790]: E1003 10:21:25.165808 4790 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc364ee9d_d51a_4bf7_b15e_ee32546ce34c.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc364ee9d_d51a_4bf7_b15e_ee32546ce34c.slice/crio-aaac422ace97198ce2a727d91687e6960f045521e33bd8457b0ac49f69291400\": RecentStats: unable to find data in memory cache]" Oct 03 10:21:34 crc kubenswrapper[4790]: I1003 10:21:34.307490 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-dp7r9_89e2e475-d36b-4626-ab23-e767070eb376/control-plane-machine-set-operator/0.log" Oct 03 10:21:34 crc kubenswrapper[4790]: I1003 10:21:34.460029 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-tml4h_2f81f816-0c58-437b-be43-b782515e8997/kube-rbac-proxy/0.log" Oct 03 10:21:34 crc kubenswrapper[4790]: I1003 10:21:34.501350 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-tml4h_2f81f816-0c58-437b-be43-b782515e8997/machine-api-operator/0.log" Oct 03 10:21:35 crc kubenswrapper[4790]: E1003 10:21:35.454417 4790 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc364ee9d_d51a_4bf7_b15e_ee32546ce34c.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc364ee9d_d51a_4bf7_b15e_ee32546ce34c.slice/crio-aaac422ace97198ce2a727d91687e6960f045521e33bd8457b0ac49f69291400\": RecentStats: unable to find data in memory cache]" Oct 03 10:21:45 crc kubenswrapper[4790]: I1003 10:21:45.634909 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-5b446d88c5-fpr25_594cfed0-b92c-4763-998c-c7aceef4c2f5/cert-manager-controller/0.log" Oct 03 10:21:45 crc kubenswrapper[4790]: E1003 10:21:45.710186 4790 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc364ee9d_d51a_4bf7_b15e_ee32546ce34c.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc364ee9d_d51a_4bf7_b15e_ee32546ce34c.slice/crio-aaac422ace97198ce2a727d91687e6960f045521e33bd8457b0ac49f69291400\": RecentStats: unable to find data in memory cache]" Oct 03 10:21:45 crc kubenswrapper[4790]: I1003 10:21:45.786385 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-7f985d654d-r87sx_f6eb09d0-b3d6-4bc0-9125-5389bf8124dd/cert-manager-cainjector/0.log" Oct 03 10:21:45 crc kubenswrapper[4790]: I1003 10:21:45.883741 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-5655c58dd6-9k2xl_9ea495a1-9693-44fd-a9d5-6f9a710456f5/cert-manager-webhook/0.log" Oct 03 10:21:55 crc kubenswrapper[4790]: E1003 10:21:55.995962 4790 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc364ee9d_d51a_4bf7_b15e_ee32546ce34c.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc364ee9d_d51a_4bf7_b15e_ee32546ce34c.slice/crio-aaac422ace97198ce2a727d91687e6960f045521e33bd8457b0ac49f69291400\": RecentStats: unable to find data in memory cache]" Oct 03 10:21:56 crc kubenswrapper[4790]: I1003 10:21:56.817450 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-6b874cbd85-kt62h_1594d001-e795-49a2-bd6a-3a9ec48c2e4d/nmstate-console-plugin/0.log" Oct 03 10:21:57 crc kubenswrapper[4790]: I1003 10:21:57.014599 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-rdglh_525f7376-4012-46b8-b79b-0c7bfb288354/nmstate-handler/0.log" Oct 03 10:21:57 crc kubenswrapper[4790]: I1003 10:21:57.030412 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-fdff9cb8d-hrwtp_d542c730-e799-4975-9b94-ae826e4706cf/kube-rbac-proxy/0.log" Oct 03 10:21:57 crc kubenswrapper[4790]: I1003 10:21:57.075699 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-fdff9cb8d-hrwtp_d542c730-e799-4975-9b94-ae826e4706cf/nmstate-metrics/0.log" Oct 03 10:21:57 crc kubenswrapper[4790]: I1003 10:21:57.219520 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-858ddd8f98-kgcvn_5fb9da29-2f74-417d-8394-8f8fa9fea3ec/nmstate-operator/0.log" Oct 03 10:21:57 crc kubenswrapper[4790]: I1003 10:21:57.296974 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-6cdbc54649-l4kg8_8ad5530d-7fb8-4c5b-b0ef-df6837ebc877/nmstate-webhook/0.log" Oct 03 10:22:06 crc kubenswrapper[4790]: E1003 10:22:06.266998 4790 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc364ee9d_d51a_4bf7_b15e_ee32546ce34c.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc364ee9d_d51a_4bf7_b15e_ee32546ce34c.slice/crio-aaac422ace97198ce2a727d91687e6960f045521e33bd8457b0ac49f69291400\": RecentStats: unable to find data in memory cache]" Oct 03 10:22:06 crc kubenswrapper[4790]: I1003 10:22:06.473042 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-lhr5g"] Oct 03 10:22:06 crc kubenswrapper[4790]: E1003 10:22:06.473733 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c364ee9d-d51a-4bf7-b15e-ee32546ce34c" containerName="container-00" Oct 03 10:22:06 crc kubenswrapper[4790]: I1003 10:22:06.473749 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="c364ee9d-d51a-4bf7-b15e-ee32546ce34c" containerName="container-00" Oct 03 10:22:06 crc kubenswrapper[4790]: I1003 10:22:06.473951 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="c364ee9d-d51a-4bf7-b15e-ee32546ce34c" containerName="container-00" Oct 03 10:22:06 crc kubenswrapper[4790]: I1003 10:22:06.475315 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-lhr5g" Oct 03 10:22:06 crc kubenswrapper[4790]: I1003 10:22:06.516653 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-lhr5g"] Oct 03 10:22:06 crc kubenswrapper[4790]: I1003 10:22:06.586297 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rdv9v\" (UniqueName: \"kubernetes.io/projected/445a5bca-64c2-4798-94eb-5e36f616043d-kube-api-access-rdv9v\") pod \"redhat-operators-lhr5g\" (UID: \"445a5bca-64c2-4798-94eb-5e36f616043d\") " pod="openshift-marketplace/redhat-operators-lhr5g" Oct 03 10:22:06 crc kubenswrapper[4790]: I1003 10:22:06.586974 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/445a5bca-64c2-4798-94eb-5e36f616043d-utilities\") pod \"redhat-operators-lhr5g\" (UID: \"445a5bca-64c2-4798-94eb-5e36f616043d\") " pod="openshift-marketplace/redhat-operators-lhr5g" Oct 03 10:22:06 crc kubenswrapper[4790]: I1003 10:22:06.587144 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/445a5bca-64c2-4798-94eb-5e36f616043d-catalog-content\") pod \"redhat-operators-lhr5g\" (UID: \"445a5bca-64c2-4798-94eb-5e36f616043d\") " pod="openshift-marketplace/redhat-operators-lhr5g" Oct 03 10:22:06 crc kubenswrapper[4790]: I1003 10:22:06.688847 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdv9v\" (UniqueName: \"kubernetes.io/projected/445a5bca-64c2-4798-94eb-5e36f616043d-kube-api-access-rdv9v\") pod \"redhat-operators-lhr5g\" (UID: \"445a5bca-64c2-4798-94eb-5e36f616043d\") " pod="openshift-marketplace/redhat-operators-lhr5g" Oct 03 10:22:06 crc kubenswrapper[4790]: I1003 10:22:06.688901 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/445a5bca-64c2-4798-94eb-5e36f616043d-utilities\") pod \"redhat-operators-lhr5g\" (UID: \"445a5bca-64c2-4798-94eb-5e36f616043d\") " pod="openshift-marketplace/redhat-operators-lhr5g" Oct 03 10:22:06 crc kubenswrapper[4790]: I1003 10:22:06.688926 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/445a5bca-64c2-4798-94eb-5e36f616043d-catalog-content\") pod \"redhat-operators-lhr5g\" (UID: \"445a5bca-64c2-4798-94eb-5e36f616043d\") " pod="openshift-marketplace/redhat-operators-lhr5g" Oct 03 10:22:06 crc kubenswrapper[4790]: I1003 10:22:06.689372 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/445a5bca-64c2-4798-94eb-5e36f616043d-catalog-content\") pod \"redhat-operators-lhr5g\" (UID: \"445a5bca-64c2-4798-94eb-5e36f616043d\") " pod="openshift-marketplace/redhat-operators-lhr5g" Oct 03 10:22:06 crc kubenswrapper[4790]: I1003 10:22:06.689800 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/445a5bca-64c2-4798-94eb-5e36f616043d-utilities\") pod \"redhat-operators-lhr5g\" (UID: \"445a5bca-64c2-4798-94eb-5e36f616043d\") " pod="openshift-marketplace/redhat-operators-lhr5g" Oct 03 10:22:06 crc kubenswrapper[4790]: I1003 10:22:06.714729 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdv9v\" (UniqueName: \"kubernetes.io/projected/445a5bca-64c2-4798-94eb-5e36f616043d-kube-api-access-rdv9v\") pod \"redhat-operators-lhr5g\" (UID: \"445a5bca-64c2-4798-94eb-5e36f616043d\") " pod="openshift-marketplace/redhat-operators-lhr5g" Oct 03 10:22:06 crc kubenswrapper[4790]: I1003 10:22:06.804856 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-lhr5g" Oct 03 10:22:07 crc kubenswrapper[4790]: I1003 10:22:07.300640 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-lhr5g"] Oct 03 10:22:07 crc kubenswrapper[4790]: I1003 10:22:07.420458 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lhr5g" event={"ID":"445a5bca-64c2-4798-94eb-5e36f616043d","Type":"ContainerStarted","Data":"eab7c92840146e1314d2ac5c53bab8cf8c7b632810d7c3f032d263f1bc40f6ef"} Oct 03 10:22:08 crc kubenswrapper[4790]: I1003 10:22:08.432211 4790 generic.go:334] "Generic (PLEG): container finished" podID="445a5bca-64c2-4798-94eb-5e36f616043d" containerID="50646691f7017af666747d57ff8ebb3d3d0b8387e0afdffeef5e0c9e5b23d68e" exitCode=0 Oct 03 10:22:08 crc kubenswrapper[4790]: I1003 10:22:08.432322 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lhr5g" event={"ID":"445a5bca-64c2-4798-94eb-5e36f616043d","Type":"ContainerDied","Data":"50646691f7017af666747d57ff8ebb3d3d0b8387e0afdffeef5e0c9e5b23d68e"} Oct 03 10:22:08 crc kubenswrapper[4790]: I1003 10:22:08.434666 4790 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 03 10:22:09 crc kubenswrapper[4790]: I1003 10:22:09.442942 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lhr5g" event={"ID":"445a5bca-64c2-4798-94eb-5e36f616043d","Type":"ContainerStarted","Data":"628632b24b8438ffd7104ed02f60c9b6edb04a59cc97b9153a815700d45e9006"} Oct 03 10:22:10 crc kubenswrapper[4790]: I1003 10:22:10.186677 4790 patch_prober.go:28] interesting pod/machine-config-daemon-zqj4j container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 10:22:10 crc kubenswrapper[4790]: I1003 10:22:10.187027 4790 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" podUID="36eae06a-d8be-4128-8d68-acb32e1f011b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 10:22:10 crc kubenswrapper[4790]: I1003 10:22:10.732939 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-68d546b9d8-lbn9j_32b53a38-4a5c-44b1-9518-fe99746bf6a4/kube-rbac-proxy/0.log" Oct 03 10:22:10 crc kubenswrapper[4790]: I1003 10:22:10.858414 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-68d546b9d8-lbn9j_32b53a38-4a5c-44b1-9518-fe99746bf6a4/controller/0.log" Oct 03 10:22:10 crc kubenswrapper[4790]: I1003 10:22:10.901376 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-rzzll_904c6a5c-2937-4b23-bdbb-4dd0e461f9fd/cp-frr-files/0.log" Oct 03 10:22:11 crc kubenswrapper[4790]: I1003 10:22:11.113099 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-rzzll_904c6a5c-2937-4b23-bdbb-4dd0e461f9fd/cp-frr-files/0.log" Oct 03 10:22:11 crc kubenswrapper[4790]: I1003 10:22:11.122503 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-rzzll_904c6a5c-2937-4b23-bdbb-4dd0e461f9fd/cp-reloader/0.log" Oct 03 10:22:11 crc kubenswrapper[4790]: I1003 10:22:11.163000 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-rzzll_904c6a5c-2937-4b23-bdbb-4dd0e461f9fd/cp-reloader/0.log" Oct 03 10:22:11 crc kubenswrapper[4790]: I1003 10:22:11.179439 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-rzzll_904c6a5c-2937-4b23-bdbb-4dd0e461f9fd/cp-metrics/0.log" Oct 03 10:22:11 crc kubenswrapper[4790]: I1003 10:22:11.402463 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-rzzll_904c6a5c-2937-4b23-bdbb-4dd0e461f9fd/cp-metrics/0.log" Oct 03 10:22:11 crc kubenswrapper[4790]: I1003 10:22:11.402946 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-rzzll_904c6a5c-2937-4b23-bdbb-4dd0e461f9fd/cp-frr-files/0.log" Oct 03 10:22:11 crc kubenswrapper[4790]: I1003 10:22:11.427132 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-rzzll_904c6a5c-2937-4b23-bdbb-4dd0e461f9fd/cp-reloader/0.log" Oct 03 10:22:11 crc kubenswrapper[4790]: I1003 10:22:11.437530 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-rzzll_904c6a5c-2937-4b23-bdbb-4dd0e461f9fd/cp-metrics/0.log" Oct 03 10:22:11 crc kubenswrapper[4790]: I1003 10:22:11.575466 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-rzzll_904c6a5c-2937-4b23-bdbb-4dd0e461f9fd/cp-frr-files/0.log" Oct 03 10:22:11 crc kubenswrapper[4790]: I1003 10:22:11.583449 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-rzzll_904c6a5c-2937-4b23-bdbb-4dd0e461f9fd/cp-reloader/0.log" Oct 03 10:22:11 crc kubenswrapper[4790]: I1003 10:22:11.613057 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-rzzll_904c6a5c-2937-4b23-bdbb-4dd0e461f9fd/cp-metrics/0.log" Oct 03 10:22:11 crc kubenswrapper[4790]: I1003 10:22:11.675600 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-rzzll_904c6a5c-2937-4b23-bdbb-4dd0e461f9fd/controller/0.log" Oct 03 10:22:11 crc kubenswrapper[4790]: I1003 10:22:11.740556 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-rzzll_904c6a5c-2937-4b23-bdbb-4dd0e461f9fd/frr-metrics/0.log" Oct 03 10:22:11 crc kubenswrapper[4790]: I1003 10:22:11.788257 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-rzzll_904c6a5c-2937-4b23-bdbb-4dd0e461f9fd/kube-rbac-proxy/0.log" Oct 03 10:22:11 crc kubenswrapper[4790]: I1003 10:22:11.892838 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-rzzll_904c6a5c-2937-4b23-bdbb-4dd0e461f9fd/kube-rbac-proxy-frr/0.log" Oct 03 10:22:11 crc kubenswrapper[4790]: I1003 10:22:11.987305 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-rzzll_904c6a5c-2937-4b23-bdbb-4dd0e461f9fd/reloader/0.log" Oct 03 10:22:12 crc kubenswrapper[4790]: I1003 10:22:12.118051 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-64bf5d555-xtg2d_966af9a3-d44e-4bf2-bff8-9b8573083daa/frr-k8s-webhook-server/0.log" Oct 03 10:22:12 crc kubenswrapper[4790]: I1003 10:22:12.308665 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-c97c79c55-pqbls_1ae3495c-a801-4526-87ed-b8e81ea71d53/manager/0.log" Oct 03 10:22:12 crc kubenswrapper[4790]: I1003 10:22:12.400650 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-687d9f4499-mr2f5_f5d65866-e446-4512-b8aa-566a283ff5be/webhook-server/0.log" Oct 03 10:22:12 crc kubenswrapper[4790]: I1003 10:22:12.638164 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-lxwhp_8f597a65-e358-498d-b635-6eafe7618337/kube-rbac-proxy/0.log" Oct 03 10:22:13 crc kubenswrapper[4790]: I1003 10:22:13.493285 4790 generic.go:334] "Generic (PLEG): container finished" podID="445a5bca-64c2-4798-94eb-5e36f616043d" containerID="628632b24b8438ffd7104ed02f60c9b6edb04a59cc97b9153a815700d45e9006" exitCode=0 Oct 03 10:22:13 crc kubenswrapper[4790]: I1003 10:22:13.493374 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lhr5g" event={"ID":"445a5bca-64c2-4798-94eb-5e36f616043d","Type":"ContainerDied","Data":"628632b24b8438ffd7104ed02f60c9b6edb04a59cc97b9153a815700d45e9006"} Oct 03 10:22:14 crc kubenswrapper[4790]: I1003 10:22:14.038000 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-lxwhp_8f597a65-e358-498d-b635-6eafe7618337/speaker/0.log" Oct 03 10:22:14 crc kubenswrapper[4790]: I1003 10:22:14.070010 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-rzzll_904c6a5c-2937-4b23-bdbb-4dd0e461f9fd/frr/0.log" Oct 03 10:22:14 crc kubenswrapper[4790]: E1003 10:22:14.367922 4790 fsHandler.go:119] failed to collect filesystem stats - rootDiskErr: could not stat "/var/lib/containers/storage/overlay/48330cda915c605bfeadb8b91f8fe2db5c3b4072879876f2078c09a6303b7e93/diff" to get inode usage: stat /var/lib/containers/storage/overlay/48330cda915c605bfeadb8b91f8fe2db5c3b4072879876f2078c09a6303b7e93/diff: no such file or directory, extraDiskErr: Oct 03 10:22:14 crc kubenswrapper[4790]: I1003 10:22:14.508255 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lhr5g" event={"ID":"445a5bca-64c2-4798-94eb-5e36f616043d","Type":"ContainerStarted","Data":"91ef5b1c2a69e61e91f7b3b31b4167307c4e07763c2af849b19d8ce33b51f52e"} Oct 03 10:22:14 crc kubenswrapper[4790]: I1003 10:22:14.533591 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-lhr5g" podStartSLOduration=3.031011386 podStartE2EDuration="8.533554565s" podCreationTimestamp="2025-10-03 10:22:06 +0000 UTC" firstStartedPulling="2025-10-03 10:22:08.434437013 +0000 UTC m=+6114.787371251" lastFinishedPulling="2025-10-03 10:22:13.936980192 +0000 UTC m=+6120.289914430" observedRunningTime="2025-10-03 10:22:14.530128841 +0000 UTC m=+6120.883063099" watchObservedRunningTime="2025-10-03 10:22:14.533554565 +0000 UTC m=+6120.886488803" Oct 03 10:22:16 crc kubenswrapper[4790]: I1003 10:22:16.804974 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-lhr5g" Oct 03 10:22:16 crc kubenswrapper[4790]: I1003 10:22:16.805599 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-lhr5g" Oct 03 10:22:17 crc kubenswrapper[4790]: I1003 10:22:17.869682 4790 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-lhr5g" podUID="445a5bca-64c2-4798-94eb-5e36f616043d" containerName="registry-server" probeResult="failure" output=< Oct 03 10:22:17 crc kubenswrapper[4790]: timeout: failed to connect service ":50051" within 1s Oct 03 10:22:17 crc kubenswrapper[4790]: > Oct 03 10:22:18 crc kubenswrapper[4790]: I1003 10:22:18.541864 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-zb589"] Oct 03 10:22:18 crc kubenswrapper[4790]: I1003 10:22:18.544186 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zb589" Oct 03 10:22:18 crc kubenswrapper[4790]: I1003 10:22:18.554648 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-zb589"] Oct 03 10:22:18 crc kubenswrapper[4790]: I1003 10:22:18.652850 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ff5k7\" (UniqueName: \"kubernetes.io/projected/a92e8ea7-1551-436b-89d7-0409296a4c41-kube-api-access-ff5k7\") pod \"community-operators-zb589\" (UID: \"a92e8ea7-1551-436b-89d7-0409296a4c41\") " pod="openshift-marketplace/community-operators-zb589" Oct 03 10:22:18 crc kubenswrapper[4790]: I1003 10:22:18.653209 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a92e8ea7-1551-436b-89d7-0409296a4c41-utilities\") pod \"community-operators-zb589\" (UID: \"a92e8ea7-1551-436b-89d7-0409296a4c41\") " pod="openshift-marketplace/community-operators-zb589" Oct 03 10:22:18 crc kubenswrapper[4790]: I1003 10:22:18.653230 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a92e8ea7-1551-436b-89d7-0409296a4c41-catalog-content\") pod \"community-operators-zb589\" (UID: \"a92e8ea7-1551-436b-89d7-0409296a4c41\") " pod="openshift-marketplace/community-operators-zb589" Oct 03 10:22:18 crc kubenswrapper[4790]: I1003 10:22:18.754934 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a92e8ea7-1551-436b-89d7-0409296a4c41-utilities\") pod \"community-operators-zb589\" (UID: \"a92e8ea7-1551-436b-89d7-0409296a4c41\") " pod="openshift-marketplace/community-operators-zb589" Oct 03 10:22:18 crc kubenswrapper[4790]: I1003 10:22:18.754980 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a92e8ea7-1551-436b-89d7-0409296a4c41-catalog-content\") pod \"community-operators-zb589\" (UID: \"a92e8ea7-1551-436b-89d7-0409296a4c41\") " pod="openshift-marketplace/community-operators-zb589" Oct 03 10:22:18 crc kubenswrapper[4790]: I1003 10:22:18.755140 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ff5k7\" (UniqueName: \"kubernetes.io/projected/a92e8ea7-1551-436b-89d7-0409296a4c41-kube-api-access-ff5k7\") pod \"community-operators-zb589\" (UID: \"a92e8ea7-1551-436b-89d7-0409296a4c41\") " pod="openshift-marketplace/community-operators-zb589" Oct 03 10:22:18 crc kubenswrapper[4790]: I1003 10:22:18.755838 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a92e8ea7-1551-436b-89d7-0409296a4c41-utilities\") pod \"community-operators-zb589\" (UID: \"a92e8ea7-1551-436b-89d7-0409296a4c41\") " pod="openshift-marketplace/community-operators-zb589" Oct 03 10:22:18 crc kubenswrapper[4790]: I1003 10:22:18.756065 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a92e8ea7-1551-436b-89d7-0409296a4c41-catalog-content\") pod \"community-operators-zb589\" (UID: \"a92e8ea7-1551-436b-89d7-0409296a4c41\") " pod="openshift-marketplace/community-operators-zb589" Oct 03 10:22:18 crc kubenswrapper[4790]: I1003 10:22:18.775427 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ff5k7\" (UniqueName: \"kubernetes.io/projected/a92e8ea7-1551-436b-89d7-0409296a4c41-kube-api-access-ff5k7\") pod \"community-operators-zb589\" (UID: \"a92e8ea7-1551-436b-89d7-0409296a4c41\") " pod="openshift-marketplace/community-operators-zb589" Oct 03 10:22:18 crc kubenswrapper[4790]: I1003 10:22:18.894475 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zb589" Oct 03 10:22:19 crc kubenswrapper[4790]: I1003 10:22:19.272073 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-zb589"] Oct 03 10:22:19 crc kubenswrapper[4790]: I1003 10:22:19.553726 4790 generic.go:334] "Generic (PLEG): container finished" podID="a92e8ea7-1551-436b-89d7-0409296a4c41" containerID="0354dd7c9742b754be4b9b5516060b6c5ee7488aee82ff3b980d961460c0f3de" exitCode=0 Oct 03 10:22:19 crc kubenswrapper[4790]: I1003 10:22:19.553896 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zb589" event={"ID":"a92e8ea7-1551-436b-89d7-0409296a4c41","Type":"ContainerDied","Data":"0354dd7c9742b754be4b9b5516060b6c5ee7488aee82ff3b980d961460c0f3de"} Oct 03 10:22:19 crc kubenswrapper[4790]: I1003 10:22:19.554016 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zb589" event={"ID":"a92e8ea7-1551-436b-89d7-0409296a4c41","Type":"ContainerStarted","Data":"f3d9b3c1c176872ecb3b9f2cf1d2eef73b66f895ca7b2b35eabf0b135542ccc0"} Oct 03 10:22:20 crc kubenswrapper[4790]: I1003 10:22:20.565924 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zb589" event={"ID":"a92e8ea7-1551-436b-89d7-0409296a4c41","Type":"ContainerStarted","Data":"daa70ae5a3915d50d03818be9b9b56c31626eca67a74763513e0ab8240e27d89"} Oct 03 10:22:21 crc kubenswrapper[4790]: I1003 10:22:21.576474 4790 generic.go:334] "Generic (PLEG): container finished" podID="a92e8ea7-1551-436b-89d7-0409296a4c41" containerID="daa70ae5a3915d50d03818be9b9b56c31626eca67a74763513e0ab8240e27d89" exitCode=0 Oct 03 10:22:21 crc kubenswrapper[4790]: I1003 10:22:21.576526 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zb589" event={"ID":"a92e8ea7-1551-436b-89d7-0409296a4c41","Type":"ContainerDied","Data":"daa70ae5a3915d50d03818be9b9b56c31626eca67a74763513e0ab8240e27d89"} Oct 03 10:22:22 crc kubenswrapper[4790]: I1003 10:22:22.588637 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zb589" event={"ID":"a92e8ea7-1551-436b-89d7-0409296a4c41","Type":"ContainerStarted","Data":"58d00cbe685ddaf30bb64bae4754838e20fed78886e641ea0775326097471027"} Oct 03 10:22:22 crc kubenswrapper[4790]: I1003 10:22:22.607747 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-zb589" podStartSLOduration=2.11544101 podStartE2EDuration="4.607728772s" podCreationTimestamp="2025-10-03 10:22:18 +0000 UTC" firstStartedPulling="2025-10-03 10:22:19.555565929 +0000 UTC m=+6125.908500167" lastFinishedPulling="2025-10-03 10:22:22.047853681 +0000 UTC m=+6128.400787929" observedRunningTime="2025-10-03 10:22:22.604097693 +0000 UTC m=+6128.957031941" watchObservedRunningTime="2025-10-03 10:22:22.607728772 +0000 UTC m=+6128.960663010" Oct 03 10:22:25 crc kubenswrapper[4790]: I1003 10:22:25.851641 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2g6bf6_cdfb6807-f905-4cef-973b-10f556f2fc96/util/0.log" Oct 03 10:22:26 crc kubenswrapper[4790]: I1003 10:22:26.050997 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2g6bf6_cdfb6807-f905-4cef-973b-10f556f2fc96/util/0.log" Oct 03 10:22:26 crc kubenswrapper[4790]: I1003 10:22:26.084520 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2g6bf6_cdfb6807-f905-4cef-973b-10f556f2fc96/pull/0.log" Oct 03 10:22:26 crc kubenswrapper[4790]: I1003 10:22:26.120504 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2g6bf6_cdfb6807-f905-4cef-973b-10f556f2fc96/pull/0.log" Oct 03 10:22:26 crc kubenswrapper[4790]: I1003 10:22:26.295298 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2g6bf6_cdfb6807-f905-4cef-973b-10f556f2fc96/util/0.log" Oct 03 10:22:26 crc kubenswrapper[4790]: I1003 10:22:26.295692 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2g6bf6_cdfb6807-f905-4cef-973b-10f556f2fc96/extract/0.log" Oct 03 10:22:26 crc kubenswrapper[4790]: I1003 10:22:26.299555 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2g6bf6_cdfb6807-f905-4cef-973b-10f556f2fc96/pull/0.log" Oct 03 10:22:26 crc kubenswrapper[4790]: I1003 10:22:26.473795 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dkh77q_f8b2b263-b285-4112-96fe-22081359fece/util/0.log" Oct 03 10:22:26 crc kubenswrapper[4790]: I1003 10:22:26.663425 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dkh77q_f8b2b263-b285-4112-96fe-22081359fece/pull/0.log" Oct 03 10:22:26 crc kubenswrapper[4790]: I1003 10:22:26.672067 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dkh77q_f8b2b263-b285-4112-96fe-22081359fece/pull/0.log" Oct 03 10:22:26 crc kubenswrapper[4790]: I1003 10:22:26.708992 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dkh77q_f8b2b263-b285-4112-96fe-22081359fece/util/0.log" Oct 03 10:22:26 crc kubenswrapper[4790]: I1003 10:22:26.858523 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dkh77q_f8b2b263-b285-4112-96fe-22081359fece/pull/0.log" Oct 03 10:22:26 crc kubenswrapper[4790]: I1003 10:22:26.874175 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-lhr5g" Oct 03 10:22:26 crc kubenswrapper[4790]: I1003 10:22:26.890461 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dkh77q_f8b2b263-b285-4112-96fe-22081359fece/extract/0.log" Oct 03 10:22:26 crc kubenswrapper[4790]: I1003 10:22:26.891264 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dkh77q_f8b2b263-b285-4112-96fe-22081359fece/util/0.log" Oct 03 10:22:26 crc kubenswrapper[4790]: I1003 10:22:26.922920 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-lhr5g" Oct 03 10:22:27 crc kubenswrapper[4790]: I1003 10:22:27.044258 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-zkj66_bec1bd8b-1006-4ab4-b8bc-79b3ea63ab3c/extract-utilities/0.log" Oct 03 10:22:27 crc kubenswrapper[4790]: I1003 10:22:27.210492 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-zkj66_bec1bd8b-1006-4ab4-b8bc-79b3ea63ab3c/extract-content/0.log" Oct 03 10:22:27 crc kubenswrapper[4790]: I1003 10:22:27.212621 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-zkj66_bec1bd8b-1006-4ab4-b8bc-79b3ea63ab3c/extract-utilities/0.log" Oct 03 10:22:27 crc kubenswrapper[4790]: I1003 10:22:27.240753 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-zkj66_bec1bd8b-1006-4ab4-b8bc-79b3ea63ab3c/extract-content/0.log" Oct 03 10:22:27 crc kubenswrapper[4790]: I1003 10:22:27.453882 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-zkj66_bec1bd8b-1006-4ab4-b8bc-79b3ea63ab3c/extract-utilities/0.log" Oct 03 10:22:27 crc kubenswrapper[4790]: I1003 10:22:27.504253 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-zkj66_bec1bd8b-1006-4ab4-b8bc-79b3ea63ab3c/extract-content/0.log" Oct 03 10:22:27 crc kubenswrapper[4790]: I1003 10:22:27.703599 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-92r86_4c817004-6a29-4be9-907b-cc5a254e7132/extract-utilities/0.log" Oct 03 10:22:27 crc kubenswrapper[4790]: I1003 10:22:27.855544 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-92r86_4c817004-6a29-4be9-907b-cc5a254e7132/extract-content/0.log" Oct 03 10:22:27 crc kubenswrapper[4790]: I1003 10:22:27.906769 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-92r86_4c817004-6a29-4be9-907b-cc5a254e7132/extract-content/0.log" Oct 03 10:22:27 crc kubenswrapper[4790]: I1003 10:22:27.909464 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-92r86_4c817004-6a29-4be9-907b-cc5a254e7132/extract-utilities/0.log" Oct 03 10:22:28 crc kubenswrapper[4790]: I1003 10:22:28.192293 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-92r86_4c817004-6a29-4be9-907b-cc5a254e7132/extract-content/0.log" Oct 03 10:22:28 crc kubenswrapper[4790]: I1003 10:22:28.277824 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-zkj66_bec1bd8b-1006-4ab4-b8bc-79b3ea63ab3c/registry-server/0.log" Oct 03 10:22:28 crc kubenswrapper[4790]: I1003 10:22:28.290202 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-92r86_4c817004-6a29-4be9-907b-cc5a254e7132/extract-utilities/0.log" Oct 03 10:22:28 crc kubenswrapper[4790]: I1003 10:22:28.489193 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-zb589_a92e8ea7-1551-436b-89d7-0409296a4c41/extract-utilities/0.log" Oct 03 10:22:28 crc kubenswrapper[4790]: I1003 10:22:28.702533 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-zb589_a92e8ea7-1551-436b-89d7-0409296a4c41/extract-utilities/0.log" Oct 03 10:22:28 crc kubenswrapper[4790]: I1003 10:22:28.744728 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-zb589_a92e8ea7-1551-436b-89d7-0409296a4c41/extract-content/0.log" Oct 03 10:22:28 crc kubenswrapper[4790]: I1003 10:22:28.747901 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-zb589_a92e8ea7-1551-436b-89d7-0409296a4c41/extract-content/0.log" Oct 03 10:22:28 crc kubenswrapper[4790]: I1003 10:22:28.875485 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-92r86_4c817004-6a29-4be9-907b-cc5a254e7132/registry-server/0.log" Oct 03 10:22:28 crc kubenswrapper[4790]: I1003 10:22:28.895969 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-zb589" Oct 03 10:22:28 crc kubenswrapper[4790]: I1003 10:22:28.896018 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-zb589" Oct 03 10:22:28 crc kubenswrapper[4790]: I1003 10:22:28.950388 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-zb589_a92e8ea7-1551-436b-89d7-0409296a4c41/extract-utilities/0.log" Oct 03 10:22:28 crc kubenswrapper[4790]: I1003 10:22:28.952738 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-zb589" Oct 03 10:22:29 crc kubenswrapper[4790]: I1003 10:22:29.000087 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-zb589_a92e8ea7-1551-436b-89d7-0409296a4c41/extract-content/0.log" Oct 03 10:22:29 crc kubenswrapper[4790]: I1003 10:22:29.016147 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-zb589_a92e8ea7-1551-436b-89d7-0409296a4c41/registry-server/0.log" Oct 03 10:22:29 crc kubenswrapper[4790]: I1003 10:22:29.142054 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cblgh2_1db4e394-a858-4ab0-b051-3135c5516405/util/0.log" Oct 03 10:22:29 crc kubenswrapper[4790]: I1003 10:22:29.340654 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cblgh2_1db4e394-a858-4ab0-b051-3135c5516405/pull/0.log" Oct 03 10:22:29 crc kubenswrapper[4790]: I1003 10:22:29.361708 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cblgh2_1db4e394-a858-4ab0-b051-3135c5516405/util/0.log" Oct 03 10:22:29 crc kubenswrapper[4790]: I1003 10:22:29.420240 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cblgh2_1db4e394-a858-4ab0-b051-3135c5516405/pull/0.log" Oct 03 10:22:29 crc kubenswrapper[4790]: I1003 10:22:29.555207 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cblgh2_1db4e394-a858-4ab0-b051-3135c5516405/extract/0.log" Oct 03 10:22:29 crc kubenswrapper[4790]: I1003 10:22:29.607266 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cblgh2_1db4e394-a858-4ab0-b051-3135c5516405/pull/0.log" Oct 03 10:22:29 crc kubenswrapper[4790]: I1003 10:22:29.616850 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cblgh2_1db4e394-a858-4ab0-b051-3135c5516405/util/0.log" Oct 03 10:22:29 crc kubenswrapper[4790]: I1003 10:22:29.718019 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-zb589" Oct 03 10:22:29 crc kubenswrapper[4790]: I1003 10:22:29.761566 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-5sbms_dc166124-1b08-4c1e-90fe-a35d6331fcf4/marketplace-operator/0.log" Oct 03 10:22:29 crc kubenswrapper[4790]: I1003 10:22:29.853763 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-jshwg_cd3a5dd4-70b9-462e-994b-18d4fbf399c3/extract-utilities/0.log" Oct 03 10:22:29 crc kubenswrapper[4790]: I1003 10:22:29.991812 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-jshwg_cd3a5dd4-70b9-462e-994b-18d4fbf399c3/extract-content/0.log" Oct 03 10:22:30 crc kubenswrapper[4790]: I1003 10:22:30.004995 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-jshwg_cd3a5dd4-70b9-462e-994b-18d4fbf399c3/extract-utilities/0.log" Oct 03 10:22:30 crc kubenswrapper[4790]: I1003 10:22:30.004995 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-jshwg_cd3a5dd4-70b9-462e-994b-18d4fbf399c3/extract-content/0.log" Oct 03 10:22:30 crc kubenswrapper[4790]: I1003 10:22:30.186866 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-jshwg_cd3a5dd4-70b9-462e-994b-18d4fbf399c3/extract-utilities/0.log" Oct 03 10:22:30 crc kubenswrapper[4790]: I1003 10:22:30.204504 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-jshwg_cd3a5dd4-70b9-462e-994b-18d4fbf399c3/extract-content/0.log" Oct 03 10:22:30 crc kubenswrapper[4790]: I1003 10:22:30.313131 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-5z6kh_61616e67-5189-42b0-a4bc-8f6fc803f2a7/extract-utilities/0.log" Oct 03 10:22:30 crc kubenswrapper[4790]: I1003 10:22:30.412280 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-jshwg_cd3a5dd4-70b9-462e-994b-18d4fbf399c3/registry-server/0.log" Oct 03 10:22:30 crc kubenswrapper[4790]: I1003 10:22:30.503167 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-5z6kh_61616e67-5189-42b0-a4bc-8f6fc803f2a7/extract-utilities/0.log" Oct 03 10:22:30 crc kubenswrapper[4790]: I1003 10:22:30.506883 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-5z6kh_61616e67-5189-42b0-a4bc-8f6fc803f2a7/extract-content/0.log" Oct 03 10:22:30 crc kubenswrapper[4790]: I1003 10:22:30.512976 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-5z6kh_61616e67-5189-42b0-a4bc-8f6fc803f2a7/extract-content/0.log" Oct 03 10:22:30 crc kubenswrapper[4790]: I1003 10:22:30.735848 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-lhr5g"] Oct 03 10:22:30 crc kubenswrapper[4790]: I1003 10:22:30.736155 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-lhr5g" podUID="445a5bca-64c2-4798-94eb-5e36f616043d" containerName="registry-server" containerID="cri-o://91ef5b1c2a69e61e91f7b3b31b4167307c4e07763c2af849b19d8ce33b51f52e" gracePeriod=2 Oct 03 10:22:30 crc kubenswrapper[4790]: I1003 10:22:30.758896 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-5z6kh_61616e67-5189-42b0-a4bc-8f6fc803f2a7/extract-content/0.log" Oct 03 10:22:30 crc kubenswrapper[4790]: I1003 10:22:30.824724 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-lhr5g_445a5bca-64c2-4798-94eb-5e36f616043d/extract-utilities/0.log" Oct 03 10:22:30 crc kubenswrapper[4790]: I1003 10:22:30.832370 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-5z6kh_61616e67-5189-42b0-a4bc-8f6fc803f2a7/extract-utilities/0.log" Oct 03 10:22:31 crc kubenswrapper[4790]: I1003 10:22:31.094884 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-lhr5g_445a5bca-64c2-4798-94eb-5e36f616043d/extract-content/0.log" Oct 03 10:22:31 crc kubenswrapper[4790]: I1003 10:22:31.094886 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-lhr5g_445a5bca-64c2-4798-94eb-5e36f616043d/extract-utilities/0.log" Oct 03 10:22:31 crc kubenswrapper[4790]: I1003 10:22:31.248130 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-lhr5g_445a5bca-64c2-4798-94eb-5e36f616043d/extract-content/0.log" Oct 03 10:22:31 crc kubenswrapper[4790]: I1003 10:22:31.252171 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-lhr5g" Oct 03 10:22:31 crc kubenswrapper[4790]: I1003 10:22:31.264505 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/445a5bca-64c2-4798-94eb-5e36f616043d-utilities\") pod \"445a5bca-64c2-4798-94eb-5e36f616043d\" (UID: \"445a5bca-64c2-4798-94eb-5e36f616043d\") " Oct 03 10:22:31 crc kubenswrapper[4790]: I1003 10:22:31.264723 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/445a5bca-64c2-4798-94eb-5e36f616043d-catalog-content\") pod \"445a5bca-64c2-4798-94eb-5e36f616043d\" (UID: \"445a5bca-64c2-4798-94eb-5e36f616043d\") " Oct 03 10:22:31 crc kubenswrapper[4790]: I1003 10:22:31.264773 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rdv9v\" (UniqueName: \"kubernetes.io/projected/445a5bca-64c2-4798-94eb-5e36f616043d-kube-api-access-rdv9v\") pod \"445a5bca-64c2-4798-94eb-5e36f616043d\" (UID: \"445a5bca-64c2-4798-94eb-5e36f616043d\") " Oct 03 10:22:31 crc kubenswrapper[4790]: I1003 10:22:31.265467 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/445a5bca-64c2-4798-94eb-5e36f616043d-utilities" (OuterVolumeSpecName: "utilities") pod "445a5bca-64c2-4798-94eb-5e36f616043d" (UID: "445a5bca-64c2-4798-94eb-5e36f616043d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 10:22:31 crc kubenswrapper[4790]: I1003 10:22:31.278523 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/445a5bca-64c2-4798-94eb-5e36f616043d-kube-api-access-rdv9v" (OuterVolumeSpecName: "kube-api-access-rdv9v") pod "445a5bca-64c2-4798-94eb-5e36f616043d" (UID: "445a5bca-64c2-4798-94eb-5e36f616043d"). InnerVolumeSpecName "kube-api-access-rdv9v". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 10:22:31 crc kubenswrapper[4790]: I1003 10:22:31.366839 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rdv9v\" (UniqueName: \"kubernetes.io/projected/445a5bca-64c2-4798-94eb-5e36f616043d-kube-api-access-rdv9v\") on node \"crc\" DevicePath \"\"" Oct 03 10:22:31 crc kubenswrapper[4790]: I1003 10:22:31.366873 4790 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/445a5bca-64c2-4798-94eb-5e36f616043d-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 10:22:31 crc kubenswrapper[4790]: I1003 10:22:31.372041 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/445a5bca-64c2-4798-94eb-5e36f616043d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "445a5bca-64c2-4798-94eb-5e36f616043d" (UID: "445a5bca-64c2-4798-94eb-5e36f616043d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 10:22:31 crc kubenswrapper[4790]: I1003 10:22:31.469006 4790 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/445a5bca-64c2-4798-94eb-5e36f616043d-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 10:22:31 crc kubenswrapper[4790]: I1003 10:22:31.683209 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-5z6kh_61616e67-5189-42b0-a4bc-8f6fc803f2a7/registry-server/0.log" Oct 03 10:22:31 crc kubenswrapper[4790]: I1003 10:22:31.685784 4790 generic.go:334] "Generic (PLEG): container finished" podID="445a5bca-64c2-4798-94eb-5e36f616043d" containerID="91ef5b1c2a69e61e91f7b3b31b4167307c4e07763c2af849b19d8ce33b51f52e" exitCode=0 Oct 03 10:22:31 crc kubenswrapper[4790]: I1003 10:22:31.685868 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lhr5g" event={"ID":"445a5bca-64c2-4798-94eb-5e36f616043d","Type":"ContainerDied","Data":"91ef5b1c2a69e61e91f7b3b31b4167307c4e07763c2af849b19d8ce33b51f52e"} Oct 03 10:22:31 crc kubenswrapper[4790]: I1003 10:22:31.685926 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-lhr5g" Oct 03 10:22:31 crc kubenswrapper[4790]: I1003 10:22:31.685947 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lhr5g" event={"ID":"445a5bca-64c2-4798-94eb-5e36f616043d","Type":"ContainerDied","Data":"eab7c92840146e1314d2ac5c53bab8cf8c7b632810d7c3f032d263f1bc40f6ef"} Oct 03 10:22:31 crc kubenswrapper[4790]: I1003 10:22:31.685970 4790 scope.go:117] "RemoveContainer" containerID="91ef5b1c2a69e61e91f7b3b31b4167307c4e07763c2af849b19d8ce33b51f52e" Oct 03 10:22:31 crc kubenswrapper[4790]: I1003 10:22:31.710316 4790 scope.go:117] "RemoveContainer" containerID="628632b24b8438ffd7104ed02f60c9b6edb04a59cc97b9153a815700d45e9006" Oct 03 10:22:31 crc kubenswrapper[4790]: I1003 10:22:31.739611 4790 scope.go:117] "RemoveContainer" containerID="50646691f7017af666747d57ff8ebb3d3d0b8387e0afdffeef5e0c9e5b23d68e" Oct 03 10:22:31 crc kubenswrapper[4790]: I1003 10:22:31.742317 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-lhr5g"] Oct 03 10:22:31 crc kubenswrapper[4790]: I1003 10:22:31.777295 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-lhr5g"] Oct 03 10:22:31 crc kubenswrapper[4790]: I1003 10:22:31.789961 4790 scope.go:117] "RemoveContainer" containerID="91ef5b1c2a69e61e91f7b3b31b4167307c4e07763c2af849b19d8ce33b51f52e" Oct 03 10:22:31 crc kubenswrapper[4790]: E1003 10:22:31.790413 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"91ef5b1c2a69e61e91f7b3b31b4167307c4e07763c2af849b19d8ce33b51f52e\": container with ID starting with 91ef5b1c2a69e61e91f7b3b31b4167307c4e07763c2af849b19d8ce33b51f52e not found: ID does not exist" containerID="91ef5b1c2a69e61e91f7b3b31b4167307c4e07763c2af849b19d8ce33b51f52e" Oct 03 10:22:31 crc kubenswrapper[4790]: I1003 10:22:31.790464 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"91ef5b1c2a69e61e91f7b3b31b4167307c4e07763c2af849b19d8ce33b51f52e"} err="failed to get container status \"91ef5b1c2a69e61e91f7b3b31b4167307c4e07763c2af849b19d8ce33b51f52e\": rpc error: code = NotFound desc = could not find container \"91ef5b1c2a69e61e91f7b3b31b4167307c4e07763c2af849b19d8ce33b51f52e\": container with ID starting with 91ef5b1c2a69e61e91f7b3b31b4167307c4e07763c2af849b19d8ce33b51f52e not found: ID does not exist" Oct 03 10:22:31 crc kubenswrapper[4790]: I1003 10:22:31.790491 4790 scope.go:117] "RemoveContainer" containerID="628632b24b8438ffd7104ed02f60c9b6edb04a59cc97b9153a815700d45e9006" Oct 03 10:22:31 crc kubenswrapper[4790]: E1003 10:22:31.790754 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"628632b24b8438ffd7104ed02f60c9b6edb04a59cc97b9153a815700d45e9006\": container with ID starting with 628632b24b8438ffd7104ed02f60c9b6edb04a59cc97b9153a815700d45e9006 not found: ID does not exist" containerID="628632b24b8438ffd7104ed02f60c9b6edb04a59cc97b9153a815700d45e9006" Oct 03 10:22:31 crc kubenswrapper[4790]: I1003 10:22:31.790777 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"628632b24b8438ffd7104ed02f60c9b6edb04a59cc97b9153a815700d45e9006"} err="failed to get container status \"628632b24b8438ffd7104ed02f60c9b6edb04a59cc97b9153a815700d45e9006\": rpc error: code = NotFound desc = could not find container \"628632b24b8438ffd7104ed02f60c9b6edb04a59cc97b9153a815700d45e9006\": container with ID starting with 628632b24b8438ffd7104ed02f60c9b6edb04a59cc97b9153a815700d45e9006 not found: ID does not exist" Oct 03 10:22:31 crc kubenswrapper[4790]: I1003 10:22:31.790790 4790 scope.go:117] "RemoveContainer" containerID="50646691f7017af666747d57ff8ebb3d3d0b8387e0afdffeef5e0c9e5b23d68e" Oct 03 10:22:31 crc kubenswrapper[4790]: E1003 10:22:31.791002 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"50646691f7017af666747d57ff8ebb3d3d0b8387e0afdffeef5e0c9e5b23d68e\": container with ID starting with 50646691f7017af666747d57ff8ebb3d3d0b8387e0afdffeef5e0c9e5b23d68e not found: ID does not exist" containerID="50646691f7017af666747d57ff8ebb3d3d0b8387e0afdffeef5e0c9e5b23d68e" Oct 03 10:22:31 crc kubenswrapper[4790]: I1003 10:22:31.791028 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"50646691f7017af666747d57ff8ebb3d3d0b8387e0afdffeef5e0c9e5b23d68e"} err="failed to get container status \"50646691f7017af666747d57ff8ebb3d3d0b8387e0afdffeef5e0c9e5b23d68e\": rpc error: code = NotFound desc = could not find container \"50646691f7017af666747d57ff8ebb3d3d0b8387e0afdffeef5e0c9e5b23d68e\": container with ID starting with 50646691f7017af666747d57ff8ebb3d3d0b8387e0afdffeef5e0c9e5b23d68e not found: ID does not exist" Oct 03 10:22:32 crc kubenswrapper[4790]: I1003 10:22:32.132545 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-zb589"] Oct 03 10:22:32 crc kubenswrapper[4790]: I1003 10:22:32.132841 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-zb589" podUID="a92e8ea7-1551-436b-89d7-0409296a4c41" containerName="registry-server" containerID="cri-o://58d00cbe685ddaf30bb64bae4754838e20fed78886e641ea0775326097471027" gracePeriod=2 Oct 03 10:22:32 crc kubenswrapper[4790]: I1003 10:22:32.340282 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="445a5bca-64c2-4798-94eb-5e36f616043d" path="/var/lib/kubelet/pods/445a5bca-64c2-4798-94eb-5e36f616043d/volumes" Oct 03 10:22:32 crc kubenswrapper[4790]: I1003 10:22:32.643696 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zb589" Oct 03 10:22:32 crc kubenswrapper[4790]: I1003 10:22:32.699515 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ff5k7\" (UniqueName: \"kubernetes.io/projected/a92e8ea7-1551-436b-89d7-0409296a4c41-kube-api-access-ff5k7\") pod \"a92e8ea7-1551-436b-89d7-0409296a4c41\" (UID: \"a92e8ea7-1551-436b-89d7-0409296a4c41\") " Oct 03 10:22:32 crc kubenswrapper[4790]: I1003 10:22:32.699554 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a92e8ea7-1551-436b-89d7-0409296a4c41-catalog-content\") pod \"a92e8ea7-1551-436b-89d7-0409296a4c41\" (UID: \"a92e8ea7-1551-436b-89d7-0409296a4c41\") " Oct 03 10:22:32 crc kubenswrapper[4790]: I1003 10:22:32.699697 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a92e8ea7-1551-436b-89d7-0409296a4c41-utilities\") pod \"a92e8ea7-1551-436b-89d7-0409296a4c41\" (UID: \"a92e8ea7-1551-436b-89d7-0409296a4c41\") " Oct 03 10:22:32 crc kubenswrapper[4790]: I1003 10:22:32.700598 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a92e8ea7-1551-436b-89d7-0409296a4c41-utilities" (OuterVolumeSpecName: "utilities") pod "a92e8ea7-1551-436b-89d7-0409296a4c41" (UID: "a92e8ea7-1551-436b-89d7-0409296a4c41"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 10:22:32 crc kubenswrapper[4790]: I1003 10:22:32.702069 4790 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a92e8ea7-1551-436b-89d7-0409296a4c41-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 10:22:32 crc kubenswrapper[4790]: I1003 10:22:32.706924 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a92e8ea7-1551-436b-89d7-0409296a4c41-kube-api-access-ff5k7" (OuterVolumeSpecName: "kube-api-access-ff5k7") pod "a92e8ea7-1551-436b-89d7-0409296a4c41" (UID: "a92e8ea7-1551-436b-89d7-0409296a4c41"). InnerVolumeSpecName "kube-api-access-ff5k7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 10:22:32 crc kubenswrapper[4790]: I1003 10:22:32.726605 4790 generic.go:334] "Generic (PLEG): container finished" podID="a92e8ea7-1551-436b-89d7-0409296a4c41" containerID="58d00cbe685ddaf30bb64bae4754838e20fed78886e641ea0775326097471027" exitCode=0 Oct 03 10:22:32 crc kubenswrapper[4790]: I1003 10:22:32.726676 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zb589" event={"ID":"a92e8ea7-1551-436b-89d7-0409296a4c41","Type":"ContainerDied","Data":"58d00cbe685ddaf30bb64bae4754838e20fed78886e641ea0775326097471027"} Oct 03 10:22:32 crc kubenswrapper[4790]: I1003 10:22:32.726702 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zb589" event={"ID":"a92e8ea7-1551-436b-89d7-0409296a4c41","Type":"ContainerDied","Data":"f3d9b3c1c176872ecb3b9f2cf1d2eef73b66f895ca7b2b35eabf0b135542ccc0"} Oct 03 10:22:32 crc kubenswrapper[4790]: I1003 10:22:32.726718 4790 scope.go:117] "RemoveContainer" containerID="58d00cbe685ddaf30bb64bae4754838e20fed78886e641ea0775326097471027" Oct 03 10:22:32 crc kubenswrapper[4790]: I1003 10:22:32.726822 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zb589" Oct 03 10:22:32 crc kubenswrapper[4790]: I1003 10:22:32.754766 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a92e8ea7-1551-436b-89d7-0409296a4c41-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a92e8ea7-1551-436b-89d7-0409296a4c41" (UID: "a92e8ea7-1551-436b-89d7-0409296a4c41"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 10:22:32 crc kubenswrapper[4790]: I1003 10:22:32.770980 4790 scope.go:117] "RemoveContainer" containerID="daa70ae5a3915d50d03818be9b9b56c31626eca67a74763513e0ab8240e27d89" Oct 03 10:22:32 crc kubenswrapper[4790]: I1003 10:22:32.795447 4790 scope.go:117] "RemoveContainer" containerID="0354dd7c9742b754be4b9b5516060b6c5ee7488aee82ff3b980d961460c0f3de" Oct 03 10:22:32 crc kubenswrapper[4790]: I1003 10:22:32.804436 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ff5k7\" (UniqueName: \"kubernetes.io/projected/a92e8ea7-1551-436b-89d7-0409296a4c41-kube-api-access-ff5k7\") on node \"crc\" DevicePath \"\"" Oct 03 10:22:32 crc kubenswrapper[4790]: I1003 10:22:32.805059 4790 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a92e8ea7-1551-436b-89d7-0409296a4c41-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 10:22:32 crc kubenswrapper[4790]: I1003 10:22:32.822747 4790 scope.go:117] "RemoveContainer" containerID="58d00cbe685ddaf30bb64bae4754838e20fed78886e641ea0775326097471027" Oct 03 10:22:32 crc kubenswrapper[4790]: E1003 10:22:32.823130 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"58d00cbe685ddaf30bb64bae4754838e20fed78886e641ea0775326097471027\": container with ID starting with 58d00cbe685ddaf30bb64bae4754838e20fed78886e641ea0775326097471027 not found: ID does not exist" containerID="58d00cbe685ddaf30bb64bae4754838e20fed78886e641ea0775326097471027" Oct 03 10:22:32 crc kubenswrapper[4790]: I1003 10:22:32.823172 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"58d00cbe685ddaf30bb64bae4754838e20fed78886e641ea0775326097471027"} err="failed to get container status \"58d00cbe685ddaf30bb64bae4754838e20fed78886e641ea0775326097471027\": rpc error: code = NotFound desc = could not find container \"58d00cbe685ddaf30bb64bae4754838e20fed78886e641ea0775326097471027\": container with ID starting with 58d00cbe685ddaf30bb64bae4754838e20fed78886e641ea0775326097471027 not found: ID does not exist" Oct 03 10:22:32 crc kubenswrapper[4790]: I1003 10:22:32.823198 4790 scope.go:117] "RemoveContainer" containerID="daa70ae5a3915d50d03818be9b9b56c31626eca67a74763513e0ab8240e27d89" Oct 03 10:22:32 crc kubenswrapper[4790]: E1003 10:22:32.823857 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"daa70ae5a3915d50d03818be9b9b56c31626eca67a74763513e0ab8240e27d89\": container with ID starting with daa70ae5a3915d50d03818be9b9b56c31626eca67a74763513e0ab8240e27d89 not found: ID does not exist" containerID="daa70ae5a3915d50d03818be9b9b56c31626eca67a74763513e0ab8240e27d89" Oct 03 10:22:32 crc kubenswrapper[4790]: I1003 10:22:32.823905 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"daa70ae5a3915d50d03818be9b9b56c31626eca67a74763513e0ab8240e27d89"} err="failed to get container status \"daa70ae5a3915d50d03818be9b9b56c31626eca67a74763513e0ab8240e27d89\": rpc error: code = NotFound desc = could not find container \"daa70ae5a3915d50d03818be9b9b56c31626eca67a74763513e0ab8240e27d89\": container with ID starting with daa70ae5a3915d50d03818be9b9b56c31626eca67a74763513e0ab8240e27d89 not found: ID does not exist" Oct 03 10:22:32 crc kubenswrapper[4790]: I1003 10:22:32.823938 4790 scope.go:117] "RemoveContainer" containerID="0354dd7c9742b754be4b9b5516060b6c5ee7488aee82ff3b980d961460c0f3de" Oct 03 10:22:32 crc kubenswrapper[4790]: E1003 10:22:32.824285 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0354dd7c9742b754be4b9b5516060b6c5ee7488aee82ff3b980d961460c0f3de\": container with ID starting with 0354dd7c9742b754be4b9b5516060b6c5ee7488aee82ff3b980d961460c0f3de not found: ID does not exist" containerID="0354dd7c9742b754be4b9b5516060b6c5ee7488aee82ff3b980d961460c0f3de" Oct 03 10:22:32 crc kubenswrapper[4790]: I1003 10:22:32.824310 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0354dd7c9742b754be4b9b5516060b6c5ee7488aee82ff3b980d961460c0f3de"} err="failed to get container status \"0354dd7c9742b754be4b9b5516060b6c5ee7488aee82ff3b980d961460c0f3de\": rpc error: code = NotFound desc = could not find container \"0354dd7c9742b754be4b9b5516060b6c5ee7488aee82ff3b980d961460c0f3de\": container with ID starting with 0354dd7c9742b754be4b9b5516060b6c5ee7488aee82ff3b980d961460c0f3de not found: ID does not exist" Oct 03 10:22:33 crc kubenswrapper[4790]: I1003 10:22:33.060171 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-zb589"] Oct 03 10:22:33 crc kubenswrapper[4790]: I1003 10:22:33.071174 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-zb589"] Oct 03 10:22:34 crc kubenswrapper[4790]: I1003 10:22:34.340232 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a92e8ea7-1551-436b-89d7-0409296a4c41" path="/var/lib/kubelet/pods/a92e8ea7-1551-436b-89d7-0409296a4c41/volumes" Oct 03 10:22:40 crc kubenswrapper[4790]: I1003 10:22:40.186486 4790 patch_prober.go:28] interesting pod/machine-config-daemon-zqj4j container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 10:22:40 crc kubenswrapper[4790]: I1003 10:22:40.187063 4790 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" podUID="36eae06a-d8be-4128-8d68-acb32e1f011b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 10:22:43 crc kubenswrapper[4790]: I1003 10:22:43.008944 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-7c8cf85677-wlf7r_66046d21-5295-4dd4-9f38-3859fc71918e/prometheus-operator/0.log" Oct 03 10:22:43 crc kubenswrapper[4790]: I1003 10:22:43.199376 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-f956bfc86-24zvx_9a774d8d-e6d0-4a3c-ad82-767e7a213d2e/prometheus-operator-admission-webhook/0.log" Oct 03 10:22:43 crc kubenswrapper[4790]: I1003 10:22:43.241598 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-f956bfc86-q2djd_2f0ce448-4fba-4522-b3d3-8287c471d89b/prometheus-operator-admission-webhook/0.log" Oct 03 10:22:43 crc kubenswrapper[4790]: I1003 10:22:43.425922 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-cc5f78dfc-rmlf2_feadd2c8-de45-4819-8626-812b7007c679/operator/0.log" Oct 03 10:22:43 crc kubenswrapper[4790]: I1003 10:22:43.491826 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_perses-operator-54bc95c9fb-8sh7j_ed2d3ffa-e89d-45d0-b131-8567b5abda08/perses-operator/0.log" Oct 03 10:23:10 crc kubenswrapper[4790]: I1003 10:23:10.186039 4790 patch_prober.go:28] interesting pod/machine-config-daemon-zqj4j container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 10:23:10 crc kubenswrapper[4790]: I1003 10:23:10.186795 4790 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" podUID="36eae06a-d8be-4128-8d68-acb32e1f011b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 10:23:10 crc kubenswrapper[4790]: I1003 10:23:10.186873 4790 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" Oct 03 10:23:10 crc kubenswrapper[4790]: I1003 10:23:10.187893 4790 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"72e7e65c5c41feebe96198a1537bafbf79d8997c016f7bcda12de7ad1bebd3e6"} pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 03 10:23:10 crc kubenswrapper[4790]: I1003 10:23:10.187957 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" podUID="36eae06a-d8be-4128-8d68-acb32e1f011b" containerName="machine-config-daemon" containerID="cri-o://72e7e65c5c41feebe96198a1537bafbf79d8997c016f7bcda12de7ad1bebd3e6" gracePeriod=600 Oct 03 10:23:11 crc kubenswrapper[4790]: I1003 10:23:11.097117 4790 generic.go:334] "Generic (PLEG): container finished" podID="36eae06a-d8be-4128-8d68-acb32e1f011b" containerID="72e7e65c5c41feebe96198a1537bafbf79d8997c016f7bcda12de7ad1bebd3e6" exitCode=0 Oct 03 10:23:11 crc kubenswrapper[4790]: I1003 10:23:11.097218 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" event={"ID":"36eae06a-d8be-4128-8d68-acb32e1f011b","Type":"ContainerDied","Data":"72e7e65c5c41feebe96198a1537bafbf79d8997c016f7bcda12de7ad1bebd3e6"} Oct 03 10:23:11 crc kubenswrapper[4790]: I1003 10:23:11.097759 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" event={"ID":"36eae06a-d8be-4128-8d68-acb32e1f011b","Type":"ContainerStarted","Data":"4a3f2daf4e4f2fcbea44841698967b97018c1bcc4bd0dc9f6b5b49027a55eb34"} Oct 03 10:23:11 crc kubenswrapper[4790]: I1003 10:23:11.097787 4790 scope.go:117] "RemoveContainer" containerID="54762a3a6941b00ff04aded3b5d01b002e1f5c184e678c224d6c8f4fab6a3037" Oct 03 10:25:06 crc kubenswrapper[4790]: I1003 10:25:06.239700 4790 generic.go:334] "Generic (PLEG): container finished" podID="c22cc451-27ee-4530-8185-d5285f391019" containerID="36ca8af4f7ba1773b65e1bd2c7e1967fb03dd64f04b383791b879ba331e9911b" exitCode=0 Oct 03 10:25:06 crc kubenswrapper[4790]: I1003 10:25:06.239788 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-hlhcg/must-gather-bgnkz" event={"ID":"c22cc451-27ee-4530-8185-d5285f391019","Type":"ContainerDied","Data":"36ca8af4f7ba1773b65e1bd2c7e1967fb03dd64f04b383791b879ba331e9911b"} Oct 03 10:25:06 crc kubenswrapper[4790]: I1003 10:25:06.241056 4790 scope.go:117] "RemoveContainer" containerID="36ca8af4f7ba1773b65e1bd2c7e1967fb03dd64f04b383791b879ba331e9911b" Oct 03 10:25:06 crc kubenswrapper[4790]: I1003 10:25:06.953487 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-hlhcg_must-gather-bgnkz_c22cc451-27ee-4530-8185-d5285f391019/gather/0.log" Oct 03 10:25:10 crc kubenswrapper[4790]: I1003 10:25:10.186462 4790 patch_prober.go:28] interesting pod/machine-config-daemon-zqj4j container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 10:25:10 crc kubenswrapper[4790]: I1003 10:25:10.186929 4790 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" podUID="36eae06a-d8be-4128-8d68-acb32e1f011b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 10:25:15 crc kubenswrapper[4790]: I1003 10:25:15.165071 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-hlhcg/must-gather-bgnkz"] Oct 03 10:25:15 crc kubenswrapper[4790]: I1003 10:25:15.166883 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-hlhcg/must-gather-bgnkz" podUID="c22cc451-27ee-4530-8185-d5285f391019" containerName="copy" containerID="cri-o://b0adbf2effa4fe25c92816baac362267e1ad66ddf919d296dfe31b7f5742fc1b" gracePeriod=2 Oct 03 10:25:15 crc kubenswrapper[4790]: I1003 10:25:15.176964 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-hlhcg/must-gather-bgnkz"] Oct 03 10:25:15 crc kubenswrapper[4790]: I1003 10:25:15.704935 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-hlhcg_must-gather-bgnkz_c22cc451-27ee-4530-8185-d5285f391019/copy/0.log" Oct 03 10:25:15 crc kubenswrapper[4790]: I1003 10:25:15.705601 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-hlhcg/must-gather-bgnkz" Oct 03 10:25:15 crc kubenswrapper[4790]: I1003 10:25:15.817464 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dkk4x\" (UniqueName: \"kubernetes.io/projected/c22cc451-27ee-4530-8185-d5285f391019-kube-api-access-dkk4x\") pod \"c22cc451-27ee-4530-8185-d5285f391019\" (UID: \"c22cc451-27ee-4530-8185-d5285f391019\") " Oct 03 10:25:15 crc kubenswrapper[4790]: I1003 10:25:15.817768 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/c22cc451-27ee-4530-8185-d5285f391019-must-gather-output\") pod \"c22cc451-27ee-4530-8185-d5285f391019\" (UID: \"c22cc451-27ee-4530-8185-d5285f391019\") " Oct 03 10:25:15 crc kubenswrapper[4790]: I1003 10:25:15.829045 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c22cc451-27ee-4530-8185-d5285f391019-kube-api-access-dkk4x" (OuterVolumeSpecName: "kube-api-access-dkk4x") pod "c22cc451-27ee-4530-8185-d5285f391019" (UID: "c22cc451-27ee-4530-8185-d5285f391019"). InnerVolumeSpecName "kube-api-access-dkk4x". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 10:25:15 crc kubenswrapper[4790]: I1003 10:25:15.920127 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dkk4x\" (UniqueName: \"kubernetes.io/projected/c22cc451-27ee-4530-8185-d5285f391019-kube-api-access-dkk4x\") on node \"crc\" DevicePath \"\"" Oct 03 10:25:16 crc kubenswrapper[4790]: I1003 10:25:16.019226 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c22cc451-27ee-4530-8185-d5285f391019-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "c22cc451-27ee-4530-8185-d5285f391019" (UID: "c22cc451-27ee-4530-8185-d5285f391019"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 10:25:16 crc kubenswrapper[4790]: I1003 10:25:16.022711 4790 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/c22cc451-27ee-4530-8185-d5285f391019-must-gather-output\") on node \"crc\" DevicePath \"\"" Oct 03 10:25:16 crc kubenswrapper[4790]: I1003 10:25:16.340277 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c22cc451-27ee-4530-8185-d5285f391019" path="/var/lib/kubelet/pods/c22cc451-27ee-4530-8185-d5285f391019/volumes" Oct 03 10:25:16 crc kubenswrapper[4790]: I1003 10:25:16.348667 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-hlhcg_must-gather-bgnkz_c22cc451-27ee-4530-8185-d5285f391019/copy/0.log" Oct 03 10:25:16 crc kubenswrapper[4790]: I1003 10:25:16.349116 4790 generic.go:334] "Generic (PLEG): container finished" podID="c22cc451-27ee-4530-8185-d5285f391019" containerID="b0adbf2effa4fe25c92816baac362267e1ad66ddf919d296dfe31b7f5742fc1b" exitCode=143 Oct 03 10:25:16 crc kubenswrapper[4790]: I1003 10:25:16.349166 4790 scope.go:117] "RemoveContainer" containerID="b0adbf2effa4fe25c92816baac362267e1ad66ddf919d296dfe31b7f5742fc1b" Oct 03 10:25:16 crc kubenswrapper[4790]: I1003 10:25:16.349291 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-hlhcg/must-gather-bgnkz" Oct 03 10:25:16 crc kubenswrapper[4790]: I1003 10:25:16.401200 4790 scope.go:117] "RemoveContainer" containerID="36ca8af4f7ba1773b65e1bd2c7e1967fb03dd64f04b383791b879ba331e9911b" Oct 03 10:25:16 crc kubenswrapper[4790]: I1003 10:25:16.450088 4790 scope.go:117] "RemoveContainer" containerID="b0adbf2effa4fe25c92816baac362267e1ad66ddf919d296dfe31b7f5742fc1b" Oct 03 10:25:16 crc kubenswrapper[4790]: E1003 10:25:16.450568 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b0adbf2effa4fe25c92816baac362267e1ad66ddf919d296dfe31b7f5742fc1b\": container with ID starting with b0adbf2effa4fe25c92816baac362267e1ad66ddf919d296dfe31b7f5742fc1b not found: ID does not exist" containerID="b0adbf2effa4fe25c92816baac362267e1ad66ddf919d296dfe31b7f5742fc1b" Oct 03 10:25:16 crc kubenswrapper[4790]: I1003 10:25:16.450698 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b0adbf2effa4fe25c92816baac362267e1ad66ddf919d296dfe31b7f5742fc1b"} err="failed to get container status \"b0adbf2effa4fe25c92816baac362267e1ad66ddf919d296dfe31b7f5742fc1b\": rpc error: code = NotFound desc = could not find container \"b0adbf2effa4fe25c92816baac362267e1ad66ddf919d296dfe31b7f5742fc1b\": container with ID starting with b0adbf2effa4fe25c92816baac362267e1ad66ddf919d296dfe31b7f5742fc1b not found: ID does not exist" Oct 03 10:25:16 crc kubenswrapper[4790]: I1003 10:25:16.450736 4790 scope.go:117] "RemoveContainer" containerID="36ca8af4f7ba1773b65e1bd2c7e1967fb03dd64f04b383791b879ba331e9911b" Oct 03 10:25:16 crc kubenswrapper[4790]: E1003 10:25:16.451128 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"36ca8af4f7ba1773b65e1bd2c7e1967fb03dd64f04b383791b879ba331e9911b\": container with ID starting with 36ca8af4f7ba1773b65e1bd2c7e1967fb03dd64f04b383791b879ba331e9911b not found: ID does not exist" containerID="36ca8af4f7ba1773b65e1bd2c7e1967fb03dd64f04b383791b879ba331e9911b" Oct 03 10:25:16 crc kubenswrapper[4790]: I1003 10:25:16.451248 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"36ca8af4f7ba1773b65e1bd2c7e1967fb03dd64f04b383791b879ba331e9911b"} err="failed to get container status \"36ca8af4f7ba1773b65e1bd2c7e1967fb03dd64f04b383791b879ba331e9911b\": rpc error: code = NotFound desc = could not find container \"36ca8af4f7ba1773b65e1bd2c7e1967fb03dd64f04b383791b879ba331e9911b\": container with ID starting with 36ca8af4f7ba1773b65e1bd2c7e1967fb03dd64f04b383791b879ba331e9911b not found: ID does not exist" Oct 03 10:25:31 crc kubenswrapper[4790]: I1003 10:25:31.367705 4790 scope.go:117] "RemoveContainer" containerID="3d5e61aff8681fa2b49f167ac095a0e9d8a36057541842e94d3f2a2911d20e84" Oct 03 10:25:40 crc kubenswrapper[4790]: I1003 10:25:40.186352 4790 patch_prober.go:28] interesting pod/machine-config-daemon-zqj4j container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 10:25:40 crc kubenswrapper[4790]: I1003 10:25:40.186916 4790 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" podUID="36eae06a-d8be-4128-8d68-acb32e1f011b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 10:25:43 crc kubenswrapper[4790]: I1003 10:25:43.256610 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-dxvsr/must-gather-p4m4p"] Oct 03 10:25:43 crc kubenswrapper[4790]: E1003 10:25:43.258657 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a92e8ea7-1551-436b-89d7-0409296a4c41" containerName="extract-content" Oct 03 10:25:43 crc kubenswrapper[4790]: I1003 10:25:43.258775 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="a92e8ea7-1551-436b-89d7-0409296a4c41" containerName="extract-content" Oct 03 10:25:43 crc kubenswrapper[4790]: E1003 10:25:43.258868 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c22cc451-27ee-4530-8185-d5285f391019" containerName="gather" Oct 03 10:25:43 crc kubenswrapper[4790]: I1003 10:25:43.258950 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="c22cc451-27ee-4530-8185-d5285f391019" containerName="gather" Oct 03 10:25:43 crc kubenswrapper[4790]: E1003 10:25:43.259041 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c22cc451-27ee-4530-8185-d5285f391019" containerName="copy" Oct 03 10:25:43 crc kubenswrapper[4790]: I1003 10:25:43.259123 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="c22cc451-27ee-4530-8185-d5285f391019" containerName="copy" Oct 03 10:25:43 crc kubenswrapper[4790]: E1003 10:25:43.259231 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="445a5bca-64c2-4798-94eb-5e36f616043d" containerName="extract-utilities" Oct 03 10:25:43 crc kubenswrapper[4790]: I1003 10:25:43.259314 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="445a5bca-64c2-4798-94eb-5e36f616043d" containerName="extract-utilities" Oct 03 10:25:43 crc kubenswrapper[4790]: E1003 10:25:43.259406 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a92e8ea7-1551-436b-89d7-0409296a4c41" containerName="registry-server" Oct 03 10:25:43 crc kubenswrapper[4790]: I1003 10:25:43.259508 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="a92e8ea7-1551-436b-89d7-0409296a4c41" containerName="registry-server" Oct 03 10:25:43 crc kubenswrapper[4790]: E1003 10:25:43.259614 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="445a5bca-64c2-4798-94eb-5e36f616043d" containerName="extract-content" Oct 03 10:25:43 crc kubenswrapper[4790]: I1003 10:25:43.259693 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="445a5bca-64c2-4798-94eb-5e36f616043d" containerName="extract-content" Oct 03 10:25:43 crc kubenswrapper[4790]: E1003 10:25:43.259785 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="445a5bca-64c2-4798-94eb-5e36f616043d" containerName="registry-server" Oct 03 10:25:43 crc kubenswrapper[4790]: I1003 10:25:43.259862 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="445a5bca-64c2-4798-94eb-5e36f616043d" containerName="registry-server" Oct 03 10:25:43 crc kubenswrapper[4790]: E1003 10:25:43.259939 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a92e8ea7-1551-436b-89d7-0409296a4c41" containerName="extract-utilities" Oct 03 10:25:43 crc kubenswrapper[4790]: I1003 10:25:43.260010 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="a92e8ea7-1551-436b-89d7-0409296a4c41" containerName="extract-utilities" Oct 03 10:25:43 crc kubenswrapper[4790]: I1003 10:25:43.260325 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="c22cc451-27ee-4530-8185-d5285f391019" containerName="gather" Oct 03 10:25:43 crc kubenswrapper[4790]: I1003 10:25:43.261178 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="445a5bca-64c2-4798-94eb-5e36f616043d" containerName="registry-server" Oct 03 10:25:43 crc kubenswrapper[4790]: I1003 10:25:43.261302 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="a92e8ea7-1551-436b-89d7-0409296a4c41" containerName="registry-server" Oct 03 10:25:43 crc kubenswrapper[4790]: I1003 10:25:43.261431 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="c22cc451-27ee-4530-8185-d5285f391019" containerName="copy" Oct 03 10:25:43 crc kubenswrapper[4790]: I1003 10:25:43.262972 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-dxvsr/must-gather-p4m4p" Oct 03 10:25:43 crc kubenswrapper[4790]: I1003 10:25:43.267994 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-dxvsr"/"default-dockercfg-7695q" Oct 03 10:25:43 crc kubenswrapper[4790]: I1003 10:25:43.268173 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-dxvsr"/"kube-root-ca.crt" Oct 03 10:25:43 crc kubenswrapper[4790]: I1003 10:25:43.268591 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-dxvsr"/"openshift-service-ca.crt" Oct 03 10:25:43 crc kubenswrapper[4790]: I1003 10:25:43.270346 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-dxvsr/must-gather-p4m4p"] Oct 03 10:25:43 crc kubenswrapper[4790]: I1003 10:25:43.275326 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/6f3a7f20-a077-41e3-9971-85318acc2a26-must-gather-output\") pod \"must-gather-p4m4p\" (UID: \"6f3a7f20-a077-41e3-9971-85318acc2a26\") " pod="openshift-must-gather-dxvsr/must-gather-p4m4p" Oct 03 10:25:43 crc kubenswrapper[4790]: I1003 10:25:43.275409 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p286j\" (UniqueName: \"kubernetes.io/projected/6f3a7f20-a077-41e3-9971-85318acc2a26-kube-api-access-p286j\") pod \"must-gather-p4m4p\" (UID: \"6f3a7f20-a077-41e3-9971-85318acc2a26\") " pod="openshift-must-gather-dxvsr/must-gather-p4m4p" Oct 03 10:25:43 crc kubenswrapper[4790]: I1003 10:25:43.378622 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/6f3a7f20-a077-41e3-9971-85318acc2a26-must-gather-output\") pod \"must-gather-p4m4p\" (UID: \"6f3a7f20-a077-41e3-9971-85318acc2a26\") " pod="openshift-must-gather-dxvsr/must-gather-p4m4p" Oct 03 10:25:43 crc kubenswrapper[4790]: I1003 10:25:43.378723 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p286j\" (UniqueName: \"kubernetes.io/projected/6f3a7f20-a077-41e3-9971-85318acc2a26-kube-api-access-p286j\") pod \"must-gather-p4m4p\" (UID: \"6f3a7f20-a077-41e3-9971-85318acc2a26\") " pod="openshift-must-gather-dxvsr/must-gather-p4m4p" Oct 03 10:25:43 crc kubenswrapper[4790]: I1003 10:25:43.379028 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/6f3a7f20-a077-41e3-9971-85318acc2a26-must-gather-output\") pod \"must-gather-p4m4p\" (UID: \"6f3a7f20-a077-41e3-9971-85318acc2a26\") " pod="openshift-must-gather-dxvsr/must-gather-p4m4p" Oct 03 10:25:43 crc kubenswrapper[4790]: I1003 10:25:43.399225 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p286j\" (UniqueName: \"kubernetes.io/projected/6f3a7f20-a077-41e3-9971-85318acc2a26-kube-api-access-p286j\") pod \"must-gather-p4m4p\" (UID: \"6f3a7f20-a077-41e3-9971-85318acc2a26\") " pod="openshift-must-gather-dxvsr/must-gather-p4m4p" Oct 03 10:25:43 crc kubenswrapper[4790]: I1003 10:25:43.582331 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-dxvsr/must-gather-p4m4p" Oct 03 10:25:44 crc kubenswrapper[4790]: I1003 10:25:44.105951 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-dxvsr/must-gather-p4m4p"] Oct 03 10:25:44 crc kubenswrapper[4790]: I1003 10:25:44.652869 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-dxvsr/must-gather-p4m4p" event={"ID":"6f3a7f20-a077-41e3-9971-85318acc2a26","Type":"ContainerStarted","Data":"7220800b3b3304d9e90ada19f98f7c27d04b2cf8a4f719d714fed9ac36c36f79"} Oct 03 10:25:44 crc kubenswrapper[4790]: I1003 10:25:44.653493 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-dxvsr/must-gather-p4m4p" event={"ID":"6f3a7f20-a077-41e3-9971-85318acc2a26","Type":"ContainerStarted","Data":"c4b445da29f75f624a6bf02309245f5e89efd2f31526f76cc54f5e50bad34a02"} Oct 03 10:25:45 crc kubenswrapper[4790]: I1003 10:25:45.663162 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-dxvsr/must-gather-p4m4p" event={"ID":"6f3a7f20-a077-41e3-9971-85318acc2a26","Type":"ContainerStarted","Data":"4a4cfbbeb2d2146d13da7e012a8137daaf16fab76772e5a715bbcaad9a26fb02"} Oct 03 10:25:45 crc kubenswrapper[4790]: I1003 10:25:45.685764 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-dxvsr/must-gather-p4m4p" podStartSLOduration=2.685740006 podStartE2EDuration="2.685740006s" podCreationTimestamp="2025-10-03 10:25:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 10:25:45.67719982 +0000 UTC m=+6332.030134058" watchObservedRunningTime="2025-10-03 10:25:45.685740006 +0000 UTC m=+6332.038674244" Oct 03 10:25:48 crc kubenswrapper[4790]: I1003 10:25:48.781003 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-dxvsr/crc-debug-ktqmr"] Oct 03 10:25:48 crc kubenswrapper[4790]: I1003 10:25:48.782849 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-dxvsr/crc-debug-ktqmr" Oct 03 10:25:48 crc kubenswrapper[4790]: I1003 10:25:48.810916 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n5wtn\" (UniqueName: \"kubernetes.io/projected/a1d20993-d138-4931-9b85-9d201850df77-kube-api-access-n5wtn\") pod \"crc-debug-ktqmr\" (UID: \"a1d20993-d138-4931-9b85-9d201850df77\") " pod="openshift-must-gather-dxvsr/crc-debug-ktqmr" Oct 03 10:25:48 crc kubenswrapper[4790]: I1003 10:25:48.810979 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a1d20993-d138-4931-9b85-9d201850df77-host\") pod \"crc-debug-ktqmr\" (UID: \"a1d20993-d138-4931-9b85-9d201850df77\") " pod="openshift-must-gather-dxvsr/crc-debug-ktqmr" Oct 03 10:25:48 crc kubenswrapper[4790]: I1003 10:25:48.912768 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n5wtn\" (UniqueName: \"kubernetes.io/projected/a1d20993-d138-4931-9b85-9d201850df77-kube-api-access-n5wtn\") pod \"crc-debug-ktqmr\" (UID: \"a1d20993-d138-4931-9b85-9d201850df77\") " pod="openshift-must-gather-dxvsr/crc-debug-ktqmr" Oct 03 10:25:48 crc kubenswrapper[4790]: I1003 10:25:48.912832 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a1d20993-d138-4931-9b85-9d201850df77-host\") pod \"crc-debug-ktqmr\" (UID: \"a1d20993-d138-4931-9b85-9d201850df77\") " pod="openshift-must-gather-dxvsr/crc-debug-ktqmr" Oct 03 10:25:48 crc kubenswrapper[4790]: I1003 10:25:48.912944 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a1d20993-d138-4931-9b85-9d201850df77-host\") pod \"crc-debug-ktqmr\" (UID: \"a1d20993-d138-4931-9b85-9d201850df77\") " pod="openshift-must-gather-dxvsr/crc-debug-ktqmr" Oct 03 10:25:48 crc kubenswrapper[4790]: I1003 10:25:48.933654 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n5wtn\" (UniqueName: \"kubernetes.io/projected/a1d20993-d138-4931-9b85-9d201850df77-kube-api-access-n5wtn\") pod \"crc-debug-ktqmr\" (UID: \"a1d20993-d138-4931-9b85-9d201850df77\") " pod="openshift-must-gather-dxvsr/crc-debug-ktqmr" Oct 03 10:25:49 crc kubenswrapper[4790]: I1003 10:25:49.105097 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-dxvsr/crc-debug-ktqmr" Oct 03 10:25:49 crc kubenswrapper[4790]: W1003 10:25:49.152139 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda1d20993_d138_4931_9b85_9d201850df77.slice/crio-0c6ba15da9d5a9001c3425a74471b29fed84237eada5a8cd2ec7f044b511d979 WatchSource:0}: Error finding container 0c6ba15da9d5a9001c3425a74471b29fed84237eada5a8cd2ec7f044b511d979: Status 404 returned error can't find the container with id 0c6ba15da9d5a9001c3425a74471b29fed84237eada5a8cd2ec7f044b511d979 Oct 03 10:25:49 crc kubenswrapper[4790]: I1003 10:25:49.699059 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-dxvsr/crc-debug-ktqmr" event={"ID":"a1d20993-d138-4931-9b85-9d201850df77","Type":"ContainerStarted","Data":"9af713dbabb8b5349bf51d66cb1df5d7bf9315c1bedd468fc4acaaefab00611a"} Oct 03 10:25:49 crc kubenswrapper[4790]: I1003 10:25:49.701161 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-dxvsr/crc-debug-ktqmr" event={"ID":"a1d20993-d138-4931-9b85-9d201850df77","Type":"ContainerStarted","Data":"0c6ba15da9d5a9001c3425a74471b29fed84237eada5a8cd2ec7f044b511d979"} Oct 03 10:25:49 crc kubenswrapper[4790]: I1003 10:25:49.719449 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-dxvsr/crc-debug-ktqmr" podStartSLOduration=1.719416679 podStartE2EDuration="1.719416679s" podCreationTimestamp="2025-10-03 10:25:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 10:25:49.715093539 +0000 UTC m=+6336.068027767" watchObservedRunningTime="2025-10-03 10:25:49.719416679 +0000 UTC m=+6336.072350927" Oct 03 10:26:10 crc kubenswrapper[4790]: I1003 10:26:10.185901 4790 patch_prober.go:28] interesting pod/machine-config-daemon-zqj4j container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 10:26:10 crc kubenswrapper[4790]: I1003 10:26:10.186365 4790 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" podUID="36eae06a-d8be-4128-8d68-acb32e1f011b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 10:26:10 crc kubenswrapper[4790]: I1003 10:26:10.186588 4790 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" Oct 03 10:26:10 crc kubenswrapper[4790]: I1003 10:26:10.187424 4790 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"4a3f2daf4e4f2fcbea44841698967b97018c1bcc4bd0dc9f6b5b49027a55eb34"} pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 03 10:26:10 crc kubenswrapper[4790]: I1003 10:26:10.187487 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" podUID="36eae06a-d8be-4128-8d68-acb32e1f011b" containerName="machine-config-daemon" containerID="cri-o://4a3f2daf4e4f2fcbea44841698967b97018c1bcc4bd0dc9f6b5b49027a55eb34" gracePeriod=600 Oct 03 10:26:10 crc kubenswrapper[4790]: E1003 10:26:10.323396 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqj4j_openshift-machine-config-operator(36eae06a-d8be-4128-8d68-acb32e1f011b)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" podUID="36eae06a-d8be-4128-8d68-acb32e1f011b" Oct 03 10:26:10 crc kubenswrapper[4790]: I1003 10:26:10.965989 4790 generic.go:334] "Generic (PLEG): container finished" podID="36eae06a-d8be-4128-8d68-acb32e1f011b" containerID="4a3f2daf4e4f2fcbea44841698967b97018c1bcc4bd0dc9f6b5b49027a55eb34" exitCode=0 Oct 03 10:26:10 crc kubenswrapper[4790]: I1003 10:26:10.966056 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" event={"ID":"36eae06a-d8be-4128-8d68-acb32e1f011b","Type":"ContainerDied","Data":"4a3f2daf4e4f2fcbea44841698967b97018c1bcc4bd0dc9f6b5b49027a55eb34"} Oct 03 10:26:10 crc kubenswrapper[4790]: I1003 10:26:10.966310 4790 scope.go:117] "RemoveContainer" containerID="72e7e65c5c41feebe96198a1537bafbf79d8997c016f7bcda12de7ad1bebd3e6" Oct 03 10:26:10 crc kubenswrapper[4790]: I1003 10:26:10.968325 4790 scope.go:117] "RemoveContainer" containerID="4a3f2daf4e4f2fcbea44841698967b97018c1bcc4bd0dc9f6b5b49027a55eb34" Oct 03 10:26:10 crc kubenswrapper[4790]: E1003 10:26:10.969067 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqj4j_openshift-machine-config-operator(36eae06a-d8be-4128-8d68-acb32e1f011b)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" podUID="36eae06a-d8be-4128-8d68-acb32e1f011b" Oct 03 10:26:22 crc kubenswrapper[4790]: I1003 10:26:22.328319 4790 scope.go:117] "RemoveContainer" containerID="4a3f2daf4e4f2fcbea44841698967b97018c1bcc4bd0dc9f6b5b49027a55eb34" Oct 03 10:26:22 crc kubenswrapper[4790]: E1003 10:26:22.329238 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqj4j_openshift-machine-config-operator(36eae06a-d8be-4128-8d68-acb32e1f011b)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" podUID="36eae06a-d8be-4128-8d68-acb32e1f011b" Oct 03 10:26:36 crc kubenswrapper[4790]: I1003 10:26:36.329817 4790 scope.go:117] "RemoveContainer" containerID="4a3f2daf4e4f2fcbea44841698967b97018c1bcc4bd0dc9f6b5b49027a55eb34" Oct 03 10:26:36 crc kubenswrapper[4790]: E1003 10:26:36.330808 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqj4j_openshift-machine-config-operator(36eae06a-d8be-4128-8d68-acb32e1f011b)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" podUID="36eae06a-d8be-4128-8d68-acb32e1f011b" Oct 03 10:26:41 crc kubenswrapper[4790]: I1003 10:26:41.553172 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-4sd9f"] Oct 03 10:26:41 crc kubenswrapper[4790]: I1003 10:26:41.556652 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4sd9f" Oct 03 10:26:41 crc kubenswrapper[4790]: I1003 10:26:41.564205 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-4sd9f"] Oct 03 10:26:41 crc kubenswrapper[4790]: I1003 10:26:41.646902 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2f0029d4-804b-4699-89ba-a01fd8d91fd7-catalog-content\") pod \"redhat-marketplace-4sd9f\" (UID: \"2f0029d4-804b-4699-89ba-a01fd8d91fd7\") " pod="openshift-marketplace/redhat-marketplace-4sd9f" Oct 03 10:26:41 crc kubenswrapper[4790]: I1003 10:26:41.647281 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2f0029d4-804b-4699-89ba-a01fd8d91fd7-utilities\") pod \"redhat-marketplace-4sd9f\" (UID: \"2f0029d4-804b-4699-89ba-a01fd8d91fd7\") " pod="openshift-marketplace/redhat-marketplace-4sd9f" Oct 03 10:26:41 crc kubenswrapper[4790]: I1003 10:26:41.647721 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tm9tw\" (UniqueName: \"kubernetes.io/projected/2f0029d4-804b-4699-89ba-a01fd8d91fd7-kube-api-access-tm9tw\") pod \"redhat-marketplace-4sd9f\" (UID: \"2f0029d4-804b-4699-89ba-a01fd8d91fd7\") " pod="openshift-marketplace/redhat-marketplace-4sd9f" Oct 03 10:26:41 crc kubenswrapper[4790]: I1003 10:26:41.750154 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tm9tw\" (UniqueName: \"kubernetes.io/projected/2f0029d4-804b-4699-89ba-a01fd8d91fd7-kube-api-access-tm9tw\") pod \"redhat-marketplace-4sd9f\" (UID: \"2f0029d4-804b-4699-89ba-a01fd8d91fd7\") " pod="openshift-marketplace/redhat-marketplace-4sd9f" Oct 03 10:26:41 crc kubenswrapper[4790]: I1003 10:26:41.750536 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2f0029d4-804b-4699-89ba-a01fd8d91fd7-catalog-content\") pod \"redhat-marketplace-4sd9f\" (UID: \"2f0029d4-804b-4699-89ba-a01fd8d91fd7\") " pod="openshift-marketplace/redhat-marketplace-4sd9f" Oct 03 10:26:41 crc kubenswrapper[4790]: I1003 10:26:41.750643 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2f0029d4-804b-4699-89ba-a01fd8d91fd7-utilities\") pod \"redhat-marketplace-4sd9f\" (UID: \"2f0029d4-804b-4699-89ba-a01fd8d91fd7\") " pod="openshift-marketplace/redhat-marketplace-4sd9f" Oct 03 10:26:41 crc kubenswrapper[4790]: I1003 10:26:41.751183 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2f0029d4-804b-4699-89ba-a01fd8d91fd7-utilities\") pod \"redhat-marketplace-4sd9f\" (UID: \"2f0029d4-804b-4699-89ba-a01fd8d91fd7\") " pod="openshift-marketplace/redhat-marketplace-4sd9f" Oct 03 10:26:41 crc kubenswrapper[4790]: I1003 10:26:41.751792 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2f0029d4-804b-4699-89ba-a01fd8d91fd7-catalog-content\") pod \"redhat-marketplace-4sd9f\" (UID: \"2f0029d4-804b-4699-89ba-a01fd8d91fd7\") " pod="openshift-marketplace/redhat-marketplace-4sd9f" Oct 03 10:26:41 crc kubenswrapper[4790]: I1003 10:26:41.780653 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tm9tw\" (UniqueName: \"kubernetes.io/projected/2f0029d4-804b-4699-89ba-a01fd8d91fd7-kube-api-access-tm9tw\") pod \"redhat-marketplace-4sd9f\" (UID: \"2f0029d4-804b-4699-89ba-a01fd8d91fd7\") " pod="openshift-marketplace/redhat-marketplace-4sd9f" Oct 03 10:26:41 crc kubenswrapper[4790]: I1003 10:26:41.884312 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4sd9f" Oct 03 10:26:42 crc kubenswrapper[4790]: I1003 10:26:42.427878 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-4sd9f"] Oct 03 10:26:43 crc kubenswrapper[4790]: I1003 10:26:43.301326 4790 generic.go:334] "Generic (PLEG): container finished" podID="2f0029d4-804b-4699-89ba-a01fd8d91fd7" containerID="08db54df2352820891cb5d71dd280211bcd98483e8c7a5698c85daf38e9f419c" exitCode=0 Oct 03 10:26:43 crc kubenswrapper[4790]: I1003 10:26:43.301378 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4sd9f" event={"ID":"2f0029d4-804b-4699-89ba-a01fd8d91fd7","Type":"ContainerDied","Data":"08db54df2352820891cb5d71dd280211bcd98483e8c7a5698c85daf38e9f419c"} Oct 03 10:26:43 crc kubenswrapper[4790]: I1003 10:26:43.301679 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4sd9f" event={"ID":"2f0029d4-804b-4699-89ba-a01fd8d91fd7","Type":"ContainerStarted","Data":"375c86cf5bce6eba8a77dba373db4dc25c95e7100aa73263aa0c6e5339d704c1"} Oct 03 10:26:44 crc kubenswrapper[4790]: I1003 10:26:44.312288 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4sd9f" event={"ID":"2f0029d4-804b-4699-89ba-a01fd8d91fd7","Type":"ContainerStarted","Data":"d263f2c87c20863585359eb4ca0eeb77323400619b666b0f1904657d9ae24fce"} Oct 03 10:26:45 crc kubenswrapper[4790]: I1003 10:26:45.323528 4790 generic.go:334] "Generic (PLEG): container finished" podID="2f0029d4-804b-4699-89ba-a01fd8d91fd7" containerID="d263f2c87c20863585359eb4ca0eeb77323400619b666b0f1904657d9ae24fce" exitCode=0 Oct 03 10:26:45 crc kubenswrapper[4790]: I1003 10:26:45.323737 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4sd9f" event={"ID":"2f0029d4-804b-4699-89ba-a01fd8d91fd7","Type":"ContainerDied","Data":"d263f2c87c20863585359eb4ca0eeb77323400619b666b0f1904657d9ae24fce"} Oct 03 10:26:46 crc kubenswrapper[4790]: I1003 10:26:46.338997 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4sd9f" event={"ID":"2f0029d4-804b-4699-89ba-a01fd8d91fd7","Type":"ContainerStarted","Data":"8ecc1c31264ed7b4e24050663432b0b3c8f62a07d5bfc48c932ec584ea0069f3"} Oct 03 10:26:46 crc kubenswrapper[4790]: I1003 10:26:46.357896 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-4sd9f" podStartSLOduration=2.827655663 podStartE2EDuration="5.357875422s" podCreationTimestamp="2025-10-03 10:26:41 +0000 UTC" firstStartedPulling="2025-10-03 10:26:43.304028668 +0000 UTC m=+6389.656962916" lastFinishedPulling="2025-10-03 10:26:45.834248437 +0000 UTC m=+6392.187182675" observedRunningTime="2025-10-03 10:26:46.351102035 +0000 UTC m=+6392.704036283" watchObservedRunningTime="2025-10-03 10:26:46.357875422 +0000 UTC m=+6392.710809660" Oct 03 10:26:48 crc kubenswrapper[4790]: I1003 10:26:48.328828 4790 scope.go:117] "RemoveContainer" containerID="4a3f2daf4e4f2fcbea44841698967b97018c1bcc4bd0dc9f6b5b49027a55eb34" Oct 03 10:26:48 crc kubenswrapper[4790]: E1003 10:26:48.329369 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqj4j_openshift-machine-config-operator(36eae06a-d8be-4128-8d68-acb32e1f011b)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" podUID="36eae06a-d8be-4128-8d68-acb32e1f011b" Oct 03 10:26:51 crc kubenswrapper[4790]: I1003 10:26:51.884764 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-4sd9f" Oct 03 10:26:51 crc kubenswrapper[4790]: I1003 10:26:51.885292 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-4sd9f" Oct 03 10:26:51 crc kubenswrapper[4790]: I1003 10:26:51.937965 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-4sd9f" Oct 03 10:26:52 crc kubenswrapper[4790]: I1003 10:26:52.440064 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-4sd9f" Oct 03 10:26:52 crc kubenswrapper[4790]: I1003 10:26:52.483553 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-4sd9f"] Oct 03 10:26:54 crc kubenswrapper[4790]: I1003 10:26:54.410036 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-4sd9f" podUID="2f0029d4-804b-4699-89ba-a01fd8d91fd7" containerName="registry-server" containerID="cri-o://8ecc1c31264ed7b4e24050663432b0b3c8f62a07d5bfc48c932ec584ea0069f3" gracePeriod=2 Oct 03 10:26:54 crc kubenswrapper[4790]: I1003 10:26:54.922714 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4sd9f" Oct 03 10:26:55 crc kubenswrapper[4790]: I1003 10:26:55.031294 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tm9tw\" (UniqueName: \"kubernetes.io/projected/2f0029d4-804b-4699-89ba-a01fd8d91fd7-kube-api-access-tm9tw\") pod \"2f0029d4-804b-4699-89ba-a01fd8d91fd7\" (UID: \"2f0029d4-804b-4699-89ba-a01fd8d91fd7\") " Oct 03 10:26:55 crc kubenswrapper[4790]: I1003 10:26:55.031450 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2f0029d4-804b-4699-89ba-a01fd8d91fd7-catalog-content\") pod \"2f0029d4-804b-4699-89ba-a01fd8d91fd7\" (UID: \"2f0029d4-804b-4699-89ba-a01fd8d91fd7\") " Oct 03 10:26:55 crc kubenswrapper[4790]: I1003 10:26:55.031475 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2f0029d4-804b-4699-89ba-a01fd8d91fd7-utilities\") pod \"2f0029d4-804b-4699-89ba-a01fd8d91fd7\" (UID: \"2f0029d4-804b-4699-89ba-a01fd8d91fd7\") " Oct 03 10:26:55 crc kubenswrapper[4790]: I1003 10:26:55.032430 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2f0029d4-804b-4699-89ba-a01fd8d91fd7-utilities" (OuterVolumeSpecName: "utilities") pod "2f0029d4-804b-4699-89ba-a01fd8d91fd7" (UID: "2f0029d4-804b-4699-89ba-a01fd8d91fd7"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 10:26:55 crc kubenswrapper[4790]: I1003 10:26:55.037908 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2f0029d4-804b-4699-89ba-a01fd8d91fd7-kube-api-access-tm9tw" (OuterVolumeSpecName: "kube-api-access-tm9tw") pod "2f0029d4-804b-4699-89ba-a01fd8d91fd7" (UID: "2f0029d4-804b-4699-89ba-a01fd8d91fd7"). InnerVolumeSpecName "kube-api-access-tm9tw". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 10:26:55 crc kubenswrapper[4790]: I1003 10:26:55.047342 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2f0029d4-804b-4699-89ba-a01fd8d91fd7-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2f0029d4-804b-4699-89ba-a01fd8d91fd7" (UID: "2f0029d4-804b-4699-89ba-a01fd8d91fd7"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 10:26:55 crc kubenswrapper[4790]: I1003 10:26:55.133867 4790 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2f0029d4-804b-4699-89ba-a01fd8d91fd7-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 10:26:55 crc kubenswrapper[4790]: I1003 10:26:55.133904 4790 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2f0029d4-804b-4699-89ba-a01fd8d91fd7-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 10:26:55 crc kubenswrapper[4790]: I1003 10:26:55.133920 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tm9tw\" (UniqueName: \"kubernetes.io/projected/2f0029d4-804b-4699-89ba-a01fd8d91fd7-kube-api-access-tm9tw\") on node \"crc\" DevicePath \"\"" Oct 03 10:26:55 crc kubenswrapper[4790]: I1003 10:26:55.421059 4790 generic.go:334] "Generic (PLEG): container finished" podID="2f0029d4-804b-4699-89ba-a01fd8d91fd7" containerID="8ecc1c31264ed7b4e24050663432b0b3c8f62a07d5bfc48c932ec584ea0069f3" exitCode=0 Oct 03 10:26:55 crc kubenswrapper[4790]: I1003 10:26:55.421110 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4sd9f" Oct 03 10:26:55 crc kubenswrapper[4790]: I1003 10:26:55.421107 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4sd9f" event={"ID":"2f0029d4-804b-4699-89ba-a01fd8d91fd7","Type":"ContainerDied","Data":"8ecc1c31264ed7b4e24050663432b0b3c8f62a07d5bfc48c932ec584ea0069f3"} Oct 03 10:26:55 crc kubenswrapper[4790]: I1003 10:26:55.421275 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4sd9f" event={"ID":"2f0029d4-804b-4699-89ba-a01fd8d91fd7","Type":"ContainerDied","Data":"375c86cf5bce6eba8a77dba373db4dc25c95e7100aa73263aa0c6e5339d704c1"} Oct 03 10:26:55 crc kubenswrapper[4790]: I1003 10:26:55.421313 4790 scope.go:117] "RemoveContainer" containerID="8ecc1c31264ed7b4e24050663432b0b3c8f62a07d5bfc48c932ec584ea0069f3" Oct 03 10:26:55 crc kubenswrapper[4790]: I1003 10:26:55.464714 4790 scope.go:117] "RemoveContainer" containerID="d263f2c87c20863585359eb4ca0eeb77323400619b666b0f1904657d9ae24fce" Oct 03 10:26:55 crc kubenswrapper[4790]: I1003 10:26:55.465134 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-4sd9f"] Oct 03 10:26:55 crc kubenswrapper[4790]: I1003 10:26:55.480866 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-4sd9f"] Oct 03 10:26:55 crc kubenswrapper[4790]: I1003 10:26:55.485788 4790 scope.go:117] "RemoveContainer" containerID="08db54df2352820891cb5d71dd280211bcd98483e8c7a5698c85daf38e9f419c" Oct 03 10:26:55 crc kubenswrapper[4790]: I1003 10:26:55.537064 4790 scope.go:117] "RemoveContainer" containerID="8ecc1c31264ed7b4e24050663432b0b3c8f62a07d5bfc48c932ec584ea0069f3" Oct 03 10:26:55 crc kubenswrapper[4790]: E1003 10:26:55.537563 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8ecc1c31264ed7b4e24050663432b0b3c8f62a07d5bfc48c932ec584ea0069f3\": container with ID starting with 8ecc1c31264ed7b4e24050663432b0b3c8f62a07d5bfc48c932ec584ea0069f3 not found: ID does not exist" containerID="8ecc1c31264ed7b4e24050663432b0b3c8f62a07d5bfc48c932ec584ea0069f3" Oct 03 10:26:55 crc kubenswrapper[4790]: I1003 10:26:55.537707 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8ecc1c31264ed7b4e24050663432b0b3c8f62a07d5bfc48c932ec584ea0069f3"} err="failed to get container status \"8ecc1c31264ed7b4e24050663432b0b3c8f62a07d5bfc48c932ec584ea0069f3\": rpc error: code = NotFound desc = could not find container \"8ecc1c31264ed7b4e24050663432b0b3c8f62a07d5bfc48c932ec584ea0069f3\": container with ID starting with 8ecc1c31264ed7b4e24050663432b0b3c8f62a07d5bfc48c932ec584ea0069f3 not found: ID does not exist" Oct 03 10:26:55 crc kubenswrapper[4790]: I1003 10:26:55.537733 4790 scope.go:117] "RemoveContainer" containerID="d263f2c87c20863585359eb4ca0eeb77323400619b666b0f1904657d9ae24fce" Oct 03 10:26:55 crc kubenswrapper[4790]: E1003 10:26:55.537967 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d263f2c87c20863585359eb4ca0eeb77323400619b666b0f1904657d9ae24fce\": container with ID starting with d263f2c87c20863585359eb4ca0eeb77323400619b666b0f1904657d9ae24fce not found: ID does not exist" containerID="d263f2c87c20863585359eb4ca0eeb77323400619b666b0f1904657d9ae24fce" Oct 03 10:26:55 crc kubenswrapper[4790]: I1003 10:26:55.537994 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d263f2c87c20863585359eb4ca0eeb77323400619b666b0f1904657d9ae24fce"} err="failed to get container status \"d263f2c87c20863585359eb4ca0eeb77323400619b666b0f1904657d9ae24fce\": rpc error: code = NotFound desc = could not find container \"d263f2c87c20863585359eb4ca0eeb77323400619b666b0f1904657d9ae24fce\": container with ID starting with d263f2c87c20863585359eb4ca0eeb77323400619b666b0f1904657d9ae24fce not found: ID does not exist" Oct 03 10:26:55 crc kubenswrapper[4790]: I1003 10:26:55.538015 4790 scope.go:117] "RemoveContainer" containerID="08db54df2352820891cb5d71dd280211bcd98483e8c7a5698c85daf38e9f419c" Oct 03 10:26:55 crc kubenswrapper[4790]: E1003 10:26:55.538246 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"08db54df2352820891cb5d71dd280211bcd98483e8c7a5698c85daf38e9f419c\": container with ID starting with 08db54df2352820891cb5d71dd280211bcd98483e8c7a5698c85daf38e9f419c not found: ID does not exist" containerID="08db54df2352820891cb5d71dd280211bcd98483e8c7a5698c85daf38e9f419c" Oct 03 10:26:55 crc kubenswrapper[4790]: I1003 10:26:55.538270 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"08db54df2352820891cb5d71dd280211bcd98483e8c7a5698c85daf38e9f419c"} err="failed to get container status \"08db54df2352820891cb5d71dd280211bcd98483e8c7a5698c85daf38e9f419c\": rpc error: code = NotFound desc = could not find container \"08db54df2352820891cb5d71dd280211bcd98483e8c7a5698c85daf38e9f419c\": container with ID starting with 08db54df2352820891cb5d71dd280211bcd98483e8c7a5698c85daf38e9f419c not found: ID does not exist" Oct 03 10:26:56 crc kubenswrapper[4790]: I1003 10:26:56.351077 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2f0029d4-804b-4699-89ba-a01fd8d91fd7" path="/var/lib/kubelet/pods/2f0029d4-804b-4699-89ba-a01fd8d91fd7/volumes" Oct 03 10:27:00 crc kubenswrapper[4790]: I1003 10:27:00.329393 4790 scope.go:117] "RemoveContainer" containerID="4a3f2daf4e4f2fcbea44841698967b97018c1bcc4bd0dc9f6b5b49027a55eb34" Oct 03 10:27:00 crc kubenswrapper[4790]: E1003 10:27:00.330126 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqj4j_openshift-machine-config-operator(36eae06a-d8be-4128-8d68-acb32e1f011b)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" podUID="36eae06a-d8be-4128-8d68-acb32e1f011b" Oct 03 10:27:09 crc kubenswrapper[4790]: I1003 10:27:09.456191 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-55549c44b-l2c25_f1f79552-9846-40d8-b095-ac773b5af079/barbican-api/0.log" Oct 03 10:27:09 crc kubenswrapper[4790]: I1003 10:27:09.483216 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-55549c44b-l2c25_f1f79552-9846-40d8-b095-ac773b5af079/barbican-api-log/0.log" Oct 03 10:27:09 crc kubenswrapper[4790]: I1003 10:27:09.718205 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-56674f8d84-pn6qq_0db45010-78ab-486c-b5be-bc2431f3ab9d/barbican-keystone-listener/0.log" Oct 03 10:27:09 crc kubenswrapper[4790]: I1003 10:27:09.770549 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-56674f8d84-pn6qq_0db45010-78ab-486c-b5be-bc2431f3ab9d/barbican-keystone-listener-log/0.log" Oct 03 10:27:09 crc kubenswrapper[4790]: I1003 10:27:09.933936 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-765d8bdc5-vfk8w_7a3c6af9-6ba2-4e18-8779-1e71dbc175a1/barbican-worker/0.log" Oct 03 10:27:09 crc kubenswrapper[4790]: I1003 10:27:09.966058 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-765d8bdc5-vfk8w_7a3c6af9-6ba2-4e18-8779-1e71dbc175a1/barbican-worker-log/0.log" Oct 03 10:27:10 crc kubenswrapper[4790]: I1003 10:27:10.145308 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-l8fbg_97eb47a0-1a88-49ee-9d24-b9f0624ed543/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Oct 03 10:27:10 crc kubenswrapper[4790]: I1003 10:27:10.443337 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_ab369c6c-af88-4f07-bc5c-fb229a9b2cc8/ceilometer-notification-agent/0.log" Oct 03 10:27:10 crc kubenswrapper[4790]: I1003 10:27:10.485249 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_ab369c6c-af88-4f07-bc5c-fb229a9b2cc8/proxy-httpd/0.log" Oct 03 10:27:10 crc kubenswrapper[4790]: I1003 10:27:10.555944 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_ab369c6c-af88-4f07-bc5c-fb229a9b2cc8/ceilometer-central-agent/0.log" Oct 03 10:27:10 crc kubenswrapper[4790]: I1003 10:27:10.617983 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_ab369c6c-af88-4f07-bc5c-fb229a9b2cc8/sg-core/0.log" Oct 03 10:27:10 crc kubenswrapper[4790]: I1003 10:27:10.877920 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_a7f77060-c6da-48ba-9c55-e454fd1da9fd/cinder-api-log/0.log" Oct 03 10:27:11 crc kubenswrapper[4790]: I1003 10:27:11.347219 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-backup-0_8927cfc4-2b2e-4a79-bd9e-3ff95160b86a/probe/0.log" Oct 03 10:27:11 crc kubenswrapper[4790]: I1003 10:27:11.877084 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-backup-0_8927cfc4-2b2e-4a79-bd9e-3ff95160b86a/cinder-backup/0.log" Oct 03 10:27:11 crc kubenswrapper[4790]: I1003 10:27:11.946405 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_1289ded7-2702-43dc-9d28-00567ba7b668/cinder-scheduler/0.log" Oct 03 10:27:12 crc kubenswrapper[4790]: I1003 10:27:12.092169 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_a7f77060-c6da-48ba-9c55-e454fd1da9fd/cinder-api/0.log" Oct 03 10:27:12 crc kubenswrapper[4790]: I1003 10:27:12.193212 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_1289ded7-2702-43dc-9d28-00567ba7b668/probe/0.log" Oct 03 10:27:12 crc kubenswrapper[4790]: I1003 10:27:12.446350 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-volume-nfs-0_63400791-2785-485b-86b4-ffd9943163a5/probe/0.log" Oct 03 10:27:12 crc kubenswrapper[4790]: I1003 10:27:12.708854 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-volume-nfs-0_63400791-2785-485b-86b4-ffd9943163a5/cinder-volume/0.log" Oct 03 10:27:12 crc kubenswrapper[4790]: I1003 10:27:12.908733 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-volume-nfs-2-0_313b0b89-2532-482b-994d-317dfc54b6a7/probe/0.log" Oct 03 10:27:13 crc kubenswrapper[4790]: I1003 10:27:13.020155 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-jr7ff_94a4a6e2-8a66-4144-b0ea-893b7519c242/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Oct 03 10:27:13 crc kubenswrapper[4790]: I1003 10:27:13.128357 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-volume-nfs-2-0_313b0b89-2532-482b-994d-317dfc54b6a7/cinder-volume/0.log" Oct 03 10:27:13 crc kubenswrapper[4790]: I1003 10:27:13.223059 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-6wbf8_e3ec6c6b-f744-4d2d-a924-32f4682b1dfa/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 03 10:27:13 crc kubenswrapper[4790]: I1003 10:27:13.299407 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-m8tf7_46adefe5-7949-4e5f-a07c-81569f5f6ad8/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 03 10:27:13 crc kubenswrapper[4790]: I1003 10:27:13.428747 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-79ff9d4d8f-zfwpx_33a1dfa6-9323-4d22-b0c1-0bd210b9638f/init/0.log" Oct 03 10:27:13 crc kubenswrapper[4790]: I1003 10:27:13.598780 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-79ff9d4d8f-zfwpx_33a1dfa6-9323-4d22-b0c1-0bd210b9638f/init/0.log" Oct 03 10:27:13 crc kubenswrapper[4790]: I1003 10:27:13.654111 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-edpm-deployment-openstack-edpm-ipam-mdlnz_2542827b-a700-41bf-80ff-22928d83630a/download-cache-edpm-deployment-openstack-edpm-ipam/0.log" Oct 03 10:27:13 crc kubenswrapper[4790]: I1003 10:27:13.754404 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-79ff9d4d8f-zfwpx_33a1dfa6-9323-4d22-b0c1-0bd210b9638f/dnsmasq-dns/0.log" Oct 03 10:27:13 crc kubenswrapper[4790]: I1003 10:27:13.861794 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_42c4562a-910e-47f6-befc-effb8f851374/glance-httpd/0.log" Oct 03 10:27:13 crc kubenswrapper[4790]: I1003 10:27:13.882176 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_42c4562a-910e-47f6-befc-effb8f851374/glance-log/0.log" Oct 03 10:27:14 crc kubenswrapper[4790]: I1003 10:27:14.035307 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_d1558347-4d48-4e51-a89c-a3d8664016a4/glance-log/0.log" Oct 03 10:27:14 crc kubenswrapper[4790]: I1003 10:27:14.058718 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_d1558347-4d48-4e51-a89c-a3d8664016a4/glance-httpd/0.log" Oct 03 10:27:14 crc kubenswrapper[4790]: I1003 10:27:14.283377 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-86f486799b-gwc4l_6b9b4763-1706-4c7e-b39f-0d8311a9637d/horizon/0.log" Oct 03 10:27:14 crc kubenswrapper[4790]: I1003 10:27:14.417565 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-x28c6_037bc62e-a9b6-45c4-9bce-9395a372bc90/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Oct 03 10:27:14 crc kubenswrapper[4790]: I1003 10:27:14.566372 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-nb4kf_eae0b967-81c1-45eb-8240-26198c43a621/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 03 10:27:14 crc kubenswrapper[4790]: I1003 10:27:14.989542 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29324701-xvbcm_0eab3760-3551-42a8-9eaa-f53d4b94a8ca/keystone-cron/0.log" Oct 03 10:27:15 crc kubenswrapper[4790]: I1003 10:27:15.157268 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-86f486799b-gwc4l_6b9b4763-1706-4c7e-b39f-0d8311a9637d/horizon-log/0.log" Oct 03 10:27:15 crc kubenswrapper[4790]: I1003 10:27:15.253781 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29324761-s6cn9_95907765-166e-4e38-a906-86b4020f3b5c/keystone-cron/0.log" Oct 03 10:27:15 crc kubenswrapper[4790]: I1003 10:27:15.328513 4790 scope.go:117] "RemoveContainer" containerID="4a3f2daf4e4f2fcbea44841698967b97018c1bcc4bd0dc9f6b5b49027a55eb34" Oct 03 10:27:15 crc kubenswrapper[4790]: E1003 10:27:15.328753 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqj4j_openshift-machine-config-operator(36eae06a-d8be-4128-8d68-acb32e1f011b)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" podUID="36eae06a-d8be-4128-8d68-acb32e1f011b" Oct 03 10:27:15 crc kubenswrapper[4790]: I1003 10:27:15.438858 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_272a9be6-ce74-4ee6-a1d2-17a8935d8c2e/kube-state-metrics/0.log" Oct 03 10:27:15 crc kubenswrapper[4790]: I1003 10:27:15.508077 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-7c4ff96776-8dljp_750b0c2b-c871-46f7-b318-78ace47356a9/keystone-api/0.log" Oct 03 10:27:15 crc kubenswrapper[4790]: I1003 10:27:15.670498 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-pgz87_60e50039-1bd7-4bd0-98ae-f1285c701e2e/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Oct 03 10:27:16 crc kubenswrapper[4790]: I1003 10:27:16.223189 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-95d849fd9-jww5w_11739e3d-ef66-4152-bd55-6c3de0dd19e2/neutron-httpd/0.log" Oct 03 10:27:16 crc kubenswrapper[4790]: I1003 10:27:16.299147 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-95d849fd9-jww5w_11739e3d-ef66-4152-bd55-6c3de0dd19e2/neutron-api/0.log" Oct 03 10:27:16 crc kubenswrapper[4790]: I1003 10:27:16.314293 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-64ldb_1e83d303-3e2f-4dab-b67f-de344d5fe175/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Oct 03 10:27:17 crc kubenswrapper[4790]: I1003 10:27:17.370163 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_46dabafe-d4c6-4b3e-84f6-3eb83eee9647/nova-cell0-conductor-conductor/0.log" Oct 03 10:27:18 crc kubenswrapper[4790]: I1003 10:27:18.052082 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_a529d30e-fdca-4eb5-8233-34e63534a164/nova-cell1-conductor-conductor/0.log" Oct 03 10:27:18 crc kubenswrapper[4790]: I1003 10:27:18.387531 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_f3e4ca6a-6d02-4c7e-b549-64e5900371ab/nova-api-log/0.log" Oct 03 10:27:18 crc kubenswrapper[4790]: I1003 10:27:18.697833 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_bbe5d109-dfdc-4f03-afa3-a71800a7d74b/nova-cell1-novncproxy-novncproxy/0.log" Oct 03 10:27:18 crc kubenswrapper[4790]: I1003 10:27:18.856709 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-edpm-deployment-openstack-edpm-ipam-z7tfh_7e86fc9f-d34e-41ca-b945-b8b45136512f/nova-edpm-deployment-openstack-edpm-ipam/0.log" Oct 03 10:27:18 crc kubenswrapper[4790]: I1003 10:27:18.901913 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_f3e4ca6a-6d02-4c7e-b549-64e5900371ab/nova-api-api/0.log" Oct 03 10:27:19 crc kubenswrapper[4790]: I1003 10:27:19.242265 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_b829e543-5d7d-4e9b-9674-9c24d060a0c5/nova-metadata-log/0.log" Oct 03 10:27:19 crc kubenswrapper[4790]: I1003 10:27:19.714793 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_89aeb08e-2699-4c1b-99ef-4ae2cfce9de3/mysql-bootstrap/0.log" Oct 03 10:27:19 crc kubenswrapper[4790]: I1003 10:27:19.901081 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_87e3a414-fd4b-49ac-8e64-3514f9725df1/nova-scheduler-scheduler/0.log" Oct 03 10:27:19 crc kubenswrapper[4790]: I1003 10:27:19.906917 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_89aeb08e-2699-4c1b-99ef-4ae2cfce9de3/mysql-bootstrap/0.log" Oct 03 10:27:20 crc kubenswrapper[4790]: I1003 10:27:20.136129 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_89aeb08e-2699-4c1b-99ef-4ae2cfce9de3/galera/0.log" Oct 03 10:27:20 crc kubenswrapper[4790]: I1003 10:27:20.331646 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_010b668f-22e4-4ad8-aae5-decf063cd49b/mysql-bootstrap/0.log" Oct 03 10:27:20 crc kubenswrapper[4790]: I1003 10:27:20.520411 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_010b668f-22e4-4ad8-aae5-decf063cd49b/mysql-bootstrap/0.log" Oct 03 10:27:20 crc kubenswrapper[4790]: I1003 10:27:20.593173 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_010b668f-22e4-4ad8-aae5-decf063cd49b/galera/0.log" Oct 03 10:27:20 crc kubenswrapper[4790]: I1003 10:27:20.813983 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_7a835cde-9c23-4cf9-9473-35d01cf2c528/openstackclient/0.log" Oct 03 10:27:21 crc kubenswrapper[4790]: I1003 10:27:21.023713 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-8ljvm_845dcf0a-43fb-4760-aac6-d12c02a8b785/ovn-controller/0.log" Oct 03 10:27:21 crc kubenswrapper[4790]: I1003 10:27:21.293763 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-6vcdh_f892381b-8fc9-4352-915e-62d0b92cb7b6/openstack-network-exporter/0.log" Oct 03 10:27:21 crc kubenswrapper[4790]: I1003 10:27:21.531416 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-vrsc7_5d8a4a4d-fc2d-49d8-80fd-8b6faf7e3aec/ovsdb-server-init/0.log" Oct 03 10:27:21 crc kubenswrapper[4790]: I1003 10:27:21.699590 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-vrsc7_5d8a4a4d-fc2d-49d8-80fd-8b6faf7e3aec/ovsdb-server-init/0.log" Oct 03 10:27:21 crc kubenswrapper[4790]: I1003 10:27:21.749955 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_b829e543-5d7d-4e9b-9674-9c24d060a0c5/nova-metadata-metadata/0.log" Oct 03 10:27:21 crc kubenswrapper[4790]: I1003 10:27:21.932174 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-vrsc7_5d8a4a4d-fc2d-49d8-80fd-8b6faf7e3aec/ovsdb-server/0.log" Oct 03 10:27:22 crc kubenswrapper[4790]: I1003 10:27:22.104703 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-vrsc7_5d8a4a4d-fc2d-49d8-80fd-8b6faf7e3aec/ovs-vswitchd/0.log" Oct 03 10:27:22 crc kubenswrapper[4790]: I1003 10:27:22.126651 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-c77vh_f3195619-a867-4741-a9c5-fc995f682859/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Oct 03 10:27:22 crc kubenswrapper[4790]: I1003 10:27:22.318805 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_b0f1b68c-e2ab-4d14-b207-a8a336b1bc1b/openstack-network-exporter/0.log" Oct 03 10:27:22 crc kubenswrapper[4790]: I1003 10:27:22.335523 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_b0f1b68c-e2ab-4d14-b207-a8a336b1bc1b/ovn-northd/0.log" Oct 03 10:27:22 crc kubenswrapper[4790]: I1003 10:27:22.501208 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_6330b59a-ef23-4482-9fa5-58ccb2f6bc5a/openstack-network-exporter/0.log" Oct 03 10:27:22 crc kubenswrapper[4790]: I1003 10:27:22.606008 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_6330b59a-ef23-4482-9fa5-58ccb2f6bc5a/ovsdbserver-nb/0.log" Oct 03 10:27:22 crc kubenswrapper[4790]: I1003 10:27:22.860326 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_c53865bf-810d-440a-bfa4-171e3183066c/openstack-network-exporter/0.log" Oct 03 10:27:22 crc kubenswrapper[4790]: I1003 10:27:22.863891 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_c53865bf-810d-440a-bfa4-171e3183066c/ovsdbserver-sb/0.log" Oct 03 10:27:23 crc kubenswrapper[4790]: I1003 10:27:23.241052 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-6d5469d486-dtt4h_38463967-5519-4ea0-8081-b22ac7858270/placement-api/0.log" Oct 03 10:27:23 crc kubenswrapper[4790]: I1003 10:27:23.433661 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-6d5469d486-dtt4h_38463967-5519-4ea0-8081-b22ac7858270/placement-log/0.log" Oct 03 10:27:23 crc kubenswrapper[4790]: I1003 10:27:23.484376 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_ba88a153-0441-4fd3-a3b3-37338c589169/init-config-reloader/0.log" Oct 03 10:27:23 crc kubenswrapper[4790]: I1003 10:27:23.671213 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_ba88a153-0441-4fd3-a3b3-37338c589169/prometheus/0.log" Oct 03 10:27:23 crc kubenswrapper[4790]: I1003 10:27:23.686502 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_ba88a153-0441-4fd3-a3b3-37338c589169/config-reloader/0.log" Oct 03 10:27:23 crc kubenswrapper[4790]: I1003 10:27:23.688870 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_ba88a153-0441-4fd3-a3b3-37338c589169/init-config-reloader/0.log" Oct 03 10:27:23 crc kubenswrapper[4790]: I1003 10:27:23.957966 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_ba88a153-0441-4fd3-a3b3-37338c589169/thanos-sidecar/0.log" Oct 03 10:27:23 crc kubenswrapper[4790]: I1003 10:27:23.968000 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_77c8e0d3-b23b-44de-a8d3-0eb2e368e9e9/setup-container/0.log" Oct 03 10:27:24 crc kubenswrapper[4790]: I1003 10:27:24.171751 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_77c8e0d3-b23b-44de-a8d3-0eb2e368e9e9/rabbitmq/0.log" Oct 03 10:27:24 crc kubenswrapper[4790]: I1003 10:27:24.222838 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_77c8e0d3-b23b-44de-a8d3-0eb2e368e9e9/setup-container/0.log" Oct 03 10:27:24 crc kubenswrapper[4790]: I1003 10:27:24.379836 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-notifications-server-0_04926fe2-c5e2-4663-8945-448a17ec2a8b/setup-container/0.log" Oct 03 10:27:24 crc kubenswrapper[4790]: I1003 10:27:24.643899 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-notifications-server-0_04926fe2-c5e2-4663-8945-448a17ec2a8b/rabbitmq/0.log" Oct 03 10:27:24 crc kubenswrapper[4790]: I1003 10:27:24.684214 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-notifications-server-0_04926fe2-c5e2-4663-8945-448a17ec2a8b/setup-container/0.log" Oct 03 10:27:24 crc kubenswrapper[4790]: I1003 10:27:24.821881 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_babbce79-b0e5-48e6-a5b7-b08e51f1dd69/setup-container/0.log" Oct 03 10:27:25 crc kubenswrapper[4790]: I1003 10:27:25.047139 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_babbce79-b0e5-48e6-a5b7-b08e51f1dd69/setup-container/0.log" Oct 03 10:27:25 crc kubenswrapper[4790]: I1003 10:27:25.069996 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_babbce79-b0e5-48e6-a5b7-b08e51f1dd69/rabbitmq/0.log" Oct 03 10:27:25 crc kubenswrapper[4790]: I1003 10:27:25.258843 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-nnnn2_8cc4fd03-2e9a-4988-8e58-06d9df765283/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 03 10:27:25 crc kubenswrapper[4790]: I1003 10:27:25.334636 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_redhat-edpm-deployment-openstack-edpm-ipam-c52nn_06011acd-ae43-466d-b25d-70eddbc5c34f/redhat-edpm-deployment-openstack-edpm-ipam/0.log" Oct 03 10:27:25 crc kubenswrapper[4790]: I1003 10:27:25.508473 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-cfdk9_551d5e7d-f7da-45c1-acbe-efa124afdd28/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Oct 03 10:27:25 crc kubenswrapper[4790]: I1003 10:27:25.918807 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-zm2qd_8dd56d83-b17f-4633-822c-e9ce0e3236d3/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 03 10:27:25 crc kubenswrapper[4790]: I1003 10:27:25.997882 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-brzw8_44cc8b3a-08a0-490c-b1ba-e23b7da761a4/ssh-known-hosts-edpm-deployment/0.log" Oct 03 10:27:26 crc kubenswrapper[4790]: I1003 10:27:26.368448 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-7657d6d659-5lb95_9ee67672-dab9-4d89-b6b4-02eb06e8fe8d/proxy-server/0.log" Oct 03 10:27:26 crc kubenswrapper[4790]: I1003 10:27:26.517104 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-7657d6d659-5lb95_9ee67672-dab9-4d89-b6b4-02eb06e8fe8d/proxy-httpd/0.log" Oct 03 10:27:26 crc kubenswrapper[4790]: I1003 10:27:26.570251 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-5tpvp_4f9b13ce-6ec8-4e05-b395-c7ec580823ed/swift-ring-rebalance/0.log" Oct 03 10:27:26 crc kubenswrapper[4790]: I1003 10:27:26.717309 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_d6c105dc-48df-4d07-8a82-ca93e78ad98c/account-auditor/0.log" Oct 03 10:27:26 crc kubenswrapper[4790]: I1003 10:27:26.821490 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_d6c105dc-48df-4d07-8a82-ca93e78ad98c/account-reaper/0.log" Oct 03 10:27:26 crc kubenswrapper[4790]: I1003 10:27:26.931733 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_d6c105dc-48df-4d07-8a82-ca93e78ad98c/account-server/0.log" Oct 03 10:27:26 crc kubenswrapper[4790]: I1003 10:27:26.957139 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_d6c105dc-48df-4d07-8a82-ca93e78ad98c/account-replicator/0.log" Oct 03 10:27:27 crc kubenswrapper[4790]: I1003 10:27:27.027806 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_d6c105dc-48df-4d07-8a82-ca93e78ad98c/container-auditor/0.log" Oct 03 10:27:27 crc kubenswrapper[4790]: I1003 10:27:27.178194 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_d6c105dc-48df-4d07-8a82-ca93e78ad98c/container-replicator/0.log" Oct 03 10:27:27 crc kubenswrapper[4790]: I1003 10:27:27.229404 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_d6c105dc-48df-4d07-8a82-ca93e78ad98c/container-server/0.log" Oct 03 10:27:27 crc kubenswrapper[4790]: I1003 10:27:27.288038 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_d6c105dc-48df-4d07-8a82-ca93e78ad98c/container-updater/0.log" Oct 03 10:27:27 crc kubenswrapper[4790]: I1003 10:27:27.328055 4790 scope.go:117] "RemoveContainer" containerID="4a3f2daf4e4f2fcbea44841698967b97018c1bcc4bd0dc9f6b5b49027a55eb34" Oct 03 10:27:27 crc kubenswrapper[4790]: E1003 10:27:27.328371 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqj4j_openshift-machine-config-operator(36eae06a-d8be-4128-8d68-acb32e1f011b)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" podUID="36eae06a-d8be-4128-8d68-acb32e1f011b" Oct 03 10:27:27 crc kubenswrapper[4790]: I1003 10:27:27.456374 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_d6c105dc-48df-4d07-8a82-ca93e78ad98c/object-auditor/0.log" Oct 03 10:27:27 crc kubenswrapper[4790]: I1003 10:27:27.462959 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_d6c105dc-48df-4d07-8a82-ca93e78ad98c/object-expirer/0.log" Oct 03 10:27:27 crc kubenswrapper[4790]: I1003 10:27:27.620732 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_d6c105dc-48df-4d07-8a82-ca93e78ad98c/object-replicator/0.log" Oct 03 10:27:27 crc kubenswrapper[4790]: I1003 10:27:27.693517 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_d6c105dc-48df-4d07-8a82-ca93e78ad98c/object-server/0.log" Oct 03 10:27:27 crc kubenswrapper[4790]: I1003 10:27:27.710287 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_d6c105dc-48df-4d07-8a82-ca93e78ad98c/object-updater/0.log" Oct 03 10:27:27 crc kubenswrapper[4790]: I1003 10:27:27.886035 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_d6c105dc-48df-4d07-8a82-ca93e78ad98c/rsync/0.log" Oct 03 10:27:27 crc kubenswrapper[4790]: I1003 10:27:27.901350 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_d6c105dc-48df-4d07-8a82-ca93e78ad98c/swift-recon-cron/0.log" Oct 03 10:27:28 crc kubenswrapper[4790]: I1003 10:27:28.165130 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-edpm-deployment-openstack-edpm-ipam-48d4w_780db1d7-b699-4f38-9f86-c6bcf5c7c9a6/telemetry-edpm-deployment-openstack-edpm-ipam/0.log" Oct 03 10:27:28 crc kubenswrapper[4790]: I1003 10:27:28.197304 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest_69ff309d-d784-487d-a190-4529008cb497/tempest-tests-tempest-tests-runner/0.log" Oct 03 10:27:28 crc kubenswrapper[4790]: I1003 10:27:28.380020 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_516af441-b673-4c5b-840f-1f9bda8cdc5a/test-operator-logs-container/0.log" Oct 03 10:27:28 crc kubenswrapper[4790]: I1003 10:27:28.550413 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-b75fr_a18ef35c-666c-4fc3-b466-7d63bb44e942/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Oct 03 10:27:29 crc kubenswrapper[4790]: I1003 10:27:29.700559 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_watcher-applier-0_67485d96-5ca0-4195-bb7a-d09d03d3dc49/watcher-applier/0.log" Oct 03 10:27:29 crc kubenswrapper[4790]: I1003 10:27:29.926303 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_watcher-api-0_e0b0e105-a5dc-4a34-b966-14fba0fd89ec/watcher-api-log/0.log" Oct 03 10:27:31 crc kubenswrapper[4790]: I1003 10:27:31.456336 4790 scope.go:117] "RemoveContainer" containerID="5b6306d88588c51309cece09d73532919ee27329505bcc43150533907c51556f" Oct 03 10:27:34 crc kubenswrapper[4790]: I1003 10:27:34.349773 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_watcher-decision-engine-0_5d8afb4c-874f-4d5d-ad9b-e11a24bd9799/watcher-decision-engine/0.log" Oct 03 10:27:35 crc kubenswrapper[4790]: I1003 10:27:35.007457 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_watcher-api-0_e0b0e105-a5dc-4a34-b966-14fba0fd89ec/watcher-api/0.log" Oct 03 10:27:41 crc kubenswrapper[4790]: I1003 10:27:41.329040 4790 scope.go:117] "RemoveContainer" containerID="4a3f2daf4e4f2fcbea44841698967b97018c1bcc4bd0dc9f6b5b49027a55eb34" Oct 03 10:27:41 crc kubenswrapper[4790]: E1003 10:27:41.329730 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqj4j_openshift-machine-config-operator(36eae06a-d8be-4128-8d68-acb32e1f011b)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" podUID="36eae06a-d8be-4128-8d68-acb32e1f011b" Oct 03 10:27:41 crc kubenswrapper[4790]: I1003 10:27:41.763222 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_3b885fd9-8ed3-45b9-8e6a-63069940ae79/memcached/0.log" Oct 03 10:27:55 crc kubenswrapper[4790]: I1003 10:27:55.328803 4790 scope.go:117] "RemoveContainer" containerID="4a3f2daf4e4f2fcbea44841698967b97018c1bcc4bd0dc9f6b5b49027a55eb34" Oct 03 10:27:55 crc kubenswrapper[4790]: E1003 10:27:55.329627 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqj4j_openshift-machine-config-operator(36eae06a-d8be-4128-8d68-acb32e1f011b)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" podUID="36eae06a-d8be-4128-8d68-acb32e1f011b" Oct 03 10:28:01 crc kubenswrapper[4790]: I1003 10:28:01.065932 4790 generic.go:334] "Generic (PLEG): container finished" podID="a1d20993-d138-4931-9b85-9d201850df77" containerID="9af713dbabb8b5349bf51d66cb1df5d7bf9315c1bedd468fc4acaaefab00611a" exitCode=0 Oct 03 10:28:01 crc kubenswrapper[4790]: I1003 10:28:01.066044 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-dxvsr/crc-debug-ktqmr" event={"ID":"a1d20993-d138-4931-9b85-9d201850df77","Type":"ContainerDied","Data":"9af713dbabb8b5349bf51d66cb1df5d7bf9315c1bedd468fc4acaaefab00611a"} Oct 03 10:28:02 crc kubenswrapper[4790]: I1003 10:28:02.220104 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-dxvsr/crc-debug-ktqmr" Oct 03 10:28:02 crc kubenswrapper[4790]: I1003 10:28:02.261014 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-dxvsr/crc-debug-ktqmr"] Oct 03 10:28:02 crc kubenswrapper[4790]: I1003 10:28:02.270618 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-dxvsr/crc-debug-ktqmr"] Oct 03 10:28:02 crc kubenswrapper[4790]: I1003 10:28:02.294949 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n5wtn\" (UniqueName: \"kubernetes.io/projected/a1d20993-d138-4931-9b85-9d201850df77-kube-api-access-n5wtn\") pod \"a1d20993-d138-4931-9b85-9d201850df77\" (UID: \"a1d20993-d138-4931-9b85-9d201850df77\") " Oct 03 10:28:02 crc kubenswrapper[4790]: I1003 10:28:02.295144 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a1d20993-d138-4931-9b85-9d201850df77-host\") pod \"a1d20993-d138-4931-9b85-9d201850df77\" (UID: \"a1d20993-d138-4931-9b85-9d201850df77\") " Oct 03 10:28:02 crc kubenswrapper[4790]: I1003 10:28:02.295226 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a1d20993-d138-4931-9b85-9d201850df77-host" (OuterVolumeSpecName: "host") pod "a1d20993-d138-4931-9b85-9d201850df77" (UID: "a1d20993-d138-4931-9b85-9d201850df77"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 10:28:02 crc kubenswrapper[4790]: I1003 10:28:02.295611 4790 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a1d20993-d138-4931-9b85-9d201850df77-host\") on node \"crc\" DevicePath \"\"" Oct 03 10:28:02 crc kubenswrapper[4790]: I1003 10:28:02.303258 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a1d20993-d138-4931-9b85-9d201850df77-kube-api-access-n5wtn" (OuterVolumeSpecName: "kube-api-access-n5wtn") pod "a1d20993-d138-4931-9b85-9d201850df77" (UID: "a1d20993-d138-4931-9b85-9d201850df77"). InnerVolumeSpecName "kube-api-access-n5wtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 10:28:02 crc kubenswrapper[4790]: I1003 10:28:02.341051 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a1d20993-d138-4931-9b85-9d201850df77" path="/var/lib/kubelet/pods/a1d20993-d138-4931-9b85-9d201850df77/volumes" Oct 03 10:28:02 crc kubenswrapper[4790]: I1003 10:28:02.397442 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n5wtn\" (UniqueName: \"kubernetes.io/projected/a1d20993-d138-4931-9b85-9d201850df77-kube-api-access-n5wtn\") on node \"crc\" DevicePath \"\"" Oct 03 10:28:03 crc kubenswrapper[4790]: I1003 10:28:03.094192 4790 scope.go:117] "RemoveContainer" containerID="9af713dbabb8b5349bf51d66cb1df5d7bf9315c1bedd468fc4acaaefab00611a" Oct 03 10:28:03 crc kubenswrapper[4790]: I1003 10:28:03.094243 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-dxvsr/crc-debug-ktqmr" Oct 03 10:28:03 crc kubenswrapper[4790]: I1003 10:28:03.443336 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-dxvsr/crc-debug-9lt64"] Oct 03 10:28:03 crc kubenswrapper[4790]: E1003 10:28:03.443775 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a1d20993-d138-4931-9b85-9d201850df77" containerName="container-00" Oct 03 10:28:03 crc kubenswrapper[4790]: I1003 10:28:03.443787 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="a1d20993-d138-4931-9b85-9d201850df77" containerName="container-00" Oct 03 10:28:03 crc kubenswrapper[4790]: E1003 10:28:03.443833 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2f0029d4-804b-4699-89ba-a01fd8d91fd7" containerName="extract-content" Oct 03 10:28:03 crc kubenswrapper[4790]: I1003 10:28:03.443839 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="2f0029d4-804b-4699-89ba-a01fd8d91fd7" containerName="extract-content" Oct 03 10:28:03 crc kubenswrapper[4790]: E1003 10:28:03.443852 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2f0029d4-804b-4699-89ba-a01fd8d91fd7" containerName="extract-utilities" Oct 03 10:28:03 crc kubenswrapper[4790]: I1003 10:28:03.443860 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="2f0029d4-804b-4699-89ba-a01fd8d91fd7" containerName="extract-utilities" Oct 03 10:28:03 crc kubenswrapper[4790]: E1003 10:28:03.443868 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2f0029d4-804b-4699-89ba-a01fd8d91fd7" containerName="registry-server" Oct 03 10:28:03 crc kubenswrapper[4790]: I1003 10:28:03.443874 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="2f0029d4-804b-4699-89ba-a01fd8d91fd7" containerName="registry-server" Oct 03 10:28:03 crc kubenswrapper[4790]: I1003 10:28:03.444047 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="a1d20993-d138-4931-9b85-9d201850df77" containerName="container-00" Oct 03 10:28:03 crc kubenswrapper[4790]: I1003 10:28:03.444069 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="2f0029d4-804b-4699-89ba-a01fd8d91fd7" containerName="registry-server" Oct 03 10:28:03 crc kubenswrapper[4790]: I1003 10:28:03.444773 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-dxvsr/crc-debug-9lt64" Oct 03 10:28:03 crc kubenswrapper[4790]: I1003 10:28:03.522637 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8f13c513-1c93-4efa-aaeb-e8b0777d3a18-host\") pod \"crc-debug-9lt64\" (UID: \"8f13c513-1c93-4efa-aaeb-e8b0777d3a18\") " pod="openshift-must-gather-dxvsr/crc-debug-9lt64" Oct 03 10:28:03 crc kubenswrapper[4790]: I1003 10:28:03.522777 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6bh7w\" (UniqueName: \"kubernetes.io/projected/8f13c513-1c93-4efa-aaeb-e8b0777d3a18-kube-api-access-6bh7w\") pod \"crc-debug-9lt64\" (UID: \"8f13c513-1c93-4efa-aaeb-e8b0777d3a18\") " pod="openshift-must-gather-dxvsr/crc-debug-9lt64" Oct 03 10:28:03 crc kubenswrapper[4790]: I1003 10:28:03.624598 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6bh7w\" (UniqueName: \"kubernetes.io/projected/8f13c513-1c93-4efa-aaeb-e8b0777d3a18-kube-api-access-6bh7w\") pod \"crc-debug-9lt64\" (UID: \"8f13c513-1c93-4efa-aaeb-e8b0777d3a18\") " pod="openshift-must-gather-dxvsr/crc-debug-9lt64" Oct 03 10:28:03 crc kubenswrapper[4790]: I1003 10:28:03.624813 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8f13c513-1c93-4efa-aaeb-e8b0777d3a18-host\") pod \"crc-debug-9lt64\" (UID: \"8f13c513-1c93-4efa-aaeb-e8b0777d3a18\") " pod="openshift-must-gather-dxvsr/crc-debug-9lt64" Oct 03 10:28:03 crc kubenswrapper[4790]: I1003 10:28:03.624989 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8f13c513-1c93-4efa-aaeb-e8b0777d3a18-host\") pod \"crc-debug-9lt64\" (UID: \"8f13c513-1c93-4efa-aaeb-e8b0777d3a18\") " pod="openshift-must-gather-dxvsr/crc-debug-9lt64" Oct 03 10:28:03 crc kubenswrapper[4790]: I1003 10:28:03.648713 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6bh7w\" (UniqueName: \"kubernetes.io/projected/8f13c513-1c93-4efa-aaeb-e8b0777d3a18-kube-api-access-6bh7w\") pod \"crc-debug-9lt64\" (UID: \"8f13c513-1c93-4efa-aaeb-e8b0777d3a18\") " pod="openshift-must-gather-dxvsr/crc-debug-9lt64" Oct 03 10:28:03 crc kubenswrapper[4790]: I1003 10:28:03.762742 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-dxvsr/crc-debug-9lt64" Oct 03 10:28:04 crc kubenswrapper[4790]: I1003 10:28:04.110641 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-dxvsr/crc-debug-9lt64" event={"ID":"8f13c513-1c93-4efa-aaeb-e8b0777d3a18","Type":"ContainerStarted","Data":"759a5431cb32737bdfdfbae4ef85c79871fed0bd2d2d6e4f2672390b4a13ac59"} Oct 03 10:28:04 crc kubenswrapper[4790]: I1003 10:28:04.111057 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-dxvsr/crc-debug-9lt64" event={"ID":"8f13c513-1c93-4efa-aaeb-e8b0777d3a18","Type":"ContainerStarted","Data":"06f05ab9cfb08b9ef35ecf61478c5b6b6b412ef026cdd6527acb158595f09ddf"} Oct 03 10:28:04 crc kubenswrapper[4790]: I1003 10:28:04.132126 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-dxvsr/crc-debug-9lt64" podStartSLOduration=1.132100612 podStartE2EDuration="1.132100612s" podCreationTimestamp="2025-10-03 10:28:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-03 10:28:04.125912191 +0000 UTC m=+6470.478846449" watchObservedRunningTime="2025-10-03 10:28:04.132100612 +0000 UTC m=+6470.485034880" Oct 03 10:28:05 crc kubenswrapper[4790]: I1003 10:28:05.123235 4790 generic.go:334] "Generic (PLEG): container finished" podID="8f13c513-1c93-4efa-aaeb-e8b0777d3a18" containerID="759a5431cb32737bdfdfbae4ef85c79871fed0bd2d2d6e4f2672390b4a13ac59" exitCode=0 Oct 03 10:28:05 crc kubenswrapper[4790]: I1003 10:28:05.124668 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-dxvsr/crc-debug-9lt64" event={"ID":"8f13c513-1c93-4efa-aaeb-e8b0777d3a18","Type":"ContainerDied","Data":"759a5431cb32737bdfdfbae4ef85c79871fed0bd2d2d6e4f2672390b4a13ac59"} Oct 03 10:28:06 crc kubenswrapper[4790]: I1003 10:28:06.276544 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-dxvsr/crc-debug-9lt64" Oct 03 10:28:06 crc kubenswrapper[4790]: I1003 10:28:06.381410 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8f13c513-1c93-4efa-aaeb-e8b0777d3a18-host\") pod \"8f13c513-1c93-4efa-aaeb-e8b0777d3a18\" (UID: \"8f13c513-1c93-4efa-aaeb-e8b0777d3a18\") " Oct 03 10:28:06 crc kubenswrapper[4790]: I1003 10:28:06.381470 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8f13c513-1c93-4efa-aaeb-e8b0777d3a18-host" (OuterVolumeSpecName: "host") pod "8f13c513-1c93-4efa-aaeb-e8b0777d3a18" (UID: "8f13c513-1c93-4efa-aaeb-e8b0777d3a18"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 10:28:06 crc kubenswrapper[4790]: I1003 10:28:06.381597 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6bh7w\" (UniqueName: \"kubernetes.io/projected/8f13c513-1c93-4efa-aaeb-e8b0777d3a18-kube-api-access-6bh7w\") pod \"8f13c513-1c93-4efa-aaeb-e8b0777d3a18\" (UID: \"8f13c513-1c93-4efa-aaeb-e8b0777d3a18\") " Oct 03 10:28:06 crc kubenswrapper[4790]: I1003 10:28:06.382239 4790 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8f13c513-1c93-4efa-aaeb-e8b0777d3a18-host\") on node \"crc\" DevicePath \"\"" Oct 03 10:28:06 crc kubenswrapper[4790]: I1003 10:28:06.390440 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f13c513-1c93-4efa-aaeb-e8b0777d3a18-kube-api-access-6bh7w" (OuterVolumeSpecName: "kube-api-access-6bh7w") pod "8f13c513-1c93-4efa-aaeb-e8b0777d3a18" (UID: "8f13c513-1c93-4efa-aaeb-e8b0777d3a18"). InnerVolumeSpecName "kube-api-access-6bh7w". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 10:28:06 crc kubenswrapper[4790]: I1003 10:28:06.483624 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6bh7w\" (UniqueName: \"kubernetes.io/projected/8f13c513-1c93-4efa-aaeb-e8b0777d3a18-kube-api-access-6bh7w\") on node \"crc\" DevicePath \"\"" Oct 03 10:28:07 crc kubenswrapper[4790]: I1003 10:28:07.141676 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-dxvsr/crc-debug-9lt64" event={"ID":"8f13c513-1c93-4efa-aaeb-e8b0777d3a18","Type":"ContainerDied","Data":"06f05ab9cfb08b9ef35ecf61478c5b6b6b412ef026cdd6527acb158595f09ddf"} Oct 03 10:28:07 crc kubenswrapper[4790]: I1003 10:28:07.141722 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="06f05ab9cfb08b9ef35ecf61478c5b6b6b412ef026cdd6527acb158595f09ddf" Oct 03 10:28:07 crc kubenswrapper[4790]: I1003 10:28:07.141703 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-dxvsr/crc-debug-9lt64" Oct 03 10:28:07 crc kubenswrapper[4790]: I1003 10:28:07.328553 4790 scope.go:117] "RemoveContainer" containerID="4a3f2daf4e4f2fcbea44841698967b97018c1bcc4bd0dc9f6b5b49027a55eb34" Oct 03 10:28:07 crc kubenswrapper[4790]: E1003 10:28:07.329051 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqj4j_openshift-machine-config-operator(36eae06a-d8be-4128-8d68-acb32e1f011b)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" podUID="36eae06a-d8be-4128-8d68-acb32e1f011b" Oct 03 10:28:13 crc kubenswrapper[4790]: I1003 10:28:13.771480 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-dxvsr/crc-debug-9lt64"] Oct 03 10:28:13 crc kubenswrapper[4790]: I1003 10:28:13.780794 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-dxvsr/crc-debug-9lt64"] Oct 03 10:28:14 crc kubenswrapper[4790]: I1003 10:28:14.347567 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f13c513-1c93-4efa-aaeb-e8b0777d3a18" path="/var/lib/kubelet/pods/8f13c513-1c93-4efa-aaeb-e8b0777d3a18/volumes" Oct 03 10:28:14 crc kubenswrapper[4790]: I1003 10:28:14.950758 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-dxvsr/crc-debug-9rqdx"] Oct 03 10:28:14 crc kubenswrapper[4790]: E1003 10:28:14.951533 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8f13c513-1c93-4efa-aaeb-e8b0777d3a18" containerName="container-00" Oct 03 10:28:14 crc kubenswrapper[4790]: I1003 10:28:14.951548 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f13c513-1c93-4efa-aaeb-e8b0777d3a18" containerName="container-00" Oct 03 10:28:14 crc kubenswrapper[4790]: I1003 10:28:14.951753 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="8f13c513-1c93-4efa-aaeb-e8b0777d3a18" containerName="container-00" Oct 03 10:28:14 crc kubenswrapper[4790]: I1003 10:28:14.952704 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-dxvsr/crc-debug-9rqdx" Oct 03 10:28:15 crc kubenswrapper[4790]: I1003 10:28:15.070921 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/805f0a83-e530-4cb6-9f1b-9f6de2acb4de-host\") pod \"crc-debug-9rqdx\" (UID: \"805f0a83-e530-4cb6-9f1b-9f6de2acb4de\") " pod="openshift-must-gather-dxvsr/crc-debug-9rqdx" Oct 03 10:28:15 crc kubenswrapper[4790]: I1003 10:28:15.070975 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tfgpw\" (UniqueName: \"kubernetes.io/projected/805f0a83-e530-4cb6-9f1b-9f6de2acb4de-kube-api-access-tfgpw\") pod \"crc-debug-9rqdx\" (UID: \"805f0a83-e530-4cb6-9f1b-9f6de2acb4de\") " pod="openshift-must-gather-dxvsr/crc-debug-9rqdx" Oct 03 10:28:15 crc kubenswrapper[4790]: I1003 10:28:15.173651 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/805f0a83-e530-4cb6-9f1b-9f6de2acb4de-host\") pod \"crc-debug-9rqdx\" (UID: \"805f0a83-e530-4cb6-9f1b-9f6de2acb4de\") " pod="openshift-must-gather-dxvsr/crc-debug-9rqdx" Oct 03 10:28:15 crc kubenswrapper[4790]: I1003 10:28:15.173702 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tfgpw\" (UniqueName: \"kubernetes.io/projected/805f0a83-e530-4cb6-9f1b-9f6de2acb4de-kube-api-access-tfgpw\") pod \"crc-debug-9rqdx\" (UID: \"805f0a83-e530-4cb6-9f1b-9f6de2acb4de\") " pod="openshift-must-gather-dxvsr/crc-debug-9rqdx" Oct 03 10:28:15 crc kubenswrapper[4790]: I1003 10:28:15.173848 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/805f0a83-e530-4cb6-9f1b-9f6de2acb4de-host\") pod \"crc-debug-9rqdx\" (UID: \"805f0a83-e530-4cb6-9f1b-9f6de2acb4de\") " pod="openshift-must-gather-dxvsr/crc-debug-9rqdx" Oct 03 10:28:15 crc kubenswrapper[4790]: I1003 10:28:15.199445 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tfgpw\" (UniqueName: \"kubernetes.io/projected/805f0a83-e530-4cb6-9f1b-9f6de2acb4de-kube-api-access-tfgpw\") pod \"crc-debug-9rqdx\" (UID: \"805f0a83-e530-4cb6-9f1b-9f6de2acb4de\") " pod="openshift-must-gather-dxvsr/crc-debug-9rqdx" Oct 03 10:28:15 crc kubenswrapper[4790]: I1003 10:28:15.272777 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-dxvsr/crc-debug-9rqdx" Oct 03 10:28:16 crc kubenswrapper[4790]: I1003 10:28:16.221116 4790 generic.go:334] "Generic (PLEG): container finished" podID="805f0a83-e530-4cb6-9f1b-9f6de2acb4de" containerID="d4fc3b80c4d9d744a1c7ed1f81a4449b69e32b0f9381f096715d2e96d1793f1e" exitCode=0 Oct 03 10:28:16 crc kubenswrapper[4790]: I1003 10:28:16.221185 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-dxvsr/crc-debug-9rqdx" event={"ID":"805f0a83-e530-4cb6-9f1b-9f6de2acb4de","Type":"ContainerDied","Data":"d4fc3b80c4d9d744a1c7ed1f81a4449b69e32b0f9381f096715d2e96d1793f1e"} Oct 03 10:28:16 crc kubenswrapper[4790]: I1003 10:28:16.221539 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-dxvsr/crc-debug-9rqdx" event={"ID":"805f0a83-e530-4cb6-9f1b-9f6de2acb4de","Type":"ContainerStarted","Data":"b846fcecb83f3afeb41defff435f6a13ec023346618667bf320b27c95e40acde"} Oct 03 10:28:16 crc kubenswrapper[4790]: I1003 10:28:16.259901 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-dxvsr/crc-debug-9rqdx"] Oct 03 10:28:16 crc kubenswrapper[4790]: I1003 10:28:16.271802 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-dxvsr/crc-debug-9rqdx"] Oct 03 10:28:17 crc kubenswrapper[4790]: I1003 10:28:17.381277 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-dxvsr/crc-debug-9rqdx" Oct 03 10:28:17 crc kubenswrapper[4790]: I1003 10:28:17.521379 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tfgpw\" (UniqueName: \"kubernetes.io/projected/805f0a83-e530-4cb6-9f1b-9f6de2acb4de-kube-api-access-tfgpw\") pod \"805f0a83-e530-4cb6-9f1b-9f6de2acb4de\" (UID: \"805f0a83-e530-4cb6-9f1b-9f6de2acb4de\") " Oct 03 10:28:17 crc kubenswrapper[4790]: I1003 10:28:17.521443 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/805f0a83-e530-4cb6-9f1b-9f6de2acb4de-host\") pod \"805f0a83-e530-4cb6-9f1b-9f6de2acb4de\" (UID: \"805f0a83-e530-4cb6-9f1b-9f6de2acb4de\") " Oct 03 10:28:17 crc kubenswrapper[4790]: I1003 10:28:17.521990 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/805f0a83-e530-4cb6-9f1b-9f6de2acb4de-host" (OuterVolumeSpecName: "host") pod "805f0a83-e530-4cb6-9f1b-9f6de2acb4de" (UID: "805f0a83-e530-4cb6-9f1b-9f6de2acb4de"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 03 10:28:17 crc kubenswrapper[4790]: I1003 10:28:17.528272 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/805f0a83-e530-4cb6-9f1b-9f6de2acb4de-kube-api-access-tfgpw" (OuterVolumeSpecName: "kube-api-access-tfgpw") pod "805f0a83-e530-4cb6-9f1b-9f6de2acb4de" (UID: "805f0a83-e530-4cb6-9f1b-9f6de2acb4de"). InnerVolumeSpecName "kube-api-access-tfgpw". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 10:28:17 crc kubenswrapper[4790]: I1003 10:28:17.624713 4790 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/805f0a83-e530-4cb6-9f1b-9f6de2acb4de-host\") on node \"crc\" DevicePath \"\"" Oct 03 10:28:17 crc kubenswrapper[4790]: I1003 10:28:17.625050 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tfgpw\" (UniqueName: \"kubernetes.io/projected/805f0a83-e530-4cb6-9f1b-9f6de2acb4de-kube-api-access-tfgpw\") on node \"crc\" DevicePath \"\"" Oct 03 10:28:18 crc kubenswrapper[4790]: I1003 10:28:18.184547 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_05e449ad10d2925ed0c6277e570b403daaa897f83ce6b356969c83b302dmhsz_a3112067-ef2a-471a-94af-601f199e667d/util/0.log" Oct 03 10:28:18 crc kubenswrapper[4790]: I1003 10:28:18.266711 4790 scope.go:117] "RemoveContainer" containerID="d4fc3b80c4d9d744a1c7ed1f81a4449b69e32b0f9381f096715d2e96d1793f1e" Oct 03 10:28:18 crc kubenswrapper[4790]: I1003 10:28:18.266906 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-dxvsr/crc-debug-9rqdx" Oct 03 10:28:18 crc kubenswrapper[4790]: I1003 10:28:18.348896 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="805f0a83-e530-4cb6-9f1b-9f6de2acb4de" path="/var/lib/kubelet/pods/805f0a83-e530-4cb6-9f1b-9f6de2acb4de/volumes" Oct 03 10:28:18 crc kubenswrapper[4790]: I1003 10:28:18.463920 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_05e449ad10d2925ed0c6277e570b403daaa897f83ce6b356969c83b302dmhsz_a3112067-ef2a-471a-94af-601f199e667d/util/0.log" Oct 03 10:28:18 crc kubenswrapper[4790]: I1003 10:28:18.509865 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_05e449ad10d2925ed0c6277e570b403daaa897f83ce6b356969c83b302dmhsz_a3112067-ef2a-471a-94af-601f199e667d/pull/0.log" Oct 03 10:28:18 crc kubenswrapper[4790]: I1003 10:28:18.528506 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_05e449ad10d2925ed0c6277e570b403daaa897f83ce6b356969c83b302dmhsz_a3112067-ef2a-471a-94af-601f199e667d/pull/0.log" Oct 03 10:28:18 crc kubenswrapper[4790]: I1003 10:28:18.701153 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_05e449ad10d2925ed0c6277e570b403daaa897f83ce6b356969c83b302dmhsz_a3112067-ef2a-471a-94af-601f199e667d/pull/0.log" Oct 03 10:28:18 crc kubenswrapper[4790]: I1003 10:28:18.702470 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_05e449ad10d2925ed0c6277e570b403daaa897f83ce6b356969c83b302dmhsz_a3112067-ef2a-471a-94af-601f199e667d/extract/0.log" Oct 03 10:28:18 crc kubenswrapper[4790]: I1003 10:28:18.716230 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_05e449ad10d2925ed0c6277e570b403daaa897f83ce6b356969c83b302dmhsz_a3112067-ef2a-471a-94af-601f199e667d/util/0.log" Oct 03 10:28:18 crc kubenswrapper[4790]: I1003 10:28:18.880124 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-6c675fb79f-rtftw_f598237f-5311-44e2-8ed2-3de81bbebc91/kube-rbac-proxy/0.log" Oct 03 10:28:18 crc kubenswrapper[4790]: I1003 10:28:18.953036 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-6c675fb79f-rtftw_f598237f-5311-44e2-8ed2-3de81bbebc91/manager/0.log" Oct 03 10:28:18 crc kubenswrapper[4790]: I1003 10:28:18.959508 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-79d68d6c85-q7g8m_9b9ab2ca-8c9e-46d1-af78-b811e32b05f0/kube-rbac-proxy/0.log" Oct 03 10:28:19 crc kubenswrapper[4790]: I1003 10:28:19.113081 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-79d68d6c85-q7g8m_9b9ab2ca-8c9e-46d1-af78-b811e32b05f0/manager/0.log" Oct 03 10:28:19 crc kubenswrapper[4790]: I1003 10:28:19.156989 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-75dfd9b554-s25k5_b1f2aef2-7fd0-4d10-ab5d-f0a8b2e7ffaf/kube-rbac-proxy/0.log" Oct 03 10:28:19 crc kubenswrapper[4790]: I1003 10:28:19.187672 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-75dfd9b554-s25k5_b1f2aef2-7fd0-4d10-ab5d-f0a8b2e7ffaf/manager/0.log" Oct 03 10:28:19 crc kubenswrapper[4790]: I1003 10:28:19.387676 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-846dff85b5-pqh6z_a168c4e9-522f-4f9b-97d3-5c50008ec59c/kube-rbac-proxy/0.log" Oct 03 10:28:19 crc kubenswrapper[4790]: I1003 10:28:19.511213 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-846dff85b5-pqh6z_a168c4e9-522f-4f9b-97d3-5c50008ec59c/manager/0.log" Oct 03 10:28:19 crc kubenswrapper[4790]: I1003 10:28:19.546567 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-599898f689-h2km6_d6653934-c835-4642-a1da-9159b7f85dcb/kube-rbac-proxy/0.log" Oct 03 10:28:19 crc kubenswrapper[4790]: I1003 10:28:19.650484 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-599898f689-h2km6_d6653934-c835-4642-a1da-9159b7f85dcb/manager/0.log" Oct 03 10:28:19 crc kubenswrapper[4790]: I1003 10:28:19.739620 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-6769b867d9-d6t4r_5b3d2a0d-d36d-4a67-9c5f-6a85735127a1/kube-rbac-proxy/0.log" Oct 03 10:28:19 crc kubenswrapper[4790]: I1003 10:28:19.754681 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-6769b867d9-d6t4r_5b3d2a0d-d36d-4a67-9c5f-6a85735127a1/manager/0.log" Oct 03 10:28:19 crc kubenswrapper[4790]: I1003 10:28:19.857366 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-5fbf469cd7-894gw_2bc6d6a2-91a5-464c-bba7-3a00ee0b1938/kube-rbac-proxy/0.log" Oct 03 10:28:20 crc kubenswrapper[4790]: I1003 10:28:20.107147 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-84bc9db6cc-2vq5g_1eec0ecd-6bcb-4df5-b6bc-66780ca3e4d4/kube-rbac-proxy/0.log" Oct 03 10:28:20 crc kubenswrapper[4790]: I1003 10:28:20.111338 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-5fbf469cd7-894gw_2bc6d6a2-91a5-464c-bba7-3a00ee0b1938/manager/0.log" Oct 03 10:28:20 crc kubenswrapper[4790]: I1003 10:28:20.143383 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-84bc9db6cc-2vq5g_1eec0ecd-6bcb-4df5-b6bc-66780ca3e4d4/manager/0.log" Oct 03 10:28:20 crc kubenswrapper[4790]: I1003 10:28:20.306106 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-7f55849f88-twvmq_12237b3a-fae2-4cac-b0d7-ee396be3da08/kube-rbac-proxy/0.log" Oct 03 10:28:20 crc kubenswrapper[4790]: I1003 10:28:20.328728 4790 scope.go:117] "RemoveContainer" containerID="4a3f2daf4e4f2fcbea44841698967b97018c1bcc4bd0dc9f6b5b49027a55eb34" Oct 03 10:28:20 crc kubenswrapper[4790]: E1003 10:28:20.329023 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqj4j_openshift-machine-config-operator(36eae06a-d8be-4128-8d68-acb32e1f011b)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" podUID="36eae06a-d8be-4128-8d68-acb32e1f011b" Oct 03 10:28:20 crc kubenswrapper[4790]: I1003 10:28:20.379902 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-7f55849f88-twvmq_12237b3a-fae2-4cac-b0d7-ee396be3da08/manager/0.log" Oct 03 10:28:20 crc kubenswrapper[4790]: I1003 10:28:20.511967 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-6fd6854b49-k8l6q_9cfa9536-48e0-4639-a0ed-acab9ab92ed0/kube-rbac-proxy/0.log" Oct 03 10:28:20 crc kubenswrapper[4790]: I1003 10:28:20.542518 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-6fd6854b49-k8l6q_9cfa9536-48e0-4639-a0ed-acab9ab92ed0/manager/0.log" Oct 03 10:28:20 crc kubenswrapper[4790]: I1003 10:28:20.590141 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-5c468bf4d4-sdqf6_c97f26b2-5313-49db-8ee8-bcde2e793bfa/kube-rbac-proxy/0.log" Oct 03 10:28:20 crc kubenswrapper[4790]: I1003 10:28:20.724413 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-5c468bf4d4-sdqf6_c97f26b2-5313-49db-8ee8-bcde2e793bfa/manager/0.log" Oct 03 10:28:20 crc kubenswrapper[4790]: I1003 10:28:20.781101 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-6574bf987d-668k4_e6f6790b-7aca-49c0-881f-8a7d15ac4071/kube-rbac-proxy/0.log" Oct 03 10:28:20 crc kubenswrapper[4790]: I1003 10:28:20.867857 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-6574bf987d-668k4_e6f6790b-7aca-49c0-881f-8a7d15ac4071/manager/0.log" Oct 03 10:28:20 crc kubenswrapper[4790]: I1003 10:28:20.970478 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-555c7456bd-tlmwh_b52c73ba-c737-4eaf-a913-155d05c96f6e/kube-rbac-proxy/0.log" Oct 03 10:28:21 crc kubenswrapper[4790]: I1003 10:28:21.023266 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-555c7456bd-tlmwh_b52c73ba-c737-4eaf-a913-155d05c96f6e/manager/0.log" Oct 03 10:28:21 crc kubenswrapper[4790]: I1003 10:28:21.185064 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-59d6cfdf45-62zvh_98c1e024-12c4-47f7-ac46-ae9f6bd45ab9/manager/0.log" Oct 03 10:28:21 crc kubenswrapper[4790]: I1003 10:28:21.205413 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-59d6cfdf45-62zvh_98c1e024-12c4-47f7-ac46-ae9f6bd45ab9/kube-rbac-proxy/0.log" Oct 03 10:28:21 crc kubenswrapper[4790]: I1003 10:28:21.341004 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-6f64c4d678lpbm5_759c9730-8d00-4516-9ed2-79f56be527bf/kube-rbac-proxy/0.log" Oct 03 10:28:21 crc kubenswrapper[4790]: I1003 10:28:21.398642 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-6f64c4d678lpbm5_759c9730-8d00-4516-9ed2-79f56be527bf/manager/0.log" Oct 03 10:28:21 crc kubenswrapper[4790]: I1003 10:28:21.589462 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-6fb94b767d-nhfws_c64c5a59-ac27-4428-8cdc-a1f8c96368bb/kube-rbac-proxy/0.log" Oct 03 10:28:21 crc kubenswrapper[4790]: I1003 10:28:21.686314 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-7f8d586cd6-ptxlm_f1035867-3856-4436-8175-e21df1fbb18c/kube-rbac-proxy/0.log" Oct 03 10:28:21 crc kubenswrapper[4790]: I1003 10:28:21.964284 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-7f8d586cd6-ptxlm_f1035867-3856-4436-8175-e21df1fbb18c/operator/0.log" Oct 03 10:28:22 crc kubenswrapper[4790]: I1003 10:28:22.014915 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-w2g5j_9a285450-ca0c-4d27-b51b-81474aaa7d8f/registry-server/0.log" Oct 03 10:28:22 crc kubenswrapper[4790]: I1003 10:28:22.179183 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-688db7b6c7-kv6xz_94dc0b91-323f-4390-8af1-71019409bdf2/kube-rbac-proxy/0.log" Oct 03 10:28:22 crc kubenswrapper[4790]: I1003 10:28:22.364688 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-688db7b6c7-kv6xz_94dc0b91-323f-4390-8af1-71019409bdf2/manager/0.log" Oct 03 10:28:22 crc kubenswrapper[4790]: I1003 10:28:22.422648 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-7d8bb7f44c-8gls6_c7787291-34a2-4682-88f4-7b05f28f36f2/kube-rbac-proxy/0.log" Oct 03 10:28:22 crc kubenswrapper[4790]: I1003 10:28:22.457777 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-7d8bb7f44c-8gls6_c7787291-34a2-4682-88f4-7b05f28f36f2/manager/0.log" Oct 03 10:28:22 crc kubenswrapper[4790]: I1003 10:28:22.662177 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-6859f9b676-ntlnz_7c58f90d-6708-4bb2-ba68-fb4e08b99c61/kube-rbac-proxy/0.log" Oct 03 10:28:22 crc kubenswrapper[4790]: I1003 10:28:22.689144 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-5f97d8c699-q8lx4_681ac688-7419-4d78-bf96-3a1d54c30d8b/operator/0.log" Oct 03 10:28:22 crc kubenswrapper[4790]: I1003 10:28:22.811497 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-6859f9b676-ntlnz_7c58f90d-6708-4bb2-ba68-fb4e08b99c61/manager/0.log" Oct 03 10:28:22 crc kubenswrapper[4790]: I1003 10:28:22.879723 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-5db5cf686f-wrcp9_b90a3b76-58bf-4c8a-a361-c59a183f6828/kube-rbac-proxy/0.log" Oct 03 10:28:22 crc kubenswrapper[4790]: I1003 10:28:22.886505 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-6fb94b767d-nhfws_c64c5a59-ac27-4428-8cdc-a1f8c96368bb/manager/0.log" Oct 03 10:28:23 crc kubenswrapper[4790]: I1003 10:28:23.295439 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5cd5cb47d7-mshnf_15dfe278-fcef-4c6f-8c06-cce51c7d5f7c/manager/0.log" Oct 03 10:28:23 crc kubenswrapper[4790]: I1003 10:28:23.319675 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5cd5cb47d7-mshnf_15dfe278-fcef-4c6f-8c06-cce51c7d5f7c/kube-rbac-proxy/0.log" Oct 03 10:28:23 crc kubenswrapper[4790]: I1003 10:28:23.343373 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-5db5cf686f-wrcp9_b90a3b76-58bf-4c8a-a361-c59a183f6828/manager/0.log" Oct 03 10:28:23 crc kubenswrapper[4790]: I1003 10:28:23.500692 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-5c899c45b6-gb6mg_a583ed8e-8a45-4dc9-80e0-bf559d110818/kube-rbac-proxy/0.log" Oct 03 10:28:23 crc kubenswrapper[4790]: I1003 10:28:23.596593 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-5c899c45b6-gb6mg_a583ed8e-8a45-4dc9-80e0-bf559d110818/manager/0.log" Oct 03 10:28:34 crc kubenswrapper[4790]: I1003 10:28:34.336140 4790 scope.go:117] "RemoveContainer" containerID="4a3f2daf4e4f2fcbea44841698967b97018c1bcc4bd0dc9f6b5b49027a55eb34" Oct 03 10:28:34 crc kubenswrapper[4790]: E1003 10:28:34.336895 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqj4j_openshift-machine-config-operator(36eae06a-d8be-4128-8d68-acb32e1f011b)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" podUID="36eae06a-d8be-4128-8d68-acb32e1f011b" Oct 03 10:28:39 crc kubenswrapper[4790]: I1003 10:28:39.970022 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-dp7r9_89e2e475-d36b-4626-ab23-e767070eb376/control-plane-machine-set-operator/0.log" Oct 03 10:28:40 crc kubenswrapper[4790]: I1003 10:28:40.182799 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-tml4h_2f81f816-0c58-437b-be43-b782515e8997/kube-rbac-proxy/0.log" Oct 03 10:28:40 crc kubenswrapper[4790]: I1003 10:28:40.214372 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-tml4h_2f81f816-0c58-437b-be43-b782515e8997/machine-api-operator/0.log" Oct 03 10:28:48 crc kubenswrapper[4790]: I1003 10:28:48.329217 4790 scope.go:117] "RemoveContainer" containerID="4a3f2daf4e4f2fcbea44841698967b97018c1bcc4bd0dc9f6b5b49027a55eb34" Oct 03 10:28:48 crc kubenswrapper[4790]: E1003 10:28:48.330049 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqj4j_openshift-machine-config-operator(36eae06a-d8be-4128-8d68-acb32e1f011b)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" podUID="36eae06a-d8be-4128-8d68-acb32e1f011b" Oct 03 10:28:52 crc kubenswrapper[4790]: I1003 10:28:52.904196 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-5b446d88c5-fpr25_594cfed0-b92c-4763-998c-c7aceef4c2f5/cert-manager-controller/0.log" Oct 03 10:28:53 crc kubenswrapper[4790]: I1003 10:28:53.098969 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-7f985d654d-r87sx_f6eb09d0-b3d6-4bc0-9125-5389bf8124dd/cert-manager-cainjector/0.log" Oct 03 10:28:53 crc kubenswrapper[4790]: I1003 10:28:53.148239 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-5655c58dd6-9k2xl_9ea495a1-9693-44fd-a9d5-6f9a710456f5/cert-manager-webhook/0.log" Oct 03 10:29:01 crc kubenswrapper[4790]: I1003 10:29:01.328595 4790 scope.go:117] "RemoveContainer" containerID="4a3f2daf4e4f2fcbea44841698967b97018c1bcc4bd0dc9f6b5b49027a55eb34" Oct 03 10:29:01 crc kubenswrapper[4790]: E1003 10:29:01.329515 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqj4j_openshift-machine-config-operator(36eae06a-d8be-4128-8d68-acb32e1f011b)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" podUID="36eae06a-d8be-4128-8d68-acb32e1f011b" Oct 03 10:29:05 crc kubenswrapper[4790]: I1003 10:29:05.133759 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-6b874cbd85-kt62h_1594d001-e795-49a2-bd6a-3a9ec48c2e4d/nmstate-console-plugin/0.log" Oct 03 10:29:05 crc kubenswrapper[4790]: I1003 10:29:05.330524 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-fdff9cb8d-hrwtp_d542c730-e799-4975-9b94-ae826e4706cf/nmstate-metrics/0.log" Oct 03 10:29:05 crc kubenswrapper[4790]: I1003 10:29:05.332023 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-rdglh_525f7376-4012-46b8-b79b-0c7bfb288354/nmstate-handler/0.log" Oct 03 10:29:05 crc kubenswrapper[4790]: I1003 10:29:05.334564 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-fdff9cb8d-hrwtp_d542c730-e799-4975-9b94-ae826e4706cf/kube-rbac-proxy/0.log" Oct 03 10:29:05 crc kubenswrapper[4790]: I1003 10:29:05.543773 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-6cdbc54649-l4kg8_8ad5530d-7fb8-4c5b-b0ef-df6837ebc877/nmstate-webhook/0.log" Oct 03 10:29:05 crc kubenswrapper[4790]: I1003 10:29:05.567972 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-858ddd8f98-kgcvn_5fb9da29-2f74-417d-8394-8f8fa9fea3ec/nmstate-operator/0.log" Oct 03 10:29:16 crc kubenswrapper[4790]: I1003 10:29:16.328899 4790 scope.go:117] "RemoveContainer" containerID="4a3f2daf4e4f2fcbea44841698967b97018c1bcc4bd0dc9f6b5b49027a55eb34" Oct 03 10:29:16 crc kubenswrapper[4790]: E1003 10:29:16.329937 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqj4j_openshift-machine-config-operator(36eae06a-d8be-4128-8d68-acb32e1f011b)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" podUID="36eae06a-d8be-4128-8d68-acb32e1f011b" Oct 03 10:29:19 crc kubenswrapper[4790]: I1003 10:29:19.143387 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-68d546b9d8-lbn9j_32b53a38-4a5c-44b1-9518-fe99746bf6a4/kube-rbac-proxy/0.log" Oct 03 10:29:19 crc kubenswrapper[4790]: I1003 10:29:19.346534 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-68d546b9d8-lbn9j_32b53a38-4a5c-44b1-9518-fe99746bf6a4/controller/0.log" Oct 03 10:29:19 crc kubenswrapper[4790]: I1003 10:29:19.379371 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-rzzll_904c6a5c-2937-4b23-bdbb-4dd0e461f9fd/cp-frr-files/0.log" Oct 03 10:29:19 crc kubenswrapper[4790]: I1003 10:29:19.593851 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-rzzll_904c6a5c-2937-4b23-bdbb-4dd0e461f9fd/cp-frr-files/0.log" Oct 03 10:29:19 crc kubenswrapper[4790]: I1003 10:29:19.594725 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-rzzll_904c6a5c-2937-4b23-bdbb-4dd0e461f9fd/cp-reloader/0.log" Oct 03 10:29:19 crc kubenswrapper[4790]: I1003 10:29:19.646509 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-rzzll_904c6a5c-2937-4b23-bdbb-4dd0e461f9fd/cp-reloader/0.log" Oct 03 10:29:19 crc kubenswrapper[4790]: I1003 10:29:19.656308 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-rzzll_904c6a5c-2937-4b23-bdbb-4dd0e461f9fd/cp-metrics/0.log" Oct 03 10:29:19 crc kubenswrapper[4790]: I1003 10:29:19.828207 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-rzzll_904c6a5c-2937-4b23-bdbb-4dd0e461f9fd/cp-reloader/0.log" Oct 03 10:29:19 crc kubenswrapper[4790]: I1003 10:29:19.834981 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-rzzll_904c6a5c-2937-4b23-bdbb-4dd0e461f9fd/cp-metrics/0.log" Oct 03 10:29:19 crc kubenswrapper[4790]: I1003 10:29:19.865736 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-rzzll_904c6a5c-2937-4b23-bdbb-4dd0e461f9fd/cp-frr-files/0.log" Oct 03 10:29:19 crc kubenswrapper[4790]: I1003 10:29:19.870822 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-rzzll_904c6a5c-2937-4b23-bdbb-4dd0e461f9fd/cp-metrics/0.log" Oct 03 10:29:20 crc kubenswrapper[4790]: I1003 10:29:20.072507 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-rzzll_904c6a5c-2937-4b23-bdbb-4dd0e461f9fd/cp-reloader/0.log" Oct 03 10:29:20 crc kubenswrapper[4790]: I1003 10:29:20.076009 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-rzzll_904c6a5c-2937-4b23-bdbb-4dd0e461f9fd/cp-metrics/0.log" Oct 03 10:29:20 crc kubenswrapper[4790]: I1003 10:29:20.087021 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-rzzll_904c6a5c-2937-4b23-bdbb-4dd0e461f9fd/cp-frr-files/0.log" Oct 03 10:29:20 crc kubenswrapper[4790]: I1003 10:29:20.147810 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-rzzll_904c6a5c-2937-4b23-bdbb-4dd0e461f9fd/controller/0.log" Oct 03 10:29:20 crc kubenswrapper[4790]: I1003 10:29:20.361654 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-rzzll_904c6a5c-2937-4b23-bdbb-4dd0e461f9fd/frr-metrics/0.log" Oct 03 10:29:20 crc kubenswrapper[4790]: I1003 10:29:20.390731 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-rzzll_904c6a5c-2937-4b23-bdbb-4dd0e461f9fd/kube-rbac-proxy/0.log" Oct 03 10:29:20 crc kubenswrapper[4790]: I1003 10:29:20.390922 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-rzzll_904c6a5c-2937-4b23-bdbb-4dd0e461f9fd/kube-rbac-proxy-frr/0.log" Oct 03 10:29:20 crc kubenswrapper[4790]: I1003 10:29:20.582273 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-rzzll_904c6a5c-2937-4b23-bdbb-4dd0e461f9fd/reloader/0.log" Oct 03 10:29:20 crc kubenswrapper[4790]: I1003 10:29:20.660956 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-64bf5d555-xtg2d_966af9a3-d44e-4bf2-bff8-9b8573083daa/frr-k8s-webhook-server/0.log" Oct 03 10:29:20 crc kubenswrapper[4790]: I1003 10:29:20.829022 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-c97c79c55-pqbls_1ae3495c-a801-4526-87ed-b8e81ea71d53/manager/0.log" Oct 03 10:29:21 crc kubenswrapper[4790]: I1003 10:29:21.070890 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-687d9f4499-mr2f5_f5d65866-e446-4512-b8aa-566a283ff5be/webhook-server/0.log" Oct 03 10:29:21 crc kubenswrapper[4790]: I1003 10:29:21.144951 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-lxwhp_8f597a65-e358-498d-b635-6eafe7618337/kube-rbac-proxy/0.log" Oct 03 10:29:21 crc kubenswrapper[4790]: I1003 10:29:21.812360 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-lxwhp_8f597a65-e358-498d-b635-6eafe7618337/speaker/0.log" Oct 03 10:29:21 crc kubenswrapper[4790]: I1003 10:29:21.969981 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-rzzll_904c6a5c-2937-4b23-bdbb-4dd0e461f9fd/frr/0.log" Oct 03 10:29:27 crc kubenswrapper[4790]: I1003 10:29:27.328928 4790 scope.go:117] "RemoveContainer" containerID="4a3f2daf4e4f2fcbea44841698967b97018c1bcc4bd0dc9f6b5b49027a55eb34" Oct 03 10:29:27 crc kubenswrapper[4790]: E1003 10:29:27.329725 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqj4j_openshift-machine-config-operator(36eae06a-d8be-4128-8d68-acb32e1f011b)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" podUID="36eae06a-d8be-4128-8d68-acb32e1f011b" Oct 03 10:29:34 crc kubenswrapper[4790]: I1003 10:29:34.168756 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2g6bf6_cdfb6807-f905-4cef-973b-10f556f2fc96/util/0.log" Oct 03 10:29:34 crc kubenswrapper[4790]: I1003 10:29:34.305720 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2g6bf6_cdfb6807-f905-4cef-973b-10f556f2fc96/pull/0.log" Oct 03 10:29:34 crc kubenswrapper[4790]: I1003 10:29:34.317188 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2g6bf6_cdfb6807-f905-4cef-973b-10f556f2fc96/util/0.log" Oct 03 10:29:34 crc kubenswrapper[4790]: I1003 10:29:34.375653 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2g6bf6_cdfb6807-f905-4cef-973b-10f556f2fc96/pull/0.log" Oct 03 10:29:34 crc kubenswrapper[4790]: I1003 10:29:34.531617 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2g6bf6_cdfb6807-f905-4cef-973b-10f556f2fc96/util/0.log" Oct 03 10:29:34 crc kubenswrapper[4790]: I1003 10:29:34.571347 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2g6bf6_cdfb6807-f905-4cef-973b-10f556f2fc96/extract/0.log" Oct 03 10:29:34 crc kubenswrapper[4790]: I1003 10:29:34.581884 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2g6bf6_cdfb6807-f905-4cef-973b-10f556f2fc96/pull/0.log" Oct 03 10:29:34 crc kubenswrapper[4790]: I1003 10:29:34.717375 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dkh77q_f8b2b263-b285-4112-96fe-22081359fece/util/0.log" Oct 03 10:29:34 crc kubenswrapper[4790]: I1003 10:29:34.941894 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dkh77q_f8b2b263-b285-4112-96fe-22081359fece/util/0.log" Oct 03 10:29:34 crc kubenswrapper[4790]: I1003 10:29:34.946883 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dkh77q_f8b2b263-b285-4112-96fe-22081359fece/pull/0.log" Oct 03 10:29:34 crc kubenswrapper[4790]: I1003 10:29:34.992411 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dkh77q_f8b2b263-b285-4112-96fe-22081359fece/pull/0.log" Oct 03 10:29:35 crc kubenswrapper[4790]: I1003 10:29:35.217531 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dkh77q_f8b2b263-b285-4112-96fe-22081359fece/pull/0.log" Oct 03 10:29:35 crc kubenswrapper[4790]: I1003 10:29:35.219510 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dkh77q_f8b2b263-b285-4112-96fe-22081359fece/util/0.log" Oct 03 10:29:35 crc kubenswrapper[4790]: I1003 10:29:35.226291 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dkh77q_f8b2b263-b285-4112-96fe-22081359fece/extract/0.log" Oct 03 10:29:35 crc kubenswrapper[4790]: I1003 10:29:35.391115 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-zkj66_bec1bd8b-1006-4ab4-b8bc-79b3ea63ab3c/extract-utilities/0.log" Oct 03 10:29:35 crc kubenswrapper[4790]: I1003 10:29:35.572220 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-zkj66_bec1bd8b-1006-4ab4-b8bc-79b3ea63ab3c/extract-content/0.log" Oct 03 10:29:35 crc kubenswrapper[4790]: I1003 10:29:35.575637 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-zkj66_bec1bd8b-1006-4ab4-b8bc-79b3ea63ab3c/extract-utilities/0.log" Oct 03 10:29:35 crc kubenswrapper[4790]: I1003 10:29:35.599209 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-zkj66_bec1bd8b-1006-4ab4-b8bc-79b3ea63ab3c/extract-content/0.log" Oct 03 10:29:35 crc kubenswrapper[4790]: I1003 10:29:35.733358 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-zkj66_bec1bd8b-1006-4ab4-b8bc-79b3ea63ab3c/extract-utilities/0.log" Oct 03 10:29:35 crc kubenswrapper[4790]: I1003 10:29:35.742943 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-zkj66_bec1bd8b-1006-4ab4-b8bc-79b3ea63ab3c/extract-content/0.log" Oct 03 10:29:35 crc kubenswrapper[4790]: I1003 10:29:35.953439 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-92r86_4c817004-6a29-4be9-907b-cc5a254e7132/extract-utilities/0.log" Oct 03 10:29:36 crc kubenswrapper[4790]: I1003 10:29:36.177696 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-92r86_4c817004-6a29-4be9-907b-cc5a254e7132/extract-utilities/0.log" Oct 03 10:29:36 crc kubenswrapper[4790]: I1003 10:29:36.248553 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-92r86_4c817004-6a29-4be9-907b-cc5a254e7132/extract-content/0.log" Oct 03 10:29:36 crc kubenswrapper[4790]: I1003 10:29:36.274339 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-92r86_4c817004-6a29-4be9-907b-cc5a254e7132/extract-content/0.log" Oct 03 10:29:36 crc kubenswrapper[4790]: I1003 10:29:36.457092 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-92r86_4c817004-6a29-4be9-907b-cc5a254e7132/extract-utilities/0.log" Oct 03 10:29:36 crc kubenswrapper[4790]: I1003 10:29:36.459928 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-92r86_4c817004-6a29-4be9-907b-cc5a254e7132/extract-content/0.log" Oct 03 10:29:36 crc kubenswrapper[4790]: I1003 10:29:36.564276 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-zkj66_bec1bd8b-1006-4ab4-b8bc-79b3ea63ab3c/registry-server/0.log" Oct 03 10:29:36 crc kubenswrapper[4790]: I1003 10:29:36.675938 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cblgh2_1db4e394-a858-4ab0-b051-3135c5516405/util/0.log" Oct 03 10:29:36 crc kubenswrapper[4790]: I1003 10:29:36.899108 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cblgh2_1db4e394-a858-4ab0-b051-3135c5516405/pull/0.log" Oct 03 10:29:36 crc kubenswrapper[4790]: I1003 10:29:36.919789 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cblgh2_1db4e394-a858-4ab0-b051-3135c5516405/util/0.log" Oct 03 10:29:36 crc kubenswrapper[4790]: I1003 10:29:36.967860 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cblgh2_1db4e394-a858-4ab0-b051-3135c5516405/pull/0.log" Oct 03 10:29:37 crc kubenswrapper[4790]: I1003 10:29:37.120390 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-92r86_4c817004-6a29-4be9-907b-cc5a254e7132/registry-server/0.log" Oct 03 10:29:37 crc kubenswrapper[4790]: I1003 10:29:37.183309 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cblgh2_1db4e394-a858-4ab0-b051-3135c5516405/util/0.log" Oct 03 10:29:37 crc kubenswrapper[4790]: I1003 10:29:37.188665 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cblgh2_1db4e394-a858-4ab0-b051-3135c5516405/pull/0.log" Oct 03 10:29:37 crc kubenswrapper[4790]: I1003 10:29:37.209192 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cblgh2_1db4e394-a858-4ab0-b051-3135c5516405/extract/0.log" Oct 03 10:29:37 crc kubenswrapper[4790]: I1003 10:29:37.352084 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-5sbms_dc166124-1b08-4c1e-90fe-a35d6331fcf4/marketplace-operator/0.log" Oct 03 10:29:37 crc kubenswrapper[4790]: I1003 10:29:37.427660 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-jshwg_cd3a5dd4-70b9-462e-994b-18d4fbf399c3/extract-utilities/0.log" Oct 03 10:29:37 crc kubenswrapper[4790]: I1003 10:29:37.553623 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-jshwg_cd3a5dd4-70b9-462e-994b-18d4fbf399c3/extract-utilities/0.log" Oct 03 10:29:37 crc kubenswrapper[4790]: I1003 10:29:37.584207 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-jshwg_cd3a5dd4-70b9-462e-994b-18d4fbf399c3/extract-content/0.log" Oct 03 10:29:37 crc kubenswrapper[4790]: I1003 10:29:37.585911 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-jshwg_cd3a5dd4-70b9-462e-994b-18d4fbf399c3/extract-content/0.log" Oct 03 10:29:37 crc kubenswrapper[4790]: I1003 10:29:37.789584 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-jshwg_cd3a5dd4-70b9-462e-994b-18d4fbf399c3/extract-utilities/0.log" Oct 03 10:29:37 crc kubenswrapper[4790]: I1003 10:29:37.790392 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-jshwg_cd3a5dd4-70b9-462e-994b-18d4fbf399c3/extract-content/0.log" Oct 03 10:29:37 crc kubenswrapper[4790]: I1003 10:29:37.911079 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-5z6kh_61616e67-5189-42b0-a4bc-8f6fc803f2a7/extract-utilities/0.log" Oct 03 10:29:38 crc kubenswrapper[4790]: I1003 10:29:38.073799 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-jshwg_cd3a5dd4-70b9-462e-994b-18d4fbf399c3/registry-server/0.log" Oct 03 10:29:38 crc kubenswrapper[4790]: I1003 10:29:38.096445 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-5z6kh_61616e67-5189-42b0-a4bc-8f6fc803f2a7/extract-content/0.log" Oct 03 10:29:38 crc kubenswrapper[4790]: I1003 10:29:38.115131 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-5z6kh_61616e67-5189-42b0-a4bc-8f6fc803f2a7/extract-content/0.log" Oct 03 10:29:38 crc kubenswrapper[4790]: I1003 10:29:38.115795 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-5z6kh_61616e67-5189-42b0-a4bc-8f6fc803f2a7/extract-utilities/0.log" Oct 03 10:29:38 crc kubenswrapper[4790]: I1003 10:29:38.315766 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-5z6kh_61616e67-5189-42b0-a4bc-8f6fc803f2a7/extract-utilities/0.log" Oct 03 10:29:38 crc kubenswrapper[4790]: I1003 10:29:38.333758 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-5z6kh_61616e67-5189-42b0-a4bc-8f6fc803f2a7/extract-content/0.log" Oct 03 10:29:39 crc kubenswrapper[4790]: I1003 10:29:39.228422 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-5z6kh_61616e67-5189-42b0-a4bc-8f6fc803f2a7/registry-server/0.log" Oct 03 10:29:39 crc kubenswrapper[4790]: I1003 10:29:39.329517 4790 scope.go:117] "RemoveContainer" containerID="4a3f2daf4e4f2fcbea44841698967b97018c1bcc4bd0dc9f6b5b49027a55eb34" Oct 03 10:29:39 crc kubenswrapper[4790]: E1003 10:29:39.329895 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqj4j_openshift-machine-config-operator(36eae06a-d8be-4128-8d68-acb32e1f011b)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" podUID="36eae06a-d8be-4128-8d68-acb32e1f011b" Oct 03 10:29:49 crc kubenswrapper[4790]: I1003 10:29:49.782294 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-7c8cf85677-wlf7r_66046d21-5295-4dd4-9f38-3859fc71918e/prometheus-operator/0.log" Oct 03 10:29:49 crc kubenswrapper[4790]: I1003 10:29:49.915501 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-f956bfc86-24zvx_9a774d8d-e6d0-4a3c-ad82-767e7a213d2e/prometheus-operator-admission-webhook/0.log" Oct 03 10:29:49 crc kubenswrapper[4790]: I1003 10:29:49.988386 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-f956bfc86-q2djd_2f0ce448-4fba-4522-b3d3-8287c471d89b/prometheus-operator-admission-webhook/0.log" Oct 03 10:29:50 crc kubenswrapper[4790]: I1003 10:29:50.106992 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-cc5f78dfc-rmlf2_feadd2c8-de45-4819-8626-812b7007c679/operator/0.log" Oct 03 10:29:50 crc kubenswrapper[4790]: I1003 10:29:50.200268 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_perses-operator-54bc95c9fb-8sh7j_ed2d3ffa-e89d-45d0-b131-8567b5abda08/perses-operator/0.log" Oct 03 10:29:51 crc kubenswrapper[4790]: I1003 10:29:51.328521 4790 scope.go:117] "RemoveContainer" containerID="4a3f2daf4e4f2fcbea44841698967b97018c1bcc4bd0dc9f6b5b49027a55eb34" Oct 03 10:29:51 crc kubenswrapper[4790]: E1003 10:29:51.329077 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqj4j_openshift-machine-config-operator(36eae06a-d8be-4128-8d68-acb32e1f011b)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" podUID="36eae06a-d8be-4128-8d68-acb32e1f011b" Oct 03 10:30:00 crc kubenswrapper[4790]: I1003 10:30:00.203029 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29324790-nz4j8"] Oct 03 10:30:00 crc kubenswrapper[4790]: E1003 10:30:00.204020 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="805f0a83-e530-4cb6-9f1b-9f6de2acb4de" containerName="container-00" Oct 03 10:30:00 crc kubenswrapper[4790]: I1003 10:30:00.204040 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="805f0a83-e530-4cb6-9f1b-9f6de2acb4de" containerName="container-00" Oct 03 10:30:00 crc kubenswrapper[4790]: I1003 10:30:00.204314 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="805f0a83-e530-4cb6-9f1b-9f6de2acb4de" containerName="container-00" Oct 03 10:30:00 crc kubenswrapper[4790]: I1003 10:30:00.205046 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29324790-nz4j8" Oct 03 10:30:00 crc kubenswrapper[4790]: I1003 10:30:00.208388 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 03 10:30:00 crc kubenswrapper[4790]: I1003 10:30:00.211038 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 03 10:30:00 crc kubenswrapper[4790]: I1003 10:30:00.258028 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29324790-nz4j8"] Oct 03 10:30:00 crc kubenswrapper[4790]: I1003 10:30:00.278069 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jsc7j\" (UniqueName: \"kubernetes.io/projected/40bdff43-ca94-428e-8d64-ddaeb98fbb85-kube-api-access-jsc7j\") pod \"collect-profiles-29324790-nz4j8\" (UID: \"40bdff43-ca94-428e-8d64-ddaeb98fbb85\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324790-nz4j8" Oct 03 10:30:00 crc kubenswrapper[4790]: I1003 10:30:00.278126 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/40bdff43-ca94-428e-8d64-ddaeb98fbb85-secret-volume\") pod \"collect-profiles-29324790-nz4j8\" (UID: \"40bdff43-ca94-428e-8d64-ddaeb98fbb85\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324790-nz4j8" Oct 03 10:30:00 crc kubenswrapper[4790]: I1003 10:30:00.278226 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/40bdff43-ca94-428e-8d64-ddaeb98fbb85-config-volume\") pod \"collect-profiles-29324790-nz4j8\" (UID: \"40bdff43-ca94-428e-8d64-ddaeb98fbb85\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324790-nz4j8" Oct 03 10:30:00 crc kubenswrapper[4790]: I1003 10:30:00.380215 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jsc7j\" (UniqueName: \"kubernetes.io/projected/40bdff43-ca94-428e-8d64-ddaeb98fbb85-kube-api-access-jsc7j\") pod \"collect-profiles-29324790-nz4j8\" (UID: \"40bdff43-ca94-428e-8d64-ddaeb98fbb85\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324790-nz4j8" Oct 03 10:30:00 crc kubenswrapper[4790]: I1003 10:30:00.380298 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/40bdff43-ca94-428e-8d64-ddaeb98fbb85-secret-volume\") pod \"collect-profiles-29324790-nz4j8\" (UID: \"40bdff43-ca94-428e-8d64-ddaeb98fbb85\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324790-nz4j8" Oct 03 10:30:00 crc kubenswrapper[4790]: I1003 10:30:00.380457 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/40bdff43-ca94-428e-8d64-ddaeb98fbb85-config-volume\") pod \"collect-profiles-29324790-nz4j8\" (UID: \"40bdff43-ca94-428e-8d64-ddaeb98fbb85\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324790-nz4j8" Oct 03 10:30:00 crc kubenswrapper[4790]: I1003 10:30:00.381691 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/40bdff43-ca94-428e-8d64-ddaeb98fbb85-config-volume\") pod \"collect-profiles-29324790-nz4j8\" (UID: \"40bdff43-ca94-428e-8d64-ddaeb98fbb85\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324790-nz4j8" Oct 03 10:30:00 crc kubenswrapper[4790]: I1003 10:30:00.388284 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/40bdff43-ca94-428e-8d64-ddaeb98fbb85-secret-volume\") pod \"collect-profiles-29324790-nz4j8\" (UID: \"40bdff43-ca94-428e-8d64-ddaeb98fbb85\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324790-nz4j8" Oct 03 10:30:00 crc kubenswrapper[4790]: I1003 10:30:00.401701 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jsc7j\" (UniqueName: \"kubernetes.io/projected/40bdff43-ca94-428e-8d64-ddaeb98fbb85-kube-api-access-jsc7j\") pod \"collect-profiles-29324790-nz4j8\" (UID: \"40bdff43-ca94-428e-8d64-ddaeb98fbb85\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29324790-nz4j8" Oct 03 10:30:00 crc kubenswrapper[4790]: I1003 10:30:00.540996 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29324790-nz4j8" Oct 03 10:30:01 crc kubenswrapper[4790]: I1003 10:30:01.146785 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29324790-nz4j8"] Oct 03 10:30:01 crc kubenswrapper[4790]: I1003 10:30:01.243474 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29324790-nz4j8" event={"ID":"40bdff43-ca94-428e-8d64-ddaeb98fbb85","Type":"ContainerStarted","Data":"3b87eb686c12303c954aa5d7006085d30ee002d1ad303faba30772d353989d91"} Oct 03 10:30:02 crc kubenswrapper[4790]: I1003 10:30:02.254097 4790 generic.go:334] "Generic (PLEG): container finished" podID="40bdff43-ca94-428e-8d64-ddaeb98fbb85" containerID="1382a9eb1b0c85507861e82657b61e0f8039ec4be82bbce220c25d1c4913487e" exitCode=0 Oct 03 10:30:02 crc kubenswrapper[4790]: I1003 10:30:02.254190 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29324790-nz4j8" event={"ID":"40bdff43-ca94-428e-8d64-ddaeb98fbb85","Type":"ContainerDied","Data":"1382a9eb1b0c85507861e82657b61e0f8039ec4be82bbce220c25d1c4913487e"} Oct 03 10:30:03 crc kubenswrapper[4790]: I1003 10:30:03.736373 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29324790-nz4j8" Oct 03 10:30:03 crc kubenswrapper[4790]: I1003 10:30:03.752265 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jsc7j\" (UniqueName: \"kubernetes.io/projected/40bdff43-ca94-428e-8d64-ddaeb98fbb85-kube-api-access-jsc7j\") pod \"40bdff43-ca94-428e-8d64-ddaeb98fbb85\" (UID: \"40bdff43-ca94-428e-8d64-ddaeb98fbb85\") " Oct 03 10:30:03 crc kubenswrapper[4790]: I1003 10:30:03.752369 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/40bdff43-ca94-428e-8d64-ddaeb98fbb85-config-volume\") pod \"40bdff43-ca94-428e-8d64-ddaeb98fbb85\" (UID: \"40bdff43-ca94-428e-8d64-ddaeb98fbb85\") " Oct 03 10:30:03 crc kubenswrapper[4790]: I1003 10:30:03.752428 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/40bdff43-ca94-428e-8d64-ddaeb98fbb85-secret-volume\") pod \"40bdff43-ca94-428e-8d64-ddaeb98fbb85\" (UID: \"40bdff43-ca94-428e-8d64-ddaeb98fbb85\") " Oct 03 10:30:03 crc kubenswrapper[4790]: I1003 10:30:03.753142 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/40bdff43-ca94-428e-8d64-ddaeb98fbb85-config-volume" (OuterVolumeSpecName: "config-volume") pod "40bdff43-ca94-428e-8d64-ddaeb98fbb85" (UID: "40bdff43-ca94-428e-8d64-ddaeb98fbb85"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 03 10:30:03 crc kubenswrapper[4790]: I1003 10:30:03.773141 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/40bdff43-ca94-428e-8d64-ddaeb98fbb85-kube-api-access-jsc7j" (OuterVolumeSpecName: "kube-api-access-jsc7j") pod "40bdff43-ca94-428e-8d64-ddaeb98fbb85" (UID: "40bdff43-ca94-428e-8d64-ddaeb98fbb85"). InnerVolumeSpecName "kube-api-access-jsc7j". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 10:30:03 crc kubenswrapper[4790]: I1003 10:30:03.777788 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/40bdff43-ca94-428e-8d64-ddaeb98fbb85-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "40bdff43-ca94-428e-8d64-ddaeb98fbb85" (UID: "40bdff43-ca94-428e-8d64-ddaeb98fbb85"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 03 10:30:03 crc kubenswrapper[4790]: I1003 10:30:03.853814 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jsc7j\" (UniqueName: \"kubernetes.io/projected/40bdff43-ca94-428e-8d64-ddaeb98fbb85-kube-api-access-jsc7j\") on node \"crc\" DevicePath \"\"" Oct 03 10:30:03 crc kubenswrapper[4790]: I1003 10:30:03.853852 4790 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/40bdff43-ca94-428e-8d64-ddaeb98fbb85-config-volume\") on node \"crc\" DevicePath \"\"" Oct 03 10:30:03 crc kubenswrapper[4790]: I1003 10:30:03.853864 4790 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/40bdff43-ca94-428e-8d64-ddaeb98fbb85-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 03 10:30:04 crc kubenswrapper[4790]: I1003 10:30:04.276566 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29324790-nz4j8" event={"ID":"40bdff43-ca94-428e-8d64-ddaeb98fbb85","Type":"ContainerDied","Data":"3b87eb686c12303c954aa5d7006085d30ee002d1ad303faba30772d353989d91"} Oct 03 10:30:04 crc kubenswrapper[4790]: I1003 10:30:04.276630 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3b87eb686c12303c954aa5d7006085d30ee002d1ad303faba30772d353989d91" Oct 03 10:30:04 crc kubenswrapper[4790]: I1003 10:30:04.276644 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29324790-nz4j8" Oct 03 10:30:04 crc kubenswrapper[4790]: I1003 10:30:04.832912 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29324745-lphg5"] Oct 03 10:30:04 crc kubenswrapper[4790]: I1003 10:30:04.845685 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29324745-lphg5"] Oct 03 10:30:06 crc kubenswrapper[4790]: I1003 10:30:06.331807 4790 scope.go:117] "RemoveContainer" containerID="4a3f2daf4e4f2fcbea44841698967b97018c1bcc4bd0dc9f6b5b49027a55eb34" Oct 03 10:30:06 crc kubenswrapper[4790]: E1003 10:30:06.332296 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqj4j_openshift-machine-config-operator(36eae06a-d8be-4128-8d68-acb32e1f011b)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" podUID="36eae06a-d8be-4128-8d68-acb32e1f011b" Oct 03 10:30:06 crc kubenswrapper[4790]: I1003 10:30:06.342085 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5604f3b1-96b7-41fd-9fbf-8fadb45a1d90" path="/var/lib/kubelet/pods/5604f3b1-96b7-41fd-9fbf-8fadb45a1d90/volumes" Oct 03 10:30:20 crc kubenswrapper[4790]: I1003 10:30:20.328632 4790 scope.go:117] "RemoveContainer" containerID="4a3f2daf4e4f2fcbea44841698967b97018c1bcc4bd0dc9f6b5b49027a55eb34" Oct 03 10:30:20 crc kubenswrapper[4790]: E1003 10:30:20.329839 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqj4j_openshift-machine-config-operator(36eae06a-d8be-4128-8d68-acb32e1f011b)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" podUID="36eae06a-d8be-4128-8d68-acb32e1f011b" Oct 03 10:30:31 crc kubenswrapper[4790]: I1003 10:30:31.675802 4790 scope.go:117] "RemoveContainer" containerID="8cd0bd8f10beed80a3b65ce20a11b384b651d96ccff7467acae0e96ab9068e82" Oct 03 10:30:35 crc kubenswrapper[4790]: I1003 10:30:35.329030 4790 scope.go:117] "RemoveContainer" containerID="4a3f2daf4e4f2fcbea44841698967b97018c1bcc4bd0dc9f6b5b49027a55eb34" Oct 03 10:30:35 crc kubenswrapper[4790]: E1003 10:30:35.330079 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqj4j_openshift-machine-config-operator(36eae06a-d8be-4128-8d68-acb32e1f011b)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" podUID="36eae06a-d8be-4128-8d68-acb32e1f011b" Oct 03 10:30:48 crc kubenswrapper[4790]: I1003 10:30:48.329050 4790 scope.go:117] "RemoveContainer" containerID="4a3f2daf4e4f2fcbea44841698967b97018c1bcc4bd0dc9f6b5b49027a55eb34" Oct 03 10:30:48 crc kubenswrapper[4790]: E1003 10:30:48.330033 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqj4j_openshift-machine-config-operator(36eae06a-d8be-4128-8d68-acb32e1f011b)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" podUID="36eae06a-d8be-4128-8d68-acb32e1f011b" Oct 03 10:31:01 crc kubenswrapper[4790]: I1003 10:31:01.329564 4790 scope.go:117] "RemoveContainer" containerID="4a3f2daf4e4f2fcbea44841698967b97018c1bcc4bd0dc9f6b5b49027a55eb34" Oct 03 10:31:01 crc kubenswrapper[4790]: E1003 10:31:01.330505 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-zqj4j_openshift-machine-config-operator(36eae06a-d8be-4128-8d68-acb32e1f011b)\"" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" podUID="36eae06a-d8be-4128-8d68-acb32e1f011b" Oct 03 10:31:15 crc kubenswrapper[4790]: I1003 10:31:15.329255 4790 scope.go:117] "RemoveContainer" containerID="4a3f2daf4e4f2fcbea44841698967b97018c1bcc4bd0dc9f6b5b49027a55eb34" Oct 03 10:31:16 crc kubenswrapper[4790]: I1003 10:31:16.048093 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" event={"ID":"36eae06a-d8be-4128-8d68-acb32e1f011b","Type":"ContainerStarted","Data":"9c577e63d1d9f135c187b5516a22ab28a6b9fe25292b7135a8d3b273532738bd"} Oct 03 10:31:52 crc kubenswrapper[4790]: I1003 10:31:52.747527 4790 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/ceilometer-0" podUID="ab369c6c-af88-4f07-bc5c-fb229a9b2cc8" containerName="ceilometer-central-agent" probeResult="failure" output="command timed out" Oct 03 10:32:21 crc kubenswrapper[4790]: I1003 10:32:21.311063 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-72qb7"] Oct 03 10:32:21 crc kubenswrapper[4790]: E1003 10:32:21.315801 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="40bdff43-ca94-428e-8d64-ddaeb98fbb85" containerName="collect-profiles" Oct 03 10:32:21 crc kubenswrapper[4790]: I1003 10:32:21.315834 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="40bdff43-ca94-428e-8d64-ddaeb98fbb85" containerName="collect-profiles" Oct 03 10:32:21 crc kubenswrapper[4790]: I1003 10:32:21.316054 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="40bdff43-ca94-428e-8d64-ddaeb98fbb85" containerName="collect-profiles" Oct 03 10:32:21 crc kubenswrapper[4790]: I1003 10:32:21.317560 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-72qb7" Oct 03 10:32:21 crc kubenswrapper[4790]: I1003 10:32:21.331653 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-72qb7"] Oct 03 10:32:21 crc kubenswrapper[4790]: I1003 10:32:21.432986 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8cc2c9eb-da3e-4b0a-890c-76d398baa1a3-utilities\") pod \"redhat-operators-72qb7\" (UID: \"8cc2c9eb-da3e-4b0a-890c-76d398baa1a3\") " pod="openshift-marketplace/redhat-operators-72qb7" Oct 03 10:32:21 crc kubenswrapper[4790]: I1003 10:32:21.433082 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z8zhc\" (UniqueName: \"kubernetes.io/projected/8cc2c9eb-da3e-4b0a-890c-76d398baa1a3-kube-api-access-z8zhc\") pod \"redhat-operators-72qb7\" (UID: \"8cc2c9eb-da3e-4b0a-890c-76d398baa1a3\") " pod="openshift-marketplace/redhat-operators-72qb7" Oct 03 10:32:21 crc kubenswrapper[4790]: I1003 10:32:21.433252 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8cc2c9eb-da3e-4b0a-890c-76d398baa1a3-catalog-content\") pod \"redhat-operators-72qb7\" (UID: \"8cc2c9eb-da3e-4b0a-890c-76d398baa1a3\") " pod="openshift-marketplace/redhat-operators-72qb7" Oct 03 10:32:21 crc kubenswrapper[4790]: I1003 10:32:21.535833 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8cc2c9eb-da3e-4b0a-890c-76d398baa1a3-utilities\") pod \"redhat-operators-72qb7\" (UID: \"8cc2c9eb-da3e-4b0a-890c-76d398baa1a3\") " pod="openshift-marketplace/redhat-operators-72qb7" Oct 03 10:32:21 crc kubenswrapper[4790]: I1003 10:32:21.535920 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z8zhc\" (UniqueName: \"kubernetes.io/projected/8cc2c9eb-da3e-4b0a-890c-76d398baa1a3-kube-api-access-z8zhc\") pod \"redhat-operators-72qb7\" (UID: \"8cc2c9eb-da3e-4b0a-890c-76d398baa1a3\") " pod="openshift-marketplace/redhat-operators-72qb7" Oct 03 10:32:21 crc kubenswrapper[4790]: I1003 10:32:21.536040 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8cc2c9eb-da3e-4b0a-890c-76d398baa1a3-catalog-content\") pod \"redhat-operators-72qb7\" (UID: \"8cc2c9eb-da3e-4b0a-890c-76d398baa1a3\") " pod="openshift-marketplace/redhat-operators-72qb7" Oct 03 10:32:21 crc kubenswrapper[4790]: I1003 10:32:21.536550 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8cc2c9eb-da3e-4b0a-890c-76d398baa1a3-catalog-content\") pod \"redhat-operators-72qb7\" (UID: \"8cc2c9eb-da3e-4b0a-890c-76d398baa1a3\") " pod="openshift-marketplace/redhat-operators-72qb7" Oct 03 10:32:21 crc kubenswrapper[4790]: I1003 10:32:21.536868 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8cc2c9eb-da3e-4b0a-890c-76d398baa1a3-utilities\") pod \"redhat-operators-72qb7\" (UID: \"8cc2c9eb-da3e-4b0a-890c-76d398baa1a3\") " pod="openshift-marketplace/redhat-operators-72qb7" Oct 03 10:32:21 crc kubenswrapper[4790]: I1003 10:32:21.561971 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z8zhc\" (UniqueName: \"kubernetes.io/projected/8cc2c9eb-da3e-4b0a-890c-76d398baa1a3-kube-api-access-z8zhc\") pod \"redhat-operators-72qb7\" (UID: \"8cc2c9eb-da3e-4b0a-890c-76d398baa1a3\") " pod="openshift-marketplace/redhat-operators-72qb7" Oct 03 10:32:21 crc kubenswrapper[4790]: I1003 10:32:21.645554 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-72qb7" Oct 03 10:32:22 crc kubenswrapper[4790]: I1003 10:32:22.151904 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-72qb7"] Oct 03 10:32:22 crc kubenswrapper[4790]: I1003 10:32:22.805149 4790 generic.go:334] "Generic (PLEG): container finished" podID="8cc2c9eb-da3e-4b0a-890c-76d398baa1a3" containerID="f4a329a4c3f2705a114bdd2769c4732d749058fe893120df091b9cea2730b4e6" exitCode=0 Oct 03 10:32:22 crc kubenswrapper[4790]: I1003 10:32:22.805414 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-72qb7" event={"ID":"8cc2c9eb-da3e-4b0a-890c-76d398baa1a3","Type":"ContainerDied","Data":"f4a329a4c3f2705a114bdd2769c4732d749058fe893120df091b9cea2730b4e6"} Oct 03 10:32:22 crc kubenswrapper[4790]: I1003 10:32:22.805437 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-72qb7" event={"ID":"8cc2c9eb-da3e-4b0a-890c-76d398baa1a3","Type":"ContainerStarted","Data":"76251d7d767b3608533fd7960876480f1123aed6dc44db5c04fac6303d0c8810"} Oct 03 10:32:22 crc kubenswrapper[4790]: I1003 10:32:22.807313 4790 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 03 10:32:23 crc kubenswrapper[4790]: I1003 10:32:23.818821 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-72qb7" event={"ID":"8cc2c9eb-da3e-4b0a-890c-76d398baa1a3","Type":"ContainerStarted","Data":"a65608959c141edcf182480f24216203dc9b883c97a3920e9dfb5a46f8809694"} Oct 03 10:32:23 crc kubenswrapper[4790]: I1003 10:32:23.837961 4790 generic.go:334] "Generic (PLEG): container finished" podID="6f3a7f20-a077-41e3-9971-85318acc2a26" containerID="7220800b3b3304d9e90ada19f98f7c27d04b2cf8a4f719d714fed9ac36c36f79" exitCode=0 Oct 03 10:32:23 crc kubenswrapper[4790]: I1003 10:32:23.838021 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-dxvsr/must-gather-p4m4p" event={"ID":"6f3a7f20-a077-41e3-9971-85318acc2a26","Type":"ContainerDied","Data":"7220800b3b3304d9e90ada19f98f7c27d04b2cf8a4f719d714fed9ac36c36f79"} Oct 03 10:32:23 crc kubenswrapper[4790]: I1003 10:32:23.838878 4790 scope.go:117] "RemoveContainer" containerID="7220800b3b3304d9e90ada19f98f7c27d04b2cf8a4f719d714fed9ac36c36f79" Oct 03 10:32:24 crc kubenswrapper[4790]: I1003 10:32:24.068455 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-dxvsr_must-gather-p4m4p_6f3a7f20-a077-41e3-9971-85318acc2a26/gather/0.log" Oct 03 10:32:24 crc kubenswrapper[4790]: I1003 10:32:24.492757 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-mdhmz"] Oct 03 10:32:24 crc kubenswrapper[4790]: I1003 10:32:24.495418 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mdhmz" Oct 03 10:32:24 crc kubenswrapper[4790]: I1003 10:32:24.504967 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-mdhmz"] Oct 03 10:32:24 crc kubenswrapper[4790]: I1003 10:32:24.602300 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/017fbf7a-4f9b-4b06-981c-b7d788e47650-utilities\") pod \"certified-operators-mdhmz\" (UID: \"017fbf7a-4f9b-4b06-981c-b7d788e47650\") " pod="openshift-marketplace/certified-operators-mdhmz" Oct 03 10:32:24 crc kubenswrapper[4790]: I1003 10:32:24.602384 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/017fbf7a-4f9b-4b06-981c-b7d788e47650-catalog-content\") pod \"certified-operators-mdhmz\" (UID: \"017fbf7a-4f9b-4b06-981c-b7d788e47650\") " pod="openshift-marketplace/certified-operators-mdhmz" Oct 03 10:32:24 crc kubenswrapper[4790]: I1003 10:32:24.602420 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bsvvb\" (UniqueName: \"kubernetes.io/projected/017fbf7a-4f9b-4b06-981c-b7d788e47650-kube-api-access-bsvvb\") pod \"certified-operators-mdhmz\" (UID: \"017fbf7a-4f9b-4b06-981c-b7d788e47650\") " pod="openshift-marketplace/certified-operators-mdhmz" Oct 03 10:32:24 crc kubenswrapper[4790]: I1003 10:32:24.683183 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-46k9s"] Oct 03 10:32:24 crc kubenswrapper[4790]: I1003 10:32:24.685982 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-46k9s" Oct 03 10:32:24 crc kubenswrapper[4790]: I1003 10:32:24.691944 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-46k9s"] Oct 03 10:32:24 crc kubenswrapper[4790]: I1003 10:32:24.704406 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/017fbf7a-4f9b-4b06-981c-b7d788e47650-utilities\") pod \"certified-operators-mdhmz\" (UID: \"017fbf7a-4f9b-4b06-981c-b7d788e47650\") " pod="openshift-marketplace/certified-operators-mdhmz" Oct 03 10:32:24 crc kubenswrapper[4790]: I1003 10:32:24.704467 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/017fbf7a-4f9b-4b06-981c-b7d788e47650-catalog-content\") pod \"certified-operators-mdhmz\" (UID: \"017fbf7a-4f9b-4b06-981c-b7d788e47650\") " pod="openshift-marketplace/certified-operators-mdhmz" Oct 03 10:32:24 crc kubenswrapper[4790]: I1003 10:32:24.704493 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bsvvb\" (UniqueName: \"kubernetes.io/projected/017fbf7a-4f9b-4b06-981c-b7d788e47650-kube-api-access-bsvvb\") pod \"certified-operators-mdhmz\" (UID: \"017fbf7a-4f9b-4b06-981c-b7d788e47650\") " pod="openshift-marketplace/certified-operators-mdhmz" Oct 03 10:32:24 crc kubenswrapper[4790]: I1003 10:32:24.705357 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/017fbf7a-4f9b-4b06-981c-b7d788e47650-utilities\") pod \"certified-operators-mdhmz\" (UID: \"017fbf7a-4f9b-4b06-981c-b7d788e47650\") " pod="openshift-marketplace/certified-operators-mdhmz" Oct 03 10:32:24 crc kubenswrapper[4790]: I1003 10:32:24.705647 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/017fbf7a-4f9b-4b06-981c-b7d788e47650-catalog-content\") pod \"certified-operators-mdhmz\" (UID: \"017fbf7a-4f9b-4b06-981c-b7d788e47650\") " pod="openshift-marketplace/certified-operators-mdhmz" Oct 03 10:32:24 crc kubenswrapper[4790]: I1003 10:32:24.739909 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bsvvb\" (UniqueName: \"kubernetes.io/projected/017fbf7a-4f9b-4b06-981c-b7d788e47650-kube-api-access-bsvvb\") pod \"certified-operators-mdhmz\" (UID: \"017fbf7a-4f9b-4b06-981c-b7d788e47650\") " pod="openshift-marketplace/certified-operators-mdhmz" Oct 03 10:32:24 crc kubenswrapper[4790]: I1003 10:32:24.807377 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1e45d3b6-6222-4ca7-bd1f-901f41999907-catalog-content\") pod \"community-operators-46k9s\" (UID: \"1e45d3b6-6222-4ca7-bd1f-901f41999907\") " pod="openshift-marketplace/community-operators-46k9s" Oct 03 10:32:24 crc kubenswrapper[4790]: I1003 10:32:24.807833 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v82hk\" (UniqueName: \"kubernetes.io/projected/1e45d3b6-6222-4ca7-bd1f-901f41999907-kube-api-access-v82hk\") pod \"community-operators-46k9s\" (UID: \"1e45d3b6-6222-4ca7-bd1f-901f41999907\") " pod="openshift-marketplace/community-operators-46k9s" Oct 03 10:32:24 crc kubenswrapper[4790]: I1003 10:32:24.808366 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1e45d3b6-6222-4ca7-bd1f-901f41999907-utilities\") pod \"community-operators-46k9s\" (UID: \"1e45d3b6-6222-4ca7-bd1f-901f41999907\") " pod="openshift-marketplace/community-operators-46k9s" Oct 03 10:32:24 crc kubenswrapper[4790]: I1003 10:32:24.822747 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mdhmz" Oct 03 10:32:24 crc kubenswrapper[4790]: I1003 10:32:24.911422 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1e45d3b6-6222-4ca7-bd1f-901f41999907-utilities\") pod \"community-operators-46k9s\" (UID: \"1e45d3b6-6222-4ca7-bd1f-901f41999907\") " pod="openshift-marketplace/community-operators-46k9s" Oct 03 10:32:24 crc kubenswrapper[4790]: I1003 10:32:24.911497 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1e45d3b6-6222-4ca7-bd1f-901f41999907-catalog-content\") pod \"community-operators-46k9s\" (UID: \"1e45d3b6-6222-4ca7-bd1f-901f41999907\") " pod="openshift-marketplace/community-operators-46k9s" Oct 03 10:32:24 crc kubenswrapper[4790]: I1003 10:32:24.911518 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v82hk\" (UniqueName: \"kubernetes.io/projected/1e45d3b6-6222-4ca7-bd1f-901f41999907-kube-api-access-v82hk\") pod \"community-operators-46k9s\" (UID: \"1e45d3b6-6222-4ca7-bd1f-901f41999907\") " pod="openshift-marketplace/community-operators-46k9s" Oct 03 10:32:24 crc kubenswrapper[4790]: I1003 10:32:24.912183 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1e45d3b6-6222-4ca7-bd1f-901f41999907-utilities\") pod \"community-operators-46k9s\" (UID: \"1e45d3b6-6222-4ca7-bd1f-901f41999907\") " pod="openshift-marketplace/community-operators-46k9s" Oct 03 10:32:24 crc kubenswrapper[4790]: I1003 10:32:24.912610 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1e45d3b6-6222-4ca7-bd1f-901f41999907-catalog-content\") pod \"community-operators-46k9s\" (UID: \"1e45d3b6-6222-4ca7-bd1f-901f41999907\") " pod="openshift-marketplace/community-operators-46k9s" Oct 03 10:32:24 crc kubenswrapper[4790]: I1003 10:32:24.944414 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v82hk\" (UniqueName: \"kubernetes.io/projected/1e45d3b6-6222-4ca7-bd1f-901f41999907-kube-api-access-v82hk\") pod \"community-operators-46k9s\" (UID: \"1e45d3b6-6222-4ca7-bd1f-901f41999907\") " pod="openshift-marketplace/community-operators-46k9s" Oct 03 10:32:25 crc kubenswrapper[4790]: I1003 10:32:25.007175 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-46k9s" Oct 03 10:32:25 crc kubenswrapper[4790]: I1003 10:32:25.442815 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-mdhmz"] Oct 03 10:32:25 crc kubenswrapper[4790]: I1003 10:32:25.458388 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-46k9s"] Oct 03 10:32:25 crc kubenswrapper[4790]: W1003 10:32:25.467647 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod017fbf7a_4f9b_4b06_981c_b7d788e47650.slice/crio-f5028b398f050fbf1400b080bfe9d254ed341499b1c91f62056dd1ecd1b59380 WatchSource:0}: Error finding container f5028b398f050fbf1400b080bfe9d254ed341499b1c91f62056dd1ecd1b59380: Status 404 returned error can't find the container with id f5028b398f050fbf1400b080bfe9d254ed341499b1c91f62056dd1ecd1b59380 Oct 03 10:32:25 crc kubenswrapper[4790]: W1003 10:32:25.480772 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1e45d3b6_6222_4ca7_bd1f_901f41999907.slice/crio-71eab605338a67c708c18d87fe79ea3d79458136d122a64015be90bda2e3bb46 WatchSource:0}: Error finding container 71eab605338a67c708c18d87fe79ea3d79458136d122a64015be90bda2e3bb46: Status 404 returned error can't find the container with id 71eab605338a67c708c18d87fe79ea3d79458136d122a64015be90bda2e3bb46 Oct 03 10:32:25 crc kubenswrapper[4790]: I1003 10:32:25.876801 4790 generic.go:334] "Generic (PLEG): container finished" podID="017fbf7a-4f9b-4b06-981c-b7d788e47650" containerID="153f1782944aa30771b2dec478f4d72c983acbc04be7e556436cb27f7f739620" exitCode=0 Oct 03 10:32:25 crc kubenswrapper[4790]: I1003 10:32:25.877841 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mdhmz" event={"ID":"017fbf7a-4f9b-4b06-981c-b7d788e47650","Type":"ContainerDied","Data":"153f1782944aa30771b2dec478f4d72c983acbc04be7e556436cb27f7f739620"} Oct 03 10:32:25 crc kubenswrapper[4790]: I1003 10:32:25.877971 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mdhmz" event={"ID":"017fbf7a-4f9b-4b06-981c-b7d788e47650","Type":"ContainerStarted","Data":"f5028b398f050fbf1400b080bfe9d254ed341499b1c91f62056dd1ecd1b59380"} Oct 03 10:32:25 crc kubenswrapper[4790]: I1003 10:32:25.884376 4790 generic.go:334] "Generic (PLEG): container finished" podID="1e45d3b6-6222-4ca7-bd1f-901f41999907" containerID="d3b2e0e7e22447f3ddbbb6776f6a35f45eaa08d4474c145e871cfb3cf3621555" exitCode=0 Oct 03 10:32:25 crc kubenswrapper[4790]: I1003 10:32:25.884415 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-46k9s" event={"ID":"1e45d3b6-6222-4ca7-bd1f-901f41999907","Type":"ContainerDied","Data":"d3b2e0e7e22447f3ddbbb6776f6a35f45eaa08d4474c145e871cfb3cf3621555"} Oct 03 10:32:25 crc kubenswrapper[4790]: I1003 10:32:25.884444 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-46k9s" event={"ID":"1e45d3b6-6222-4ca7-bd1f-901f41999907","Type":"ContainerStarted","Data":"71eab605338a67c708c18d87fe79ea3d79458136d122a64015be90bda2e3bb46"} Oct 03 10:32:26 crc kubenswrapper[4790]: I1003 10:32:26.896401 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-46k9s" event={"ID":"1e45d3b6-6222-4ca7-bd1f-901f41999907","Type":"ContainerStarted","Data":"6e32ff935da247821694cf8f31911cf2f2c7ab8686f79f8026312baf47409484"} Oct 03 10:32:27 crc kubenswrapper[4790]: I1003 10:32:27.910451 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mdhmz" event={"ID":"017fbf7a-4f9b-4b06-981c-b7d788e47650","Type":"ContainerStarted","Data":"acd14a2471bdec593a7b8b214fc98be930d9869ff077c45be8b7e7deaa0e7b76"} Oct 03 10:32:28 crc kubenswrapper[4790]: I1003 10:32:28.926330 4790 generic.go:334] "Generic (PLEG): container finished" podID="8cc2c9eb-da3e-4b0a-890c-76d398baa1a3" containerID="a65608959c141edcf182480f24216203dc9b883c97a3920e9dfb5a46f8809694" exitCode=0 Oct 03 10:32:28 crc kubenswrapper[4790]: I1003 10:32:28.926433 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-72qb7" event={"ID":"8cc2c9eb-da3e-4b0a-890c-76d398baa1a3","Type":"ContainerDied","Data":"a65608959c141edcf182480f24216203dc9b883c97a3920e9dfb5a46f8809694"} Oct 03 10:32:29 crc kubenswrapper[4790]: I1003 10:32:29.954553 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-72qb7" event={"ID":"8cc2c9eb-da3e-4b0a-890c-76d398baa1a3","Type":"ContainerStarted","Data":"f9a5fe8466cae31ba71e28b4ff368176a00fb40aa728befebc9cbd3c70072d40"} Oct 03 10:32:29 crc kubenswrapper[4790]: I1003 10:32:29.956801 4790 generic.go:334] "Generic (PLEG): container finished" podID="017fbf7a-4f9b-4b06-981c-b7d788e47650" containerID="acd14a2471bdec593a7b8b214fc98be930d9869ff077c45be8b7e7deaa0e7b76" exitCode=0 Oct 03 10:32:29 crc kubenswrapper[4790]: I1003 10:32:29.956866 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mdhmz" event={"ID":"017fbf7a-4f9b-4b06-981c-b7d788e47650","Type":"ContainerDied","Data":"acd14a2471bdec593a7b8b214fc98be930d9869ff077c45be8b7e7deaa0e7b76"} Oct 03 10:32:29 crc kubenswrapper[4790]: I1003 10:32:29.960282 4790 generic.go:334] "Generic (PLEG): container finished" podID="1e45d3b6-6222-4ca7-bd1f-901f41999907" containerID="6e32ff935da247821694cf8f31911cf2f2c7ab8686f79f8026312baf47409484" exitCode=0 Oct 03 10:32:29 crc kubenswrapper[4790]: I1003 10:32:29.960315 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-46k9s" event={"ID":"1e45d3b6-6222-4ca7-bd1f-901f41999907","Type":"ContainerDied","Data":"6e32ff935da247821694cf8f31911cf2f2c7ab8686f79f8026312baf47409484"} Oct 03 10:32:29 crc kubenswrapper[4790]: I1003 10:32:29.980948 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-72qb7" podStartSLOduration=2.235003028 podStartE2EDuration="8.980925985s" podCreationTimestamp="2025-10-03 10:32:21 +0000 UTC" firstStartedPulling="2025-10-03 10:32:22.807130717 +0000 UTC m=+6729.160064955" lastFinishedPulling="2025-10-03 10:32:29.553053674 +0000 UTC m=+6735.905987912" observedRunningTime="2025-10-03 10:32:29.975201817 +0000 UTC m=+6736.328136065" watchObservedRunningTime="2025-10-03 10:32:29.980925985 +0000 UTC m=+6736.333860223" Oct 03 10:32:30 crc kubenswrapper[4790]: I1003 10:32:30.977504 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-46k9s" event={"ID":"1e45d3b6-6222-4ca7-bd1f-901f41999907","Type":"ContainerStarted","Data":"6251648b8fde67799b16504046f122aeabcf3f7471a6257278c14b21f6ba8411"} Oct 03 10:32:31 crc kubenswrapper[4790]: I1003 10:32:31.003074 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-46k9s" podStartSLOduration=2.504980799 podStartE2EDuration="7.00304955s" podCreationTimestamp="2025-10-03 10:32:24 +0000 UTC" firstStartedPulling="2025-10-03 10:32:25.88671448 +0000 UTC m=+6732.239648718" lastFinishedPulling="2025-10-03 10:32:30.384783231 +0000 UTC m=+6736.737717469" observedRunningTime="2025-10-03 10:32:30.992742846 +0000 UTC m=+6737.345677134" watchObservedRunningTime="2025-10-03 10:32:31.00304955 +0000 UTC m=+6737.355983809" Oct 03 10:32:31 crc kubenswrapper[4790]: I1003 10:32:31.646113 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-72qb7" Oct 03 10:32:31 crc kubenswrapper[4790]: I1003 10:32:31.646160 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-72qb7" Oct 03 10:32:31 crc kubenswrapper[4790]: I1003 10:32:31.992556 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mdhmz" event={"ID":"017fbf7a-4f9b-4b06-981c-b7d788e47650","Type":"ContainerStarted","Data":"f8e00b73451013ad09a94f8c853cffc59dd9b44144e16e8e929a307f2f70cce3"} Oct 03 10:32:32 crc kubenswrapper[4790]: I1003 10:32:32.010364 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-mdhmz" podStartSLOduration=3.015836853 podStartE2EDuration="8.010346258s" podCreationTimestamp="2025-10-03 10:32:24 +0000 UTC" firstStartedPulling="2025-10-03 10:32:25.880615162 +0000 UTC m=+6732.233549400" lastFinishedPulling="2025-10-03 10:32:30.875124567 +0000 UTC m=+6737.228058805" observedRunningTime="2025-10-03 10:32:32.009104674 +0000 UTC m=+6738.362038912" watchObservedRunningTime="2025-10-03 10:32:32.010346258 +0000 UTC m=+6738.363280496" Oct 03 10:32:32 crc kubenswrapper[4790]: I1003 10:32:32.705047 4790 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-72qb7" podUID="8cc2c9eb-da3e-4b0a-890c-76d398baa1a3" containerName="registry-server" probeResult="failure" output=< Oct 03 10:32:32 crc kubenswrapper[4790]: timeout: failed to connect service ":50051" within 1s Oct 03 10:32:32 crc kubenswrapper[4790]: > Oct 03 10:32:34 crc kubenswrapper[4790]: I1003 10:32:34.822927 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-mdhmz" Oct 03 10:32:34 crc kubenswrapper[4790]: I1003 10:32:34.823341 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-mdhmz" Oct 03 10:32:35 crc kubenswrapper[4790]: I1003 10:32:35.008124 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-46k9s" Oct 03 10:32:35 crc kubenswrapper[4790]: I1003 10:32:35.008631 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-46k9s" Oct 03 10:32:35 crc kubenswrapper[4790]: I1003 10:32:35.868875 4790 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-mdhmz" podUID="017fbf7a-4f9b-4b06-981c-b7d788e47650" containerName="registry-server" probeResult="failure" output=< Oct 03 10:32:35 crc kubenswrapper[4790]: timeout: failed to connect service ":50051" within 1s Oct 03 10:32:35 crc kubenswrapper[4790]: > Oct 03 10:32:36 crc kubenswrapper[4790]: I1003 10:32:36.063262 4790 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-46k9s" podUID="1e45d3b6-6222-4ca7-bd1f-901f41999907" containerName="registry-server" probeResult="failure" output=< Oct 03 10:32:36 crc kubenswrapper[4790]: timeout: failed to connect service ":50051" within 1s Oct 03 10:32:36 crc kubenswrapper[4790]: > Oct 03 10:32:41 crc kubenswrapper[4790]: I1003 10:32:41.453451 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-dxvsr/must-gather-p4m4p"] Oct 03 10:32:41 crc kubenswrapper[4790]: I1003 10:32:41.454540 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-dxvsr/must-gather-p4m4p" podUID="6f3a7f20-a077-41e3-9971-85318acc2a26" containerName="copy" containerID="cri-o://4a4cfbbeb2d2146d13da7e012a8137daaf16fab76772e5a715bbcaad9a26fb02" gracePeriod=2 Oct 03 10:32:41 crc kubenswrapper[4790]: I1003 10:32:41.469597 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-dxvsr/must-gather-p4m4p"] Oct 03 10:32:41 crc kubenswrapper[4790]: I1003 10:32:41.983333 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-dxvsr_must-gather-p4m4p_6f3a7f20-a077-41e3-9971-85318acc2a26/copy/0.log" Oct 03 10:32:41 crc kubenswrapper[4790]: I1003 10:32:41.984592 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-dxvsr/must-gather-p4m4p" Oct 03 10:32:42 crc kubenswrapper[4790]: I1003 10:32:42.091306 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-dxvsr_must-gather-p4m4p_6f3a7f20-a077-41e3-9971-85318acc2a26/copy/0.log" Oct 03 10:32:42 crc kubenswrapper[4790]: I1003 10:32:42.091888 4790 generic.go:334] "Generic (PLEG): container finished" podID="6f3a7f20-a077-41e3-9971-85318acc2a26" containerID="4a4cfbbeb2d2146d13da7e012a8137daaf16fab76772e5a715bbcaad9a26fb02" exitCode=143 Oct 03 10:32:42 crc kubenswrapper[4790]: I1003 10:32:42.091955 4790 scope.go:117] "RemoveContainer" containerID="4a4cfbbeb2d2146d13da7e012a8137daaf16fab76772e5a715bbcaad9a26fb02" Oct 03 10:32:42 crc kubenswrapper[4790]: I1003 10:32:42.091959 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-dxvsr/must-gather-p4m4p" Oct 03 10:32:42 crc kubenswrapper[4790]: I1003 10:32:42.114223 4790 scope.go:117] "RemoveContainer" containerID="7220800b3b3304d9e90ada19f98f7c27d04b2cf8a4f719d714fed9ac36c36f79" Oct 03 10:32:42 crc kubenswrapper[4790]: I1003 10:32:42.127078 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/6f3a7f20-a077-41e3-9971-85318acc2a26-must-gather-output\") pod \"6f3a7f20-a077-41e3-9971-85318acc2a26\" (UID: \"6f3a7f20-a077-41e3-9971-85318acc2a26\") " Oct 03 10:32:42 crc kubenswrapper[4790]: I1003 10:32:42.127178 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p286j\" (UniqueName: \"kubernetes.io/projected/6f3a7f20-a077-41e3-9971-85318acc2a26-kube-api-access-p286j\") pod \"6f3a7f20-a077-41e3-9971-85318acc2a26\" (UID: \"6f3a7f20-a077-41e3-9971-85318acc2a26\") " Oct 03 10:32:42 crc kubenswrapper[4790]: I1003 10:32:42.139022 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6f3a7f20-a077-41e3-9971-85318acc2a26-kube-api-access-p286j" (OuterVolumeSpecName: "kube-api-access-p286j") pod "6f3a7f20-a077-41e3-9971-85318acc2a26" (UID: "6f3a7f20-a077-41e3-9971-85318acc2a26"). InnerVolumeSpecName "kube-api-access-p286j". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 10:32:42 crc kubenswrapper[4790]: I1003 10:32:42.172603 4790 scope.go:117] "RemoveContainer" containerID="4a4cfbbeb2d2146d13da7e012a8137daaf16fab76772e5a715bbcaad9a26fb02" Oct 03 10:32:42 crc kubenswrapper[4790]: E1003 10:32:42.174485 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4a4cfbbeb2d2146d13da7e012a8137daaf16fab76772e5a715bbcaad9a26fb02\": container with ID starting with 4a4cfbbeb2d2146d13da7e012a8137daaf16fab76772e5a715bbcaad9a26fb02 not found: ID does not exist" containerID="4a4cfbbeb2d2146d13da7e012a8137daaf16fab76772e5a715bbcaad9a26fb02" Oct 03 10:32:42 crc kubenswrapper[4790]: I1003 10:32:42.174530 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4a4cfbbeb2d2146d13da7e012a8137daaf16fab76772e5a715bbcaad9a26fb02"} err="failed to get container status \"4a4cfbbeb2d2146d13da7e012a8137daaf16fab76772e5a715bbcaad9a26fb02\": rpc error: code = NotFound desc = could not find container \"4a4cfbbeb2d2146d13da7e012a8137daaf16fab76772e5a715bbcaad9a26fb02\": container with ID starting with 4a4cfbbeb2d2146d13da7e012a8137daaf16fab76772e5a715bbcaad9a26fb02 not found: ID does not exist" Oct 03 10:32:42 crc kubenswrapper[4790]: I1003 10:32:42.174557 4790 scope.go:117] "RemoveContainer" containerID="7220800b3b3304d9e90ada19f98f7c27d04b2cf8a4f719d714fed9ac36c36f79" Oct 03 10:32:42 crc kubenswrapper[4790]: E1003 10:32:42.174831 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7220800b3b3304d9e90ada19f98f7c27d04b2cf8a4f719d714fed9ac36c36f79\": container with ID starting with 7220800b3b3304d9e90ada19f98f7c27d04b2cf8a4f719d714fed9ac36c36f79 not found: ID does not exist" containerID="7220800b3b3304d9e90ada19f98f7c27d04b2cf8a4f719d714fed9ac36c36f79" Oct 03 10:32:42 crc kubenswrapper[4790]: I1003 10:32:42.174854 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7220800b3b3304d9e90ada19f98f7c27d04b2cf8a4f719d714fed9ac36c36f79"} err="failed to get container status \"7220800b3b3304d9e90ada19f98f7c27d04b2cf8a4f719d714fed9ac36c36f79\": rpc error: code = NotFound desc = could not find container \"7220800b3b3304d9e90ada19f98f7c27d04b2cf8a4f719d714fed9ac36c36f79\": container with ID starting with 7220800b3b3304d9e90ada19f98f7c27d04b2cf8a4f719d714fed9ac36c36f79 not found: ID does not exist" Oct 03 10:32:42 crc kubenswrapper[4790]: I1003 10:32:42.232296 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p286j\" (UniqueName: \"kubernetes.io/projected/6f3a7f20-a077-41e3-9971-85318acc2a26-kube-api-access-p286j\") on node \"crc\" DevicePath \"\"" Oct 03 10:32:42 crc kubenswrapper[4790]: I1003 10:32:42.336769 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6f3a7f20-a077-41e3-9971-85318acc2a26-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "6f3a7f20-a077-41e3-9971-85318acc2a26" (UID: "6f3a7f20-a077-41e3-9971-85318acc2a26"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 10:32:42 crc kubenswrapper[4790]: I1003 10:32:42.349328 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6f3a7f20-a077-41e3-9971-85318acc2a26" path="/var/lib/kubelet/pods/6f3a7f20-a077-41e3-9971-85318acc2a26/volumes" Oct 03 10:32:42 crc kubenswrapper[4790]: I1003 10:32:42.436405 4790 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/6f3a7f20-a077-41e3-9971-85318acc2a26-must-gather-output\") on node \"crc\" DevicePath \"\"" Oct 03 10:32:42 crc kubenswrapper[4790]: I1003 10:32:42.698910 4790 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-72qb7" podUID="8cc2c9eb-da3e-4b0a-890c-76d398baa1a3" containerName="registry-server" probeResult="failure" output=< Oct 03 10:32:42 crc kubenswrapper[4790]: timeout: failed to connect service ":50051" within 1s Oct 03 10:32:42 crc kubenswrapper[4790]: > Oct 03 10:32:44 crc kubenswrapper[4790]: I1003 10:32:44.889442 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-mdhmz" Oct 03 10:32:44 crc kubenswrapper[4790]: I1003 10:32:44.970786 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-mdhmz" Oct 03 10:32:45 crc kubenswrapper[4790]: I1003 10:32:45.068151 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-46k9s" Oct 03 10:32:45 crc kubenswrapper[4790]: I1003 10:32:45.129130 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-46k9s" Oct 03 10:32:45 crc kubenswrapper[4790]: I1003 10:32:45.145386 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-mdhmz"] Oct 03 10:32:46 crc kubenswrapper[4790]: I1003 10:32:46.139042 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-mdhmz" podUID="017fbf7a-4f9b-4b06-981c-b7d788e47650" containerName="registry-server" containerID="cri-o://f8e00b73451013ad09a94f8c853cffc59dd9b44144e16e8e929a307f2f70cce3" gracePeriod=2 Oct 03 10:32:46 crc kubenswrapper[4790]: I1003 10:32:46.652015 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mdhmz" Oct 03 10:32:46 crc kubenswrapper[4790]: I1003 10:32:46.724285 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/017fbf7a-4f9b-4b06-981c-b7d788e47650-utilities\") pod \"017fbf7a-4f9b-4b06-981c-b7d788e47650\" (UID: \"017fbf7a-4f9b-4b06-981c-b7d788e47650\") " Oct 03 10:32:46 crc kubenswrapper[4790]: I1003 10:32:46.724397 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/017fbf7a-4f9b-4b06-981c-b7d788e47650-catalog-content\") pod \"017fbf7a-4f9b-4b06-981c-b7d788e47650\" (UID: \"017fbf7a-4f9b-4b06-981c-b7d788e47650\") " Oct 03 10:32:46 crc kubenswrapper[4790]: I1003 10:32:46.724550 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bsvvb\" (UniqueName: \"kubernetes.io/projected/017fbf7a-4f9b-4b06-981c-b7d788e47650-kube-api-access-bsvvb\") pod \"017fbf7a-4f9b-4b06-981c-b7d788e47650\" (UID: \"017fbf7a-4f9b-4b06-981c-b7d788e47650\") " Oct 03 10:32:46 crc kubenswrapper[4790]: I1003 10:32:46.725731 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/017fbf7a-4f9b-4b06-981c-b7d788e47650-utilities" (OuterVolumeSpecName: "utilities") pod "017fbf7a-4f9b-4b06-981c-b7d788e47650" (UID: "017fbf7a-4f9b-4b06-981c-b7d788e47650"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 10:32:46 crc kubenswrapper[4790]: I1003 10:32:46.735542 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/017fbf7a-4f9b-4b06-981c-b7d788e47650-kube-api-access-bsvvb" (OuterVolumeSpecName: "kube-api-access-bsvvb") pod "017fbf7a-4f9b-4b06-981c-b7d788e47650" (UID: "017fbf7a-4f9b-4b06-981c-b7d788e47650"). InnerVolumeSpecName "kube-api-access-bsvvb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 10:32:46 crc kubenswrapper[4790]: I1003 10:32:46.776502 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/017fbf7a-4f9b-4b06-981c-b7d788e47650-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "017fbf7a-4f9b-4b06-981c-b7d788e47650" (UID: "017fbf7a-4f9b-4b06-981c-b7d788e47650"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 10:32:46 crc kubenswrapper[4790]: I1003 10:32:46.827402 4790 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/017fbf7a-4f9b-4b06-981c-b7d788e47650-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 10:32:46 crc kubenswrapper[4790]: I1003 10:32:46.827441 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bsvvb\" (UniqueName: \"kubernetes.io/projected/017fbf7a-4f9b-4b06-981c-b7d788e47650-kube-api-access-bsvvb\") on node \"crc\" DevicePath \"\"" Oct 03 10:32:46 crc kubenswrapper[4790]: I1003 10:32:46.827451 4790 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/017fbf7a-4f9b-4b06-981c-b7d788e47650-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 10:32:47 crc kubenswrapper[4790]: I1003 10:32:47.161745 4790 generic.go:334] "Generic (PLEG): container finished" podID="017fbf7a-4f9b-4b06-981c-b7d788e47650" containerID="f8e00b73451013ad09a94f8c853cffc59dd9b44144e16e8e929a307f2f70cce3" exitCode=0 Oct 03 10:32:47 crc kubenswrapper[4790]: I1003 10:32:47.161922 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mdhmz" event={"ID":"017fbf7a-4f9b-4b06-981c-b7d788e47650","Type":"ContainerDied","Data":"f8e00b73451013ad09a94f8c853cffc59dd9b44144e16e8e929a307f2f70cce3"} Oct 03 10:32:47 crc kubenswrapper[4790]: I1003 10:32:47.162030 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mdhmz" Oct 03 10:32:47 crc kubenswrapper[4790]: I1003 10:32:47.162061 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mdhmz" event={"ID":"017fbf7a-4f9b-4b06-981c-b7d788e47650","Type":"ContainerDied","Data":"f5028b398f050fbf1400b080bfe9d254ed341499b1c91f62056dd1ecd1b59380"} Oct 03 10:32:47 crc kubenswrapper[4790]: I1003 10:32:47.162098 4790 scope.go:117] "RemoveContainer" containerID="f8e00b73451013ad09a94f8c853cffc59dd9b44144e16e8e929a307f2f70cce3" Oct 03 10:32:47 crc kubenswrapper[4790]: I1003 10:32:47.188716 4790 scope.go:117] "RemoveContainer" containerID="acd14a2471bdec593a7b8b214fc98be930d9869ff077c45be8b7e7deaa0e7b76" Oct 03 10:32:47 crc kubenswrapper[4790]: I1003 10:32:47.200738 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-mdhmz"] Oct 03 10:32:47 crc kubenswrapper[4790]: I1003 10:32:47.209289 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-mdhmz"] Oct 03 10:32:47 crc kubenswrapper[4790]: I1003 10:32:47.223480 4790 scope.go:117] "RemoveContainer" containerID="153f1782944aa30771b2dec478f4d72c983acbc04be7e556436cb27f7f739620" Oct 03 10:32:47 crc kubenswrapper[4790]: I1003 10:32:47.268678 4790 scope.go:117] "RemoveContainer" containerID="f8e00b73451013ad09a94f8c853cffc59dd9b44144e16e8e929a307f2f70cce3" Oct 03 10:32:47 crc kubenswrapper[4790]: E1003 10:32:47.269126 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f8e00b73451013ad09a94f8c853cffc59dd9b44144e16e8e929a307f2f70cce3\": container with ID starting with f8e00b73451013ad09a94f8c853cffc59dd9b44144e16e8e929a307f2f70cce3 not found: ID does not exist" containerID="f8e00b73451013ad09a94f8c853cffc59dd9b44144e16e8e929a307f2f70cce3" Oct 03 10:32:47 crc kubenswrapper[4790]: I1003 10:32:47.269176 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f8e00b73451013ad09a94f8c853cffc59dd9b44144e16e8e929a307f2f70cce3"} err="failed to get container status \"f8e00b73451013ad09a94f8c853cffc59dd9b44144e16e8e929a307f2f70cce3\": rpc error: code = NotFound desc = could not find container \"f8e00b73451013ad09a94f8c853cffc59dd9b44144e16e8e929a307f2f70cce3\": container with ID starting with f8e00b73451013ad09a94f8c853cffc59dd9b44144e16e8e929a307f2f70cce3 not found: ID does not exist" Oct 03 10:32:47 crc kubenswrapper[4790]: I1003 10:32:47.269209 4790 scope.go:117] "RemoveContainer" containerID="acd14a2471bdec593a7b8b214fc98be930d9869ff077c45be8b7e7deaa0e7b76" Oct 03 10:32:47 crc kubenswrapper[4790]: E1003 10:32:47.269565 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"acd14a2471bdec593a7b8b214fc98be930d9869ff077c45be8b7e7deaa0e7b76\": container with ID starting with acd14a2471bdec593a7b8b214fc98be930d9869ff077c45be8b7e7deaa0e7b76 not found: ID does not exist" containerID="acd14a2471bdec593a7b8b214fc98be930d9869ff077c45be8b7e7deaa0e7b76" Oct 03 10:32:47 crc kubenswrapper[4790]: I1003 10:32:47.269614 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"acd14a2471bdec593a7b8b214fc98be930d9869ff077c45be8b7e7deaa0e7b76"} err="failed to get container status \"acd14a2471bdec593a7b8b214fc98be930d9869ff077c45be8b7e7deaa0e7b76\": rpc error: code = NotFound desc = could not find container \"acd14a2471bdec593a7b8b214fc98be930d9869ff077c45be8b7e7deaa0e7b76\": container with ID starting with acd14a2471bdec593a7b8b214fc98be930d9869ff077c45be8b7e7deaa0e7b76 not found: ID does not exist" Oct 03 10:32:47 crc kubenswrapper[4790]: I1003 10:32:47.269632 4790 scope.go:117] "RemoveContainer" containerID="153f1782944aa30771b2dec478f4d72c983acbc04be7e556436cb27f7f739620" Oct 03 10:32:47 crc kubenswrapper[4790]: E1003 10:32:47.269912 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"153f1782944aa30771b2dec478f4d72c983acbc04be7e556436cb27f7f739620\": container with ID starting with 153f1782944aa30771b2dec478f4d72c983acbc04be7e556436cb27f7f739620 not found: ID does not exist" containerID="153f1782944aa30771b2dec478f4d72c983acbc04be7e556436cb27f7f739620" Oct 03 10:32:47 crc kubenswrapper[4790]: I1003 10:32:47.269984 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"153f1782944aa30771b2dec478f4d72c983acbc04be7e556436cb27f7f739620"} err="failed to get container status \"153f1782944aa30771b2dec478f4d72c983acbc04be7e556436cb27f7f739620\": rpc error: code = NotFound desc = could not find container \"153f1782944aa30771b2dec478f4d72c983acbc04be7e556436cb27f7f739620\": container with ID starting with 153f1782944aa30771b2dec478f4d72c983acbc04be7e556436cb27f7f739620 not found: ID does not exist" Oct 03 10:32:47 crc kubenswrapper[4790]: I1003 10:32:47.327691 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-46k9s"] Oct 03 10:32:47 crc kubenswrapper[4790]: I1003 10:32:47.328095 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-46k9s" podUID="1e45d3b6-6222-4ca7-bd1f-901f41999907" containerName="registry-server" containerID="cri-o://6251648b8fde67799b16504046f122aeabcf3f7471a6257278c14b21f6ba8411" gracePeriod=2 Oct 03 10:32:48 crc kubenswrapper[4790]: I1003 10:32:48.172503 4790 generic.go:334] "Generic (PLEG): container finished" podID="1e45d3b6-6222-4ca7-bd1f-901f41999907" containerID="6251648b8fde67799b16504046f122aeabcf3f7471a6257278c14b21f6ba8411" exitCode=0 Oct 03 10:32:48 crc kubenswrapper[4790]: I1003 10:32:48.172564 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-46k9s" event={"ID":"1e45d3b6-6222-4ca7-bd1f-901f41999907","Type":"ContainerDied","Data":"6251648b8fde67799b16504046f122aeabcf3f7471a6257278c14b21f6ba8411"} Oct 03 10:32:48 crc kubenswrapper[4790]: I1003 10:32:48.355418 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="017fbf7a-4f9b-4b06-981c-b7d788e47650" path="/var/lib/kubelet/pods/017fbf7a-4f9b-4b06-981c-b7d788e47650/volumes" Oct 03 10:32:48 crc kubenswrapper[4790]: I1003 10:32:48.519239 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-46k9s" Oct 03 10:32:48 crc kubenswrapper[4790]: I1003 10:32:48.720097 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v82hk\" (UniqueName: \"kubernetes.io/projected/1e45d3b6-6222-4ca7-bd1f-901f41999907-kube-api-access-v82hk\") pod \"1e45d3b6-6222-4ca7-bd1f-901f41999907\" (UID: \"1e45d3b6-6222-4ca7-bd1f-901f41999907\") " Oct 03 10:32:48 crc kubenswrapper[4790]: I1003 10:32:48.720619 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1e45d3b6-6222-4ca7-bd1f-901f41999907-utilities\") pod \"1e45d3b6-6222-4ca7-bd1f-901f41999907\" (UID: \"1e45d3b6-6222-4ca7-bd1f-901f41999907\") " Oct 03 10:32:48 crc kubenswrapper[4790]: I1003 10:32:48.720650 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1e45d3b6-6222-4ca7-bd1f-901f41999907-catalog-content\") pod \"1e45d3b6-6222-4ca7-bd1f-901f41999907\" (UID: \"1e45d3b6-6222-4ca7-bd1f-901f41999907\") " Oct 03 10:32:48 crc kubenswrapper[4790]: I1003 10:32:48.721122 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1e45d3b6-6222-4ca7-bd1f-901f41999907-utilities" (OuterVolumeSpecName: "utilities") pod "1e45d3b6-6222-4ca7-bd1f-901f41999907" (UID: "1e45d3b6-6222-4ca7-bd1f-901f41999907"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 10:32:48 crc kubenswrapper[4790]: I1003 10:32:48.722194 4790 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1e45d3b6-6222-4ca7-bd1f-901f41999907-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 10:32:48 crc kubenswrapper[4790]: I1003 10:32:48.735864 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1e45d3b6-6222-4ca7-bd1f-901f41999907-kube-api-access-v82hk" (OuterVolumeSpecName: "kube-api-access-v82hk") pod "1e45d3b6-6222-4ca7-bd1f-901f41999907" (UID: "1e45d3b6-6222-4ca7-bd1f-901f41999907"). InnerVolumeSpecName "kube-api-access-v82hk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 10:32:48 crc kubenswrapper[4790]: I1003 10:32:48.772878 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1e45d3b6-6222-4ca7-bd1f-901f41999907-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1e45d3b6-6222-4ca7-bd1f-901f41999907" (UID: "1e45d3b6-6222-4ca7-bd1f-901f41999907"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 10:32:48 crc kubenswrapper[4790]: I1003 10:32:48.823297 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v82hk\" (UniqueName: \"kubernetes.io/projected/1e45d3b6-6222-4ca7-bd1f-901f41999907-kube-api-access-v82hk\") on node \"crc\" DevicePath \"\"" Oct 03 10:32:48 crc kubenswrapper[4790]: I1003 10:32:48.823332 4790 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1e45d3b6-6222-4ca7-bd1f-901f41999907-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 10:32:49 crc kubenswrapper[4790]: I1003 10:32:49.183626 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-46k9s" event={"ID":"1e45d3b6-6222-4ca7-bd1f-901f41999907","Type":"ContainerDied","Data":"71eab605338a67c708c18d87fe79ea3d79458136d122a64015be90bda2e3bb46"} Oct 03 10:32:49 crc kubenswrapper[4790]: I1003 10:32:49.183683 4790 scope.go:117] "RemoveContainer" containerID="6251648b8fde67799b16504046f122aeabcf3f7471a6257278c14b21f6ba8411" Oct 03 10:32:49 crc kubenswrapper[4790]: I1003 10:32:49.183719 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-46k9s" Oct 03 10:32:49 crc kubenswrapper[4790]: I1003 10:32:49.205198 4790 scope.go:117] "RemoveContainer" containerID="6e32ff935da247821694cf8f31911cf2f2c7ab8686f79f8026312baf47409484" Oct 03 10:32:49 crc kubenswrapper[4790]: I1003 10:32:49.226383 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-46k9s"] Oct 03 10:32:49 crc kubenswrapper[4790]: I1003 10:32:49.235977 4790 scope.go:117] "RemoveContainer" containerID="d3b2e0e7e22447f3ddbbb6776f6a35f45eaa08d4474c145e871cfb3cf3621555" Oct 03 10:32:49 crc kubenswrapper[4790]: I1003 10:32:49.240812 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-46k9s"] Oct 03 10:32:50 crc kubenswrapper[4790]: I1003 10:32:50.340246 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1e45d3b6-6222-4ca7-bd1f-901f41999907" path="/var/lib/kubelet/pods/1e45d3b6-6222-4ca7-bd1f-901f41999907/volumes" Oct 03 10:32:51 crc kubenswrapper[4790]: I1003 10:32:51.691394 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-72qb7" Oct 03 10:32:51 crc kubenswrapper[4790]: I1003 10:32:51.741759 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-72qb7" Oct 03 10:32:52 crc kubenswrapper[4790]: I1003 10:32:52.534548 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-72qb7"] Oct 03 10:32:53 crc kubenswrapper[4790]: I1003 10:32:53.224983 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-72qb7" podUID="8cc2c9eb-da3e-4b0a-890c-76d398baa1a3" containerName="registry-server" containerID="cri-o://f9a5fe8466cae31ba71e28b4ff368176a00fb40aa728befebc9cbd3c70072d40" gracePeriod=2 Oct 03 10:32:53 crc kubenswrapper[4790]: I1003 10:32:53.772699 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-72qb7" Oct 03 10:32:53 crc kubenswrapper[4790]: I1003 10:32:53.927006 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8cc2c9eb-da3e-4b0a-890c-76d398baa1a3-catalog-content\") pod \"8cc2c9eb-da3e-4b0a-890c-76d398baa1a3\" (UID: \"8cc2c9eb-da3e-4b0a-890c-76d398baa1a3\") " Oct 03 10:32:53 crc kubenswrapper[4790]: I1003 10:32:53.927185 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8cc2c9eb-da3e-4b0a-890c-76d398baa1a3-utilities\") pod \"8cc2c9eb-da3e-4b0a-890c-76d398baa1a3\" (UID: \"8cc2c9eb-da3e-4b0a-890c-76d398baa1a3\") " Oct 03 10:32:53 crc kubenswrapper[4790]: I1003 10:32:53.927245 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z8zhc\" (UniqueName: \"kubernetes.io/projected/8cc2c9eb-da3e-4b0a-890c-76d398baa1a3-kube-api-access-z8zhc\") pod \"8cc2c9eb-da3e-4b0a-890c-76d398baa1a3\" (UID: \"8cc2c9eb-da3e-4b0a-890c-76d398baa1a3\") " Oct 03 10:32:53 crc kubenswrapper[4790]: I1003 10:32:53.928503 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8cc2c9eb-da3e-4b0a-890c-76d398baa1a3-utilities" (OuterVolumeSpecName: "utilities") pod "8cc2c9eb-da3e-4b0a-890c-76d398baa1a3" (UID: "8cc2c9eb-da3e-4b0a-890c-76d398baa1a3"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 10:32:53 crc kubenswrapper[4790]: I1003 10:32:53.934290 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cc2c9eb-da3e-4b0a-890c-76d398baa1a3-kube-api-access-z8zhc" (OuterVolumeSpecName: "kube-api-access-z8zhc") pod "8cc2c9eb-da3e-4b0a-890c-76d398baa1a3" (UID: "8cc2c9eb-da3e-4b0a-890c-76d398baa1a3"). InnerVolumeSpecName "kube-api-access-z8zhc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 03 10:32:54 crc kubenswrapper[4790]: I1003 10:32:54.029414 4790 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8cc2c9eb-da3e-4b0a-890c-76d398baa1a3-utilities\") on node \"crc\" DevicePath \"\"" Oct 03 10:32:54 crc kubenswrapper[4790]: I1003 10:32:54.029423 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8cc2c9eb-da3e-4b0a-890c-76d398baa1a3-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8cc2c9eb-da3e-4b0a-890c-76d398baa1a3" (UID: "8cc2c9eb-da3e-4b0a-890c-76d398baa1a3"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 03 10:32:54 crc kubenswrapper[4790]: I1003 10:32:54.029454 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z8zhc\" (UniqueName: \"kubernetes.io/projected/8cc2c9eb-da3e-4b0a-890c-76d398baa1a3-kube-api-access-z8zhc\") on node \"crc\" DevicePath \"\"" Oct 03 10:32:54 crc kubenswrapper[4790]: I1003 10:32:54.131251 4790 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8cc2c9eb-da3e-4b0a-890c-76d398baa1a3-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 03 10:32:54 crc kubenswrapper[4790]: I1003 10:32:54.236671 4790 generic.go:334] "Generic (PLEG): container finished" podID="8cc2c9eb-da3e-4b0a-890c-76d398baa1a3" containerID="f9a5fe8466cae31ba71e28b4ff368176a00fb40aa728befebc9cbd3c70072d40" exitCode=0 Oct 03 10:32:54 crc kubenswrapper[4790]: I1003 10:32:54.237807 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-72qb7" Oct 03 10:32:54 crc kubenswrapper[4790]: I1003 10:32:54.240720 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-72qb7" event={"ID":"8cc2c9eb-da3e-4b0a-890c-76d398baa1a3","Type":"ContainerDied","Data":"f9a5fe8466cae31ba71e28b4ff368176a00fb40aa728befebc9cbd3c70072d40"} Oct 03 10:32:54 crc kubenswrapper[4790]: I1003 10:32:54.240787 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-72qb7" event={"ID":"8cc2c9eb-da3e-4b0a-890c-76d398baa1a3","Type":"ContainerDied","Data":"76251d7d767b3608533fd7960876480f1123aed6dc44db5c04fac6303d0c8810"} Oct 03 10:32:54 crc kubenswrapper[4790]: I1003 10:32:54.240807 4790 scope.go:117] "RemoveContainer" containerID="f9a5fe8466cae31ba71e28b4ff368176a00fb40aa728befebc9cbd3c70072d40" Oct 03 10:32:54 crc kubenswrapper[4790]: I1003 10:32:54.271855 4790 scope.go:117] "RemoveContainer" containerID="a65608959c141edcf182480f24216203dc9b883c97a3920e9dfb5a46f8809694" Oct 03 10:32:54 crc kubenswrapper[4790]: I1003 10:32:54.282789 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-72qb7"] Oct 03 10:32:54 crc kubenswrapper[4790]: I1003 10:32:54.295567 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-72qb7"] Oct 03 10:32:54 crc kubenswrapper[4790]: I1003 10:32:54.312614 4790 scope.go:117] "RemoveContainer" containerID="f4a329a4c3f2705a114bdd2769c4732d749058fe893120df091b9cea2730b4e6" Oct 03 10:32:54 crc kubenswrapper[4790]: I1003 10:32:54.345781 4790 scope.go:117] "RemoveContainer" containerID="f9a5fe8466cae31ba71e28b4ff368176a00fb40aa728befebc9cbd3c70072d40" Oct 03 10:32:54 crc kubenswrapper[4790]: E1003 10:32:54.349523 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f9a5fe8466cae31ba71e28b4ff368176a00fb40aa728befebc9cbd3c70072d40\": container with ID starting with f9a5fe8466cae31ba71e28b4ff368176a00fb40aa728befebc9cbd3c70072d40 not found: ID does not exist" containerID="f9a5fe8466cae31ba71e28b4ff368176a00fb40aa728befebc9cbd3c70072d40" Oct 03 10:32:54 crc kubenswrapper[4790]: I1003 10:32:54.349649 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f9a5fe8466cae31ba71e28b4ff368176a00fb40aa728befebc9cbd3c70072d40"} err="failed to get container status \"f9a5fe8466cae31ba71e28b4ff368176a00fb40aa728befebc9cbd3c70072d40\": rpc error: code = NotFound desc = could not find container \"f9a5fe8466cae31ba71e28b4ff368176a00fb40aa728befebc9cbd3c70072d40\": container with ID starting with f9a5fe8466cae31ba71e28b4ff368176a00fb40aa728befebc9cbd3c70072d40 not found: ID does not exist" Oct 03 10:32:54 crc kubenswrapper[4790]: I1003 10:32:54.349683 4790 scope.go:117] "RemoveContainer" containerID="a65608959c141edcf182480f24216203dc9b883c97a3920e9dfb5a46f8809694" Oct 03 10:32:54 crc kubenswrapper[4790]: E1003 10:32:54.352059 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a65608959c141edcf182480f24216203dc9b883c97a3920e9dfb5a46f8809694\": container with ID starting with a65608959c141edcf182480f24216203dc9b883c97a3920e9dfb5a46f8809694 not found: ID does not exist" containerID="a65608959c141edcf182480f24216203dc9b883c97a3920e9dfb5a46f8809694" Oct 03 10:32:54 crc kubenswrapper[4790]: I1003 10:32:54.352242 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a65608959c141edcf182480f24216203dc9b883c97a3920e9dfb5a46f8809694"} err="failed to get container status \"a65608959c141edcf182480f24216203dc9b883c97a3920e9dfb5a46f8809694\": rpc error: code = NotFound desc = could not find container \"a65608959c141edcf182480f24216203dc9b883c97a3920e9dfb5a46f8809694\": container with ID starting with a65608959c141edcf182480f24216203dc9b883c97a3920e9dfb5a46f8809694 not found: ID does not exist" Oct 03 10:32:54 crc kubenswrapper[4790]: I1003 10:32:54.352395 4790 scope.go:117] "RemoveContainer" containerID="f4a329a4c3f2705a114bdd2769c4732d749058fe893120df091b9cea2730b4e6" Oct 03 10:32:54 crc kubenswrapper[4790]: E1003 10:32:54.353530 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f4a329a4c3f2705a114bdd2769c4732d749058fe893120df091b9cea2730b4e6\": container with ID starting with f4a329a4c3f2705a114bdd2769c4732d749058fe893120df091b9cea2730b4e6 not found: ID does not exist" containerID="f4a329a4c3f2705a114bdd2769c4732d749058fe893120df091b9cea2730b4e6" Oct 03 10:32:54 crc kubenswrapper[4790]: I1003 10:32:54.353563 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f4a329a4c3f2705a114bdd2769c4732d749058fe893120df091b9cea2730b4e6"} err="failed to get container status \"f4a329a4c3f2705a114bdd2769c4732d749058fe893120df091b9cea2730b4e6\": rpc error: code = NotFound desc = could not find container \"f4a329a4c3f2705a114bdd2769c4732d749058fe893120df091b9cea2730b4e6\": container with ID starting with f4a329a4c3f2705a114bdd2769c4732d749058fe893120df091b9cea2730b4e6 not found: ID does not exist" Oct 03 10:32:54 crc kubenswrapper[4790]: I1003 10:32:54.358644 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cc2c9eb-da3e-4b0a-890c-76d398baa1a3" path="/var/lib/kubelet/pods/8cc2c9eb-da3e-4b0a-890c-76d398baa1a3/volumes" Oct 03 10:33:40 crc kubenswrapper[4790]: I1003 10:33:40.185973 4790 patch_prober.go:28] interesting pod/machine-config-daemon-zqj4j container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 10:33:40 crc kubenswrapper[4790]: I1003 10:33:40.186833 4790 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" podUID="36eae06a-d8be-4128-8d68-acb32e1f011b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 10:34:10 crc kubenswrapper[4790]: I1003 10:34:10.186126 4790 patch_prober.go:28] interesting pod/machine-config-daemon-zqj4j container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 10:34:10 crc kubenswrapper[4790]: I1003 10:34:10.186972 4790 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" podUID="36eae06a-d8be-4128-8d68-acb32e1f011b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 10:34:31 crc kubenswrapper[4790]: I1003 10:34:31.842109 4790 scope.go:117] "RemoveContainer" containerID="759a5431cb32737bdfdfbae4ef85c79871fed0bd2d2d6e4f2672390b4a13ac59" Oct 03 10:34:40 crc kubenswrapper[4790]: I1003 10:34:40.187084 4790 patch_prober.go:28] interesting pod/machine-config-daemon-zqj4j container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 03 10:34:40 crc kubenswrapper[4790]: I1003 10:34:40.187667 4790 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" podUID="36eae06a-d8be-4128-8d68-acb32e1f011b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 03 10:34:40 crc kubenswrapper[4790]: I1003 10:34:40.187710 4790 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" Oct 03 10:34:40 crc kubenswrapper[4790]: I1003 10:34:40.188475 4790 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"9c577e63d1d9f135c187b5516a22ab28a6b9fe25292b7135a8d3b273532738bd"} pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 03 10:34:40 crc kubenswrapper[4790]: I1003 10:34:40.188534 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" podUID="36eae06a-d8be-4128-8d68-acb32e1f011b" containerName="machine-config-daemon" containerID="cri-o://9c577e63d1d9f135c187b5516a22ab28a6b9fe25292b7135a8d3b273532738bd" gracePeriod=600 Oct 03 10:34:40 crc kubenswrapper[4790]: I1003 10:34:40.372249 4790 generic.go:334] "Generic (PLEG): container finished" podID="36eae06a-d8be-4128-8d68-acb32e1f011b" containerID="9c577e63d1d9f135c187b5516a22ab28a6b9fe25292b7135a8d3b273532738bd" exitCode=0 Oct 03 10:34:40 crc kubenswrapper[4790]: I1003 10:34:40.372333 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" event={"ID":"36eae06a-d8be-4128-8d68-acb32e1f011b","Type":"ContainerDied","Data":"9c577e63d1d9f135c187b5516a22ab28a6b9fe25292b7135a8d3b273532738bd"} Oct 03 10:34:40 crc kubenswrapper[4790]: I1003 10:34:40.372478 4790 scope.go:117] "RemoveContainer" containerID="4a3f2daf4e4f2fcbea44841698967b97018c1bcc4bd0dc9f6b5b49027a55eb34" Oct 03 10:34:41 crc kubenswrapper[4790]: I1003 10:34:41.383776 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-zqj4j" event={"ID":"36eae06a-d8be-4128-8d68-acb32e1f011b","Type":"ContainerStarted","Data":"7fd83dbd718a3cdb13e56db15e4caab7780bfc7f2103b369b20f1b7e5f0566c6"}